Wednesday, October 23, 2019

Book Review “Thinking, Fast and Slow” Essay

I read the international bestseller â€Å"Thinking, Fast and Slow† of Daniel Kahneman (Winner of the Nobel Prize) over the last 3-4 weeks. I think it is a very interesting book and it is describing very critically the human brain and mind, which gave me many insights into decision-making and errors we are doing automatically without noticing it every day. He is very often talking about â€Å"System 1† and â€Å"System 2†. System 1 is fast; it’s intuitive, associative, metaphorical, automatic, impressionistic, and it can’t be switched off. Its operations involve no sense of intentional control, but it’s the â€Å"secret author of many of the choices and judgments you make† and it’s the hero of Daniel Kahneman’s book Thinking, Fast and Slow. System 2 is slow, deliberate, effortful. Its operations require attention. System 2 takes over, rather unwillingly, when things get difficult. It’s â€Å"the conscious being you call ‘I'†, and one of Kahneman’s main points is that this is a mistake. You’re wrong to identify with System 2, for you are also and equally and profoundly System 1. Kahneman compares System 2 to a supporting character who believes herself to be the lead actor and often has little idea of what’s going on. System 2 is slothful, and tires easily – so it usually accepts what System 1 tells it. It’s often right to do so, because System 1 is for the most part pretty good at what it does; it’s highly sensitive to subtle environmental cues, signs of danger, and so on. It does, however, pay a high price for speed. It loves to simplify, to assume WYSIATI (â€Å"what you see is all there is†), even as it gossips and embroiders and confabulates. It’s hopelessly bad at the kind of statistical thinking often required for good decisions, it jumps wildly to conclusions and it’s subject to a fantastic suite of irrational biases and interference effects (the halo effect, the â€Å"Florida effect†, framing effects, anchoring effects, the confirmation bias, outcome bias, hindsight bias, availability bias, the focusing illusion, and so on). Thousands of experiments have been conducted, right across the broad board of human life, all to the same general effect. We don’t know who we are or what we’re like, we don’t know what we’re really doing and we don’t know why we’re doing it. That’s a System 1 exaggeration, for sure, but there’s more truth in it than you can easily imagine. Judges think they make considered decisions about parole based strictly on the facts of the case. It turns out (to simplify only slightly) that it is their blood-sugar levels really sitting in judgment. We also hugely underestimate the role of chance in life (this is again System 1’s work). Analysis of the performance of fund managers over the longer term proves conclusively that you’d do just as well if you entrusted your financial decisions to a monkey throwing darts at a board. There is a tremendously powerful illusion that sustains managers in their belief their results, when good, are the result of skill; Kahneman explains how the illusion works. The fact remains that â€Å"performance bonuses† are awarded for luck, not skill. They might as well be handed out on the roll of a die: they’re completely unjustified. This may be why some banks now speak of â€Å"retention bonuses† rather than performance bonuses, but the idea that retention bonuses are needed depends on the shared myth of skill, and since the myth is known to be a myth, the system is profoundly dishonest – unless the dart-throwing monkeys are going to be cut in. In an experi ment designed to test the â€Å"anchoring effect†, highly experienced judges were given a description of a shoplifting offence. They were then â€Å"anchored† to different numbers by being asked to roll a pair of dice that had been secretly loaded to produce only two totals – three or nine. Finally, they were asked whether the prison sentence for the shoplifting offence should be greater or fewer, in months, than the total showing on the dice. Normally the judges would have made extremely similar judgments, but those who had just rolled nine proposed an average of eight months while those who had rolled three proposed an average of only five months. All were unaware of the anchoring effect. The same goes for all of us, almost all the time. We think we’re smart; we’re confident we won’t be unconsciously swayed by the high list price of a house. We’re wrong. (Kahneman admits his own inability to counter some of these effects.) We’re also hopelessly subject to the â€Å"focusing illusion†, which can be conveyed in one sentence: â€Å"Nothing in life is as important as you think it is when you’re thinking about it.† Whatever we focus on, it bulges in the heat of our attention until we assume its role in our life as a whole is greater than it is. Daniel Kahneman won a Nobel prize for economics in 2002 and much of his time he’s working together with Amos Tversky. Thinking, Fast and Slow has its roots in their joint work. It is an outstanding book, distinguished by beauty and clarity of detail, precision of presentation and gentleness of manner.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.