Two (worthwhile) ways of thinking

There are many ways to test our decision-making ability. What I propose is a simple riddle, but one that must be answered instinctively.

“In a lawn, there is a sod of grass; every day the sod doubles in size; it takes 48 days to cover the entire lawn. How many days does it take to cover half a lawn?”.

Who says 24?

….

Who says 96?

Who says 47?

The majority of people have the certainty of making decisions in a rational way, i.e. weighing the alternatives optimally, evaluating the pros and cons of each option in order to arrive at the most functional choice with respect to the set objective in a sustainable time.

If the option chosen was 47, this is probably the case.

If it was 24 or 96, it’s proof that instinct, at least at this juncture, was smarter than reason. What is too often underestimated is the fact that intuition leads astray. Systematically, recurrently, and predictably. As biases and heuristics teach.

Why is it so easy to make mistakes?

Proving this point, there are two personalities who are both strong and antithetical at the same time. Both grandchildren of Eastern European rabbis, sharing a deep interest in the way “people function in their normal states, practicing psychology as an exact science, and both searching for simple, powerful truths […], gifted with minds of shocking productivity”. Both Jewish atheists in Israel.”

Their names were Amos Tversky and the Nobel Prize-winning economist Daniel Kahneman.

Amos Tversky was optimistic and brilliant because “When you’re a pessimist and the bad thing happens, you experience it twice: once when you worry and the second time when it happens.” He was able to resonate about scientific conversations with experts in fields far removed from his own, but almost ethereal, intolerant of social conventions and metaphors: “They replace genuine uncertainty about the world with semantic ambiguity. A metaphor is a cover-up.”

Instead, Kahneman was born in Tel Aviv, and spent his childhood in Paris. In 1940, the German occupation put the family at risk. Hidden in the south of France, they managed to survive (with the exception of his father, who died of untreated diabetes). After the war, the rest of the family emigrated to Palestine.

If Tversky was a night owl, Kahneman is an early riser who often wakes up alarmed about something. He is prone to pessimism, claiming that by “expecting the worst, one is never disappointed.” This pessimism extends to the expectations he has for his research, which he likes to question, “I have a sense of discovery whenever I find a flaw in my thinking.” 

Tversky liked to say, “People aren’t that complicated. Relationships between people are complicated.” But then he would stop and add, “Except for Danny.”

They were different, but anyone seeing them together, as they spent endless hours talking, knew that something special was happening, and they are credited with understanding why we make mistakes in making decisions.

Kahneman’s is an immense work, dedicated to his late colleague: in the end, it’s all about being slow or fast.

There are many ways to test our decision-making ability. What I propose is a simple riddle, but one that must be answered instinctively.

It’s all about being slow or fast

When it comes to thinking or making a decision, two systems are mobilized by the brain: system 1 (S1) and system 2 (S2), where S1 is intuitive, impulsive, loves to jump to conclusions, automatic, unconscious, fast and economical. S2, on the other hand, is conscious, deliberative, slow, often lazy, laborious to initiate, and reflective.

S1 and S2 don’t really exist, they are a handy analogy (or a label), which helps us understand what’s going on in our heads. For instance, it is thanks to S1 that we can quickly tie our shoes without really paying mental attention to the action itself or notice that an object is further away or closer than another, or even instantly intercept the fear on a person’s face and answer in a few moments the question: “What is the capital of France?”.

It is thanks to S2 if we can focus on the voice of a specific person in a noisy room full of people. If we can find our car in a crowded parking lot, dictate our phone number, fill out questionnaires, do math calculations and learn poems by heart. It would not be possible to perform complex tasks like these simultaneously. We can perform several actions together, but only if they are simple and require little mental effort.

Bias, heuristics and intuitions

The two systems are both active during our waking period, but while the first one works automatically, the second one is placed in a mode where it can make the least amount of effort and only a small percentage of its capacity is used. Its order is to consume as few calories as possible.

Normally S2 follows S1’s advice, without making any changes. However, if System 1 is in trouble, he disturbs System 2 to help him analyze the information and suggest a solution to the problem. In the same way, when S2 realizes that his partner is making a mistake, he activates: for example, when you would like to insult the boss, but then something stops you. That something is System 2.

However, S2 is not always involved in the judgments of System 1 and this leads to error. How? Just like it happened with the riddle proposed in the opening of the article.

If it is indisputable that System 1 is at the origin of most of our errors (i.e. bias and heuristics), it is also true that it produces many “expert intuitions”, the automatic reflexes that are essential in our lives to make important decisions in a few fractions of a second. It’s thanks to System 1 if a surgeon in the operating room or a firefighter facing a fire can make life and death choices to deal with emergencies and very often make the right decision in those few moments.

The trouble is that S1 doesn’t know his own limits. He has a tendency to make unforgivable mistakes in assessing the statistical probability of an event. Generally using System 1 we underestimate the risk that rare events occur.

We generally underestimate the risk that rare catastrophic events will occur, while overestimating the probability that they will recur soon after these disasters have occurred. To cut the long story short, if on one side it helps us to take an infinite number of decisions, its rapidity generates errors, just because it doesn’t analyze the data at disposal in how much time that operation requires, and he prefers to jump to conclusions to show us quickly and effortlessly the way to act.

System 1 is therefore easily influenced. This is why, in order to prevent errors, but above all to protect people so that they do not end up shredded because of the volatility of their System 1 or the slowness of System 2, nudges are born; the need for a gentle push that Kahneman cites as the bible of behavioral economics “that directs people to make the right choice”.

Laura Mondino

Sources

Lewis M., A Nobel friendship. Kahneman and Tversky, the meeting that changed the way we think, Raffaello Cortina Editore, Milan, 2017 pp. 165-166

Stanovich K., West R., Individual differences in reasoning: Implications for the rationality debate?, Behavioral and brain sciences (2000) 23, 645-726 http://pages.ucsd.edu/~mckenzie/StanovichBBS.pdf

Kahneman D., Slow and fast thinking, Mondadori, Milan 2016, p. 23.

0 commenti

Lascia un Commento

Vuoi partecipare alla discussione?
Fornisci il tuo contributo!

Lascia un commento

Il tuo indirizzo email non sarà pubblicato. I campi obbligatori sono contrassegnati *