No items found.
No items found.
No items found.
Blog
October 31, 2024

Thinking fast and slow : the study

Arnaud Weiss
5 minute read

In the 1970s, the future Nobel Prize-winning economist Daniel Kahneman and the researcher Amos Tversky sought to understand irrational decisions made in the economic sector. They theorized the concept of 'cognitive bias' and highlighted our profound irrationality. I have always thought of myself as a Cartesian and logical person. Reading System 1, System 2 (Thinking Fast and Slow) shook my certainties. It is certainly the book that has served me most as an entrepreneur. Here is a modest summary with my comments.

In this article discover :

  • The two systems that co-exist in your brain to process information
  • Why you shouldn't (always) trust your instincts
  • The most important lesson of Kahneman's work

In Introduction to Psychoanalysis, Freud explains that humanity has suffered three narcissistic wounds:

  • Copernicus' observation that the earth is not at the center of the universe.
  • Darwin's discovery of the animal ancestry of Homo Sapiens, which is only one species among others.
  • And finally, the highlighting of the role of the unconscious by the psychoanalytic movement, "the ego is not master in its own house", Sigmund Freud.

"A third denial will be inflicted on human megalomania by the psychological research of our day."Sigmund Freud (full extract )

‍Half a century later, Kahneman and Tversky inflict a fourth narcissistic wound on humanity by highlighting our difficulty in making rational choices. But what is the origin of this one?

Two systems co-exist in our brain to process information and make decisions.

 

System 1
Fast thinking 🐇

System 2
Slow thinking 🐢

Characteristics 📊

- Intuitive
- Emotional
- Unconscious
- Fast

- Logic thinker
- Requires effort
- Self-conscious
- Slow

Examples 🔎

- Drive on a highway
- Choose clothes
- Discern anger by listening to someone's voice 

- Filling out tax form
- Resolve 24 x 12
- Learn rules of a new board game


 

‍The first(fast thinking) is the intuitive, emotional and automatic way of thinking. You use it every day without realizing it to navigate our environment. It is unconscious.

Think of all the micro-decisions you make every day without realizing it: choosing your clothes or going to the office. System 1 is also responsible for your unconscious reflexes, such as dodging a ball that is thrown at your face.
In contrast, the second mode of thinking(slow thinking) is a deliberate and conscious effort to think. It occurs when we have to solve complex or new problems.

Let's take a concrete example. Try to solve the multiplication 24 x 12. The answer does not come naturally to you, you have to make a conscious effort to get the answer. Kahneman discovered that this effort is even visible! For a brief moment, your pupils have dilated to the use of system 2 due to the adrenaline (source).

The human body always seeks to minimize the cognitive effort to accomplish a task (see our article on habits on this subject). This is why even when faced with complex problems, System 1 automatically kicks in and offers an intuitive solution. The problem? This solution is very often wrong. System 1 is prone to systematic errors when it comes to logical and statistical questions.

When faced with logical or statistical problems, our intuition is wrong

‍Let's take a concrete example with a short problem. Read the three instructions below.

  • A bat and a ball cost 11€ each
  • Knowing that the bat costs 10€ more than the ball
  • How much does the ball cost?

What is the first answer that comes to mind? Write it down, and now take a piece of paper to solve the problem. 

Your system 1 shouted out the easy answer: €1. But when you thought about it, you realized that this was not possible. If the ball is worth £1, the bat is worth £11 and the sum of the two is £12, not £11. The correct answer is €0.50.
In a study of several thousand students, 50% gave the wrong answer at the prestigious universities of Harvard, MIT and Princeton. For less selective universities, the rate rose to 80%.

Still not convinced? Here are some more examples from Shane Frederick's Cognitive Reflection Test (link):

  • If it takes 5 minutes for 5 machines to make 5 widgets, how long would it take 100 machines to make 100 widgets?
  • There are water lilies in a pond. Every day the area covered by the water lilies doubles in size. If it takes 48 days for the water lilies to cover the whole lake, how long does it take for them to cover half of it?

The first key lesson of Thinking Fast and Slow is therefore: if you want to make rational and informed decisions, you must systematically go beyond your first intuition. Make a (painful) conscious effort to analyze the situation and weigh up the various arguments.

Is the will an infinite faculty or does it have a limited reservoir? Is it possible that the mere sight of a banknote makes you more selfish? Is it rational to stay at the cinema even if you don't like the film? I will answer these questions in the next article where we will explore "cognitive biases".

‍To make sure you don't miss any of our articles, subscribe to our newsletter and receive a monthly summary of our top stories!

In the 1970s, the future Nobel Prize-winning economist Daniel Kahneman and the researcher Amos Tversky sought to understand irrational decisions made in the economic sector. They theorized the concept of "cognitive bias" and highlighted our profound irrationality.

Our intuition or 'system 1' is subject to cognitive biases. These biases lead us to make systematic errors of judgment when faced with complex problems. However, to save effort, we often use our intuition to solve them, rather than our analytical thinking. In fact, analytical thinking - our "system 2" - is more energy-intensive.

In this second part, I detail the cognitive biases that most affect our daily lives. Beyond simple intellectual curiosity, being aware of their existence is, in my opinion, the first step towards making fairer and more informed choices.

In this article discover :

  • The insidious primacy effect, or the power of symbols
  • Confirmation bias as a barrier to fair choices
  • The halo effect: why we often judge a book by its cover
     

The insidious primacy effect, or the power of symbols

The Prime effect defines the influence of exposure to a concept. This influence is primarily mental, but also physical.

In a famous experiment conducted by the psychologist John Bargh, two groups of students aged 18 to 22 were asked to form sentences from groups of words. While the first group had completely random words, the second group had only words associated with old age: greying, wrinkled, dentures, etc.

Once the task was completed, the students were invited to go to another room for a second exercise. The researchers measured their speed of movement without their knowledge. They were surprised to see that the students exposed to the lexicon of old age walked significantly slower than the first group!

During John Bargh's experiment, the students were not aware of the real purpose of the experiment. Their speed was measured without their knowledge.

Numerous experiments have subsequently confirmed these observations. For example, after being exposed to the symbolism of money (monopoly tickets), adults were more selfish than a control group.

The discovery of this bias led me to two conclusions:

  • Our daily environment influences our behaviour. We must shape it to suit our goals.
  • The use of particular iconography by regimes or companies has a real impact on people.

Confirmation bias is our tendency to select only information that confirms our beliefs or ideas. It can be summed up in a phrase from Warren Buffett:

"Humans excel at interpreting new information in such a way that their previous conclusions remain unchanged."

This bias, which is omnipresent in everyday life, is the root cause of many errors of judgement. On the one hand, we tend to accept low quality evidence that confirms our beliefs and opinions. On the other hand, we are blind to very strong signals that we are wrong and we reject strong evidence that supports an opponent. This is very easy to observe in an emotionally charged discussion or in a political debate.*

But why does this bias exist? Is it dishonesty or simple intellectual laziness? In The Little Book of Stupidity, Sia Mohajer offers an answer:

We look for elements that confirm our beliefs and opinions about the world, but exclude those that contradict them... In order to simplify the world and make it conform to our expectations (...) Accepting information that confirms our ideas is easy and requires little energy. In trying to save energy, our mind will search for information in such a way that our interpretation of the evidence will be biased.

The explanation lies in the effort required to sincerely question our beliefs. Moreover, accepting that we have been wrong for a long time is painful. Questioning is painful. Refusing to see this error is much more comfortable.

There is a second element, highlighted by Arthur Schopenhauer in The Art of Being Right. In a debate, we are driven by our ego to avoid losing face at all costs. To do this, we use any means necessary, and we are prepared to defend an argument tooth and nail, even if it is flawed. Vanity takes precedence over the search for truth.
 

But how can the effects of confirmation bias be mitigated? When faced with an important choice, there are several avenues:

  • Forcing us to collect all relevant information, whether or not it is in our favour;
  • Do the exercise of dismantling our own hypothesis with substantiated arguments.
     

The halo effect: why we often judge a book by its cover

Exaggerated emotional coherence, or the "Halo effect", describes the influence of a positive or negative characteristic on the perception of other characteristics of a person. For example, a person who is judged to be attractive is unconsciously attributed other qualities: honesty, generosity, kindness, etc.*

This phenomenon was measured by the London School of Economics in a study in which they asked participants to estimate the IQ of people based on their photographs. Men who were judged to be attractive were given 13.6 IQ points more than the average. For women, the difference was 11.4 points.
In his book, Kahneman also draws on the experiment conducted by Solomon E. Asch in 1946, which you can find below. Asch in 1946, which you can find below.

Who intuitively looks nice ? 🤨

Alan : intelligent - hard-worker - impulsive - critical - stubborn - envious
Ben : envious - stubborn - critical - impulsive - hard-worker - intelligent

Experience of Solomon E. Asch, ""Forming impressions of Personality"

Participants rated Alan more positively than Ben, even though they saw themselves as having the same character traits, just exposed in a different order!

One explanation for this phenomenon is the human brain's need for coherence. For the sake of convenience, we construct a simplified view of the world. We are not comfortable with ambiguity and incoherence. We then tend to put everything into boxes according to a binary system. 

Thus, we are more forgiving after a good first impression, out of consistency with the positive image of the person that has formed in our mind. 

So what conclusions can we draw from this discovery?

Let's take care of our first impression, it has a considerable impact. Note: this is the whole point of our onboarding solution, to invest in the first impression that a new recruit will have of the company.
In our relationship with others, let us strive to go beyond first impressions. Let us accept complexity and refuse to make judgments in the absence of relevant data. 

There are many other cognitive biases, but I have chosen to focus on the most prominent ones in this article. They have a particularly strong impact if you are an entrepreneur or HR, in your recruitments and performance evaluations. If you are interested in the subject, I strongly suggest you read Kahneman's book. If not, you can re-read my first article on the subject or the one on the art of negotiation which discusses the anchoring bias.
 

References

  • Introduction to psychoanalysis - Sigmund Freud, 1916 (extract)
  • Pupillary Heart Rate and Skin Resistance Changes During a Mental Task - Bernard Tursky, David Shapiro, Andrew Crider, Daniel Kahneman, 1969 (source)
  • Cognitive Reflection and Decision Making - Shane Frederick, 2005 (link)
  • Attention in delay of gratification - W. Mischel, E. Ebbesen, 1970 (link)
  • Automaticity of social behavior: Direct effects of trait construct and stereotype activation on action. Bargh, J. A., Chen, M., & Burrows, L. - 1996 (link)
  • Why beautiful people are more intelligent. Kanazawa, Satoshi and Kovar, JL - 2004 (link)
  • Solomon E. Asch, "Forming Impressions of Personality", Journal of Abnormal and Social Psychology 41, 1946, pp. 258-290 -(link)
  • The Little Book of Stupidity, Sia Mohajer
  • The art of being always right, Arthur Schopenhauer
     
No items found.
Get started with LumApps
We would love to know more about your goals. How can we help?
LumApps
Insights
Blog
Thinking fast and slow : the study