Our Broken Minds

There’s a special feeling of illumination when two things you have learned separately click together. Recently I had this privilege while going through Daniel Kahneman’s book Thinking, Fast and Slow.

 

Kahneman argues that the human mind operates in 2 modes that he creatively calls System 1 and System 2. The first is fast, impulsive, automatic, emotional, and corresponds to what we would informally call intuition. The latter is slow, lazy, deliberate, effortful, and rational. System 2 is the conscious one that proclaims the I, despite the fact that System 1 does most of the work. When I say the word “elephant” and a mental picture of an elephant comes up in your head, that’s System 1 acting. It’s the same System that says “2” whenever someone says “1 + 1”. System 1 is also present when you’re driving. Think of every activity that occurs involuntarily or can be delegated to autopilot, and you can safely assume it’s an activity of System 1. On the other hand, any activity that demands your attention, then it’s System 2 at work. Activities that require a cognitive load are delegated to System 2. The entire book discusses the cooperation of these two systems, their merits and faults, and the consequences of their cooperation and inner working, which have effects throughout life generally and economics specifically. There’s one aspect of their cooperation that I would like to emphasize: System 1 formulates first impressions and feelings, and System 2 accepts them as draft that may turn into beliefs most of the time. In short, Kahneman puts it as “[i]f System 1 is involved, the conclusion comes first and the arguments follow.”

 

That particular sentence caught my attention because I have heard it before. Not in that particular phrasing, but it falls along the same line. Jonathan Haidt in his book the Righteous Mind kept hammering the phrase “intuition comes first, strategic reasoning second” throughout the book. The first item in both of them is intuition, and the second one in both is strategy. Though neither have explicitly referenced the other, I believe both of them are talking about the same thing.

 

As much as we like to believe we’re in total control of our thoughts and behaviour, we aren’t. A lot of behavioral actions and thoughts occur unconsciously. An example of that is mentioned earlier, i.e. the “do not think of an elephant” conjuring a thought of an elephant despite your conscious effort not to. We do have control, but not 100% sort of control. We have conscious control by the virtue of System 2, albeit faulty control. Its faultiness is evident by the list of cognitive biases that have been identified by scientists in various experiments.1 A lovely metaphor Haidt uses is System 1 as the elephant and System 2 as the rider of the elephant. The rider can control so much of the elephant as long as the elephant is willing to be lead down a certain path and the rider has enough energy to steer the elephant. The elephant can affect the path, the time, and the entire experience of the journey. If the elephant desires to sway left, the rider cannot prevent it at all, but will work that sway into the journey and guide the elephant back to the intended path. The effect of that sway is not undone. The marks are still visible on the ground. The elephant’s idiosyncrasies are the cognitive biases.

 

Let’s start with the anchoring effect. This cognitive bias associates the first piece of information encountered with whatever task comes right after it, regardless whether they’re related or not. Dan Ariely in one of his books talked about an experiment where the last two digits of subjects’ social security number was used as the anchor. The subjects were shown wine bottles, told of its qualities, asked to write down the last two digits of their social security number, then asked to put a price on the bottle. The data show subjects whose social security number ended with big 2-digit numbers priced the bottle higher than those with small 2-digit numbers. Kahneman talks about an experiment where judges were presented with a case of a person who was caught shoplifting, asked to roll a pair of dice, say whether the sentence of months in jail would be higher or lower than what the pair of dice say, then asked to give the actual number of months. The result: “On average, those who had rolled a 9 said they would sentence her to 8 months; those who rolled a 3 said they would sentence her to 5 months; the anchoring effect was 50%” (Kahneman).

Considering this effect, how much are we in control?

 

Which do you think is more fatal: sharks or cows? Most people would guess the answer is sharks, but, it turns out, cows cause more deaths than sharks do.2 This is called availability heuristic: the belief that ease of memory recollection correlates with higher significance. We’re subjected to this effect by media reports, for example. Many people fear flying due to fear plane crashes but not as afraid of riding a car, though death rates by car wrecks are far greater than plane crashes. The media has an interesting relationship with this effect. The rarer the event, the more newsworthy it is. The more newsworthy, the easier it is to recall.

Considering this effect, how much are we in control?

 

Another cognitive bias is the framing effect. Being told a surgical procedure has 90% success rate gets more favorable than when it’s presented as having 10% fail rate or 10% possibility of dying on the table. The gist of the information is not different, but the reaction garnered is different.

Considering this effect, how much are we in control?

 

Those are only 3 cognitive biases of a long list, not to mention that the discussion is brief and more can be said on them.

 

The bad news is: our brains are broken due the default mode of System 1 at work in inappropriate times.

 

The good news is: we have System 2. We might not be able to control all thought, but we can influence thought and control the action.3

 

The bad news is: there is only so much energy for System 2 before it gets tired and submits to all suggestions of System 1. The phenomenon of System 2 exhaustion and submission to System 1 is known as ego depletion. If you ever been on a strict diet then, one day, you had a wild lack of self-control, then you’ve experienced ego depletion.

Considering this effect, how much are we in control?

 

What I’m getting at by this discussion is a call to be aware of our thoughts and judgments. We should always keep a check on our conceptions; are they a product of deliberate System 2 thinking, or are they unverified impressions from System 1? We must always be cautious. We must return every thought to be thoroughly processed by System 2 before accepting it. We are responsible; or so believes System 2.

 

Footnotes:

  1. https://www.washingtonpost.com/news/wonk/wp/2015/06/16/chart-the-animals-that-are-most-likely-to-kill-you-this-summer
  2. http://nymag.com/scienceofus/2016/02/a-neuroscience-finding-on-free-will.html

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s