You’ll agree: the process of thinking is complicated. And now you want me to describe it in 400 words or less?
Bare bones, then, the basics. There are two types of thinking at our disposal. A quick, efficient, economic one, called Type I, that uses shortcuts. Type II, on the other hand, is slow, deliberate, effortful. Type I shortcuts are called heuristics, strategies that save you time and usually lead to a correct outcome. Emphasis on usually – they are error prone in comparison to Type II thinking, which rewards your investment with overall accuracy.
We could not function in daily life without Type I, even though it can lead us astray. Malcolm Gladwell, in his book Blink, made that point forcefully, although he stretched his enthusiasm a bit too far for my taste. But clearly, intuitive quick judgments can lead to life saving results if you look at expert firefighters or pediatric nurses in the ER, and for most of us it is good enough – until it isn’t.
We can be drawn into errors by relying too much on knowledge that is available – because we have encountered it frequently, or because something is vivid, or because we have heard it from a trusted source, even if that knowledge is not applicable. We are also prone to reason mistakenly from single incidents to whole categories (smoking can’t be bad – my aunt smoked a pack a day and lived to age 99….) and vice versa, from stereotypes to individuals (Germans are punctual, obedient, bad cooks – must be true for Heuer.) (You lucked out on two out of three.) Relying on examples rather than statistics is so much easier, when you are short on time and/or distracted!
We can voluntarily shift between these dual modes of thinking but Type II needs the right cues and circumstances. If you MUST make a quick decision, it won’t work; if you have time and the outcome matters a lot, it will be your choice in order to avoid dangerous error. And in general we all do better when presented with frequencies (12 out of 1000 people will get the flu without vaccination) vs. “1.2 % will” or “the probability is .012.” Presenting problems in the right way then really helps people to be better thinkers.
Finally, we have a tendency to seek confirmation for our beliefs – confirmation bias – instead of looking for evidence that might challenge them; we also cling to our beliefs – belief perseverance – even if disconfirming evidence is in front of our very eyes.
A good example would be conspiracy theories: the government is accused of having orchestrated a mass shooting in a night club. (Have you seen how much of that is actually floating around on the web after Orlando?) If the government denies this, you can judge that as clever manipulation to hide something. If the government admits to involvement (fat chance) you have your confirmation. If the government is remaining silent you can keep your belief that this is an attempt to keep the secret. In all cases, your belief was not threatened. Think about that!
I knew it – I blew my word count.