Friday, August 3, 2012

Are smart people more prone to thinking errors?

We begin with two simple questions, drawn from a study which examines how people think.  Here goes:

·         Question #1: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost? (take a second, we have time)

·         Question #2: In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

The answers . . . appear below*. And while you ponder, consider this:

As humans, we like to think of ourselves as rational beings, capable of digesting and processing information and emerging with a sound, reasonable answer, or decision.  But decades of research has substantiated that, when it comes to thinking, we aren’t rational at all.  Instead, we take mental shortcuts (as when answering questions one and two) and exhibit a range of biases that lead us astray. 

A recent study sought to learn whether smart people are less prone to such biases, that is, whether intelligence provides some kind of a buffer against bias.  Their conclusion? Apparently not. Said the study authors: “. . . cognitive ability provides no inoculation at all from the bias blind spot.” Ah, the bias blind spot**.  Yes, we all have one – it’s when we think that biased thinking is more prevalent in others than in ourselves.

And here’s the crazy part – the smarter you are, the larger your bias blind spot. The findings come from a recent study conducted by Richard West and Russell Meserve of James Madison University, and Keith Stanovich of the University of Toronto.  And, they maintain, self-awareness and introspection don’t appear to help – that is, no matter how self-aware you are, and how introspective you are, you’re still prone to exhibit these common mental biases (an unnerving conclusion, to be sure).

What kind of biases are we talking about? (that even smart people suffer from):

·         The Planning Fallacy – the tendency to underestimate how long it will take a complete a task;

·         Framing Effect – this effect explains why a food item labeled “98% fat free” is more desirable than one labeled “contains 2% fat”;

·         Myside Bias – this is the tendency to ignore evidence when you already have an opinion on a subject;

·         Anchoring Bias – a quick story best illustrates this type of bias, courtesy of Jonah Lehrer, in an article he crafted for The New Yorker: “Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students exposed to a small ‘anchor’—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold”;

·         Base-Rate Neglect – this is when we ignore probabilities and focus too much on the specific situation; and

·         Outcome Bias – this type of bias shows up when we judge the quality of a decision on how the decision worked out.

So let’s recap: we are all prone to various biases, and being smart doesn’t seem to mitigate them (in some cases, it actually hurts). And neither self-awareness nor introspection appear to weaken these biases.  What’s a human to do?  (thinking . . . thinking )   

* answer #1: the ball costs 5 cents. And if you missed it, don’t feel too badly. Reportedly, 50% of students at Harvard, Princeton and MIT also gave the incorrect answer. Answer #2: 47 days

** in his article, Lehrer points to one theory on why the bias blind spot exists: “One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.”

No comments:

Post a Comment