Intellect no shelter from cognitive bias

Choose you evidence carefully by rocket ship

In a recent post I shared my observation that, as the usefulness of many psychotropics have been cast into doubt by a growing body of evidence, many people who used to chide questioners to “follow the evidence” now resort to anecdotes to defend their advocacy for these drugs. (And, denigration of other methods.)

One story about about ability to rationalize from an interview in Wired:

Wired: You write that people find it easier to rationalize stealing when they’re taking things rather than actual cash. You did an experiment where you left Coca-Colas in a dorm refrigerator along with a pile of dollar bills. People took the Cokes but left the cash. What’s going on there?

 This, I think, is one of the most worrisome experiments we’ve ever conducted, and it’s again about rationalization. There’s a story about a kid who gets in trouble at school for stealing a pencil from another kid, and the father comes home and says, ‘Johnny, that’s terrible, you never steal, and besides, if you need a pencil, let me know and I’ll bring you a box from the office.’

Why is that slightly amusing? Because we recognize that if we were taking the pencil from the office we would not have to confront that we are being immoral, in the way that we would if we took $10 from the petty cash box (even if we used that cash to buy pencils).

Another about how intellect is no protection from cognitive bias in The New Yorker [emphasis mine]:

But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”

The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.

Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.

And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.”

Keep this in mind when you read headlines like, Addiction Treatment in America: Not Based in Science, Not Truly ‘Medical’. Ask yourself why she’d give the impression that Twelve Step Facilitation is not an evidence-based practice? Or, why she’d fail to mention that the best informed and culturally empowered addicts do not seek the treatments she advocates? They receive the kind of treatment she sneers at (Albeit reliably high quality versions of that treatment.) and they enjoy stellar outcomes.

This isn’t to say that there’s no truth in what she or other critics have to say. (That would be an example of a cognitive error called the argument from fallacy.) Just be suspicious of people who don’t acknowledge their bias, particularly when they say their only agenda is evidence or science.