Intellect no shelter from cognitive bias

Choose you evidence carefully by rocket ship

In a recent post I shared my observation that, as the usefulness of many psychotropics have been cast into doubt by a growing body of evidence, many people who used to chide questioners to “follow the evidence” now resort to anecdotes to defend their advocacy for these drugs. (And, denigration of other methods.)

One story about about ability to rationalize from an interview in Wired:

Wired: You write that people find it easier to rationalize stealing when they’re taking things rather than actual cash. You did an experiment where you left Coca-Colas in a dorm refrigerator along with a pile of dollar bills. People took the Cokes but left the cash. What’s going on there?

 This, I think, is one of the most worrisome experiments we’ve ever conducted, and it’s again about rationalization. There’s a story about a kid who gets in trouble at school for stealing a pencil from another kid, and the father comes home and says, ‘Johnny, that’s terrible, you never steal, and besides, if you need a pencil, let me know and I’ll bring you a box from the office.’

Why is that slightly amusing? Because we recognize that if we were taking the pencil from the office we would not have to confront that we are being immoral, in the way that we would if we took $10 from the petty cash box (even if we used that cash to buy pencils).

Another about how intellect is no protection from cognitive bias in The New Yorker [emphasis mine]:

But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”

The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.

Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.

And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.”

Keep this in mind when you read headlines like, Addiction Treatment in America: Not Based in Science, Not Truly ‘Medical’. Ask yourself why she’d give the impression that Twelve Step Facilitation is not an evidence-based practice? Or, why she’d fail to mention that the best informed and culturally empowered addicts do not seek the treatments she advocates? They receive the kind of treatment she sneers at (Albeit reliably high quality versions of that treatment.) and they enjoy stellar outcomes.

This isn’t to say that there’s no truth in what she or other critics have to say. (That would be an example of a cognitive error called the argument from fallacy.) Just be suspicious of people who don’t acknowledge their bias, particularly when they say their only agenda is evidence or science.

The limits of empiricism

While listening to On Point last week I was struck by an argument on a show that focused on Charles Murray‘s new book. I have no interest in arguing the merits of his thesis here, but he believes that, for a variety of reasons, America has been dividing by class and he is profoundly concerned about the implications. In one segment he expresses concern that one result is an growing concentration of the smartest people in the elite class, and, by extension, an growing concentration of the least smart people in the lower classes. The host and other guest push back against what they hear as genetic determinism. Exasperated, Murray says, “There’s a statistical relationship between parental IQ and child IQ… on average, parents with high IQs will produce offspring with higher IQs than parents with lower IQs…It’s a fact!…I’m talking about an empirical relationship that is not contestable!”

I have no interest in entering this debate on this blog, but I think the exchange offers a chance to step outside of the debates in our field.

Murray’s insistence that he was simply reporting a data point shows how blind we can be to our own narratives. He seems only vaguely aware that he has already attributed meaning to the data point—its source, its implications, its importance, and its characteristics. (fixed vs. static, that genetic determinants are powerful and important in comparison to other determinants, etc.)

The other host and the other guest were so troubled by the meaning that Murray ascribed that all of their responses focused on this meaning and they never really responded to the data point.

It seems like a lot of drug policy debates follow a very similar pattern. I find myself frustrated with people who argue that their position is empirically based as though the meaning they derive from their facts is self-evident, that they hold the only rational understanding and their conclusions are value-free.

In turn, I could do a better job of responding to their data and concerns.