► Show Top 10 Hot Links

Posts Tagged ‘Wired Magazine’

Cognitive Dissonance and the “oh sh*t!” circuit

by snork ( 158 Comments › )
Filed under Climate, Science at January 2nd, 2010 - 10:30 am
Note: this didn’t start out this way, but as things developed, this turned out to be the first in a trilogy of posts on science, climate, energy, economics, technology, and the future. They all will be a bit meatier than the typical Blogmocracy posts, in the mold of Coldwarrior’s three-part series on the Balkans.

Wired magazine has a lot of junky stuff in it, and every once in a while, something brilliant. This is one such piece. The title is misleading, and somewhat banal, it’s about a lot more than just neuroscience. It’s about philosophy of science, and and even political philosophy. Let me describe how this all applies to the Climategate fiasco. The article starts out talking about a case study in a failure that lead to a later success. Science, technology, and more broadly the history of mankind is full of such stories. But this was a lead-in to an actual study of science labs, and which ones are productive, and which ones aren’t, and why. Here we get to the crux:

Dunbar came away from his in vivo studies with an unsettling insight: Science is a deeply frustrating pursuit. Although the researchers were mostly using established techniques, more than 50 percent of their data was unexpected. (In some labs, the figure exceeded 75 percent.) “The scientists had these elaborate theories about what was supposed to happen,” Dunbar says. “But the results kept contradicting their theories. It wasn’t uncommon for someone to spend a month on a project and then just discard all their data because the data didn’t make sense.” Perhaps they hoped to see a specific protein but it wasn’t there. Or maybe their DNA sample showed the presence of an aberrant gene. The details always changed, but the story remained the same: The scientists were looking for X, but they found Y.

Well, gosh. That’s weren’t not s’posed to happen.

Dunbar was fascinated by these statistics. The scientific process, after all, is supposed to be an orderly pursuit of the truth, full of elegant hypotheses and control variables. (Twentieth-century science philosopher Thomas Kuhn, for instance, defined normal science as the kind of research in which “everything but the most esoteric detail of the result is known in advance.”) However, when experiments were observed up close — and Dunbar interviewed the scientists about even the most trifling details — this idealized version of the lab fell apart, replaced by an endless supply of disappointing surprises. There were models that didn’t work and data that couldn’t be replicated and simple studies riddled with anomalies. “These weren’t sloppy people,” Dunbar says. “They were working in some of the finest labs in the world. But experiments rarely tell us what we think they’re going to tell us. That’s the dirty secret of science.”

How did the researchers cope with all this unexpected data? How did they deal with so much failure? Dunbar realized that the vast majority of people in the lab followed the same basic strategy. First, they would blame the method. The surprising finding was classified as a mere mistake; perhaps a machine malfunctioned or an enzyme had gone stale. “The scientists were trying to explain away what they didn’t understand,” Dunbar says. “It’s as if they didn’t want to believe it.”

When you think about this, “hide the decline” seems to fit right in with that not wanting to believe what the data are telling them. What was interesting about the most famous of CRUtape letters™ is that they took this denial  (yes, that’s a correct use of that word) of the data to the next level, and started working together to sweep the facts under the carpet.

So what is the take-away message so far? 1) life is full of surprises, and 2) if you go into your experiments looking for some particular result, there’s a very good chance that you’re going to end up fighting your own experiment. As Feynman said the easiest person to fool is yourself. But why do people react this way?

The experiment would then be carefully repeated. Sometimes, the weird blip would disappear, in which case the problem was solved. But the weirdness usually remained, an anomaly that wouldn’t go away.

This is when things get interesting. According to Dunbar, even after scientists had generated their “error” multiple times — it was a consistent inconsistency — they might fail to follow it up. “Given the amount of unexpected data in science, it’s just not feasible to pursue everything,” Dunbar says. “People have to pick and choose what’s interesting and what’s not, but they often choose badly.” And so the result was tossed aside, filed in a quickly forgotten notebook. The scientists had discovered a new fact, but they called it a failure.

The reason we’re so resistant to anomalous information — the real reason researchers automatically assume that every unexpected result is a stupid mistake — is rooted in the way the human brain works.

Hmmm. So scientists aren’t androids. Who knew?

Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists — our views dictated by nothing but the facts — we’re actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isn’t that most experiments fail — it’s that most failures are ignored.

So much for the myth of the scientist just looking for the facts, and letting the chips fall where they may.

Furthermore, when Dunbar monitored the subjects in an fMRI machine, he found that showing non-physics majors the correct video triggered a particular pattern of brain activation: There was a squirt of blood to the anterior cingulate cortex, a collar of tissue located in the center of the brain. The ACC is typically associated with the perception of errors and contradictions — neuroscientists often refer to it as part of the “Oh shit!” circuit — so it makes sense that it would be turned on when we watch a video of something that seems wrong.

This explains a lot, and not just about science and scientists. What he’s just shown is that we all develop models of reality in our heads, and then resist evidence that contradicts those models.

The lesson is that not all data is created equal in our mind’s eye: When it comes to interpreting our experiments, we see what we want to see and disregard the rest. The physics students, for instance, didn’t watch the video and wonder whether Galileo might be wrong. Instead, they put their trust in theory, tuning out whatever it couldn’t explain. Belief, in other words, is a kind of blindness.

The lesson here seems clear, but it isn’t. The physics students, in this case, were right. But being right doesn’t mean that you’re not engaging in selective cognition. While their understanding of Galilean gravity got them past the wrong intuition that the untrained people had, this same understanding caused a lot of people to reject Einsteinian gravity. So knowledge is a double-edged sword. It can reinforce your correctness, but it can also reinforce your wrongness. Remember this famous Reagan quote:

It isn’t that Liberals are ignorant. It’s just that they know so much that isn’t so.

Seems like Ron understood something intuitively about this problem.

Now the author goes off on an interesting side trip:

In 1918, sociologist Thorstein Veblen was commissioned by a popular magazine devoted to American Jewry to write an essay on how Jewish “intellectual productivity” would be changed if Jews were given a homeland. At the time, Zionism was becoming a potent political movement, and the magazine editor assumed that Veblen would make the obvious argument: A Jewish state would lead to an intellectual boom, as Jews would no longer be held back by institutional anti-Semitism. But Veblen, always the provocateur, turned the premise on its head. He argued instead that the scientific achievements of Jews — at the time, Albert Einstein was about to win the Nobel Prize and Sigmund Freud was a best-selling author — were due largely to their marginal status. In other words, persecution wasn’t holding the Jewish community back — it was pushing it forward.

The reason, according to Veblen, was that Jews were perpetual outsiders, which filled them with a “skeptical animus.” Because they had no vested interest in “the alien lines of gentile inquiry,” they were able to question everything, even the most cherished of assumptions. Just look at Einstein, who did much of his most radical work as a lowly patent clerk in Bern, Switzerland. According to Veblen’s logic, if Einstein had gotten tenure at an elite German university, he would have become just another physics professor with a vested interest in the space-time status quo. He would never have noticed the anomalies that led him to develop the theory of relativity.

Predictably, Veblen’s essay was potentially controversial, and not just because he was a Lutheran from Wisconsin. The magazine editor evidently was not pleased; Veblen could be seen as an apologist for anti-Semitism. But his larger point is crucial: There are advantages to thinking on the margin. When we look at a problem from the outside, we’re more likely to notice what doesn’t work. Instead of suppressing the unexpected, shunting it aside with our “Oh shit!” circuit and Delete key, we can take the mistake seriously. A new theory emerges from the ashes of our surprise.

Based on the resounding success of Israeli science and technology, this seems to be disproven. But wait. There WE go jumping to a specious conclusion. Since there are approximately the same number of Jews in the US and in Israel, a fair comparison would be the scientific achievements of American v.s. Israeli Jews. I don’t have that information at my fingertips, but I think that there are considerably more American than Israeli Jews who have received various Nobel Prizes over the past 60 years, confirming the hypothesis. Anyway, this is a whole interesting discussion unto itself, but for now, we’ll have to say the jury is out. But on to the money quote vis-a-vis climate science:

Modern science is populated by expert insiders, schooled in narrow disciplines. Researchers have all studied the same thick textbooks, which make the world of fact seem settled. This led Kuhn, the philosopher of science, to argue that the only scientists capable of acknowledging the anomalies — and thus shifting paradigms and starting revolutions — are “either very young or very new to the field.” In other words, they are classic outsiders, naive and untenured. They aren’t inhibited from noticing the failures that point toward new possibilities.

Is that what is “settled” Mr. Gore? The naive and untenured should be prevented from publishing, right Mr. Mann?

But not every lab meeting was equally effective. Dunbar tells the story of two labs that both ran into the same experimental problem: The proteins they were trying to measure were sticking to a filter, making it impossible to analyze the data. “One of the labs was full of people from different backgrounds,” Dunbar says. “They had biochemists and molecular biologists and geneticists and students in medical school.” The other lab, in contrast, was made up of E. coli experts. “They knew more about E. coli than anyone else, but that was what they knew,” he says. Dunbar watched how each of these labs dealt with their protein problem. The E. coli group took a brute-force approach, spending several weeks methodically testing various fixes. “It was extremely inefficient,” Dunbar says. “They eventually solved it, but they wasted a lot of valuable time.”

The diverse lab, in contrast, mulled the problem at a group meeting. None of the scientists were protein experts, so they began a wide-ranging discussion of possible solutions. At first, the conversation seemed rather useless. But then, as the chemists traded ideas with the biologists and the biologists bounced ideas off the med students, potential answers began to emerge. “After another 10 minutes of talking, the protein problem was solved,” Dunbar says. “They made it look easy.”

Which brings up another major climate issue: multidiciplinarianism. When McIntyre and McKitrick trashed Mann’s hockey stick, it was a case of a couple of statisticians elbowing their way into the climatological lair. Mann et all basically told them that he knew everything that he needs to know about statistics, and won’t be needing their assistance. But which approach discovers the truth faster? Hands-down, the multidisciplinary team.

I think you can see how the climate train wreck is a perfect illustration of all of this, but in a more general sense, it explains cognitive dissonance, and why there’s so much “la la la, I can’t hear you” when Teh Won is criticized. It’s because a similar setup is operating: they all live in their hermetic cloisters, and hear nothing but echos of how wonderful  Dear Leader is, and when counterevidence surfaces, the “oh-shit” circuit wipes it out just like antivirus software.

I think it’s obvious how this sets up self-reinforcing social networks.