Monday, January 30, 2006

Use and Abuse of Hidden Biases

Recently published:
Studies presented at the conference, for example, produced evidence that emotions and implicit assumptions often influence why people choose their [fill-in-the-blank], and that [fill-in-the-blank] stubbornly discount any information that challenges their preexisting beliefs.
What would you guess goes in the blanks?

As you can read in today’s Washington Post, the terms in the two blanks above are not ‘religion’ and ‘believers’ as you might have feared, but “political affiliations” and “partisans.” Perhaps there is still reason for concern, however: it’s no accident that politics and religion are proverbially linked as subjects to be avoided in social situations calling for decorous avoidance of conflict. The following phenomenon, for example, surely rings true for religious as well as political partisans:
Emory University psychologist Drew Westen put self-identified Democratic and Republican partisans in brain scanners and asked them to evaluate negative information about various candidates. Both groups were quick to spot inconsistency and hypocrisy—but only in candidates they opposed.
So what do the brain scans have to do with this? This is where it gets interesting:
When presented with negative information about the candidates they liked, partisans of all stripes found ways to discount it, Westen said. When the unpalatable information was rejected, furthermore, the brain scans showed that volunteers gave themselves feel-good pats—the scans showed that “reward centers” in volunteers’ brains were activated. The psychologist observed that the way these subjects dealt with unwelcome information had curious parallels with drug addiction as addicts also reward themselves for wrong-headed behavior.
Ah, the sweet rewards—or should we say guilty pleasures?—of uncritical loyalty and devotion.

It seems reasonable to guess that a biologically measurable tendency like this has an evolutionary basis: aside from the fact that the tendency to follow the leader of the pack surely preceded the ability to rationally question him, in terms of survival (and happiness) value Social Coherence probably gives Truth a serious run for its money.

Perhaps we can be grateful that the human brain’s primal addiction to filtered information, its natural capacity for cognitive dissonance, prevents valuable bonds from unraveling unnecessarily or prematurely; but as with any evolved tendency—whose very existence at least hints that it ‘works,’ or at least used to, at some level, for something—there are reasons to not allow it unchecked sway. (Cases in point: sex and violence.) We rightly value valiant fidelity and half-blind, long-suffering charity. But it is also written that some wrenching conversions are necessary, even if they divide families, and promise not peace but the sword.

Listening—giving new and scary views fair consideration—doesn’t seem to have a lot to recommend it, since it may cost you your life, or at least your life as you know it. And yet, depending on party affiliation, we cannot help admiring and hoping to emulate the likes of Joseph and Jesus, or Spinoza and Socrates—all men who found unexamined religious and even physical lives not worth living, and preferred the truth that made them lonely but free to the loyalty that would have made them happy and prosperous.

13 Comments:

I also briefly commented  on this study.

I know pretty much nothing about cognition, but it seems like this is a coping mechanism for mortal life--a shortcut for making decisions, if you will. 

Comment by Jared | 1/30/2006 08:44:00 PM  

Oh my, I look awful silly having missed that!

I agree that emotions play an indispensible role in decision-making. For virtually all decisions there is insufficient information (and computational ability) to make a 100% rational determination of the best course, and without emotional impulses to tip the balance we'd be stuck in paralyzing indecision.

What I find interesting here is the claim that emotion not only tips the balance in semi-rational decisions, but unless consciously resisted actively prevents the information collection necessary to make a more rational reconsideration of one's current position. Emotion provides a ratcheting as well as "tipping" function.

Now if you'll excuse me I have to go watch 24  and ponder whether Jack Bauer is an addictive loyalist or self-questioning rationalist... 

Comment by Christian Y. Cardall | 1/30/2006 09:06:00 PM  

Again, in my uninformed opinion, it seems like the manifestation of chosen loyalties or trust. By choosing a "team" (political, religious, or whatever) you have essentially outsourced some of your decision making. This is a natural result of the fact that we can't be experts on everything.

It makes sense to me that once you have chosen who you are "for" or "against," that emotion plays a large role and that you would be reluctant to re-evaluate your choice.

What interested me, which I did not bring out, was the
mental pay-off that people felt when they rejected contrary information. In its extreme form, I suppose, are people who actually glory in ignorance. But its milder form is something to watch out for also--in any area, including religion. 

Comment by Jared | 1/30/2006 09:39:00 PM  

Yeah, as my post indicates I found the soothing brain chemicals angle fascinating too.

It does seem very common, probably most common, that conflicting emotional stimuli rather than some overarching self-will to rationality motivates the consideration of previously rejected views or sources of information. I suppose this is why the Church cautions youth about media and friends---both potent emotional influences---even more than "alternate [intellectual] voices."

However, one potent emotional stimulus, or rather reaction, may be a sense of betrayal engendered by an isolated fact that is sufficiently incontrovertible to cast a previously trusted loyalty into doubt. This would be a case where rationality and emotion are feeding off each other almost immediately in generating a conversion. 

Comment by Christian Y. Cardall | 1/30/2006 10:19:00 PM  

These kinds of identities work the same, basically regardless of the social context. In fact, a lot of research has shown that racial and religious identities are profoundly stronger than political ones. So we should probably expect the information-filtering effects discussed above -- as well as the biochemical mechanism -- to apply to stereotypes related to those other identities just as much as to partisanship.

By the way, the mental "payoff" for rejecting contradictory information is just exactly the opposite effect of "cognitive dissonance."

The last point I'll make on this -- one always wants to be careful about running on at the mouth when discussing topics where one has actual expertise, as for instance party ID -- is that partisanship has been shown to be a useful cognitive heuristic. People with party identifications are usually able to reach the same vote choices and issue positions as others with similar social and ideological characteristics -- but partisans expend much less mental effort in the process than others. Is the same heuristic effect important in religious identities? Probably -- they help us filter out moral, theological, and authority claims that we would otherwise reject -- but more quickly. 

Comment by RoastedTomatoes | 1/30/2006 10:48:00 PM  

Thanks RoastedTomatoes, I wrote sloppily---I should have written something like "natural capacity for tolerating , or better, dissipating cognitive dissonance".

I like the cognitive heuristic insight. Explaining it that way reminds us that this kind of dynamic plays a very real (and not unjustified) role not just in things like religion and politics, but also in the way even physical scientists filter incoming information in their own or related fields. A biological mechanism that helps one to make fruitful bets on how to expend one's limited resources of attention is clearly valuable---though evidently it's far from foolproof!  

Comment by Christian Y. Cardall | 1/30/2006 11:20:00 PM  

Atran's In Gods We Trust  actually goes through some of these issues. I was doing various blog posts on it before I got terribly busy at work. I should start them up again. 

Comment by clark | 1/31/2006 02:39:00 AM  

Clark, that seems to be a book a number of people have mentioned... I guess I should look into it. 

Comment by Christian Y. Cardall | 1/31/2006 09:11:00 AM  

It's quite good. If you're already fairly familiar with the research in cognitive science and child development then a lot of it will be repetitive. (I've found myself skimming a lot of pages) But it really does get to the heart of most of these issues, although clearly it can't reflect data from the past 3 years. 

Comment by clark | 1/31/2006 01:34:00 PM  

I'm not familiar with cognitive science or child development, so that wouldn't bother me. I see it has only 4 reviews on Amazon, but all give 5 stars. Surely he must have detractors... 

Comment by Christian Y. Cardall | 1/31/2006 02:33:00 PM  

It really isn't *that* controversial a book (unlike say Plinker). That's not to say everyone will agree. But I think even those who disagree would still think the book quite good.  

Comment by clark | 1/31/2006 11:39:00 PM  

There is an important limitation in these kinds of fMRI studies that is rarely mentioned (you'd think those researching bias would be especially cautious about avoiding conclusions that might be biased). That is that these studies represent a snapshot in time, and fail to account for the plasticity of the brain (and the CNS in general).

Unfortunately, as the Washington Post story is based on a conference, not on a peer reviewed publication, I cannot find details on the methodology of this study in order to provide a proper critique. (What constutues a "partisan" anyway?)

The traditional teaching has been that the adult brain is essentially fixed, unlike that of a child or adolsecent. However, current research makes it clear this is not the case. For example, adults who learn a string instrument demonstrate growth in that portion of the homonculous of the motor cortex in the frontal lobe that controls finger movement. Buddhist monks who spend thousands of hours meditating demonstrate a massive increase in the amplitude of gamma waves on EEG testing.

In other words, accepting the results of Westen et al. at face value, it is not clear from the study that the activation of reward centers of the brain in conjunction with the rejection of unpalatable information is an inherent property of the human brain. Perhaps it is a developed characteristic of the "partisan." Or perhaps is a normal variant of the adult brain.

The authors themselves tacitly acknowledge this by comparing their study group to drug addicts. Drug addiction is an acquired condition. While there may some genetic predisposition to certain addictions (this is clearly the case in alcohol dependence) it has not been shown for any other substance commonly abused. It is clear, however, that substance abuse alters brain chemistry--dramatically (as it should; that is how the drugs work). Additionally, alterations in brain chemistry have been shown to lead to changes in brain anatomy on a gross scale. Any ER doc or radiologist can spot the brain of an alcoholic on a CT scan a mile away (cerebellar degeneration, cerebral atrophy, etc.) Despite the genetic component to alcoholism, these changes are only evident in those who actually abuse alcohol.

To properly do this study you would have to set up an experiment like the following:

Take several sets of monozygotic and dizygotic twins (want to isolate genetic differences), separate some of them at birth so they are raised in different households (want to isolate cultural differences), then perfom fMRI testing serially until middle age or so. Then crunch your numbers. Only with longitudinal testing can you demonstrate inherent versus acquired traits.

Of course, such a study would still have limitations. What about the influence of Western versus non-Western culture? What about aboriginal cultures, are the results similar? What about the psychologic version of the Heisenberg uncertainty principle: Does the act serial measurement of psychological traits itself alter one's psychology?

fMRI (and its less common sibling, the PET scanner) are the new darling of academic psychologists with access to scanner time everywhere. Why? Because it's an easy way to get a publication. All you have to do is get scanner time, get a bunch of volunteers, stick them in the scanner and start showing them images and asking them questions (color preference, pornography, drug addiction, anorexia, bulimia, politics, sexual prefernce...just find something that no one has reported yet. If it's been reported, then see if you can replicate it). Then write your paper. Obviously it's not that straightforward but not far off either.

I witnessed this firsthand in a summer doing fMRI and MR spectroscopy research. I myself spent many hours in the scanner looking at stimuli, trying to "think" about what they were telling me to think about while my brain was imaged.

I think it's a useful tool, but frequently researchers and press alike make exagerrated claims based on the research, when reality is not so simplistic.

Comment by Taylor Cardall | 2/01/2006 03:48:00 PM  

Taylor, thanks for the expert knowledge. Truly media accounts can way overhype things, or mangle them beyond recognition. But I'm confused as to why the elaborate time series and controls you describe are necessary in this case. I don't doubt brain plasticity in general; but 'primitive' faculties like I imagine the "reward centers" would be---are they not well-known, pretty universally localized regions, activation of which has a well-identified meaning?

And don't despise the ease of publication arising from a new empirical approach. Rejoice in the deluge of data! 

Comment by Christian Y. Cardall | 2/01/2006 10:24:00 PM  

:
:
:

BloggerHacks

<< Home