National Research Council Canada
Symbol of the Government of Canada
Dimensions

ARCHIVED - Guest columnist

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats by contacting us.


Guest Columnist: Jay Ingram

The science of bias

Think you’re open to a good argument? Think again. Research shows that the side we take on topics from global warming to child vaccinations is less about facts than about how we think the world should be.

Jay Ingram

One of Canada’s best known science broadcasters, Jay Ingram is co-host and producer of Daily Planet, Canada’s first daily science show. From 1979 to 1982, he hosted CBC Radio’s science program Quirks and Quarks. He has presented on science topics for radio and television, and has won numerous awards for bringing science to the mainstream. He has written 10 books, including The Daily Planet Book of Cool Ideas – Global Warming and What People are Doing About It.

Contrary to what many scientists would like to believe, there actually is a climate change debate out there. Some don’t like to acknowledge it because a “debate” suggests two sides, each of which has equal merit. Those same scientists (many of them climate specialists but many more not) don’t rate the climate change “debate” worthy of its name, think the media are misguided to argue for equal time for both sides, and so dismiss it. Call them climate change “believers.” Whatever they think though, it is undeniable that there is a lot of noise about climate change out there. People are talking and arguing.

Recently in Toronto, I heard Gwynne Dyer, one of the go-to guys on international issues, argue that even though there are blogs and newspapers tossing accusations back and forth, governments pay no attention to climate change “nonbelievers” — those who do not believe that carbon dioxide emissions are warming the planet. I wonder. It’s certainly not hard to find prominent political leaders in this country using familiar phrases like “the science isn’t settled” or “there’s a debate raging,” and referencing with approval the controversial author Bjorn Lomborg — all favourite rhetorical devices of what you might call the “nonbelievers.”

Those scientists willing to admit there is a debate argue that: “If we just had a scientifically literate public, this would never happen.” In their minds, if everyone were able to understand the data and evaluate it critically, then the discussion or debate or whatever would be over.

The flaw in this reasoning is this: if you think you’re going to effect change in the public by giving them more data, or wishing more data on them, and thus converting them to your view (whatever it is), you’re wrong. People respond to debates or disagreements based on what they already believe, not on how many facts they know.

The Cultural Cognition Project in the U.S., a group composed mostly of legal experts and psychologists, has published several papers on why people line up the way they do on contentious issues. Topics they’ve studied include the risks of nanotechnology, the advisability of vaccinating 12 and 13 year-old girls against Human Papilloma Virus, and climate change.

They argue that it is possible to characterize people along two dimensions: one extending from individualism at one end to communitarian at the other, and the other ranging all the way from hierarchical to egalitarian. There are pretty straightforward sociometric ways of determining where we stand with reference to those. The result: we make decisions that “reflect and reinforce one or another idealized version of how society should be organized.”1

Your alignment along these two sociological dimensions can affect which side you take in an argument, irrespective of the facts (hierarchical, communitarian, egalitarian or individualistic).

It’s easy to see how your sociocultural alignment could affect how you interpret an argument. For instance, an individualist will never be in favour of greater governmental control, and so will always be suspicious of the claims of the Intergovernmental Panel on Climate Change. An egalitarian will instead argue for restrictions on commerce and industry, which they claim are forces for disparity.

There are some psychological mechanisms that ensure anyone is able to view new data through their own, private lens. One is called “identity-protective cognition,” the idea that a person will always refuse to acknowledge that their favourite behaviours are damaging or even dangerous. Another human mind trick is the “confirmation bias”: we can — all of us — read two scientific papers which, while appearing to be equal in quality, represent diametrically opposed views, and decide that the one that favours our point of view is superior.

Part of the Cultural Cognition Project’s global warming study was to ask volunteers if a particular author (a made-up person) was a trustworthy source of information. If a participant tended toward being hierarchical and individualist, and the author was presented as being a “nonbeliever,” then yes, they judged him to be a reliable source. In this case, those who are hierarchical are presumably unwilling to see the social and governmental elites threatened by radical re-ordering. Egalitarians and communitarians of course believed exactly the opposite. The author’s supposed credentials, which were also presented to the group, had absolutely no effect one way or the other. What was important was what people believed even before they entered the experiment.

So what? Can we take anything away from research like this to help resolve disputes, whether over global warming or other issues? One step is pretty obvious, and the best media people know this well: know your audience. But more important, I think, is to accept that these data apply to you too. We all unconsciously deploy the confirmation bias when we need it. Perhaps the most important lesson here is to reexamine our own beliefs and how they skew our take on the information that floods us about the environment, public health and other issues every day. End

The views expressed in this article are those of the author.


1 - Kahan, D., Braman, D., and Jenkins-Smith, H.: “Cultural Cognition of Scientific Consensus.” Cultural Cognition Project Working paper No. 77 (www.culturalcognition.net)