Citizen Action Monitor

Providing people with accurate information doesn’t change minds

People holding baseless beliefs get a “rush” when sticking to their guns, even if they know they’re wrong.

No 1901 Posted by fw, February 28, 2017

*****

The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.

*****

“Even after the evidence ‘for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,’ the researchers noted…. The Stanford studies became famous. Coming from a group of [Stanford University psychology] academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?” —Elizabeth Kolbert, Information Clearing House

Below is a heavily abridged repost of Kolbert’s provocative essay, which goes a long way towards explaining Trump’s appeal to such a large segment of the US population. Kolbert, a staff writer at The New Yorker since 1999, and 2015 Pulitzer Prize winner for general nonfiction, concludes her piece on a discouraging note:

“These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter, the literature is not reassuring.” 

To read Kolbert’s complete article on the Information Clearing House’s website, click on the following linked title.

Don’t miss the link to a related article at the end of this post, which offers a positive viewpoint – it is possible to overcome your own pre-existing beliefs, i.e., your confirmation biases.

**********

Why Facts Don’t Change Our Minds by Elizabeth Kolbert, Information Clearing HouseNew Yorker, February 27, 2017

People can’t think straight, concludes findings of thousands of Stanford U studies

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted…. The Stanford studies became famous. Coming from a group of [Stanford University psychology] academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding.

“Reasonable-seeming people are often totally irrational.” How did we become this way?

As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

As it turns out, reasoning evolved gradually in humans

In a new book, The Enigma of Reason (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

Reasoning enabled humans living in groups to cooperate in solving problems, giving them an adaptive advantage over those humans (and animals) who lacked this trait

Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Hypersociability can be a double-edged sword – Going along to get along can lead to faulty, biased thinking, aka – a “confirmation bias”

Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments.

Case Study – The Stanford study of capital punishment

One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The students [in both groups] were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics.

Pro-capital punishment students rated more highly the study finding capital punishment had a deterrent effect – and vice versa for students who opposed capital punishment

The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse.

As well, the research findings had the effect of strengthening each group’s initial beliefs

At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

So much for the power of reasoning

If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner.

Basing one’s judgment on pre-existing beliefs shared by one’s “in-crowd” can lead people to dismiss evidence and engage in high-risk behavior

To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

While human’s are adept at spotting weaknesses in other’s reasoning, we tend to be blind to our own weaknesses

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.

…..

“As a rule, strong feelings about issues do not emerge from deep understanding”

Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia “annexed” the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)

If I go along to get along with my group’s baseless opinion, then my opinion is baseless, even though we all may feel more convinced in our shared view

Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.

When asked to explain the reasons for their stance on political issues, people have trouble justifying the high level of certainty in their judgements

“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe…. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.

If we spent less time pontificating and more time working through the implications of our beliefs we might realize how clueless we are

Sloman and Fernbach see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”

…..

Persistent baseless beliefs can be potentially deadly – e.g. refusing to be vaccinated

In Denying to the Grave: Why We Ignore the Facts That Will Save Us (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous. Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place. “Immunization is one of the triumphs of modern medicine,” the Gormans note. But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved. (They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)

Why people feel a “rush” when “sticking to their guns, even if we are wrong “

The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

Providing people with accurate information just doesn’t work — the challenge of communicating with them remains

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them. There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous. (Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.) But here they encounter the very problems they have enumerated. Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. “The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”

In this age of a Liar-in Chief president, Steve Bannon, and “alternative facts”, rational voices are at a loss

The Enigma of Reason, The Knowledge Illusion, and Denying to the Grave were all written before the November election. And yet they anticipate Kellyanne Conway and the rise of “alternative facts.” These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter, the literature is not reassuring.

Elizabeth Kolbert has been a staff writer at The New Yorker since 1999. She won the 2015 Pulitzer Prize for general nonfiction for The Sixth Extinction: An Unnatural History.

This article appears in other versions of the February 27, 2017, issue, with the headline That’s What You Think.

SEE ALSO

Confirmation Bias (Part Two): How to Overcome Your Own Pre-Existing Beliefs by Peter M. Sandman, The Peter Sandman Risk Communication Website, November 11, 2016 – “What counts is knowing that confirmation bias significantly distorts how we process information; knowing when it’s important to find a less biased way to check out whether we’re actually right or not; and knowing some ways to help overcome our confirmation bias when we have decided we should.” In this article, Dr. Sandman provides a definition of “confirmation bias” and then offers 16 ways to overcome your own pre-existing beliefs. In Part One, he explained how confirmation bias works and how risk consultants can help leaders to improve the chances of getting their message through to an audience that is predisposed not to hear their message.

Advertisements

Information

This entry was posted on February 28, 2017 by in Uncategorized.
%d bloggers like this: