Citizen Action Monitor

Why do Trump supporters cheer for a man who is publicly known to be a racist and congenital liar?

Attribute it to their need to conform to fellow supporters’ core belief that he speaks for them. So much for critical, independent thinking.

No 2509 Posted by fw, August 22, 2019

To access links to other posts by Nate Hagens about The Human Predicament, click on the Tab titled Teachings of Dr. Nate Hagens about The Human Predicament – Links to Posts

Dr. Nate Hagens

“In a famous experiment on ‘authority bias’ called the Milgram experiment, 65% of volunteers delivered what they thought were fatal doses of 450-volt electric shocks to human subjects under the calm assurance “to continue” by the experiment administrators, who were doctors in lab coats. The other 35% of participants still delivered very high-voltage shocks to the point of unconsciousness, but refused to administer the highest level shocks. Interestingly, none of these 35% insisted that the experiment itself be terminated. Nor did they leave the  room to check the victim was okay without first asking for permission. In interviews prior to this experiment, respondents predicted that only the most sadistic 1% of participants would be willing to hurt another participant with electric chocks. Yet 1% of the participants did administer the shocks. And two-thirds were willing to kill a stranger due solely to be instructed to by another stranger in a white coat. Presumably, deferring to authority relieves the individual of responsibility for their own actions or inactions, an important, if unsettling bias. So much for independent thinking.” —Dr. Nate Hagens

This passage was taken from Hagens’ video, Self-Blindness, Part 2/2 of (Cognitive Biases), Brain and Behavior, 8/10. Hagens picks up where he left off in Self-Blindness, Part 1/2 of (Cognitive Biases), Brain and Behavior, 7/10. In Part 1/2, Hagens focused on the “doozies of ‘cognitive dissonance’ and ‘optimism bias’, worried that these cognitive biases risk precluding our taking action on real threats to our survival.

In today’s Part 2/2, Hagens looks at social conformity blind spots, including: the intense pressure for social conformity, conformity bias to core beliefs of groups, authority bias, loss aversion, confirmation bias, and moral bias. Our brain’s cognitive blind spots are what they are, and as Hagens puts it: “These are the brains we’ve got to work with. It’s best we understand them.”

As mentioned in my August 5 post, Dr. Nate Hagens, professor at the University of Minnesota, has uploaded a set of 34 videos on the nature of our human predicament. These videos, which he uses in his teachings and public lectures, are grouped in three sections —  Brain and Behavior; Energy and Economy; and The Big Picture. They are freely available for all on You Tube, offering a total of 6 hours of viewing time. The August 5 post provides a list of all 34 titles along with each and every embedded video.

Posted below is an embedded video of Hagens’ Video 8, Segment 2 of 2 on Self-Blindness. Accompanying this video is my transcript featuring added subheadings, highlighted text, added hyperlinks and images. Alternatively, this video, without the transcript, can be viewed by clicking on the following linked title.

**********

Self-Blindness – Part 2 (Cognitive Biases), Brain & Behavior 8/10, by Nate Hagens, Reality 101 – UMN Nexus One, January 31, 2019 (10:02)

TRANSCRIPT

Lots of cognitive biases revolving around social conformity issues.

Our need to conform with others around us can cause us to mistrust our own sensory experiences

In the graph on the left depicting the famous Asch experiments, participants had to match the lines in Exhibit 1 with the lines in Exhibit 2. In a controlled trial with no pressure from other participants, less than 1% of the participants gave an incorrect answer. But, when there were confederates, fake participants in cahoots with the scientists running the experiment, only one-third of the responses agreed that C was the most equal in length to the line in Exhibit 1.

In a related study called the Smoke-Filled Room, experimenters released smoke outside of a closed door. When left alone in the room, 75% of people would report smoke unexpectedly coming from under the door, and for good reason since dying in a building fire is one of the few immediate dangers we’re still subject to in the modern world. However, if a person was in the room with two other people who ignored the smoke, 90% of them ignored it as well. Basically, the risk of some minor social awkwardness was more threatening than the danger of fire.

Being part of a group that witnesses a worrying incident may cause us to delay our decision to get involved

It’s almost as if our physical world concerns are considered adequately dealt with once others around us have the same information. This is a huge problem as, not so metaphorically, we now live in a smoke-filled world.

Extrapolating social conformity biases

Re conformity bias to core beliefs of groups —  It’s not necessary for a core belief of a large group to be true for members of a group to believe it

In human societies, the organizing power of a belief is far more important than the validity of the belief itself. Indeed, it’s not a requirement that the central beliefs be valid or even be sane because their truth is relatively unimportant. What is vital is the reliability of the belief in belief amidst a large population.

Two core survival mandates of humans who evolved to live in small tribes — social cohesion and relative status and loyalty

Like other social primates, humans evolved to exist in small tribes in which two core survival mandates coexisted. Number one – the cohesiveness of the tribe against outgroups, predators and other challenges. And number two – the relative status and loyalty of members within the tribe.

Early human tribes were limited to about 150 close relationships to ensure social cohesion and loyalty

The size of the primate tribes is generally limited by the size of a primate brain, because knowing another tribe member, and keeping track of what they’re up to, takes a lot of brainpower. Humans have the largest primate brains, and thus had the largest tribes. But, still, this was limited to roughly 150 close relationships, before the ability to keep track of the others broke down.

Biased belief in the reliability of the beliefs of others allowed humans to form larger and more powerful social units

However, our species stumbled on a way of getting past that limit of 150, resulting in our ability to form far larger and more powerful units, which out-competed and/or killed off all the other human species and some competing large predators. This was the supernormal stimuli — a belief. A good case can be made that belief in the reliability of the beliefs of others is the basis of global human society, regardless of how valid those beliefs are.

The “authority bias” —  Deferring to authority relieves individuals of taking responsibility for their own actions

In a famous experiment on “authority bias” called the Milgram experiment, 65% of volunteers delivered what they thought were fatal doses of 450-volt electric shocks to human subjects under the calm assurance “to continue” by the experiment administrators, who were doctors in lab coats. The other 35% of participants still delivered very high-voltage shocks to the point of unconsciousness, but refused to administer the highest level shocks. Interestingly, none of these 35% insisted that the experiment itself be terminated. Nor did they leave the  room to check the victim was okay without first asking for permission.

So much for independent thinking.

In interviews prior to this experiment, respondents predicted that only the most sadistic 1% of participants would be willing to hurt another participant with electric chocks. Yet 1% of the participants did administer the shocks. And two-thirds were willing to kill a stranger due solely to be instructed to by another stranger in a white coat.

Presumably, deferring to authority relieves the individual of responsibility for their own actions or inactions, an important, if unsettling bias.

These are my two dogs: Frank and Maisy. If there’s a stick lying around, neither of them is interested in it. But once I call attention to the stick, thus assigning value to it, they will fight over it. And whoever gains control, will vigorously defend it from the other one. This concept also applies to humans. It’s called “loss aversion.”

Humans are biased to respond much more to losses than to gains

Humans respond much more to losses than to gains. If you start with $10,000 and make $1,000, and they measure your brain scans on how happy you are and how you brain responds, then if you lose $1,000 and go from $11,000 to $10,000, your brain responds much more negatively to that experience.

We can see how this dynamic evolved, because a week without food might have meant death. But an extra week of food didn’t mean an extra week of life.

As we get into energy and money in Nexus 9, we’ll see how this loss aversion dynamic will likely be of central importance in coming decades.

Confirmation bias – tendency to seek for evidence that confirms what we already think and ignore disconfirming evidence

Okay, this is a big one and one you’ve probably even heard about in high school. Sir Francis Bacon famously said: “Man prefers to believe what he prefers to be true.” I believe this quote is true, and I prefer it to be.

Confirmation bias is displayed when we selectively search our environment for evidence and information that confirm what we already think. Implicitly then, we also ignore, and therefore, don’t spend any time seeking out information that does not support our existing beliefs.

Someone who thinks global warming is a socialist scam, is not going to be reading the latest IPCC forecast. Someone that thinks humans are going to go extinct because of climate change isn’t going to read any hopeful mitigation literature.

You can imagine lots of different ways this manifests. People who believe technology is going to solve all our problems are the first to gravitate towards press conferences advertising Mars travels within a decade.

Confirmation bias is a real thing. We’re all susceptible to it.

Moral bias – we will do “the right thing” as long as it isn’t too demanding

Here’s a bias we don’t often think of – moral bias. There’s lots of them. When presented with information in circumstances about a starving child, we will commit to donate money to help her. When shown a picture of her with her brother in a similar condition, however, our intended donation goes down by 25%. When we learn there are two children in need, logic would suggest our donation should increase, not decrease. If you then add a whole village of children who need help, the intent to donate drops dramatically.

It’s not yet known why this paradoxical behavior exists, only that it does. It may have to do with the mirror neuron system which seems to underlie part of the biological basis for empathy – If we see someone feeling pain or other stress, our brains tend to light up as though we feel it too.

Small tribal groups never had a need to feel empathy for very large groups, so being empathetic to large groups did not become part of human behavior

However, as tribal primates, there was never a need to feel empathy for vast numbers of others, or vast numbers of other species, for that matter. Stone age minds, modern predicament.

Okay. This is just for the selection of the many discovered and researched biases in human behavior. There are many, many more.

Okay, let’s summarize this video.

  • We didn’t evolve to need to understand and see reality in the true sense.
  • We have a catalog of blind spots and biases as long as your arm. Some we covered here were cognitive dissonance, optimism bias, the intense pressure for social conformity, loss aversion, confirmation bias – but there are many others.
  • Being aware of our blind spots is necessary for some of us – most of us – to avoid some of the societal challenges we face in coming decades.
  • These are the brains we’ve got to work with. It’s best we understand them.

We’re all guilty of each and every one of our cognitive biases

In this survey of prevalent cognitive biases, I am guilty of each and every one of them, just as most of you are. Being aware of then makes me both humble and little bit more tolerant. It helps me understand where other people are coming from, and, when I make very important decisions about my life or future, I try to be aware of these blind spots. I have “little Nate” looking over my shoulder kind of chiming in periodically.

Humans evolved to be wrong, but knowing this, our modern human brains have the intellectual capacity to override these limitations

In summary, we thus evolved to be wrong. The world we perceive is NOT the physical world which exists. Yet, for the first time we have learned this truth, and so are not, in principle, absolutely limited by these built-in errors.

So as college students, who may have just had their first exposure to cognitive biases:

  • How did you feel about being told that you have biases?
  • What are the implications of having these – and other – biases when dealing with our environmental challenges, or social inequality, or your own personal relationships?
  • Spend 10 minutes meditating this week. Try to ”hear” the various modules in your brain as they process and relay information. What did you learn?

As an add-on to this video, there’s a particular field of bias that has special relevance to our planetary predicament, which will be the subject of the next video. From your current understanding of human brain and behavior, can you guess what it might be?

FAIR USE NOTICE – For details click here

%d bloggers like this: