The Bias Blind Spot

In my novel, Alemeth, I told the story of an ante-bellum family who ran a cotton plantation in Mississippi.  They owned sixty African-American slaves.  Their belief in the righteousness of the southern cause was based on their view that slavery was sanctioned by Holy Scripture.  Essentially, they believed that God had charged them with a duty to perpetuate the peculiar institution.

One of the mysteries that attracted me to this true story was how so many people could have been wrong about an institution which, today, nearly all mankind agrees is evil.  I wanted to understand how their wrongness came to be.  Of course, this family was not alone.  Their neighbors, their churches, their doctors, their lawyers, their newspapermen, shared their views.  At the risk of gross oversimplification, it is at least roughly true that about twenty million northerners thought slavery wrong, and five or six million southerners thought it right. 

I’m not talking about related questions, like whether slavery was worth going to war over, or whether it justified secession; I’m not talking about whether there were some in the north who supported slavery, or who were racists, or whether there were individual abolitionists in the south. I’m talking about whether people thought slavery was an evil that should be immediately abolished or that it was an economic necessity that ought to be preserved for the foreseeable future – and on that point, the people of the South showed amazing agreement with each other.  One indication of just how geographically lopsided the distribution of opinions was: the large number of Christian church denominations that split into separate northern and southern churches over the slavery question.

If every person had simply thought out the rightness and wrongness of it for himself, there’d have been a thorough mixture of opinions in every state, north and south. Differences as to details notwithstanding, the geographically lopsided distribution of opinions  as to the central question that was a necessary condition for civil war convinces me that something else was going on. 

How was it that nearly all the good white people lived up north, and nearly all the bad ones lived in the south? 

Okay, not really.  I know that couldn’t be true. So I wonder, how did it happen that nearly all the smart people lived up north, and all the stupid ones lived in the south? 

Okay, really, not that either.  While mulling this mystery over, my daughter Jen forwarded me a blog by someone I don’t know – his name is Sean Blanda – called “The ‘Other Side’ is Not Dumb.”  https://medium.com/@SeanBlanda/the-other-side-is-not-dumb-2670c1294063#.blt9vqmzr.   I think Sean is right.  On average, surely the people of the south were as good, and as smart, as their northern counterparts.  So perhaps, being “right” or “wrong” has little to do with how smart you are?  Or how good you are?

Was it self-interest, tradition and peer pressure that caused the people of the south to descend into such widespread error?   A sort of groupthink, perhaps, arising from common backgrounds and perspectives?.  Fair enough.  But what, then, about the beliefs of those in the North?  Was the correct position of the north regarding slavery due to an absence of groupthink, self-interest, and peer pressure there?  Was the south riddled with conditions that contributed to southern bias, while the north was able to arrive at the “right” answer because it was free of any such influences?

Maybe so.  Maybe we could all agree about the errors and biases of the south, now that we all agree about the evils of slavery.  But what of those controversies on which we don’t yet agree?  In political election cycles, the country always seems split fairly evenly between Republicans and Democrats.  Is it possible that one side’s views are explained in terms of cultural bias, but the other side’s views are not?  According to the Pew Research Center, about 30% of the World’s population is Christian, and a similar portion (about 22%) is Muslim.  Is it possible that the 30% is simply better informed than the 22%?  That the 22% are smarter than the 30%?  That one view is the result of cultural biases and the happenstance of birthplace and family influence, but the other view is not?  Are the debates over gun control, abortion, global warming, Vegan diets and same sex marriage, debates between smart people and stupid people?  Between the good people and the bad people?

Finally, what are the odds that, on each and every issue, it’s ME who recognizes the truth (because it really is the truth), while my opponents’ incorrectness can be explained by bias? 

In Being Wrong (Harper Collins, 2010), Kathryn Schulz writes, “Let’s say that I believe that drinking green tea is good for my health.  Let’s also say that I’ve been drinking three cups of green tea a day for twenty years, that I come from a long line of green tea drinkers, and that I’m the CEO of a family-owned corporation, Green Tea International.  An impartial observer would instantly recognize that I have three very compelling reasons to believe in the salubrious effects of green tea, none of which have anything to do with whether those effects are real…  I have powerful social, psychological, and practical reasons to believe in the merits of green tea.”

Makes sense, doesn’t it?  In the example just given, Schulz is writing about what would be obvious to an impartial observer.  But more important is what’s obvious to partial observers – to those who are convinced that the other side is wrong.  If we’re talking about people we’re convinced are wrong (like those who supported slavery) it’s natural to believe that their views are shaped by – and therefore depend on – their peculiar life experiences.  Yet when it comes to the things we have decided we’re right about, we ‘re unable to see that our beliefs are a function of own life experiences in the same way.  Because we believe that the Statue of Liberty really towers above New York Harbor, we believe it is objectively real, regardless of our subjective perspective, culture, or bias. To us, everything that’s “obviously true” is like another Statute of Liberty. 

“Sure, it may be that my father was a civil rights activist and my mother worked for George McGovern, but I hold my liberal views because they are objectively right…”  Or, “Sure, it may be I grew up reading the Christian Bible, but my faith in Jesus has nothing to do with that happenstance; I have faith in Jesus because he has revealed himself to me…”  When people believe that something is true, they believe it not because of anything about themselves or their own backgrounds, they believe it because – well, because it’s true.

Simultaneously, because we believe that slavery was wrong, we are quick to conclude that those who supported it only did so because of such a cultural bias.  This readiness to see bias as being the reason for the (erroneous) beliefs of others, while being unable to see that bias may explain why we ourselves believe certain things, is something professional psychologists call the “bias blind spot.” A quick Google search on “the bias blind spot” reveals a host of scientific studies regarding this phenomenon.  Many have shown it to be true: we are quick to ascribe bias (from whatever source) to those we disagree with, while denying it in ourselves.

In a May, 2005 article in The Personality and Social Psychology Bulletin (Ehrlinger, Glovich, & Ross, “Peering into the Bias Blind Spot: People’s Assessments of Bias in Themselves and Others”), the authors explored two empirical consequences of the phenomenon: First, that people are more inclined to think they are guilty of bias in the abstract than in any specific instance.  (“Sure, I recognize that I’m capable of bias; but doggone it, not when it comes to this.”)  Second, that people tend to believe that their own personal connection to a given issue is a source of accuracy and enlightenment – while simultaneously believing that such personal connections by those who hold different views are a source of bias. 

I find the second point especially interesting.  Think about it:  As to the beliefs I hold most dear on some controversial subject, do I have personal experiences that are relevant?  If so, do I consider those personal experiences as giving me special insights into the matter?  Now ask the same question about the typical person on the other side of that issue.  Do the reasons for their error lie at least in part in their different experiences?  Do I not see those experiences as providing valuable insights, but as reasons to explain away their error?  Personally, I’ve been guilty of this double standard often. 

Schulz points out that when we try to understand how people disagree with us, our first tendency is to assume they don’t have all the information we have – something Schulz calls the Ignorance Assumption. So we try to educate them.  If our efforts to educate them don’t work, if they adhere to their mistaken beliefs even after we’ve given them the benefit of our own information and experiences, then we decide they must be less able than we are to properly evaluate the evidence.  (In other words, we decide they just not as smart as we are – Schulz’s “Idiocy Assumption.”)  Finally, if we become convinced they’re actually smart people, we find ourselves considering them morally flawed –selfish at best, just plain rotten at worst (Schulz’s “Evil Assumption.”)

At the end of the day, it might just be that I’m right about a few things.  But if so, I doubt it’s because I’m smarter, or a better person, than those on the other side.  And it’s certainly not because I have no cultural biases of my own.

I’ll end by quoting Schulz one more time: “If we assume that people who are wrong are ignorant, or idiotic, or evil – well, small wonder that we prefer not to confront the possibility of error in ourselves.”

– Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2019/01/20/the-bias-blind-spot/
Twitter
RSS

2 thoughts on “The Bias Blind Spot”

  1. Interesting and thought provoking as usual, Joe. I forget where I read this, but, the article stated the only reliable demographic of people who changed their minds when presented with new facts were scientists. Smarter (non-scientific) people dug deeper to find more obscure facts or arguments to counter the new facts to hold on to their belief. And as I recall, less intelligent people simply ignored or dismissed the new facts out of hand. Interestingly, Engineers and other technically educated people did not share the same “plasticity” of mind as research scientists. I think there is something special about the mindset of scientists that predisposes them to change with new information. Evidence leads to the conclusion – not the other way around…The really good ones are always questioning the existing paradigm. They would like nothing more than overthrow it with a better theory that does a better job incorporating known facts and provides better predictions. Unlike Political Science – which is more like an oxymoron…

    1. Thanks, Tom. I agree. I think I read something similar about scientists, and I think that, on the whole, it’s very probably true. That said, I do think scientists, as human beings, are sometimes driven by the same experiential biases and peer pressures as the rest of us, so I like how you refer to “the best of them.” Sometimes, I think “the worst of them” can demonstrate a lot of smugness about their claims to objectivity. But I wish we all had as much curiosity as I think the average scientist does.

Leave a Reply

Your email address will not be published. Required fields are marked *