Thoughts and Opinions

Rip Van Winkle Returns

Sometimes I feel like Rip Van Winkle.  A career in civil rights and employment law kept me in the midst of political issues and controversies for over thirty years, but upon my retirement in 2003, I decided to enjoy a less stressful life:  to do so, I would isolate myself from the news.  So I went into a deep sleep.  For sixteen years now, I’ve been dreaming of beautiful things.  During my slumber, I played with grandchildren, I gardened, I wrote historical fiction, I read some of my daughter’s old college psychology texts – nothing that would raise my blood pressure.  I especially enjoyed reading about the psychology of human error, and confirmation bias.

In Being Wrong (Harper Collins, 2010) Kathryn Schulz quotes the French essayist Montaigne as asserting that people “are swept [into a belief] – either by the custom of their country or by their parental upbringing, or by chance – as by a tempest, without judgment or choice, indeed most often before the age of discretion.”  In keeping with that view, Schulz asserts that the single best predictor of someone’s political ideology is their parents’ political ideology.  That had certainly been true in my case, and as I researched the actual lives of the players in my historical fiction, I had discovered how true it was for them as well.  I was forced to ask myself the difficult question of whether I believed what I did, not because it made objective sense, but because of an inherited or at least culturally-guided confirmation bias of my own.

Now, even when asleep, our bodies can sense the presence of heat, cold, or other stimuli, and in a similar way, though I was asleep, I did hear snippets of the outside world from time to time.  The classic movie I’d recorded (so I could fast-forward through campaign ads) having ended, I’d be startled when the TV screen suddenly defaulted to the late news on TV.  In the car, entranced by Smetana’s Moldau or Charles Mingus’s rendition of “I’ll Remember April,” I’d be jarred awake by a piece of headline news before my hand could turn the radio off.  So I wasn’t totally asleep; not totally unaware of what was going on in the modern world.  Just mostly so.

Now, think what you will of him, few will deny that Donald Trump makes for engaging theater.  So no surprise, occasional sound bites of last summer’s slugfest between Donald and Hillary began to intrude on my dream, appealing to my own interest in politics the way a voice whispering “one little drink won’t hurt you” might appeal to an alcoholic, even after sixteen years on the wagon.  And – no one will be surprised to hear this – since awakening from my sixteen-year political slumber, I’ve been  feeling like old Rip Van Winkle himself, rubbing my eyes in disbelief at how much has changed during my absence, aghast at just how divisive this country had become while I slept.  My conservative friends had become so opinionated and cocksure that I found myself trying to articulate liberal replies in response, in an effort to moderate their extremism.  My liberal friends had become so arrogant and dismissive of their opponents that it seemed I had to join them, or become their enemy.  Two months ago, I started this blog as the only response I could think of to a world that seemed to have gone out of control as I slept.  And because of this blog, I have started, once again, to be sucked into the vortex of the news.

I still know little of what went down during my reverie.  As I emerge from my slumber, I imagine myself having something like Van Winkle’s naivete.  Perhaps that naivete will be apparent to others, as I dare to comment on the modern political scene.  But let the chips fall where they may, I’m going to comment – because I’ve decided my long slumber may actually be of help to the mission at hand.

My brother James alerted me today to an article I found most interesting, and this article is actually the focus of my post today.  But before I get to it, I’m afraid that, for some on the right, it might be an immediate turnoff to mention that it came from Vox.  Vox is a news source I’d never heard of until today, as it was created during the period of my deep slumber.  From what I’ve been able to gather this afternoon, it’s apparently viewed by the right as being very left.  So I feel constrained to offer, first, a word of caution about sources.

In Kathryn Schulz’s catalogue of the types of non-rational, illogical thinking to which we human beings are prone,  she points out that “[i]nstead of trusting a piece of information because we have vetted its source, we trust a source, and therefore accept its information.”  That’s understandable in some cases, but not a good thing for one aspiring to real communication across the political divide.  And in this case, I feel I have an advantage – having never heard of Vox before, I hold no biases for or against the source.  I neither trust it nor distrust it.  I can only consider what I read in it on its own merits.

Anyway, I hear today that Ezra Klein launched Vox in the eleventh year of my slumber with an article titled “How Politics Makes Us Stupid.”  I haven’t read it, but it apparently focused on the scientific work of Dan Kahan, a professor at Yale Law School whose earlier work showed that the ability to reason soundly, particularly about political subjects, is undermined by the need to protect one’s core beliefs.  Hence, “how politics makes us stupid.”  Now, lost as I may have been in the land of Nod, that came as no surprise to me:  it sounded like run-of-the-mill confirmation bias, and I had digested the concept of confirmation bias years ago, before ever going to sleep, along with half a package of Oreo cookies.  But of greater interest to me is what appeared in Vox this week.   Klein has now reported on the work of Professor Kahan again, this time to report a way to escape our human susceptibility to confirmation bias:  CURIOSITY.

Apparently, as described by Klein (http://www.vox.com/science-and-health/2017/2/1/14392290/partisan-bias-dan-kahan-curiosity), Kahan’s new research shows that some of us – on both the right and the left – are more scientifically curious than others.  And that those of us who are scientifically curious are less prone to confirmation bias – or, to use Kahan’s phrase – less prone to let our politics make us “stupid.”  The point appears to be that confirmation bias interferes with sound thinking on both the left and the right, but that curiosity – a trait that exists on both the left and the right – is the common predictive factor that makes us less susceptible to the “stupidity” toward which confirmation bias pushes us.

Now, I haven’t vetted the source.  I haven’t even read Kahan’s actual findings.  I know better than to rely on the second-hand report of any mediary, trusted or not.  But I have to confess, I’m doggone interested.  For  the past several weeks, I’ve been asserting that political debate is for people who want to prove that they’re right in the eyes of a judge – not for people who want to convince people with whom they disagree.  In a debating class, there’s some sort of third-party judge.  In a courtroom, there’s a judge or jury.  In a political debate, there’s the undecided viewing public that is the effective judge.  In every case, the efforts of the debaters are designed to win points with the third-party judges by making the other side look erroneous, ignorant, or (best of all) just plain foolish.

How surprised I was, upon waking from my slumber, to discover that modern internet discussion is conducted the same way – as if there were some third party judge present to determine a winner.  After a thirty year legal career, I can tell you that I never saw a plaintiff convince a defendant she was right, nor a defendant convince a plaintiff that he was.  Rubbing my eyes of my sleepy dust, I had to wonder what these internet debaters thought they were doing in their efforts to “win an argument” (by showing how stupid their adversaries were) in the absence of any third party judge.  Weren’t they quite obviously driving their opponents deeper into their convictions?  In Being Wrong, Schulz describes exactly that phenomenon – how such efforts to “persuade” actually have the opposite effect.  And I’ve been saying that, surely, it makes more sense to conduct political discourse with a sincere attitude of wanting to learn from one’s adversaries, rather than proving (to ourselves?) how stupid our adversaries are.  I’ve been asking whether, paradoxically, a sincere desire to learn from someone else isn’t more likely to result in his or her learning from us at the same time.  And I’ve been wondering if there isn’t some psychological study that backs up that theory.

So here, today, comes my brother James, providing me exactly the sort of scientific study I’ve been looking for.  A desire to learn — curiosity — could it really make us less susceptible to confirmation bias?  Perhaps this is all just confirmation bias, on my part, fitting as well as it does with what I already suspected. So I want to check into it further.  I will check into it further.  But in the mean time, doggone it, it seems clear to me that curiosity must be the remedy, just as Kahan and Klein say.  If being curious isn’t close to being open-minded, and if being open-minded isn’t essential to learning, and if learning isn’t something we should all strive to experience, then what is?  And how come there’s all this debating and berating that has been shown to keep us from ever learning anything?

The world has changed a great deal in my years in the land of Nod.  Now that I’m awake, feeling (like old Rip Van Winkle) a good bit naïve and ill-informed, with no real clue about the strange new world I find around me, I am very, very thankful for that slumber.  For after thinking over what Kahan’s research has apparently shown, I believe my deep sleep may have done me a huge favor; by being politically asleep for these sixteen years, what strikes some (myself included) as naivete may be just what I need to be curious about what’s been going on in the world; curious about who’s right, and whose wrong; and ready, and willing, to learn from people who aren’t already my mental clones.

I’ll close by applauding another website I learned of just today: The Lystening Project.  The Lystening Project is an innovative approach to fostering open-mindedness and civility in political discourse, conceived of by a class of San Francisco high school students in what is surely a kindred spirit to that of We May Be Wrong.  Check out their website for yourself, but from what I gather, their idea is to assess participants’ political leanings through a short survey of opinions, and then to pair them with people of opposing views for dialogue across the divide.   I especially like the “oath” that participants must take before undertaking such paired dialogue:

“The Lystening Oath”

I will take a moment to connect with the other person as a human being first.

I will enter this conversation with the goal of understanding, not convincing.

I will not vilify, demean or degrade others or their views.

I will enter this conversation with goodwill and I will assume goodwill on the part of the other person.

 I will do my best to express my own views and how I came to believe them

Reminds me of the rules for the WMBW Forum.  I can’t imagine a better oath to ask people to take, and I thank the students’ advisor, Elijah Colby, for bringing their project to my attention.  Check them out for yourself at https://thelysteningproject.wixsite.com/thelysteningproject. Help if you can.

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/02/09/rip-van-winkle-returns/
Twitter
RSS

The WMBW Forum

The We May Be Wrong Forum is now up and running.

We’re hoping it will be a different kind of discussion Forum — not one in which people belittle those they think are wrong.  Not one where people are trying to “win a debate” or feel good by surrounding themselves with people who think like they do.  Rather, a Forum in which people can participate in discussions with people who may disagree, but without fear of being mocked or ridiculed, because they aren’t contestants in debate, but partners in a search for understanding.

Impossible?  Maybe.  Altruistic?  Of course.  But we think of it as a worthwhile experiment in this age of rampant incivility, and we’re moving forward.

So you’re invited to check out the new WMBW discussion Forum, and help us make the experiment succeed.

–Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/02/04/the-wmbw-forum/
Twitter
RSS

MLK and the Dream

I had a dream last night;  I woke up this morning thinking about it. And my train of thought went from there to Martin Luther King’s dream.  Remembering the late civil rights leader led me to contemplate a sort of ironic coincidence: that last Monday – the 16th – the Martin Luther King Holiday – was the very day I made the final revisions to my novel, Alemeth, and began the process of formatting it for the printing company.

Completion of the novel is the fulfillment of a dream.  I could trace its origins back to the early 1960’s, possibly even to the very year of King’s famous 1963 speech.  That was when my grandmother first showed me some of the letters my great uncle Alemeth had written home from the front lines during the Civil War.   Or I could trace its origins to a dinner that Karen and I had with our friends Roger and Lynda ten years ago, when a lively discussion got me thinking about a novel that explored (or even tested) the differences between fiction and non-fiction.  Or I could trace it back seven years, when I chose to write Alemeth’s life story.  No matter how far back I go to date the novel’s origins, it has been many years in the making. Somewhere along the way, a novel based on Alemeth’s life became a dream, and it seemed ironic that the dream had finally been fulfilled on the Martin Luther King Holiday.

But the coincidence seemed ironic for reasons deeper than that my novel has been sort of a dream for me.  It seems ironic because the themes of King’s famous “I Have a Dream” speech and the themes of Alemeth are so closely related.

For King’s dream, we need scant reminder.  “[O]ne day… even the state of Mississippi… will be transformed into an oasis of freedom and justice.” “[M]y four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character…” My great uncle, Alemeth Byers, the title character of my novel, was the son of a cotton planter in Mississippi.  The family owned sixty slaves when the Civil War began.   In calling Mississippi “a state sweltering with the heat of injustice, sweltering with the heat of oppression,” Martin Luther King had been talking about my family.

Early in my research into Alemeth’s life, I began to confront what, for me, was terribly unsettling.  I knew my grandparents to be among the kindest, most “Christian,” most tolerant people I knew.  But as I grew older, my research into their lives, and into their parents’ lives, revealed more and more evidence of racial bigotry.  In old correspondence, these prejudices pop up often – and most alarming of all – when I looked honestly at the historical record, I saw those prejudices getting passed down, from generation to generation.

In one respect, I felt I was confronting a paradox of the highest order.  My mother was kind and loving, and my sense was that her kindness was in large part because her parents had been kind.  My instincts applied the same presumption to their parents, as if “loving and kind” was a trait carried down in the genes, or at least in the serious study of Christian Scripture.  (My grandmother was a Sunday School teacher; my childhood visits to her house always included Bible study.). Presuming that my great grandparents were as kind and loving as my grandparents, and knowing that they, too, had been devout Christians, I found it paradoxical that all this well-studied and well-practiced Christianity not only tolerated racial bigotry but, in great uncle Alemeth’s day, was used to justify a war to preserve human bondage.  Frankly, it made no sense.  I wondered: How did these people square their Christian beliefs with their ownership of so many slaves?  With their support for a war intended to preserve their “property rights” in these other people?

It was even more unsettling, then, to realize how the “squaring” had occurred.   George Armstrong’s The Christian Doctrine of Slavery (Charles Scribner, New York, 1857) made a fascinating read.  That work expounded, in argument after argument, based on scripture after scripture, how God had created the separate races, given Moses Commandments which made no mention of slavery, instructed the Israelites to make slaves of their heathen enemies (Leviticus 25:44-46), sent a Son to save us who never once condemned slavery though he lived in its midst, and inspired Saint Paul to send the slave Onesimus back to Philemon with instructions to be a good, obedient slave to his master.  Armstrong’s work was perhaps the most impactful, but by no means did it represent an isolated view.  My research uncovered source after source that made plain how the slave owners of the ante-bellum South were able to square their support of slavery with their Christianity: they did so by interpreting Christian Scripture as supporting the institution.  Indeed, in some sermons of the day, the case was made that being a good Christian required a commitment to the defense of slavery, because civilized white people had a Christian duty to care for their “savage” African slaves.  In the end, of course, they were so convinced they were right that they were willing to go to war and fight (and die) for it.  (Their cause being a righteous one, the killing of people in support of it met all the requirements for a “Just War” as traditional Christian doctrine expounded it.)

For me, it was an eye-opener to realize that southern Christians based their support of slavery squarely on Christian scripture.  It was also an eye-opener to see how the beliefs and attitudes of the community were shared, both horizontally and vertically.  By horizontally, I mean how family members, neighbors, newspapers, courts, elected representatives, school teachers and preachers all worked together to homogenize the Southern attitude toward slavery.  (It was rare to find a voice of dissent – the conclusion seems compelling that the few dissenters tended to keep their opinions to themselves, for fear of being run out of town, as those considered “unsound on the slavery question” generally were.)  By vertically, I mean how attitudes and beliefs were passed down from one generation to the next, most strongly within immediate families, but also within whole communities and cultures.  My research extended back in time to the racism of our national heroes, Washington and Jefferson, and forward in time through my grandparents, my parents, and –

Indeed.  What about myself?  Historical research proves again and again how, once accepted in a family or community, “wrong” attitudes and beliefs can be passed down so easily from one generation to the next.  Is it possible I could be exempt from such influences?  Somehow free to form my opinions entirely on reason and logic, safe from any familial or cultural biases? All my historical research has led me to conclude that we are most  prone to be blind to the wrongness within that which is most familiar; if that’s true, what are the ramifications for my own attitudes and beliefs?  How much of the racism inherent in my family history manages to cling to my own way of thinking?  I hope none of it, of course, but how likely is it that some of it persists?

I will repeat a quote from  The Gulag Archipelago, which I already mentioned in a prior WMBW post and which I managed to squeeze into Alemeth as well.  Alexander Solzhenitsyn expressed a wish for an easier world:

If only there were evil people somewhere insidiously committing evil deeds and it were necessary only to separate them from the rest of us and destroy them. But the line between good and evil cuts through the heart of every human being. And who’s willing to destroy a piece of his own heart?

I’ll have more to say in later posts about what psychologists call the “bias blind spot.”  For now, suffice it to say that much as I share King’s dream for a day when prejudice will be a thing of the past, I fear that as long as we have families, as long as parents teach their children, as long as such a thing as “culture” exists, we will all have our prejudices.  Many of them, I believe, will have been inherited from our parents and grandparents.  Others from school teachers, preachers, news sources, national heroes, or friends.  A rare few, perhaps, we will have created entirely on our own.  But they will be there.  And others who see this the same way I do have suggested an idea that makes a great deal of sense to me: that to begin the path toward a more just world, we’d do well to begin by trying (as best we can) to identify what our own biases are.

In Alemeth, I have tried to take a step in that direction.  Early in the evolution of the novel, I found myself asking whether it was I who was creating Alemeth, or Alemeth who had created me.  It’s a novel about my family – about the culture that began the process of making me what I am – and it’s not an entirely pretty picture.  But the dream that inspired it, and the research and thought given to the project, is also largely responsible for the existence of something else.  I don’t think I’ll be giving too much away if I give you a hint: the last four words of the novel are “we may be wrong.”

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/01/28/mlk-and-the-dream/
Twitter
RSS

WYSIATI

 

Have you ever seen a swan?  If so, how many have you seen?

For four years, my family lived on a pond that we shared with a family of swans.  I saw this one family a lot.  More recently, I’ve seen a few more swans, but given that swans live long, maintain monogamous relationships, and tend to remain in the same habitat, I suspect I’ve been seeing the same swans over and over again. I’d take a wild guess that I’ve seen a total of thirty swans in my life.  You might ask yourself the same question now: how many do you suppose you’ve seen?  (We’ll return to the matter of swans in a moment.)

I’ve been on vacation in Florida, so it’s been a couple of weeks since my last WMBW post.  During the holidays I was able to read a couple of excellent books: one of them, Thinking, Fast and Slow, by the psychologist and Nobel prize winner Daniel Kahneman, asserts that we have two systems in our brains – one designed to produce immediate beliefs, the other to size things up more carefully.  The other, Being Wrong, by journalist Kathryn Schulz, explores both the reasons we err and the emotional costs of recognizing our wrongness.  Both books have done much to clarify my intuitive beliefs about error.  If you suspect this is a case of “confirmation bias” you’re probably right – but at least my confirmation bias gives me a defense to those who say admitting doubt is tantamount to having no beliefs at all.  (I can’t have a bias in favor of a belief unless I have a belief to begin with, right?)

Well, to those who fear that I totter on the brink of nihilism, I assure you I do have beliefs.  And perhaps my strongest belief is that we human beings err – a lot. Since starting this website, I’ve started to see people committing errors with truly alarming frequency.  The experience helps me understand witch hunts, as I now see error the way Cotton Mather saw witches and Joe McCarthy saw communists – everywhere.  The difference, I’d submit, is that Cotton Mather never suspected himself of being a witch, and Joe McCarthy never suspected himself of being a communist. In contrast, I see myself being wrong every day.  In fact, most of the errors I’ve been discovering lately have been my own.

My willingness to admit to chronic wrongness may be partly due to the fact that Schulz devotes much of her book to rehabilitating the reputation of wrongness – pointing out that, far from being the shameful thing most of us consider it to be, wrongness is endemic to who we are and how we think – specifically, to our most common method of rational thought – reasoning by induction.

Consider this diagram:

block-sign

Reasoning by induction, says Schulz, is what causes even a four year old child to “correctly” answer the question of what lies behind the dark rectangle. By way of contrast, she says, a computer can’t answer such a puzzle. The reason? A computer is “smart” enough to understand that the dark rectangle may hide an infinite number of things, from a stop sign to a bunny rabbit to a naked picture of Lindsay Lohan. Without inductive reasoning, the computer will have to consider (and reject) an infinite number of possibilities before deciding on an answer. We humans, on the other hand, are much more efficient – we’re able to form nearly instantaneous conclusions, not by considering all the possibilities we don’t see, but by coming up with plausible explanations for what we do see. Even to a four year old child, it seems highly probable that the image behind the dark rectangle is the unseen middle of the white bar behind it. It’s certainly plausible, so we immediately adopt it as a belief, without having to exhaust an endless list of other explanations. Inductive reasoning makes us the intelligent, quick-thinking creatures we are.

In his book, Daniel Kahneman calls this WYSIATI. His acronym stands for “What you see is all there is.” Like Schulz, he points out that this is how human beings generally think – by forming plausible beliefs on the basis of the things we see, rather than by tediously rejecting an endless list of things we don’t. And, like Schulz, he gives this sort of thinking credit for a good deal of the power of the human brain.

But there’s a downside, a cost to such efficiency, which brings us back to swans. If you’re like me, you probably believe that swans are white, no?

“Which swans?” you might ask.

“Well,” I might well reply, “all of them.”

I first formed the belief that swans are white after seeing just a handful of them. Once I’d see a dozen, I’d become pretty sure all of them were white. And by the time I’d seen my thirteenth swan, and my fourteenth, confirmation bias had kicked in, leaving me convinced that my belief in the whiteness of swans was valid. It only took one or two more swans before I was convinced that all swans are white. Schulz says it was the philosopher Karl Popper who asked, “How can I be sure that all swans are white if I myself have seen only a tiny fraction of all the swans that have ever existed?”

Schulz observes that as children, we likely observed someone flipping a light switch only a few times before concluding that flipping switches always turns on lights. After seeing a very small sample – say, a golden retriever, a shih tzu, and Scooby Doo — children have sufficient information to understand the concept of “dog.” We form out beliefs based on very small samples.

Kahneman describes how and why it’s so common for groups to underestimate how long a project will take: the group imagines all the steps they anticipate, adding up the time each step will take; it factors in a few delays it reasonably foresees, and the time such delays will likely take; and it even builds in an extra cushion, to give it some wiggle room. But almost invariably, it underestimates the time its project ends up taking, because in fact, the number of things that could cause delays is virtually infinite and, well, you can’t know what you don’t know. In a sense, to use Kahnemen’s phrase, you can’t help but feel that “what you see is all there is.”

Now here’s what I think is a critical point. The way inductive reasoning takes such very small samples and draws global conclusions about them makes sense when worlds are very small. If the child’s world is her own house, it’s probably true that all the wall switches turn on lights – it’s only when she enters factories and astronomical observatories years later that wall switches turn on giant engines and rotate telescopes. Here in Virginia, all swans probably are white; I’ll only see black swans if I go to Australia or South America, which I may never do. There wasn’t much problem thinking of the world as flat until there were ocean voyages and jetliners. Both as individuals, and as a species, we grow up in very small, homogeneous worlds in which our inductive reasoning serves us well.

But the real world is more varied and complex. It’s when we expand our frames of reference – when we encounter peoples, cultures and worlds different from those of our youth – that what we “know to be true” is likeliest to be challenged.  And by that time, we’ve become convinced that we’re right. All experience has proven it. Everyone we know knows the truth of what we know. After all, our very conceptions of self, and of our worth, and our very comfort, depends on our being right about the Truth.

More, later, about the emotions involved when one of these strangers challenges that which we know to be true.

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/01/07/wysiati/
Twitter
RSS

Some Thoughts, this Christmas Eve

On the WMBW home page, a brief bio refers to my personal religious leanings as “other — really.”   To elaborate at any length might ruin the Holiday Season for most of you, and it’s Christmas Eve: a voice in my head (that of my late mother, I suspect) urges me to mark the occasion with something appropriate to the season.  So I have a few things to share.

First, I highly recommend David Wong’s wonderful article, “10 Things Christians and Atheists Can (and Must) Agree On.”  It’s from the December 16, 2007 issue of Cracked.  (Along with Mad magazine, Cracked deserves at least some credit or blame for making me the man I am today.)  But while Wong’s article is humorous in many respects, it’s also very much in tune with We May Be Wrong.  Well, sort of.  I mean, actually, since Wong’s 10 Things article has had over 1.8 million views, that’s a bit like like saying the ocean’s in tune with the last drop of rain to fall into it. But I hope you catch my drift.  I really wish not just 1.8 million, but 1.8 billion, had read Wong’s article.  In addition to being the sort of article I’d love to publish on this website, it also has a bunch of really cool pictures. Check them out!

http://www.cracked.com/article_15759_10-things-christians-atheists-can-and-must-agree-on.html

Second, that same voice (yes, now I’m sure it belongs to my late mother) tells me that because it’s Christmas eve, I ought to say something about Jesus.   And since I respect Jesus at least as much as I respect David Wong, I’ll post four of my favorite things about Jesus.

1. He is said to have preached that one should love one’s neighbor, and even one’s enemy.

2.  He is said to have preached, “Whosoever shall say, Thou fool, shall be in danger of hell fire.”

3. He apparently instructed his followers not to swear.  As I read it, he didn’t seem to be talking about four-letter words; rather, he seemed to be warning against swearing to the certain truth of anything.  (“Swear not at all: neither by heaven… nor by the earth…because thou canst not make one hair white or black.  But let your communication be, Yea, yea; Nay, nay: for whatsoever is more than these cometh of evil.”)

4. It is said that he repeatedly asked his followers, “Why beholdest thou the mote that is in thy brother’s eye, but perceivest not the beam that is in thine own eye?”

I’d like to think that with that sort of philosophy, if Jesus had been connected to the internet, he might not have disapproved entirely of We May Be Wrong.

My third Christmas offering is a link to a You-Tube video of this season’s performances at our (backyard) Friend’s Theater.

I know that my mother would have liked it.  She was an ardent Christian, but she was also a ham.

As for Jesus, I’d like to think he wouldn’t have been offended that we chose Clark Clement Moore’s poem to perform this season, rather than Luke’s rendition of the Nativity.  As I read the gospels, Jesus comes across as a pretty humble guy who (laughing with us, not at us) might have chuckled at our ineptness — and that’s what I like most about him.

-Joe

 

 

 

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/24/some-thoughts-this-christmas-eve/
Twitter
RSS

Baby, It’s Cold Outside

Surely everyone knows the classic Ray Charles and Betty Carter duet in which Ray is intent on getting Betty to stay at his place for just one more drink, while Betty protests, insisting she can’t.  Hammering away with insistence that “It’s cold outside,” Ray eventually prevails on Betty to stay and enjoy the fire.  Snuggling up to him, happy to be together in harmony, Betty joins Ray in singing the final line, “Ah, but it’s cold outside!”

It’s a great study of persuasion in action – the use of words to produce apparent agreement.  I say “apparent” because – well, no, on second thought, I won’t go there.  The time’s not right to take up the subject of the obstacles words pose for minds that wish to share the same thought.  For today, let’s assume that words mean the same thing for everybody. And let’s use them, like Ray Charles so artfully does, for making a case.

If you’ve been following this website, you know that one of our friends made a suggestion that we include one or more “objective truths that everybody could agree on.”  Daunted by the prospect, I sought help from our readers.  The first to answer the call was my longtime friend Ann Beale.  Picking up where Ray and Betty left off, Anne declared that an objective truth to which everyone could agree was, “It’s cold outside.”

Now, I thought this nomination brilliant.  If you don’t know Anne, she lives in South Dakota, where the average low temperature in December is 5 degrees Fahrenheit, the average high only 25.   As it happens, reading Anne’s comment was the first thing I did after getting up at 6:00 a.m., and I was still dressed in the wool sweater I’d worn under the covers during the night – a wool sweater I’d worn over a night shirt, which I’d worn over a tee shirt, which I’d over a tank top.  With the help of these four layers, I’d endured a night of record low temperatures here in Virginia, but with the covers off, I was already shivering as I sat down at my desktop to read Anne’s post.  So I had no choice but to agree with her – it was very cold outside.

Then I read the nomination submitted by another long time friend, Philip McIntyre. Philip nominated an entire slate of candidates.  His description of his nominees – the physical laws of nature – wasn’t quite as pithy as Anne’s, but (always gracious) Philip pointed out that perhaps his post “built on” Anne’s.  You can read Philip’s comment for yourself, but I’d venture the opinion that Philip actually agreed with Anne regarding her nominee: that it was, in fact, cold outside.  One of Philip’s sentences began, “The cold temperature outside right now is…” which strikes me as coming pretty doggone close to agreement. (Philip, I might point out, lives in Buffalo, where the average low in December is 11, and the high, at 31, is still below freezing.)

Now, at that point, I was surprised, but elated.  As best I could tell, (“with three precincts reporting”) there was universal acceptance of an objective truth.  It was, in fact, cold outside.  But then, this morning, as I sat down to record my elation and post “It’s Cold Outside” on the WMBW website’s Home page, I discovered a third nomination.  While the third comment didn’t expressly disagree with Anne – while it wasn’t so contentious, for example, as to say, “Heck no, you fool, it’s hotter ‘n blazes, dammit!” – the writer did write, “Mightn’t the only objective truth be that we do not know what we do not know?”

Definitely food for thought there; I for one was tempted to make a fine breakfast of it, for at least several paragraphs.  But loath to digress, I strove to stay focused on the question at hand – i.e., could everyone agree, “It’s cold outside” – ?  The new writer’s suggestion that there might be only one objective truth everyone could agree on – and that such uniquely objective truth was neither a physical law of nature nor a statement about the weather – forced me to conclude that the new writer was advancing a position in irreconcilable  disagreement with Anne.

I hasten to add that the writer – my brother David – lives in south Georgia, where the average high this time of year is a near-tropical 65.  Well, there you go.  Despite his obvious effort to avoid confrontation with his friends to the north, David, by postulating that he might have put forward the only objective truth, had in a single stroke destroyed our unanimity of belief. (It was easy to see, in that moment, how the Civil War might have started, and as my long time friend Ron Beuch has now suggested with his comment — even as I write this post –bias can be very hard to shed.)

We May Be Wrong is a truly nascent phenomenon.*  During our first three weeks of existence, our growth has been phenomenal.  We already have a huge number of readers.  (At least thirty, I’d be willing to bet.)  But even with only four of us weighing in on the question, we appeared unable to agree that “It’s cold outside” was an objective truth which everybody could agree to.

Now, saddened as I was at this setback, I turned to Philip’s nominees – the physical laws of nature.  Searching for the sort of harmony Ray Charles had achieved with Betty Carter, I asked myself, is it possible that we four, at least, could all agree to the objective truth of Philip’s nominees?  I mean, perhaps, in South Dakota, “It’s Cold Outside” is a physical law of nature.   And perhaps “We don’t know what we don’t know” is a physical law of nature in south Georgia.  So maybe Philip’s comment deserved a closer look.  Maybe, if Anne and David already considered their nominations to be physical laws of nature, they already agreed with Philip, implicitly, and in that case, if I could see my way clear to agreement, Philip’s nomination would have agreement from all four of us.  (And maybe the other twenty-six of us, like Betty Carter, would eventually come around?)

First, I was a little concerned that Philip hadn’t nominated any one Law of Nature in particular, or even multiple such laws, but simply a category, “Physical Laws of Nature.”  It’s been a long time since I was in school, and if I ever knew, I’ve forgotten just how many physical laws of nature the experts have determined there are.   In fact, I’m left wondering what, exactly, a Physical Law of Nature is.  But as with the obstacles posed by words, I’ll forego the temptation to go down that perilous path.  Assume with me, if you will, that we all share a common understanding of what the Laws of Nature are.

I understand that this assumption is not an easy one to make.  In Philip’s comment, he writes, “The problem is, they [the laws of Nature] are so hard to understand.”  Well, I’d sure agree with that.  Relativity?  The space-time continuum?  Quantum mechanics?  They all elude my full understanding, to be sure, and maybe my partial understanding as well.  In fact, even gravity sometimes mystifies me (and not only when I’ve had too much to drink).  But that’s precisely why I wonder about Philip’s statement that, “properly understood,” the physical laws of nature are constant and immutable.  Having agreed that such laws are very hard to understand, I have great difficulty agreeing with anything about what they are when they’re “properly understood,” because I doubt very much that I properly understand them.

But surely I quibble.  And meanwhile, I’m actually more troubled by a different question.  Philip writes that the physical laws of nature are “constant and immutable” in the sense that they “will produce exactly the same result every time in exactly the same set of circumstances.”  I’ve been up all night (well, much of it, anyway) pondering the significance of the italicized words in that sentence.

Now, before I continue, I should acknowledge my own biases.  I personally believe in the value of the scientific method.  As I understand it, scientific “proofs” are all about “reliability” which I believe is the scientific word for what Philip is talking about.  When the scientist keeps extraneous factors under “control,” and can accurately predict the outcome of an experiment time and time again, always getting the same (predictable, identical) result, the scientist is said to have demonstrated “reliability.”   It’s another word for scientific “proof,” as far as I know.  I think there’s much to be said for the scientific method, as a means of learning new things about the physical world.  So if there’s any confirmation bias at work here, I’m pre-wired to agree with what Philip is saying.

But his qualification, “in exactly the same set of circumstances,” nags at me.  Can something be said to be a “law” at all, much less a “constant and immutable” one, if it all depends on an exact set of circumstances?  Isn’t a “law,” by definition, something that operates across circumstances?  There’s a saying in the (legal) law that you can’t have one rule for Monday and another for Tuesday.  It stands for the proposition that for a law to be a law, it has to apply to varied circumstances.  The trooper who issues a speeding ticket says, “I’m sorry, sir, but that’s the law,” by which he is essentially saying, “it doesn’t matter that you’re late for a meeting; the law is the law.  Circumstances don’t matter.”  Believe me, I know that laws often get riddled with exceptions which are essentially driven by variations in circumstance.  Murder?  >> Guilty!  (Oh, self-defense? >> an exception >> innocent.  But murder!  >> Guilty!  Oh, insanity?  >> an exception >> innocent.)   But in the legal world, I’d venture to say, the exceptions are like little “mini-laws” that live within the more general law, running contrary to it in result, but similar to it in form, in that they apply to all the circumstances they purport to include.  Riddled as they are with exceptions, both the general laws themselves and the little “mini-laws” that deal with exceptions are general principles that cut across variations in circumstance.  So I wonder: if every single variation in circumstance had its own special “law,” would there really be any law at all?   With each thing subject to rules applicable only to it, wouldn’t we have anarchy and lawlessness?

David’s nominee, “We do not know what we do not know,” strikes me as a classic tautology, a class of self-evident propositions that also includes “All I can say is what I can say,” “a rose is a rose…” and (importantly) “we do know what we do know.” As such, rather than being the only objective truth, it seems one of a type of an infinite number of truths. At the point at which each unique thing in the world can claim that it is what it is, that it does what it does, etc., it seems plausible to think we might not have objective truth at all, but the very essence of complete subjectivity.

As Philip appears to acknowledge, Anne’s nominee, “It’s cold outside,” seems to result from a constant and immutable set of laws, in the sense of being scientifically predictable, repeatable, and reliable — as long as you remain “in exactly the same set of circumstances.” For people in South Dakota, in the month of December, when there are no forest fires raging for miles around, when the sun is at an oblique angle to the hills around Sioux Falls, when none of the moose are wearing overcoats or carrying space heaters, etc. etc.) it will always be cold outside.

Last night I finished the book of David Foster Wallace essays, Both Flesh and Not, in which I read Wallace’s delightful essay, “Twenty Four Word Notes.” In that essay, Wallace discusses the class of adjectives that he calls “uncomparables,” the first of which is the word “unique.”  Since “unique” means “one of a kind,” he points out that one thing cannot be more “unique” than another; a thing is either unique or it’s not.  Wallace asserts that other uncomparable adjectives include precise, correct, inevitable, and accurate.  “[I]f you really think about them,” he writes, “the core assertions in sentences like, ‘War is becoming increasingly inevitable as Middle East tensions rise,’ [is] nonsense.   If something is inevitable, it is bound to happen; it cannot be bound to happen and then somehow even more bound to happen.”

Philip’s comment uses three key adjectives in describing the physical laws of nature.  He calls them “objective,” “constant” and “immutable.”  I’ll bet that if David Wallace were still alive, he’d agree that “constant” and “immutable” are uncomparables, and perhaps “objective” as well.  If you’re not always constant, then are you really constant at all?  If you’re not always “immutable” – because, on some occasions, you can change – then are you “immutable” at all? If something is “objective” because it doesn’t depend on one’s individual circumstances, then can it depend on any individual circumstances at all, and still be objective?

It seems to me that the class of tautologies comprises an infinitely large class of “truths” because everything is what it is, everything does what it does, and none of these subjective “truths” have to apply to anything else.  So it strikes me as pertinent to ask, ‘Does a truth transcend mere tautology when it applies to anything more than itself?’  And if so, once the gap between two discrete indivisible units is bridged by a “law” that applies to both, is it now a “law of nature” in any meaningful sense?  A constant, immutable, objective truth, because it applies to not just a single set of circumstances, but a second set, as well?   I wonder whether, to qualify as a constant, immutable “objective truth,” a law would only have to apply to two sets of circumstances, or to ten, or to a hundred?

if the “physical laws of nature” include Einsteinian relativity, then isn’t everything ultimately dependent on point of view (i.e., subjective?)  Well, not the great constant, c, the speed of light, you say?  But as I understand it, the speed of light in a vacuum can never be surpassed provided we’re not talking about dark matter, black holes, or parallel universes, and provided we’ve narrowed our consideration to the post-Big Bang era, which insulates our perspective as surely (it seems to me) as the vast Atlantic Ocean insulated pre-Columbian Europe.  And if scientists admit (as I understand they do) that for time prior to the Big Bang, all bets are off, then how is our understanding of physical laws not dependent on our point of view, i.e., subjective?

So pending a reply from Philip or others, who may yet convince me I’m wrong, I’m not yet prepared to agree that the physical laws of nature can lay claim to being “objective” truth.  The original challenge put to the website was to include not just any old objective truth, but an objective truth everyone could agree to.  Alas, much as I hope for our readership to grow, I fear this website may never appeal to those who live on the other side of the Bang, or in any quadrant of the multiverse, or in the world of dark matter, for that matter.

Oh well.  A day or so ago, when there were only three of us, I was, for however brief a time, able to bask in the comfort of pure harmony, knowing everyone agreed that it’s cold outside.  Today, I’ll close by reporting that it’s a few degrees warmer outside.  And in the game of Hide and Seek in which I fear never coming to know the truth, I think that warmth means I may be getting closer.

_______

* Nascent: “(especially of a process or organization) just coming into existence and beginning to display signs of future potential.” – See https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=nascent

–Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/18/baby-its-cold-outside/
Twitter
RSS

Objective Truth, Anyone?

Friends of ours have been making some great suggestions for the website.  One friend – to whom we owe many thanks — made several suggestions that led to changes in the site.   But one of his suggestions we did not incorporate.  That suggestion was: “If you listed some objective truths that everyone could agree to, the website would have a better chance of surviving.”

As website editor, I didn’t adopt this suggestion since I felt unable to identify an objective truth everyone could agree to — even if the very survival of the website was at stake.

But then another friend came forward with a suggestion of his own.  He opined that if the website’s following is going to continue to grow the way we’d like, it will need a thriving discourse among its readers – a discourse triggered, perhaps, by the founders’ occasional thoughts, but itself becoming the main reason people return to the website. His suggestion was to leave implementation of the first friend’s suggestion up to our readers.  Unsure as I am what the outcome might be, I decided to give our readers that chance.

So, friends:  consider this your chance to contribute to the website in this most fundamental way.  Take a moment, if you would, to write a “comment” in reply to this post.    Do you think there are any objective truths that everyone could agree to? If not, tell us why not.  If so, give us an example.

I’m anxious to see what you come up with. and I fully expect to learn from whatever you choose to share.

— Joe

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/15/objective-truth-anyone/
Twitter
RSS

The Answering Machine

I woke this morning to the sad news that Kessler (our daughter’s dog) was being taken to the Vet today, to be put down.

I went next door, to her house, to say my goodbyes to the dog.  Looking into Kessler’s cloudy eyes, I thought I could see evidence of suffering.  I thought I could see returned affection.  I tried to convey what comfort and love I could, wondering whether I really had any idea what was going through the mind of the dog during these, his last hours– or whether I was projecting my own, human thoughts into him –and whether what I was seeing was merely a reflection of myself.

I also observed the way the three grandchildren were dealing with the situation.  “But, Mom, I don’t want Kessler to die.”  Death is something they’ve never really encountered before.  They’ve seen road kill; they’ve even experienced the death of some of their chickens; but that’s not the same thing.  Kessler is older than any of the kids; he’s been their pet their entire lives; in a few hours, he’ll be gone; they are looking at mortality with new eyes today – experiencing the process, not just the aftermath.

Returning home, I found myself wondering about the difference between adults and children of elementary school age.  These children encounter new things every day. Their lives consist almost entirely of new experiences.  At this point, they seem to take it for granted that they have an awful lot yet to learn.  For them, not understanding things is the normal state of mind.  And I wondered – what happens to us, as we become adults?  How do we lose our childhood sense of wonder, our belief that the world is full of things we do not understand?  How do we come to think that, because we are adults, we now understand so much that we can be sure of ourselves?

As I came back into our house, I noticed a message waiting on our answering machine.  (Yes, Karen and I have land lines, not “smart” phones.)  Picking up the receiver, I listened to a delightful message that Karen had saved.  It had been left, three days ago, by Jackson, our five year old grandson.  The reason she’d saved it was that she is a grandmother and the recording was cute:  Having only made five or six phone calls in his life, Jackson had obviously never encountered an answering machine before.

“We’re not available right now,” said my recorded voice.  “Please leave a message.”

“Poppi?” said Jackson, recognizing my voice.  “This is Jackson.”

A long pause as he awaited a reply that never came.

“Hello.  This is Jackson.  Hello?”

Another pause; the barely audible voice of his mother (coaching him) from the background; her words indecipherable.

“This is Jackson,” he said at last.  “Call me.”

More indecipherable coaching from the background.

“This is Jackson,” he said again.  “Call me.  Goodbye.”

I pictured the scene in my mind, imagining the thoughts that had gone through Jackson’s mind three days ago, during his first encounter with an answering machine.  I’d already been trying to remember what it was like to discover new things all the time, and I couldn’t have asked for a better reminder.  I recalled my own fascination with telephones – maybe 1958 — back in the day when we picked up the receiver and waited for a human being to ask us, “Number please?” Remembering how we’d tell her (yes, always a her) what the number was that we wanted her to connect us to.  How she would magically connect us to others, across vast distances.  We didn’t understand how it all worked, of course, but then, we didn’t expect to.  The world was full of things we didn’t understand.

I was still reminiscing when, at that very moment, the door opened and in walked Jackson, in person this time.  Well, me being an adult and him being a five-year-old, I couldn’t pass up the chance to teach him something about life, by which of course I meant life as it really is.  (You know – I wanted to help him along on his path to an adult world in which he would understand just about everything I understood – even old fashioned answering machines.)  So I decided it was time for Jack to have his second encounter with an answering machine.

“Hey Jack, buddy, come over here.  I’ve got something I’d like you to listen to.”  I picked up the telephone, dialed *86, and pressed 1 to retrieve our messages.  There was only one saved message – Jackson’s.  I put the phone to his ear so that, hearing his own voice, he could learn about answering machines.

There was a confused look on Jack’s face.  I was sure it was because he was perplexed by the sound of his own voice. But when I moved my head closer in order to hear what Jackson was hearing, I could hear a robotic, clipped adult male voice who  (at least from my perspective) sounded nothing like me.

“Voice message received at 2:31 p.m.“ said the robotic recording.  “December 6th.  From (804) 551…”

“Poppi?” asked the living Jackson in front of me, mistaking the robotic voice for mine.  “This is Jackson.” This time, instead of listening to a three day old recording of Jackson, I was looking into his eyes as he spoke.

“Poppi?”  came the reply.  We both heard Jackson’s recorded voice — from three days ago – at the same time.  I waited for him to realize it was his own voice he was listening to.  “This is Jackson,” said the recording.

“Who is this?” asked the real Jackson, standing in front of me – and again, he got an answer.

“Hello?” came the voice from the phone.  “This is Jackson.  Hello?”

“Hello,” replied the little boy in front of me.  “What do you want?”

“This is Jackson” said the recorded voice.  “Call me.”

The living boy in front of me searched his five year old brain for a sensible answer, but found none.  There was a long pause; some muffled whispering in the background could be heard over the phone. Finally, Jackson’s recorded voice broke the silence:

“This is Jackson” it repeated.  “Call me.  Goodbye.”

Without missing a beat, the living boy in front of me politely replied, “Goodbye,” and handed the receiver back to me, obviously very confused.  It took me a long time to stop laughing.  When I’d mustered sufficient composure, I informed Jack that, due to the miracle of the answering machine, the voice he’d been hearing was his own, and that he’d been having a conversation with himself.  His face lit up as my words sank in, and one of the biggest smiles I’ve ever seen took over his face, and he started laughing with me.  We laughed together for a long time.

It’s a true story, really – at least, it’s the truth as I perceive it.  The story of Jackson and the answering machine is my story; I have no idea how Jackson perceives it; no idea how the story would play out, if he told it, from his perspective.  I wish that, somehow, I could climb into the mind of another person – or even a dog – and see whether we’re perceiving the same reality.  But, sadly, I don’t seem to know how to do that.  Sometimes it seems that what I take to be reality is really nothing other than the playing back of a recording — a recording I don’t recognize, but which, in fact, is only my own voice, repeating things I’ve already thought, and said, and have come to believe I fully understand, and nothing more.

But it’s been a good day, all things considered. True, Jackson and I and the rest of the family are all sad about the passing of our dear friend, Kessler.  But as I think about Jackson and the answering machine – as I’m amused anew by his innocence — as I’m joyful anew at his discovery – I’m especially attracted to the way he was able to laugh at his own folly.  I wish I could learn to do that more myself.

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/09/the-answering-machine/
Twitter
RSS

How to Get On Our Mailing List

Several people have reported they’ve been to the WMBW website and can’t find any place to subscribe to the Thoughts and Opinions posts, or to be put on the mailing list.

One interested reader went so far as to insist that the website did not even offer a way to subscribe.  (When we told him he was definitely, unmistakably, absolutely wrong, and called him an idiot, he accused us of hypocrisy — imagine that? 🙂  Just kidding.)

Seriously, for anyone who has missed the websites explanations for how to get on the mailing list, the answer is to click on the round, gray “Follow” icon, which takes you to a screen provided by SpecificFeeds.com (where you can enter the e-mail address where you would like to receive notifications.)

This round, gray “Follow” icon appears (along with other social media icons) at the end of every post on the Thoughts and Opinions page (including this one).  It also appears (along with the other social media icons) in the middle of the screen (quite prominently) once you have been on the website for sixty seconds.  (We thought it a little rude to throw an invitation to Follow us in the middle of the screen before a first-time visitor had even had a chance to look over the home page; if they left the website before sixty seconds was up, odds seemed small they’d want to follow us anyway.  Also, since the same icons appear at the end of every post, and by subscribing, what a visitor is really doing is asking to be notified of more posts, we figured having that option placed at the end of every post made sense.)

FYI, WMBW selected SpecificFeeds.com to provide out mailing list plug-in because:

  • we understand that a third-party provider would help prevent robotic subscribers
  • they appear serious about protecting subscribers’ e-mail addresses, and
  • they allow subscribers several options on how to be receive WMBW news and notifications
  • they were free

That said, we’re really just giving them a try, and if you have complaints or suggestions regarding our subscription process, we’ll love to hear from you.

(And again, we’ll never share our mailing list with anyone.)

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/06/how-to-get-on-our-mailing-list/
Twitter
RSS

Goldilocks and the Case Against Reality

In my last blog, I credited Richard Dawkins with reminding me how human beings are able to see only a narrow band of the electromagnetic spectrum.  As Dawkins put it, “[N]atural selection shaped our brains to survive in a world of large, slow things.”  Would we be better off if we could see not only “visible light” but infrared and ultraviolet as well?  Or, like Goldilocks, do we have no use for chairs that are too large or too small?  Are we better off if we devote our attention to the things that are “just right” for creatures of our own size and needs?

I’m no evolutionary biologist, but I’ve long been fascinated by the anthropocentric idea that evolution first made us the dominant species on earth, and will now ensure we remain at the pinnacle of creation – presumably, because we’re so much more intelligent than any other creature on earth, so that no other species will ever be able to catch up.  Some people seem to believe evolution will ensure that our brains get ever larger and that we’ll ascend the evolutionary ladder ever higher toward omniscience.  The idea that, instead, natural selection has shaped our brains “to survive in a world of large slow things” – causing us to be blind to smaller and faster things, for our own good – is surely a different idea of evolution altogether.  I’ve been intending to research that question and to blog about what I found.

This morning, my brother David sent me hurtling in that direction faster and farther than I’d imagined possible.  David – who’s been kind enough to join me in starting We May Be Wrong – sent me a link to an article by Amanda Gefter that appeared in Quanta and was reprinted in The Atlantic.  It’s called The Case Against Reality, about the theories of cognitive scientist Donald D. Hoffman.  In the article, Hoffman says:

The classic argument is that those of our ancestors who saw more accurately had a competitive advantage over those who saw less accurately and thus were more likely to pass on their genes that coded for those more accurate perceptions, so after thousands of generations we can be quite confident that we’re the offspring of those who saw accurately, and so we see accurately. That sounds very plausible. But I think it is utterly false.

The illustration Hoffman proceeds to use, in order to simplify the point, reminded me of the story of Goldilocks.  He asks us to think of a creature that needs water for survival.  Too much of it and the creature will drown; too little and it will die of thirst.  What the creature really needs, for purposes of survival, is simply to know whether something contains a beneficial (medium) amount of water or not.  In Goldilocks terms, that it’s “just right.”

What I’ll call the “Goldilocks factor” strikes me as lying behind our inability to see ultraviolet or infrared light.  We don’t see the extremes of electromagnetic frequencies because we don’t need to, and because having all that extra information would bog down our brains with useless minutiae.  It’s just not efficient for a biological organism to spend its energy dealing with things of no immediate consequence to its survival, and if it took the time to do so, it would be fatal.  If Papa Bear’s porridge is so hot as to scald Goldilocks’ tongue, she has no reason to concern herself with whether its 300 or 350 degrees.  If she did, she’d succumb to what has been aptly dubbed “paralysis by analysis,” and her tongue would get very burned while she figured it out.

Hoffman compares it to what we see on a desktop interface.  We see icons, not binary code.  “Evolution,” he says, “has shaped us with perceptions that allow us to survive. They guide adaptive behaviors. But part of that involves hiding from us the stuff we don’t need to know. And that’s pretty much all of reality, whatever reality might be.”

I hesitate to further describe Gefter’s article lest it decrease the chance you’ll follow the link and read it for yourselves.  But in a nutshell, Hoffman’s view is that the world presented to us by our perceptions is nothing like reality – or that there is no such thing as objective reality — or that the only realities are our individual perceptions – or – well, doggone it, please read the article for yourself.

http://www.theatlantic.com/science/archive/2016/04/the-illusion-of-reality/479559/

It’s a beaut.  Thanks for sharing it, Dave.

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2016/12/03/goldilocks-and-the-case-against-reality/
Twitter
RSS