Comparing Apples and Oranges

You know: the very point of saying “it’s like comparing apples and oranges” is that it’s difficult, maybe even impossible, to do so, because —well —because they’re just not the same.  Consider this picture:

 

Forty-nine apples and one orange.   If I put all this fruit in a bag, mix it up and pull one piece out at random, the odds will be 49 to 1 that I’ll pull out an apple.  That is, 49 to 1 against the orange.

Now a question for you: Assuming a random draw, will I be surprised if I pull out an apple?  Answer: no, I won’t.  I fully expect to pull out an apple, due to the odds.  I assume you wouldn’t be surprised either.  I also assume we’d both be surprised if I pulled out the orange, for the same reason.  Am I right?

Now,  I feel as I do without qualification — by which I mean, for example, that if I pick out the orange, my surprise won’t be greater or less depending on whether the orange weighs nine ounces or ten, and I won’t be surprised if I pull an apple from the bag, regardless of the number of leaves on its stem.   The fact is I expect an apple, and as long as I get an apple, I’ll have no cause for surprise.  Right?

But now another question, and this one’s a little harder. What are the odds of my picking out an apple with two leaflets on its stem?  You can scroll back and look at the picture if you want, but try to answer the question without doing so: what are the odds of my picking an apple with two leaflets on its stem?

Ready?

Alright. Hard, wasn’t it?  If you went back to look at the picture, you found there was only one apple with two leaflets on its stem. Knowing that, you determined that the odds against my picking that particular apple were 49:1, the same odds as existed against my picking the orange.  Yet it’s pretty clear, as already determined, I would have been surprised if I’d picked the orange, but I wouldn’t have been surprised if I’d picked the only apple in the bag with two leaflets on its stem.

My real question, then, is why the difference?  And the only answer that makes sense to me comes not from probability theory, but from psychology.  I’m surprised if I draw the orange because, being mindful of the differences between the orange and the apples, I expected an apple. But not being mindful of the uniqueness of the two-leafed apple, I lumped all the apples together and treated them as if they were all the same.  I focused on the fact that the odds against the orange were 49:1, while never forming a similar expectation about the improbability of choosing the two-leafed apple.

Here, then, is my conclusion:  In pulling fruit from the bag, the actual improbability of every single piece of fruit is the same. Yet the perceived improbability of choosing the orange is far greater than the perceived improbability of drawing the two-leafed apple, because… well… because I hadn’t been paying attention to the differences among the apples.

Also, the division of the 50 pieces of fruit into only two categories – apples and oranges – was a subjective choice.  I could have grouped the fruit into large and small, or into three groups based on relative sweetness.  Or according to the number of leaves on the stem, in which case the orange would have been in a group with twenty apples.

Now, in any group of 50 pieces of fruit, no two are going to be exactly alike – the two-leafedness of one will be matched by the graininess of another, the seed count of a third, the sweetness of a fourth, and so on.  But we elect to ignore (or de-emphasize) a whole slew of possible differences, in order to focus on one or two traits.  Only by ignoring (or at least de-emphasizing) other differences do we construct a homogeneous group, treating all 49 of the red fruits the same for purposes of comparison to the orange one — treating them all as “apples” rather than one or two McIntosh, one or two sweet ones, etc.  That’s why I’m not surprised when I pick out that one, unique apple, despite the 49:1 odds against it.

Now consider a related point: that (subjective) decision about what criteria to base comparisons on, while ignoring other criteria, not only explains why we’re surprised if we select the orange, but how we estimate odds in the first place.  In fact, if we consider all their attributes, every piece of fruit is unique. The odds against picking any one are 49:1.  Yet, if we only focus on the uniqueness of the orange, our impression of odds will be vastly different than if we focus on fruit size, or sweetness, or seed count.

It isn’t some sort of unalterable constant of nature that determines how we perceive odds – it’s what we’re mindful of, and our resulting (subjective) expectations.

In an earlier post, Goldilocks and the Case Against Reality, I wrote of the concept that the limited focus which characterizes our brains has been useful to us.  (If I could see every part of the electro-magnetic spectrum, I’d be overwhelmed with information overload, so I’m advantaged by only being able to see visible light.)  My brain is just too small and slow to deal with all the information out there.  Even if I’d happened to notice there was only one two-leafed apple, I could never have taken the time to absorb all the differences among the forty-nine apples.  Compare that, say, to the difficulty of absorbing the different facial features of every person on this tiny, one-among-trillions planet.  I cope with reality by ignoring vast complexities of things I don’t understand, lumping a lot of very special things into groups for the very reason I can’t get my brain to focus on all their differences.

Now, this lesson about comparing apples and oranges teaches me something about God, and I hope you’ll give me a chance to explain.

The astronomer Fred Hoyle is said to have written, “The probability of life originating on Earth is no greater than the chance that a hurricane, sweeping through a scrapyard, would have the luck to assemble a Boeing 747.”  Hoyle apparently used the improbability of life as an argument for the theory of intelligent design. Hoyle’s statement was then quoted in The God Delusion (Houghton Mifflin, 2006), by the atheist Richard Dawkins, who said that the “improbability” of life is readily explained by Darwinian evolution, and declared, “The argument from improbability, properly deployed, comes close to proving that God does not exist.”

Now, whether either of these beliefs makes sense to me, I’ll leave for another day.  My focus is on trying to understand any argument based the “improbability” of life, and its because of what I’ve learned from the fruit.

I agree that the odds are against a hurricane assembling a 747, and against life’s existence exactly as it is today.  But is my surprised reaction to such improbabilities any different than my surprise at the random drawing of an orange, but not at the two-leafed apple?  Imagine, for a moment, that some other configuration of scrap parts had been left in the hurricane’s wake – one that appeared entirely “random” to me.  Upon careful inspection, I find that a piece of thread from the pilot’s seat lies in the precise middle of what was once the scrap heap.  A broken altimeter lies 2 meters NNE of there.  The knob of the co-pilot’s throttle abuts a palm frond 14.83 inches from that.  The three hinges of the luggage compartment door have formed an acute triangle, which (against all odds) points precisely north; the latch from the first class lavatory door is perched atop the life jacket from Seat 27-B….

I trust you get the picture.  Complex?  Yes.  Unique?  Yes.  So I ask, what are the odds the triangle of hinges would point exactly north?  The odds against that alone seem high, and if we consider the odds against every other location and angle, once all the pieces of scrap have been located, what are the odds that every single one of them would have ended up in precisely the configuration they did?

In retrospect, was it just the assembly of the 747 that was wildly against the odds?  It seems to me that every unique configuration of parts is improbable, and astronomically so.  Among a nearly infinite set of possible outcomes, any specific arrangement ought to surprise me, no?  Yet I’m only surprised at the assembly of the 747.  What I expect to see in the aftermath of the hurricane is a helter-skelter mess, and I’m only surprised when I don’t.

But on what do I base my expectation of seeing “a helter-skelter mess?” Indeed, what IS a “helter-skelter mess”?  Doesn’t that term really mean “all those unique and unlikely arrangements I lump together because, like the apples, I’m unmindful of the differences between them, unmindful of the reasons for those differences, ignorant of how and why they came to be as they are?”

Suppose, instead, that with the help of a new Super-Brain, I could not only understand all the relevant principles of physics, all the relevant data – the location, size, shape and weight of every piece of scrap in the heap before the storm — and suppose further that when the storm came, I understood the force and direction of every molecule in the air, etc.  With all that data, wouldn’t I be able to predict exactly where the pieces of scrap would end up?  In that case, would any configuration seem improbable to me?  I suggest the answer is no.  There’d be one configuration I’d see as certain, and the others would all be patently impossible.

Compare it to a deck of cards.  We can speak of the odds against dealing a certain hand because the arrangement of cards in the shuffled deck is unknown to us.  Once the cards have been dealt, I can tell you with certainty what the odds were that they’d be dealt as they were: it was a certainty, given the order they had previously taken in the deck.  And if I’d known the precise arrangement of the cards in the deck before it was dealt, I could say, with certainty, how they would be dealt.  Perfect hindsight and foreknowledge are alike in that neither admit of probabilities; in each case — in a state of complete understanding — there are only certainties and impossibilities. The shuffling of a deck of cards doesn’t mean that any deal of particular cards is possible, it means that we, the subjective observers, are now ignorant of the arrangement that has resulted. The very concepts of luck, probability and improbability are constructs of our limited brains.  Assessments of probability have developed as helpful ways for human beings to cope, because we live in a world of unknowns.

Now, let’s return to the scrap heap, one more time.  But this time, we don’t have an all-knowing Super-Brain.  This time, we’re just a couple of ants, crawling across the site after the hurricane has left.  On the off-chance that the hurricane has left a fully assembled 747, would we be mindful of how incredibly unlikely that outcome had been?  I suspect not. A 747 has no usefulness or meaning for an ant, so we probably wouldn’t notice the structure involved, the causes and purposes of each part being where it is. From our perspective as ants, that assembled 747 might as well be a helter-skelter mess — an array of meaningless unknowns.

Now, after traversing the 747, something else catches our little ant eyes. Immediately, we scramble up the side of the anthill, plunge into the entrance, race down the pathway to the Queen’s deep chamber, and announced with excitement that something truly amazing has happened.

“It’s surely against astronomical odds,” I say. “I wouldn’t believe it myself, had I not seen it with my own two eyes!”

“What is it?” the Queen’s courtiers demand to know.

“A great glass jar of sweet jelly has appeared,” you say, “just outside the entrance to our anthill!  That jelly could have landed anywhere in the jungle.  What are the odds it would land just outside the entrance to our hill?  A thousand to one?  A million to one?  There must be a reason…”

Well, there probably is some reason, it seems to me.  But the difference in approaches taken by people and ants to the perceived “improbabilities” here reminds me of comparing apples to oranges.  It’s not just that apples are different from oranges.  Whether “God” made us or not, we’re all unique, in many, many ways.  Some of us — I’ll call them the oranges — attribute perceived improbability to “plain ol’ luck.” Others, like one-leafed apples, attribute it to intelligent design.  Others, like leafless apples, say that improbability nearly proves the non-existence of God.  I say, what we perceive as improbable depends on whether we’re ants or people.  Our surprise varies widely, depending on the criteria we’re (subjectively) mindful of.  But as unique as we are, we’re all alike in one respect: we all have limited brains, and that’s why we need concepts like probability —to cope with our profound lack of understanding.

So, call me a two-leafed apple if you like, but when I encounter the improbable — the fact that the grains of sand on a beach happen to be arranged exactly as they are, and the snowflakes in a blizzard move exactly as they do — I try to remember that what I experience as “randomness” is just a name I give to what I can’t get my mind around.  “Improbability” tells me nothing about God, one way or the other, except that, if God does exist, she gave me a brain that’s incapable of fully understanding the uniqueness of things, or why any of it exists.

And I’m okay with that.

— Joe

 

Self Reflection

What follows was submitted to the WMBW website as a comment on one of my earlier posts.  I was moved by it; I wanted to share it; so I got the author’s permission to post it as a guest blog in its own right.  My old friend, Ron Beuch, has clearly been doing some honest self-examination. I’m pleased to be able to share what he wrote:

The old man sits at the bench in his favorite sweats, the one with the hoodie.  (A friend gave it to him for helping to build a stone water feature for his patio.) The overhead garage door is closed because the wind is blowing and the temperature is around freezing. This limits his light to the overhead LED spots that he installed recently.

Surrounded by the stuff of forty years, he hears the rattling of the doors and thinks of the sixty foot black walnut tree that fell on his stuff last summer. He puts aside his newly acquired paranoia of wind and inspects the silver he has been tasked to rescue, two wine goblets, two dinner forks, two dinner knives and a large serving fork. The wife has bumped his quota because she knows that speed comes with experience.

As he polishes these items he reflects on the memories connected to them, the romantic dinners with his wife, the family dinners on holidays, the parties with friends. The memory of his drinking problem dims the glow for a moment, but fortunately he won that battle. The white noise of the space heater warming his feet competes with the tintinitis that is buzzing like a summer evening in the background. As he dives closer in focus to the depths of the shine to see the blemishes that might mar the surface, his memory does the same with his past. When inspecting his psyche, some of the smallest details of his biggest party fouls surface.

“Who’s kids are those?”

“Whose tree is that?”

“I didn’t know sports bras came in that size.”

He can’t help but feel better when he compares these to what is coming out of the TV today.

Thanks, Ron.

 

-Joe

Rip Van Winkle Returns

Sometimes I feel like Rip Van Winkle.  A career in civil rights and employment law kept me in the midst of political issues and controversies for over thirty years, but upon my retirement in 2003, I decided to enjoy a less stressful life:  to do so, I would isolate myself from the news.  So I went into a deep sleep.  For sixteen years now, I’ve been dreaming of beautiful things.  During my slumber, I played with grandchildren, I gardened, I wrote historical fiction, I read some of my daughter’s old college psychology texts – nothing that would raise my blood pressure.  I especially enjoyed reading about the psychology of human error, and confirmation bias.

In Being Wrong (Harper Collins, 2010) Kathryn Schulz quotes the French essayist Montaigne as asserting that people “are swept [into a belief] – either by the custom of their country or by their parental upbringing, or by chance – as by a tempest, without judgment or choice, indeed most often before the age of discretion.”  In keeping with that view, Schulz asserts that the single best predictor of someone’s political ideology is their parents’ political ideology.  That had certainly been true in my case, and as I researched the actual lives of the players in my historical fiction, I had discovered how true it was for them as well.  I was forced to ask myself the difficult question of whether I believed what I did, not because it made objective sense, but because of an inherited or at least culturally-guided confirmation bias of my own.

Now, even when asleep, our bodies can sense the presence of heat, cold, or other stimuli, and in a similar way, though I was asleep, I did hear snippets of the outside world from time to time.  The classic movie I’d recorded (so I could fast-forward through campaign ads) having ended, I’d be startled when the TV screen suddenly defaulted to the late news on TV.  In the car, entranced by Smetana’s Moldau or Charles Mingus’s rendition of “I’ll Remember April,” I’d be jarred awake by a piece of headline news before my hand could turn the radio off.  So I wasn’t totally asleep; not totally unaware of what was going on in the modern world.  Just mostly so.

Now, think what you will of him, few will deny that Donald Trump makes for engaging theater.  So no surprise, occasional sound bites of last summer’s slugfest between Donald and Hillary began to intrude on my dream, appealing to my own interest in politics the way a voice whispering “one little drink won’t hurt you” might appeal to an alcoholic, even after sixteen years on the wagon.  And – no one will be surprised to hear this – since awakening from my sixteen-year political slumber, I’ve been  feeling like old Rip Van Winkle himself, rubbing my eyes in disbelief at how much has changed during my absence, aghast at just how divisive this country had become while I slept.  My conservative friends had become so opinionated and cocksure that I found myself trying to articulate liberal replies in response, in an effort to moderate their extremism.  My liberal friends had become so arrogant and dismissive of their opponents that it seemed I had to join them, or become their enemy.  Two months ago, I started this blog as the only response I could think of to a world that seemed to have gone out of control as I slept.  And because of this blog, I have started, once again, to be sucked into the vortex of the news.

I still know little of what went down during my reverie.  As I emerge from my slumber, I imagine myself having something like Van Winkle’s naivete.  Perhaps that naivete will be apparent to others, as I dare to comment on the modern political scene.  But let the chips fall where they may, I’m going to comment – because I’ve decided my long slumber may actually be of help to the mission at hand.

My brother James alerted me today to an article I found most interesting, and this article is actually the focus of my post today.  But before I get to it, I’m afraid that, for some on the right, it might be an immediate turnoff to mention that it came from Vox.  Vox is a news source I’d never heard of until today, as it was created during the period of my deep slumber.  From what I’ve been able to gather this afternoon, it’s apparently viewed by the right as being very left.  So I feel constrained to offer, first, a word of caution about sources.

In Kathryn Schulz’s catalogue of the types of non-rational, illogical thinking to which we human beings are prone,  she points out that “[i]nstead of trusting a piece of information because we have vetted its source, we trust a source, and therefore accept its information.”  That’s understandable in some cases, but not a good thing for one aspiring to real communication across the political divide.  And in this case, I feel I have an advantage – having never heard of Vox before, I hold no biases for or against the source.  I neither trust it nor distrust it.  I can only consider what I read in it on its own merits.

Anyway, I hear today that Ezra Klein launched Vox in the eleventh year of my slumber with an article titled “How Politics Makes Us Stupid.”  I haven’t read it, but it apparently focused on the scientific work of Dan Kahan, a professor at Yale Law School whose earlier work showed that the ability to reason soundly, particularly about political subjects, is undermined by the need to protect one’s core beliefs.  Hence, “how politics makes us stupid.”  Now, lost as I may have been in the land of Nod, that came as no surprise to me:  it sounded like run-of-the-mill confirmation bias, and I had digested the concept of confirmation bias years ago, before ever going to sleep, along with half a package of Oreo cookies.  But of greater interest to me is what appeared in Vox this week.   Klein has now reported on the work of Professor Kahan again, this time to report a way to escape our human susceptibility to confirmation bias:  CURIOSITY.

Apparently, as described by Klein (http://www.vox.com/science-and-health/2017/2/1/14392290/partisan-bias-dan-kahan-curiosity), Kahan’s new research shows that some of us – on both the right and the left – are more scientifically curious than others.  And that those of us who are scientifically curious are less prone to confirmation bias – or, to use Kahan’s phrase – less prone to let our politics make us “stupid.”  The point appears to be that confirmation bias interferes with sound thinking on both the left and the right, but that curiosity – a trait that exists on both the left and the right – is the common predictive factor that makes us less susceptible to the “stupidity” toward which confirmation bias pushes us.

Now, I haven’t vetted the source.  I haven’t even read Kahan’s actual findings.  I know better than to rely on the second-hand report of any mediary, trusted or not.  But I have to confess, I’m doggone interested.  For  the past several weeks, I’ve been asserting that political debate is for people who want to prove that they’re right in the eyes of a judge – not for people who want to convince people with whom they disagree.  In a debating class, there’s some sort of third-party judge.  In a courtroom, there’s a judge or jury.  In a political debate, there’s the undecided viewing public that is the effective judge.  In every case, the efforts of the debaters are designed to win points with the third-party judges by making the other side look erroneous, ignorant, or (best of all) just plain foolish.

How surprised I was, upon waking from my slumber, to discover that modern internet discussion is conducted the same way – as if there were some third party judge present to determine a winner.  After a thirty year legal career, I can tell you that I never saw a plaintiff convince a defendant she was right, nor a defendant convince a plaintiff that he was.  Rubbing my eyes of my sleepy dust, I had to wonder what these internet debaters thought they were doing in their efforts to “win an argument” (by showing how stupid their adversaries were) in the absence of any third party judge.  Weren’t they quite obviously driving their opponents deeper into their convictions?  In Being Wrong, Schulz describes exactly that phenomenon – how such efforts to “persuade” actually have the opposite effect.  And I’ve been saying that, surely, it makes more sense to conduct political discourse with a sincere attitude of wanting to learn from one’s adversaries, rather than proving (to ourselves?) how stupid our adversaries are.  I’ve been asking whether, paradoxically, a sincere desire to learn from someone else isn’t more likely to result in his or her learning from us at the same time.  And I’ve been wondering if there isn’t some psychological study that backs up that theory.

So here, today, comes my brother James, providing me exactly the sort of scientific study I’ve been looking for.  A desire to learn — curiosity — could it really make us less susceptible to confirmation bias?  Perhaps this is all just confirmation bias, on my part, fitting as well as it does with what I already suspected. So I want to check into it further.  I will check into it further.  But in the mean time, doggone it, it seems clear to me that curiosity must be the remedy, just as Kahan and Klein say.  If being curious isn’t close to being open-minded, and if being open-minded isn’t essential to learning, and if learning isn’t something we should all strive to experience, then what is?  And how come there’s all this debating and berating that has been shown to keep us from ever learning anything?

The world has changed a great deal in my years in the land of Nod.  Now that I’m awake, feeling (like old Rip Van Winkle) a good bit naïve and ill-informed, with no real clue about the strange new world I find around me, I am very, very thankful for that slumber.  For after thinking over what Kahan’s research has apparently shown, I believe my deep sleep may have done me a huge favor; by being politically asleep for these sixteen years, what strikes some (myself included) as naivete may be just what I need to be curious about what’s been going on in the world; curious about who’s right, and whose wrong; and ready, and willing, to learn from people who aren’t already my mental clones.

I’ll close by applauding another website I learned of just today: The Lystening Project.  The Lystening Project is an innovative approach to fostering open-mindedness and civility in political discourse, conceived of by a class of San Francisco high school students in what is surely a kindred spirit to that of We May Be Wrong.  Check out their website for yourself, but from what I gather, their idea is to assess participants’ political leanings through a short survey of opinions, and then to pair them with people of opposing views for dialogue across the divide.   I especially like the “oath” that participants must take before undertaking such paired dialogue:

“The Lystening Oath”

I will take a moment to connect with the other person as a human being first.

I will enter this conversation with the goal of understanding, not convincing.

I will not vilify, demean or degrade others or their views.

I will enter this conversation with goodwill and I will assume goodwill on the part of the other person.

 I will do my best to express my own views and how I came to believe them

Reminds me of the rules for the WMBW Forum.  I can’t imagine a better oath to ask people to take, and I thank the students’ advisor, Elijah Colby, for bringing their project to my attention.  Check them out for yourself at https://thelysteningproject.wixsite.com/thelysteningproject. Help if you can.

— Joe

The WMBW Forum

The We May Be Wrong Forum is now up and running.

We’re hoping it will be a different kind of discussion Forum — not one in which people belittle those they think are wrong.  Not one where people are trying to “win a debate” or feel good by surrounding themselves with people who think like they do.  Rather, a Forum in which people can participate in discussions with people who may disagree, but without fear of being mocked or ridiculed, because they aren’t contestants in debate, but partners in a search for understanding.

Impossible?  Maybe.  Altruistic?  Of course.  But we think of it as a worthwhile experiment in this age of rampant incivility, and we’re moving forward.

So you’re invited to check out the new WMBW discussion Forum, and help us make the experiment succeed.

–Joe

MLK and the Dream

I had a dream last night;  I woke up this morning thinking about it. And my train of thought went from there to Martin Luther King’s dream.  Remembering the late civil rights leader led me to contemplate a sort of ironic coincidence: that last Monday – the 16th – the Martin Luther King Holiday – was the very day I made the final revisions to my novel, Alemeth, and began the process of formatting it for the printing company.

Completion of the novel is the fulfillment of a dream.  I could trace its origins back to the early 1960’s, possibly even to the very year of King’s famous 1963 speech.  That was when my grandmother first showed me some of the letters my great uncle Alemeth had written home from the front lines during the Civil War.   Or I could trace its origins to a dinner that Karen and I had with our friends Roger and Lynda ten years ago, when a lively discussion got me thinking about a novel that explored (or even tested) the differences between fiction and non-fiction.  Or I could trace it back seven years, when I chose to write Alemeth’s life story.  No matter how far back I go to date the novel’s origins, it has been many years in the making. Somewhere along the way, a novel based on Alemeth’s life became a dream, and it seemed ironic that the dream had finally been fulfilled on the Martin Luther King Holiday.

But the coincidence seemed ironic for reasons deeper than that my novel has been sort of a dream for me.  It seems ironic because the themes of King’s famous “I Have a Dream” speech and the themes of Alemeth are so closely related.

For King’s dream, we need scant reminder.  “[O]ne day… even the state of Mississippi… will be transformed into an oasis of freedom and justice.” “[M]y four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character…” My great uncle, Alemeth Byers, the title character of my novel, was the son of a cotton planter in Mississippi.  The family owned sixty slaves when the Civil War began.   In calling Mississippi “a state sweltering with the heat of injustice, sweltering with the heat of oppression,” Martin Luther King had been talking about my family.

Early in my research into Alemeth’s life, I began to confront what, for me, was terribly unsettling.  I knew my grandparents to be among the kindest, most “Christian,” most tolerant people I knew.  But as I grew older, my research into their lives, and into their parents’ lives, revealed more and more evidence of racial bigotry.  In old correspondence, these prejudices pop up often – and most alarming of all – when I looked honestly at the historical record, I saw those prejudices getting passed down, from generation to generation.

In one respect, I felt I was confronting a paradox of the highest order.  My mother was kind and loving, and my sense was that her kindness was in large part because her parents had been kind.  My instincts applied the same presumption to their parents, as if “loving and kind” was a trait carried down in the genes, or at least in the serious study of Christian Scripture.  (My grandmother was a Sunday School teacher; my childhood visits to her house always included Bible study.). Presuming that my great grandparents were as kind and loving as my grandparents, and knowing that they, too, had been devout Christians, I found it paradoxical that all this well-studied and well-practiced Christianity not only tolerated racial bigotry but, in great uncle Alemeth’s day, was used to justify a war to preserve human bondage.  Frankly, it made no sense.  I wondered: How did these people square their Christian beliefs with their ownership of so many slaves?  With their support for a war intended to preserve their “property rights” in these other people?

It was even more unsettling, then, to realize how the “squaring” had occurred.   George Armstrong’s The Christian Doctrine of Slavery (Charles Scribner, New York, 1857) made a fascinating read.  That work expounded, in argument after argument, based on scripture after scripture, how God had created the separate races, given Moses Commandments which made no mention of slavery, instructed the Israelites to make slaves of their heathen enemies (Leviticus 25:44-46), sent a Son to save us who never once condemned slavery though he lived in its midst, and inspired Saint Paul to send the slave Onesimus back to Philemon with instructions to be a good, obedient slave to his master.  Armstrong’s work was perhaps the most impactful, but by no means did it represent an isolated view.  My research uncovered source after source that made plain how the slave owners of the ante-bellum South were able to square their support of slavery with their Christianity: they did so by interpreting Christian Scripture as supporting the institution.  Indeed, in some sermons of the day, the case was made that being a good Christian required a commitment to the defense of slavery, because civilized white people had a Christian duty to care for their “savage” African slaves.  In the end, of course, they were so convinced they were right that they were willing to go to war and fight (and die) for it.  (Their cause being a righteous one, the killing of people in support of it met all the requirements for a “Just War” as traditional Christian doctrine expounded it.)

For me, it was an eye-opener to realize that southern Christians based their support of slavery squarely on Christian scripture.  It was also an eye-opener to see how the beliefs and attitudes of the community were shared, both horizontally and vertically.  By horizontally, I mean how family members, neighbors, newspapers, courts, elected representatives, school teachers and preachers all worked together to homogenize the Southern attitude toward slavery.  (It was rare to find a voice of dissent – the conclusion seems compelling that the few dissenters tended to keep their opinions to themselves, for fear of being run out of town, as those considered “unsound on the slavery question” generally were.)  By vertically, I mean how attitudes and beliefs were passed down from one generation to the next, most strongly within immediate families, but also within whole communities and cultures.  My research extended back in time to the racism of our national heroes, Washington and Jefferson, and forward in time through my grandparents, my parents, and –

Indeed.  What about myself?  Historical research proves again and again how, once accepted in a family or community, “wrong” attitudes and beliefs can be passed down so easily from one generation to the next.  Is it possible I could be exempt from such influences?  Somehow free to form my opinions entirely on reason and logic, safe from any familial or cultural biases? All my historical research has led me to conclude that we are most  prone to be blind to the wrongness within that which is most familiar; if that’s true, what are the ramifications for my own attitudes and beliefs?  How much of the racism inherent in my family history manages to cling to my own way of thinking?  I hope none of it, of course, but how likely is it that some of it persists?

I will repeat a quote from  The Gulag Archipelago, which I already mentioned in a prior WMBW post and which I managed to squeeze into Alemeth as well.  Alexander Solzhenitsyn expressed a wish for an easier world:

If only there were evil people somewhere insidiously committing evil deeds and it were necessary only to separate them from the rest of us and destroy them. But the line between good and evil cuts through the heart of every human being. And who’s willing to destroy a piece of his own heart?

I’ll have more to say in later posts about what psychologists call the “bias blind spot.”  For now, suffice it to say that much as I share King’s dream for a day when prejudice will be a thing of the past, I fear that as long as we have families, as long as parents teach their children, as long as such a thing as “culture” exists, we will all have our prejudices.  Many of them, I believe, will have been inherited from our parents and grandparents.  Others from school teachers, preachers, news sources, national heroes, or friends.  A rare few, perhaps, we will have created entirely on our own.  But they will be there.  And others who see this the same way I do have suggested an idea that makes a great deal of sense to me: that to begin the path toward a more just world, we’d do well to begin by trying (as best we can) to identify what our own biases are.

In Alemeth, I have tried to take a step in that direction.  Early in the evolution of the novel, I found myself asking whether it was I who was creating Alemeth, or Alemeth who had created me.  It’s a novel about my family – about the culture that began the process of making me what I am – and it’s not an entirely pretty picture.  But the dream that inspired it, and the research and thought given to the project, is also largely responsible for the existence of something else.  I don’t think I’ll be giving too much away if I give you a hint: the last four words of the novel are “we may be wrong.”

— Joe

WYSIATI

 

Have you ever seen a swan?  If so, how many have you seen?

For four years, my family lived on a pond that we shared with a family of swans.  I saw this one family a lot.  More recently, I’ve seen a few more swans, but given that swans live long, maintain monogamous relationships, and tend to remain in the same habitat, I suspect I’ve been seeing the same swans over and over again. I’d take a wild guess that I’ve seen a total of thirty swans in my life.  You might ask yourself the same question now: how many do you suppose you’ve seen?  (We’ll return to the matter of swans in a moment.)

I’ve been on vacation in Florida, so it’s been a couple of weeks since my last WMBW post.  During the holidays I was able to read a couple of excellent books: one of them, Thinking, Fast and Slow, by the psychologist and Nobel prize winner Daniel Kahneman, asserts that we have two systems in our brains – one designed to produce immediate beliefs, the other to size things up more carefully.  The other, Being Wrong, by journalist Kathryn Schulz, explores both the reasons we err and the emotional costs of recognizing our wrongness.  Both books have done much to clarify my intuitive beliefs about error.  If you suspect this is a case of “confirmation bias” you’re probably right – but at least my confirmation bias gives me a defense to those who say admitting doubt is tantamount to having no beliefs at all.  (I can’t have a bias in favor of a belief unless I have a belief to begin with, right?)

Well, to those who fear that I totter on the brink of nihilism, I assure you I do have beliefs.  And perhaps my strongest belief is that we human beings err – a lot. Since starting this website, I’ve started to see people committing errors with truly alarming frequency.  The experience helps me understand witch hunts, as I now see error the way Cotton Mather saw witches and Joe McCarthy saw communists – everywhere.  The difference, I’d submit, is that Cotton Mather never suspected himself of being a witch, and Joe McCarthy never suspected himself of being a communist. In contrast, I see myself being wrong every day.  In fact, most of the errors I’ve been discovering lately have been my own.

My willingness to admit to chronic wrongness may be partly due to the fact that Schulz devotes much of her book to rehabilitating the reputation of wrongness – pointing out that, far from being the shameful thing most of us consider it to be, wrongness is endemic to who we are and how we think – specifically, to our most common method of rational thought – reasoning by induction.

Consider this diagram:

block-sign

Reasoning by induction, says Schulz, is what causes even a four year old child to “correctly” answer the question of what lies behind the dark rectangle. By way of contrast, she says, a computer can’t answer such a puzzle. The reason? A computer is “smart” enough to understand that the dark rectangle may hide an infinite number of things, from a stop sign to a bunny rabbit to a naked picture of Lindsay Lohan. Without inductive reasoning, the computer will have to consider (and reject) an infinite number of possibilities before deciding on an answer. We humans, on the other hand, are much more efficient – we’re able to form nearly instantaneous conclusions, not by considering all the possibilities we don’t see, but by coming up with plausible explanations for what we do see. Even to a four year old child, it seems highly probable that the image behind the dark rectangle is the unseen middle of the white bar behind it. It’s certainly plausible, so we immediately adopt it as a belief, without having to exhaust an endless list of other explanations. Inductive reasoning makes us the intelligent, quick-thinking creatures we are.

In his book, Daniel Kahneman calls this WYSIATI. His acronym stands for “What you see is all there is.” Like Schulz, he points out that this is how human beings generally think – by forming plausible beliefs on the basis of the things we see, rather than by tediously rejecting an endless list of things we don’t. And, like Schulz, he gives this sort of thinking credit for a good deal of the power of the human brain.

But there’s a downside, a cost to such efficiency, which brings us back to swans. If you’re like me, you probably believe that swans are white, no?

“Which swans?” you might ask.

“Well,” I might well reply, “all of them.”

I first formed the belief that swans are white after seeing just a handful of them. Once I’d see a dozen, I’d become pretty sure all of them were white. And by the time I’d seen my thirteenth swan, and my fourteenth, confirmation bias had kicked in, leaving me convinced that my belief in the whiteness of swans was valid. It only took one or two more swans before I was convinced that all swans are white. Schulz says it was the philosopher Karl Popper who asked, “How can I be sure that all swans are white if I myself have seen only a tiny fraction of all the swans that have ever existed?”

Schulz observes that as children, we likely observed someone flipping a light switch only a few times before concluding that flipping switches always turns on lights. After seeing a very small sample – say, a golden retriever, a shih tzu, and Scooby Doo — children have sufficient information to understand the concept of “dog.” We form out beliefs based on very small samples.

Kahneman describes how and why it’s so common for groups to underestimate how long a project will take: the group imagines all the steps they anticipate, adding up the time each step will take; it factors in a few delays it reasonably foresees, and the time such delays will likely take; and it even builds in an extra cushion, to give it some wiggle room. But almost invariably, it underestimates the time its project ends up taking, because in fact, the number of things that could cause delays is virtually infinite and, well, you can’t know what you don’t know. In a sense, to use Kahnemen’s phrase, you can’t help but feel that “what you see is all there is.”

Now here’s what I think is a critical point. The way inductive reasoning takes such very small samples and draws global conclusions about them makes sense when worlds are very small. If the child’s world is her own house, it’s probably true that all the wall switches turn on lights – it’s only when she enters factories and astronomical observatories years later that wall switches turn on giant engines and rotate telescopes. Here in Virginia, all swans probably are white; I’ll only see black swans if I go to Australia or South America, which I may never do. There wasn’t much problem thinking of the world as flat until there were ocean voyages and jetliners. Both as individuals, and as a species, we grow up in very small, homogeneous worlds in which our inductive reasoning serves us well.

But the real world is more varied and complex. It’s when we expand our frames of reference – when we encounter peoples, cultures and worlds different from those of our youth – that what we “know to be true” is likeliest to be challenged.  And by that time, we’ve become convinced that we’re right. All experience has proven it. Everyone we know knows the truth of what we know. After all, our very conceptions of self, and of our worth, and our very comfort, depends on our being right about the Truth.

More, later, about the emotions involved when one of these strangers challenges that which we know to be true.

— Joe

Some Thoughts, this Christmas Eve

On the WMBW home page, a brief bio refers to my personal religious leanings as “other — really.”   To elaborate at any length might ruin the Holiday Season for most of you, and it’s Christmas Eve: a voice in my head (that of my late mother, I suspect) urges me to mark the occasion with something appropriate to the season.  So I have a few things to share.

First, I highly recommend David Wong’s wonderful article, “10 Things Christians and Atheists Can (and Must) Agree On.”  It’s from the December 16, 2007 issue of Cracked.  (Along with Mad magazine, Cracked deserves at least some credit or blame for making me the man I am today.)  But while Wong’s article is humorous in many respects, it’s also very much in tune with We May Be Wrong.  Well, sort of.  I mean, actually, since Wong’s 10 Things article has had over 1.8 million views, that’s a bit like like saying the ocean’s in tune with the last drop of rain to fall into it. But I hope you catch my drift.  I really wish not just 1.8 million, but 1.8 billion, had read Wong’s article.  In addition to being the sort of article I’d love to publish on this website, it also has a bunch of really cool pictures. Check them out!

http://www.cracked.com/article_15759_10-things-christians-atheists-can-and-must-agree-on.html

Second, that same voice (yes, now I’m sure it belongs to my late mother) tells me that because it’s Christmas eve, I ought to say something about Jesus.   And since I respect Jesus at least as much as I respect David Wong, I’ll post four of my favorite things about Jesus.

1. He is said to have preached that one should love one’s neighbor, and even one’s enemy.

2.  He is said to have preached, “Whosoever shall say, Thou fool, shall be in danger of hell fire.”

3. He apparently instructed his followers not to swear.  As I read it, he didn’t seem to be talking about four-letter words; rather, he seemed to be warning against swearing to the certain truth of anything.  (“Swear not at all: neither by heaven… nor by the earth…because thou canst not make one hair white or black.  But let your communication be, Yea, yea; Nay, nay: for whatsoever is more than these cometh of evil.”)

4. It is said that he repeatedly asked his followers, “Why beholdest thou the mote that is in thy brother’s eye, but perceivest not the beam that is in thine own eye?”

I’d like to think that with that sort of philosophy, if Jesus had been connected to the internet, he might not have disapproved entirely of We May Be Wrong.

My third Christmas offering is a link to a You-Tube video of this season’s performances at our (backyard) Friend’s Theater.

I know that my mother would have liked it.  She was an ardent Christian, but she was also a ham.

As for Jesus, I’d like to think he wouldn’t have been offended that we chose Clark Clement Moore’s poem to perform this season, rather than Luke’s rendition of the Nativity.  As I read the gospels, Jesus comes across as a pretty humble guy who (laughing with us, not at us) might have chuckled at our ineptness — and that’s what I like most about him.

-Joe

 

 

 

 

Baby, It’s Cold Outside

Surely everyone knows the classic Ray Charles and Betty Carter duet in which Ray is intent on getting Betty to stay at his place for just one more drink, while Betty protests, insisting she can’t.  Hammering away with insistence that “It’s cold outside,” Ray eventually prevails on Betty to stay and enjoy the fire.  Snuggling up to him, happy to be together in harmony, Betty joins Ray in singing the final line, “Ah, but it’s cold outside!”

It’s a great study of persuasion in action – the use of words to produce apparent agreement.  I say “apparent” because – well, no, on second thought, I won’t go there.  The time’s not right to take up the subject of the obstacles words pose for minds that wish to share the same thought.  For today, let’s assume that words mean the same thing for everybody. And let’s use them, like Ray Charles so artfully does, for making a case.

If you’ve been following this website, you know that one of our friends made a suggestion that we include one or more “objective truths that everybody could agree on.”  Daunted by the prospect, I sought help from our readers.  The first to answer the call was my longtime friend Ann Beale.  Picking up where Ray and Betty left off, Anne declared that an objective truth to which everyone could agree was, “It’s cold outside.”

Now, I thought this nomination brilliant.  If you don’t know Anne, she lives in South Dakota, where the average low temperature in December is 5 degrees Fahrenheit, the average high only 25.   As it happens, reading Anne’s comment was the first thing I did after getting up at 6:00 a.m., and I was still dressed in the wool sweater I’d worn under the covers during the night – a wool sweater I’d worn over a night shirt, which I’d worn over a tee shirt, which I’d over a tank top.  With the help of these four layers, I’d endured a night of record low temperatures here in Virginia, but with the covers off, I was already shivering as I sat down at my desktop to read Anne’s post.  So I had no choice but to agree with her – it was very cold outside.

Then I read the nomination submitted by another long time friend, Philip McIntyre. Philip nominated an entire slate of candidates.  His description of his nominees – the physical laws of nature – wasn’t quite as pithy as Anne’s, but (always gracious) Philip pointed out that perhaps his post “built on” Anne’s.  You can read Philip’s comment for yourself, but I’d venture the opinion that Philip actually agreed with Anne regarding her nominee: that it was, in fact, cold outside.  One of Philip’s sentences began, “The cold temperature outside right now is…” which strikes me as coming pretty doggone close to agreement. (Philip, I might point out, lives in Buffalo, where the average low in December is 11, and the high, at 31, is still below freezing.)

Now, at that point, I was surprised, but elated.  As best I could tell, (“with three precincts reporting”) there was universal acceptance of an objective truth.  It was, in fact, cold outside.  But then, this morning, as I sat down to record my elation and post “It’s Cold Outside” on the WMBW website’s Home page, I discovered a third nomination.  While the third comment didn’t expressly disagree with Anne – while it wasn’t so contentious, for example, as to say, “Heck no, you fool, it’s hotter ‘n blazes, dammit!” – the writer did write, “Mightn’t the only objective truth be that we do not know what we do not know?”

Definitely food for thought there; I for one was tempted to make a fine breakfast of it, for at least several paragraphs.  But loath to digress, I strove to stay focused on the question at hand – i.e., could everyone agree, “It’s cold outside” – ?  The new writer’s suggestion that there might be only one objective truth everyone could agree on – and that such uniquely objective truth was neither a physical law of nature nor a statement about the weather – forced me to conclude that the new writer was advancing a position in irreconcilable  disagreement with Anne.

I hasten to add that the writer – my brother David – lives in south Georgia, where the average high this time of year is a near-tropical 65.  Well, there you go.  Despite his obvious effort to avoid confrontation with his friends to the north, David, by postulating that he might have put forward the only objective truth, had in a single stroke destroyed our unanimity of belief. (It was easy to see, in that moment, how the Civil War might have started, and as my long time friend Ron Beuch has now suggested with his comment — even as I write this post –bias can be very hard to shed.)

We May Be Wrong is a truly nascent phenomenon.*  During our first three weeks of existence, our growth has been phenomenal.  We already have a huge number of readers.  (At least thirty, I’d be willing to bet.)  But even with only four of us weighing in on the question, we appeared unable to agree that “It’s cold outside” was an objective truth which everybody could agree to.

Now, saddened as I was at this setback, I turned to Philip’s nominees – the physical laws of nature.  Searching for the sort of harmony Ray Charles had achieved with Betty Carter, I asked myself, is it possible that we four, at least, could all agree to the objective truth of Philip’s nominees?  I mean, perhaps, in South Dakota, “It’s Cold Outside” is a physical law of nature.   And perhaps “We don’t know what we don’t know” is a physical law of nature in south Georgia.  So maybe Philip’s comment deserved a closer look.  Maybe, if Anne and David already considered their nominations to be physical laws of nature, they already agreed with Philip, implicitly, and in that case, if I could see my way clear to agreement, Philip’s nomination would have agreement from all four of us.  (And maybe the other twenty-six of us, like Betty Carter, would eventually come around?)

First, I was a little concerned that Philip hadn’t nominated any one Law of Nature in particular, or even multiple such laws, but simply a category, “Physical Laws of Nature.”  It’s been a long time since I was in school, and if I ever knew, I’ve forgotten just how many physical laws of nature the experts have determined there are.   In fact, I’m left wondering what, exactly, a Physical Law of Nature is.  But as with the obstacles posed by words, I’ll forego the temptation to go down that perilous path.  Assume with me, if you will, that we all share a common understanding of what the Laws of Nature are.

I understand that this assumption is not an easy one to make.  In Philip’s comment, he writes, “The problem is, they [the laws of Nature] are so hard to understand.”  Well, I’d sure agree with that.  Relativity?  The space-time continuum?  Quantum mechanics?  They all elude my full understanding, to be sure, and maybe my partial understanding as well.  In fact, even gravity sometimes mystifies me (and not only when I’ve had too much to drink).  But that’s precisely why I wonder about Philip’s statement that, “properly understood,” the physical laws of nature are constant and immutable.  Having agreed that such laws are very hard to understand, I have great difficulty agreeing with anything about what they are when they’re “properly understood,” because I doubt very much that I properly understand them.

But surely I quibble.  And meanwhile, I’m actually more troubled by a different question.  Philip writes that the physical laws of nature are “constant and immutable” in the sense that they “will produce exactly the same result every time in exactly the same set of circumstances.”  I’ve been up all night (well, much of it, anyway) pondering the significance of the italicized words in that sentence.

Now, before I continue, I should acknowledge my own biases.  I personally believe in the value of the scientific method.  As I understand it, scientific “proofs” are all about “reliability” which I believe is the scientific word for what Philip is talking about.  When the scientist keeps extraneous factors under “control,” and can accurately predict the outcome of an experiment time and time again, always getting the same (predictable, identical) result, the scientist is said to have demonstrated “reliability.”   It’s another word for scientific “proof,” as far as I know.  I think there’s much to be said for the scientific method, as a means of learning new things about the physical world.  So if there’s any confirmation bias at work here, I’m pre-wired to agree with what Philip is saying.

But his qualification, “in exactly the same set of circumstances,” nags at me.  Can something be said to be a “law” at all, much less a “constant and immutable” one, if it all depends on an exact set of circumstances?  Isn’t a “law,” by definition, something that operates across circumstances?  There’s a saying in the (legal) law that you can’t have one rule for Monday and another for Tuesday.  It stands for the proposition that for a law to be a law, it has to apply to varied circumstances.  The trooper who issues a speeding ticket says, “I’m sorry, sir, but that’s the law,” by which he is essentially saying, “it doesn’t matter that you’re late for a meeting; the law is the law.  Circumstances don’t matter.”  Believe me, I know that laws often get riddled with exceptions which are essentially driven by variations in circumstance.  Murder?  >> Guilty!  (Oh, self-defense? >> an exception >> innocent.  But murder!  >> Guilty!  Oh, insanity?  >> an exception >> innocent.)   But in the legal world, I’d venture to say, the exceptions are like little “mini-laws” that live within the more general law, running contrary to it in result, but similar to it in form, in that they apply to all the circumstances they purport to include.  Riddled as they are with exceptions, both the general laws themselves and the little “mini-laws” that deal with exceptions are general principles that cut across variations in circumstance.  So I wonder: if every single variation in circumstance had its own special “law,” would there really be any law at all?   With each thing subject to rules applicable only to it, wouldn’t we have anarchy and lawlessness?

David’s nominee, “We do not know what we do not know,” strikes me as a classic tautology, a class of self-evident propositions that also includes “All I can say is what I can say,” “a rose is a rose…” and (importantly) “we do know what we do know.” As such, rather than being the only objective truth, it seems one of a type of an infinite number of truths. At the point at which each unique thing in the world can claim that it is what it is, that it does what it does, etc., it seems plausible to think we might not have objective truth at all, but the very essence of complete subjectivity.

As Philip appears to acknowledge, Anne’s nominee, “It’s cold outside,” seems to result from a constant and immutable set of laws, in the sense of being scientifically predictable, repeatable, and reliable — as long as you remain “in exactly the same set of circumstances.” For people in South Dakota, in the month of December, when there are no forest fires raging for miles around, when the sun is at an oblique angle to the hills around Sioux Falls, when none of the moose are wearing overcoats or carrying space heaters, etc. etc.) it will always be cold outside.

Last night I finished the book of David Foster Wallace essays, Both Flesh and Not, in which I read Wallace’s delightful essay, “Twenty Four Word Notes.” In that essay, Wallace discusses the class of adjectives that he calls “uncomparables,” the first of which is the word “unique.”  Since “unique” means “one of a kind,” he points out that one thing cannot be more “unique” than another; a thing is either unique or it’s not.  Wallace asserts that other uncomparable adjectives include precise, correct, inevitable, and accurate.  “[I]f you really think about them,” he writes, “the core assertions in sentences like, ‘War is becoming increasingly inevitable as Middle East tensions rise,’ [is] nonsense.   If something is inevitable, it is bound to happen; it cannot be bound to happen and then somehow even more bound to happen.”

Philip’s comment uses three key adjectives in describing the physical laws of nature.  He calls them “objective,” “constant” and “immutable.”  I’ll bet that if David Wallace were still alive, he’d agree that “constant” and “immutable” are uncomparables, and perhaps “objective” as well.  If you’re not always constant, then are you really constant at all?  If you’re not always “immutable” – because, on some occasions, you can change – then are you “immutable” at all? If something is “objective” because it doesn’t depend on one’s individual circumstances, then can it depend on any individual circumstances at all, and still be objective?

It seems to me that the class of tautologies comprises an infinitely large class of “truths” because everything is what it is, everything does what it does, and none of these subjective “truths” have to apply to anything else.  So it strikes me as pertinent to ask, ‘Does a truth transcend mere tautology when it applies to anything more than itself?’  And if so, once the gap between two discrete indivisible units is bridged by a “law” that applies to both, is it now a “law of nature” in any meaningful sense?  A constant, immutable, objective truth, because it applies to not just a single set of circumstances, but a second set, as well?   I wonder whether, to qualify as a constant, immutable “objective truth,” a law would only have to apply to two sets of circumstances, or to ten, or to a hundred?

if the “physical laws of nature” include Einsteinian relativity, then isn’t everything ultimately dependent on point of view (i.e., subjective?)  Well, not the great constant, c, the speed of light, you say?  But as I understand it, the speed of light in a vacuum can never be surpassed provided we’re not talking about dark matter, black holes, or parallel universes, and provided we’ve narrowed our consideration to the post-Big Bang era, which insulates our perspective as surely (it seems to me) as the vast Atlantic Ocean insulated pre-Columbian Europe.  And if scientists admit (as I understand they do) that for time prior to the Big Bang, all bets are off, then how is our understanding of physical laws not dependent on our point of view, i.e., subjective?

So pending a reply from Philip or others, who may yet convince me I’m wrong, I’m not yet prepared to agree that the physical laws of nature can lay claim to being “objective” truth.  The original challenge put to the website was to include not just any old objective truth, but an objective truth everyone could agree to.  Alas, much as I hope for our readership to grow, I fear this website may never appeal to those who live on the other side of the Bang, or in any quadrant of the multiverse, or in the world of dark matter, for that matter.

Oh well.  A day or so ago, when there were only three of us, I was, for however brief a time, able to bask in the comfort of pure harmony, knowing everyone agreed that it’s cold outside.  Today, I’ll close by reporting that it’s a few degrees warmer outside.  And in the game of Hide and Seek in which I fear never coming to know the truth, I think that warmth means I may be getting closer.

_______

* Nascent: “(especially of a process or organization) just coming into existence and beginning to display signs of future potential.” – See https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=nascent

–Joe

Objective Truth, Anyone?

Friends of ours have been making some great suggestions for the website.  One friend – to whom we owe many thanks — made several suggestions that led to changes in the site.   But one of his suggestions we did not incorporate.  That suggestion was: “If you listed some objective truths that everyone could agree to, the website would have a better chance of surviving.”

As website editor, I didn’t adopt this suggestion since I felt unable to identify an objective truth everyone could agree to — even if the very survival of the website was at stake.

But then another friend came forward with a suggestion of his own.  He opined that if the website’s following is going to continue to grow the way we’d like, it will need a thriving discourse among its readers – a discourse triggered, perhaps, by the founders’ occasional thoughts, but itself becoming the main reason people return to the website. His suggestion was to leave implementation of the first friend’s suggestion up to our readers.  Unsure as I am what the outcome might be, I decided to give our readers that chance.

So, friends:  consider this your chance to contribute to the website in this most fundamental way.  Take a moment, if you would, to write a “comment” in reply to this post.    Do you think there are any objective truths that everyone could agree to? If not, tell us why not.  If so, give us an example.

I’m anxious to see what you come up with. and I fully expect to learn from whatever you choose to share.

— Joe

 

The Answering Machine

I woke this morning to the sad news that Kessler (our daughter’s dog) was being taken to the Vet today, to be put down.

I went next door, to her house, to say my goodbyes to the dog.  Looking into Kessler’s cloudy eyes, I thought I could see evidence of suffering.  I thought I could see returned affection.  I tried to convey what comfort and love I could, wondering whether I really had any idea what was going through the mind of the dog during these, his last hours– or whether I was projecting my own, human thoughts into him –and whether what I was seeing was merely a reflection of myself.

I also observed the way the three grandchildren were dealing with the situation.  “But, Mom, I don’t want Kessler to die.”  Death is something they’ve never really encountered before.  They’ve seen road kill; they’ve even experienced the death of some of their chickens; but that’s not the same thing.  Kessler is older than any of the kids; he’s been their pet their entire lives; in a few hours, he’ll be gone; they are looking at mortality with new eyes today – experiencing the process, not just the aftermath.

Returning home, I found myself wondering about the difference between adults and children of elementary school age.  These children encounter new things every day. Their lives consist almost entirely of new experiences.  At this point, they seem to take it for granted that they have an awful lot yet to learn.  For them, not understanding things is the normal state of mind.  And I wondered – what happens to us, as we become adults?  How do we lose our childhood sense of wonder, our belief that the world is full of things we do not understand?  How do we come to think that, because we are adults, we now understand so much that we can be sure of ourselves?

As I came back into our house, I noticed a message waiting on our answering machine.  (Yes, Karen and I have land lines, not “smart” phones.)  Picking up the receiver, I listened to a delightful message that Karen had saved.  It had been left, three days ago, by Jackson, our five year old grandson.  The reason she’d saved it was that she is a grandmother and the recording was cute:  Having only made five or six phone calls in his life, Jackson had obviously never encountered an answering machine before.

“We’re not available right now,” said my recorded voice.  “Please leave a message.”

“Poppi?” said Jackson, recognizing my voice.  “This is Jackson.”

A long pause as he awaited a reply that never came.

“Hello.  This is Jackson.  Hello?”

Another pause; the barely audible voice of his mother (coaching him) from the background; her words indecipherable.

“This is Jackson,” he said at last.  “Call me.”

More indecipherable coaching from the background.

“This is Jackson,” he said again.  “Call me.  Goodbye.”

I pictured the scene in my mind, imagining the thoughts that had gone through Jackson’s mind three days ago, during his first encounter with an answering machine.  I’d already been trying to remember what it was like to discover new things all the time, and I couldn’t have asked for a better reminder.  I recalled my own fascination with telephones – maybe 1958 — back in the day when we picked up the receiver and waited for a human being to ask us, “Number please?” Remembering how we’d tell her (yes, always a her) what the number was that we wanted her to connect us to.  How she would magically connect us to others, across vast distances.  We didn’t understand how it all worked, of course, but then, we didn’t expect to.  The world was full of things we didn’t understand.

I was still reminiscing when, at that very moment, the door opened and in walked Jackson, in person this time.  Well, me being an adult and him being a five-year-old, I couldn’t pass up the chance to teach him something about life, by which of course I meant life as it really is.  (You know – I wanted to help him along on his path to an adult world in which he would understand just about everything I understood – even old fashioned answering machines.)  So I decided it was time for Jack to have his second encounter with an answering machine.

“Hey Jack, buddy, come over here.  I’ve got something I’d like you to listen to.”  I picked up the telephone, dialed *86, and pressed 1 to retrieve our messages.  There was only one saved message – Jackson’s.  I put the phone to his ear so that, hearing his own voice, he could learn about answering machines.

There was a confused look on Jack’s face.  I was sure it was because he was perplexed by the sound of his own voice. But when I moved my head closer in order to hear what Jackson was hearing, I could hear a robotic, clipped adult male voice who  (at least from my perspective) sounded nothing like me.

“Voice message received at 2:31 p.m.“ said the robotic recording.  “December 6th.  From (804) 551…”

“Poppi?” asked the living Jackson in front of me, mistaking the robotic voice for mine.  “This is Jackson.” This time, instead of listening to a three day old recording of Jackson, I was looking into his eyes as he spoke.

“Poppi?”  came the reply.  We both heard Jackson’s recorded voice — from three days ago – at the same time.  I waited for him to realize it was his own voice he was listening to.  “This is Jackson,” said the recording.

“Who is this?” asked the real Jackson, standing in front of me – and again, he got an answer.

“Hello?” came the voice from the phone.  “This is Jackson.  Hello?”

“Hello,” replied the little boy in front of me.  “What do you want?”

“This is Jackson” said the recorded voice.  “Call me.”

The living boy in front of me searched his five year old brain for a sensible answer, but found none.  There was a long pause; some muffled whispering in the background could be heard over the phone. Finally, Jackson’s recorded voice broke the silence:

“This is Jackson” it repeated.  “Call me.  Goodbye.”

Without missing a beat, the living boy in front of me politely replied, “Goodbye,” and handed the receiver back to me, obviously very confused.  It took me a long time to stop laughing.  When I’d mustered sufficient composure, I informed Jack that, due to the miracle of the answering machine, the voice he’d been hearing was his own, and that he’d been having a conversation with himself.  His face lit up as my words sank in, and one of the biggest smiles I’ve ever seen took over his face, and he started laughing with me.  We laughed together for a long time.

It’s a true story, really – at least, it’s the truth as I perceive it.  The story of Jackson and the answering machine is my story; I have no idea how Jackson perceives it; no idea how the story would play out, if he told it, from his perspective.  I wish that, somehow, I could climb into the mind of another person – or even a dog – and see whether we’re perceiving the same reality.  But, sadly, I don’t seem to know how to do that.  Sometimes it seems that what I take to be reality is really nothing other than the playing back of a recording — a recording I don’t recognize, but which, in fact, is only my own voice, repeating things I’ve already thought, and said, and have come to believe I fully understand, and nothing more.

But it’s been a good day, all things considered. True, Jackson and I and the rest of the family are all sad about the passing of our dear friend, Kessler.  But as I think about Jackson and the answering machine – as I’m amused anew by his innocence — as I’m joyful anew at his discovery – I’m especially attracted to the way he was able to laugh at his own folly.  I wish I could learn to do that more myself.

— Joe