News, Thoughts and Opinions

My Kid Brother

It’s been ten days now since my kid brother died.  Ten days to digest what has happened.  Ten days to decide what to say about it here, in this blog.

I know that, because this post only reflects my thoughts as they stand today, it, too, will pass, just as my kid brother did.  In some future year, if I come back and read what I’m now writing from that future perspective, these thoughts will likely seem outdated.  I’ll look back and think how foolish I was, how I missed what was really important.  I’ll be embarrassed at how short-sighted I was.

Be that as it may, I do have thoughts, and in my present grief, those thoughts seem worth sharing.

I was six when he was born.  Old enough that I remember our mother sitting on a toilet, her water having broken, telling me she was about to have a baby.  I was wrong, at that point, to conclude that babies come from bowel movements.  But time passes.  We learn new things.  We gain new perspectives.

At his memorial service last Friday, I read excerpts from letters written when we were little – letters our mother had written to her own mother.  Because she wrote a letter every few weeks, the collection of letters makes for a detailed chronology of our childhood.  In many ways, my kid brother and I lived privileged lives, but in other ways, we had it pretty tough.  There was a constant absence of parents, there was alcoholism, there was neglect, there were beatings, and he got his share of all of them.

Perhaps as a result, my brother was a problem child.  When he was not yet two years old, Mom wrote to her mother that he was “the bane of [her] existence.”  By the time he was five,  she had written that he was a “little menace,” that he had wrecked our sister’s doll house, that he’d bitten other kids at school, that “we run when we see him coming,” and that the other kids in the carpool were scared to death of him.  As years passed, new elements were added to her description of the problem child.  He was accident prone.  Stitches.  Burns.  Broken bones.   He was in trouble at school.  He cut classes. His grades were suffering.  In high school, he was voted  “most accident prone.”

Our mother’s letters were relatively complimentary of me.  From a very young age, I was told it was my job to set an example for my brother.  I tried my best to do so.  Even as a youngster, I saw it as my role to help him see right from wrong.  Later in life, I was his landlord twice.  When he dropped out of college, I hired him for his first job.  And when, in his forties, his life began to fall apart, I found myself still trying to straighten him out.  If he had a hard time making good decisions, it was clearly my job to help him.

His death at age sixty-one wasn’t a total surprise.  His bad choices played a big role – perhaps the biggest role – in his early departure.  And because of that, the anger I felt was as much anger at him as it was for him.  Why hadn’t he listened to me and all the others who were telling him to change his lifestyle, to start eating better, to get some exercise, to stop smoking, to stop drinking, to stop his life of physical abuse to himself, to stay on the right side of the law?  For a while, anger at his shortcomings dominated all my feelings about what had happened.

Then, from the idea I could have done more to arrest his downward spiral,  came guilt.  Heck – if his fun-loving lifestyle was responsible for what had happened, how much had I contributed to it?  How often had I laughed at the stories he told of his joy-rides and escapades?  How often had I repeated those stories, with outer dismay but obvious inner admiration?  How often had I smiled and looked the other way, merely shaking my head as he went back to the bar and bought another drink, or drew in another lungful of smoke?

In the ten days since my brother passed, there’s been an outpouring of affection for him such as I’ve never seen.  Hundreds of people have mentioned how his fun-loving nature had given them joy again and again and again.  At first, I shook my head in more disapproval, thinking they didn’t know him well enough.  Couldn’t they see that the very joie-de-vivre they were applauding was what had killed him?

After a few days, I realized that my brother’s irrepressible pursuit of happiness had brought lots of happiness to others, myself included.  I started to realize there was a good side to him.  That if I were going to be angry at his self-destruction, I had also to celebrate the positive side of his personality, the side of my affable, happy-go-lucky, friendly bon vivant that had brought so much joy to so many.

Now, today, yet another realization. My kid brother’s personality grew from the same soup as mine, the same familial soup as all six of us who grew up as siblings together.  Two of us had severe learning disabilities. They, too, always needed help from the rest of us.  They, too, finally passed, after a lifetime of dependence.   Two more – the two who (with me) remain in this world of the living – turned out to be philosophic sorts.  Good people.  Deeply religious.  Like me, they have become serious, thoughtful people who see themselves as caretakers, responsible for the lives of others.  And much as I applaud this aspect of their lives, I realized, this morning, that with my kid brother’s  recent passing, I’d lost the one sibling who could be counted on to make me laugh.  He was the guy with whom I’d shot pool, played Stratego, chess, Space Invaders and poker, gone camping, whitewater rafting and skinny-dipping, drank whiskey, conceived practical jokes, gone on vacations, exchanged stories,  engaged in word-play  – he was the guy I’d spoken with by telephone for an hour a day for the past ten months, laughing at his jokes and stories even as I cringed at his late-life struggles.  And today, as I face the rest of my life without him, I realize what his loss means to me personally.  For much as I love and respect my remaining siblings, my wife, and my children, they, like me, are responsible people.  They, like me, are all usually serious people.  As the anger and guilt of my grief turns, today,  to a depression I hope will be short-lived, I realize that I’ve lost the longest running and most reliable source of laughter in my life – a laughter and joie de vivre I could always (until now) look forward to recurring, even in the worst of times.  I’ve lost my main “laughter fix.”

In a sense, we had become co-dependent.   Just as his joie de vivre had all but extinguished his sense of “responsibility,” my sense of responsibility had all but extinguished my joie de vivre.  If he depended on me as a voice of reason, I depended on him for laughter.  And, today, I’m terribly afraid of a future without that laughter.  I feel like I’m the addicted one, and that I now face my own withdrawal.

It makes me wonder:  Had my very responsibility – the side of me that became his employer, his landlord, the guy who lent him money, who bailed him out of foreclosure – facilitated his lack of it?  And had his joking, his sprees, the joy-rides I could always count on him to supply when I needed my own escape from responsibility, had they facilitated the lack of laughter in my own life, when he wasn’t around to provide it?

I don’t know.  And I apologize if I’ve gone on too long about my kid brother and me.  But I think it raises a larger question.

If, early in life, we grow accustomed to seeing one aspect of a person or thing, does that make it harder for us, later in life, to see other sides of that same thing?

Most things on earth are a combination of good and bad.  Rain spoils parades but feeds the grass.  Sunshine makes for great picnics and cancerous skin.  Fires destroy and so replenish the soil.  Aren’t all things multi-faceted?  If we see that the barn has two great big doors, does it become harder for us to remember the side that lacks them?  If we view a nation, or a religion, or a political party or platform, from one perspective – in a good light, or a bad one– does it make it harder for us to see the good things that it does, or the good values it stands for?

When I was in school, I was president of a debating club.  There were always two sides to the debate.  We had to argue either for or against the proposition.   When I practiced law, there were generally two sides, and once again, we had to argue for, or against, the proposition.  Everything in life, it seems, leads us to be for certain things and against others.  We  go about arming ourselves with the evidence that our adopted side is right and the other side wrong.  We come to believe that the side of every barn has two doors – or that it doesn’t.  No team in the debate is assigned to argue that both sides of the barn are true, depending on your point of view.

For sixty one years, I viewed my kid brother as dependent on me and others for the sense of responsibility we knew he needed.  It didn’t dawn on me, until today, how much I’ve depended on him for the laughter and fun I didn’t realize how much I needed.

Thank you, bro, for that important life lesson.  I wish you the best, and I hope I learn as much from you tomorrow.


Please follow, share and like us:
Follow by Email

Happy Halloween

Since we planned to be out of town for Halloween this year, we produced our annual Haunted House last night, a bit before official trick-or-treat night. “We” means myself and my volunteer crew, of which, this year, there were thirteen members.  What an appropriate number for a Haunted House!

As usual, It took several weeks back in September for me to get psyched.  First, I had to stop thinking about my other projects.  I had to come up with a theme, decide on characters, scenes, and devices, and develop a story line in my mind, imagining the experience our visitors would have, before I could nail the first nail.  As I created the structure that defined the maze-like path to be followed, as I shot each staple into the black plastic walls intended to keep visitors’ footsteps and focus in the right direction, as I adjusted the angle and dimness of each tea light to reveal only what I wanted to reveal, eventually, the construction of the house drew me into the scenes and characters I was imagining.  And as usual, now that “the day after” has arrived, I’ve awoken before the sun rises, my mind crawling with memories of last night’s screams and laughter.  I try to go back to much-needed sleep, but the thoughts of next year’s possibilities get in the way.  It’s the same old story.  Once my mind gets psyched for the Haunted House, it starts to wear a groove in a horrific path; now, it will take something powerful to lift it out of that groove.

I wish I’d done more theater in my life.  I suppose some of my elaborate practical jokes might lay claim to theater.  I’ve even tried my hand at a few crude movies of the narrative, “artsy” sort.  But mostly, its been novels and haunted houses.  I suppose I’ve wanted to tell stories with pictures and words ever since I was a kid.  It’s how I’ve always imagined who I am.

In my efforts to be a better writer, I’ve read much on the craft of writing, from popular books like Stephen King’s On Writing to academic tomes like Mieke Bal’s‘s Narratology.  But among the ghouls and monsters on my mind this dark morning comes the memory of a book on writing I read a few years back, one by Lisa Cron called Wired for Story.  That book makes the point that human brains have evolved to give us a highly developed capacity – indeed, a need – to think in terms of stories, and that we’re now hard-wired to do so.

The opening words of Ms. Cron’s book set the neurological stage:

“In the second it takes you to read this sentence, your senses are showering you with over 11,000,000 pieces of information.  Your conscious mind is capable of registering about forty of them.  And when it comes to actually paying attention?  On a good day, you can process seven bits of data at a time…”

Cron’s book goes on to describe how the very success of our species depends on our capacity to translate overwhelming experience into simple stories.  I don’t know the source, or even if it’s true – maybe from The Agony and the Ecstasy? — but Michelangelo is said to have observed that when he sculpted, he didn’t create art, he just removed everything that wasn’t art.  In my own writing, I’ve come to realize how true that is.  Research produces so many pieces of data, and because I find it fascinating, my temptation is to share it all with my readers.  But thorough research is a little bit like real life, which is to say, like Cron’s 11,000,000 pieces of information.  That much information simply doesn’t make a story, any more than the slab of marble Michelangelo started with makes art.

Our brains are not wired to deal with such overloads, but to ensure our survival, which they do by “imagining” ourselves in hypothetical situations, scoping out what “might” happen to us if we eat the apple, smell the flower, or step in front of the oncoming bus.  Every memory we have is similarly a story – not a photographic reproduction of reality, but an over-simplified construct designed to make sense of our experience.  Think of what you were doing a minute before you started reading this post.  What do you remember?  Certainly not every smell, every sound, every thought that crossed your mind, every pixel of your peripheral vision.  What you remember of that moment is a microcosm of what you remember about your entire life.  Sure, you can remember what you were doing September 11, 2001, but how many details of your own life on that infamous day could you recall, if you devoted every second of tomorrow to the task?  And that was a very memorable day.  What do you recall of  September 11, 1997?  Chances are you have no idea of the details of your experience that day.  The fact is, we don’t remember 99.99% of our lives.  All we remember are the pieces of the narrative stories we tell ourselves about who we are, which is to say, what our experiences have been.

The same holds true about our thoughts of the future.  As we drive down the road, we don’t forecast whether the next vehicle we pass will be a blue Toyota or a green Chevy.  We do, however, forecast whether our boss will be angry when we ask for a raise, or whatever might happen that’s important to us when we arrive at our destination (which is, usually, a function of why we’re going there).  Whether we’re thinking about the past, the present, or the future, we see ourselves as the protagonist in a narrative story defined by the very narrow story-view we’ve shaped to date, which includes our developing notions of what’s important to us.  Our proficiency at doing this is what has helped us flourish as a species.  This is why photographers tend to see more sources of light in the world, and painters more color, while novelists see more words and doctors see more symptoms of illness. The more entrenched we are in who we’ve become, the more different is the way we perceive reality.

Understanding ourselves as hard-wired for dealing with simple, limited stories rather than the totality of our actual experience – not to mention the totality of universal experience – has important ramifications for self-awareness.  As the psychologist Jean Piaget taught us, from our earliest years, we take our experiences and form conclusions about the patterns they appear to represent.  As long as new experiences are consistent with these constructs, we continue interpreting the world on the basis of them.  When a new experience presents itself that may not fit neatly into the pattern, we either reject it or (often with some angst) we begrudgingly modify our construct of reality to incorporate it.  From that point forward, we continue to interpret new experiences in accordance with our existing constructs, seeing them as consistent with our understanding of “reality” (as previously decided-on) whenever we can make it fit.

And so, from earliest childhood, we form notions of reality based on personal experience.  The results are the stories we tell of ourselves and of our worlds, stories which have a past and which continue to unfold before us.  As Cron points out, we are the protagonists in these stories.  And I’d like to make an additional point: that in the stories we tell ourselves, we are sometimes the heroes.  We are sometimes the victims.  But unless we are psychopathic, we are rarely, if ever, the villains.

There are, of course, plenty of villains in these stories, but the villains are always other people.  In your story, maybe the villains are big business, or big government; evil Nazis or evil communists; aggressive religious zealots, cold-blooded, soul-less atheists, or even Satan himself. It could be your heartless neighbor who lets his dog keep you up all night long with its barking, or the unfeeling cop who just gave you that unjust speeding ticket.

As you think of the current chapter of your life story, who are the biggest villains?  And are you one of them?  I doubt it.  But I suggest asking ourselves, what are the stories the villains tell about themselves?  What is it that makes them see themselves as the heroes of their stories, or the victims?  Isn’t it reasonable to assume that their stories make as much sense to them as our stories make to us?

We have formed our ideas about reality based on our own experiences, because they make sense to us.  Indeed, our stories make sense to us because they are the only way we can get our minds around a reality that’s throwing 11,000,000 pieces of information at us every second of our waking lives.  We live in a reality of mountain ranges, full of granite and marble.  Michelangelo finds meaning in it by chipping away everything that isn’t The Pieta, Auguste Rodin by chipping away everything that isn’t The Thinker.  When they find meaning in such small samples of worldwide rock, is it any wonder they see reality differently?

Psychologists tell us that self-esteem is important to mental health, so it’s no wonder that in the stories we tell ourselves, we are the heroes on good days, the victims on bad ones, and the villains only every third leap year or so.  Others are the normal villains.  But if I’m your villain, and you’re mine, then we can’t both be right – or can we?  An objective observer would say that your story makes excellent sense to you, for the same reasons my story makes excellent sense to me.  Both are grounded in experience, and your experience is quite different from mine.  Even more importantly, I think, your “story” represents about 7/11,000,000th of your life experiences while my story represents about 7/11,000,000th of mine.

But confirmation bias means that we fight like heck to conform new experience to our pre-existing stories.  If a new experience doesn’t demand a complete re-write, we’ll find a way to fit it in.  It’s like we’re watching a movie in a theater.  If some prankster projectionist has spliced in a scene from another movie, the whole story we’re watching makes no sense to us and sometimes we want to start over, from the beginning.    If our stories are wrong, our entire understanding of who we are and how we fit in becomes a heap of tangled film on the projection room floor.

One of the things I love about Halloween is how it lets us imagine ourselves as something different.  I mean, Christmas puts our focus on Jesus or Santa Claus, role models to emulate, but their larger-than-life accomplishments and abilities are distinctly other than the selves we know.  Mother’s Day and Valentine’s Day encourage us to focus on other people.  Halloween is the one holiday that encourages us to pretend to be something we’re not – to put aside our existing views of the world “as it really is” and become whatever our wildest imaginations might see us as.  I think that’s why I like it so much.  Obviously, I’m not really a vampire ghoul from Transylvania, but when my current worldview is based on a tiny,  7/11,000,000,000th slice of my own personal experience, how much less accurate can that new self-image be?

I think of “intelligence” as the ability to see things from multiple points of view.  The most pig-headed dullards I know are those who seem so stuck in their convictions that they can’t even imagine the world as I or others see it.  I tend to think that absent the ability to see things from multiple points of view, we’d have no basis for making comparisons, no basis for preferences, no basis for judgment, and therefore, no basis for wisdom.

Halloween is the one time of year I really get to celebrate my imagination, to change my story from one in which I’m hero or victim to one in which I’m a villain.  As I try to see things from a weird, offbeat, or even seemingly evil point of view, I get practice in trying to see things as others see them.  For me, it seems a very healthy habit to cultivate.

But I must end on a note of caution.  As someone who tries to tell stories capable of captivating an audience, I am keenly aware of a conflict.  As the dramatist, my goal is to channel your experience, your thoughts, your attention, along a path I’ve staked out, to an end I have in mind.  When I’m successful, I create the groove.  My audience follows it.  In this respect, good story-telling, when directed toward others, is a form of mind control.

But what about story-telling to oneself?  It’s probably good news that in real life, there isn’t just one Stephen King or Tom Clancy trying to capture your attention or lead you to some predetermined goal.  Every book, movie, TV commercial, internet pop-up ad, billboard, preacher, politician, news reporter, self-help guru and next door neighbor has a story to tell, and wants you to follow it.  The blessing of being exposed to 11,000,000 pieces of information every second is that we’re not in thrall to a single voice trying to control the way we see the world.  But does this mean we’re free?  The reduction of the world’s complexity into a single world-view is a story that IS told by a single voice — our own.  All of our individual experiences to date have been shaped by our brains into a story, a story in which we are the heroes and victims.  The most powerful things that seek to control our views of the world are those stories.  We’ve been telling ourselves one since the day we began to experience reality.  My own?  Since early childhood, I have seen myself as a story-teller.  Since September, the immanence of Halloween has forced me, almost unwillingly at first, to focus on my annual Haunted House.  At first, it was hard.  But in just a few weeks, the themes, characters, scenes, and devices of this story took such a hold on me, that I woke up this morning unable to think of anything else.

Such are the pathways of our minds.  If my thoughts can be so channeled in just a few weeks, how deep are the grooves I’ve been cutting for over sixty years?  Am I really free to change the story of my life, or am I the helpless victim of the story I’ve been telling?

This week, try imagining yourself as something very different.  Something you’d normally find very weird, maybe even distasteful.  But remember – don’t imagine yourself the villain.  Imagine yourself, in this new role, as part hero, part victim. Get outside your prior self, and have a Happy Halloween.

— Joe

Please follow, share and like us:
Follow by Email

Knowing Right from Wrong

Two items I heard on the radio yesterday struck me as worthy of comment.

First was the news of Sunday night’s tragedy in Las Vegas.  Questions of motive apparently loom large. President Trump first called the shooter  “pure evil.” Now he’s saying the shooter was “very, very sick.”

I also heard yesterday that the Supreme Court would soon be deciding a case involving a woman sent back to jail because she tested positive on a drug test, which positive result violated the terms of her parole.  Her lawyer is apparently arguing that the action amounts to re-incarceration due to a “disease” (addiction), and is therefore unconstitutional.  My own reaction is that the woman wasn’t incarcerated for having an illness (her addiction) but for something she did (use drugs, and test positive on a drug test).  But the fact that the woman’s conduct arguably sprang from her illness/addiction leads me to compare her to the Vegas shooter.  Ultimately, the question becomes whether an offense that results from “sickness” is excusable, and whether it can be distinguished from an offense that results from something else, something that is not some sort of sickness –“evil”perhaps.  If so, then all we have to do is figure out the difference between evil and sickness.

While I’m at it, allow me to throw in the killing of Osama Bin Laden, just to round out the analytical field.  By the killing of Osama Bin Laden, I mean both the killing he ordered and the killing that finally brought him down.  Premeditated.  Innocent lives lost in the process.  Evil?  Justifiable?  Sickness?  Other?

There’s nothing particularly new about such questions.  They take us back to the legal requirements for justifiable homicide.  To the religious doctrine of the just war.  To the philosophical question of whether an end ever justifies a means.  To the debate over determinism and free will.  All these issues have defied resolution for centuries.  I have my opinions, but instead of advancing them here, I’d like to use them as the background for raising two other matters that have been on my mind.

The first, I’ll call the question of knowledge.  When I studied Latin in school, I learned the distinction between two Latin verbs, cognoscere and scire.   When I studied French, I encountered  the same difference between two French verbs, connaitre and savoir., which evolved from the Latin.  All four verbs are translated into English as “to know.”  But in both Latin and French, a distinction is observed between knowing in the sense of being somewhat familiar with something, and knowing in the sense of being aware of a fact or a field of knowledge, authoritatively, or with certainty.  In Latin and French, if you want to say you “know” your neighbor, you use the word cognoscere or connaitre, because you really only mean to say you’re somewhat familiar with her.   But if you want to say that you know your own name, or where you live, or the words of the Gettysburg address, you use scire or savoir, to assert that you have essentially complete and authoritative knowledge of the subject.

These two types of knowledge seem rather different from each other.  For many years, I thought it a shame that the English word “to know” gets used to cover both types.  I thought it important to distinguish between those situations in which we really know something and those in which we simply have a passing familiarity, and I found English lacking due to its failure to make that distinction.   But now, I think differently.  Now, I question whether we really know anything with certainty.   If we can’t see all four sides of a barn simultaneously, how can we say we “know” the barn, as opposed to being familiar with just one aspect of it?  Is the most we can ever say about anything  that we are somewhat familiar with it?  If there really is just the one sort of knowledge, then maybe we’re right to have just one word for it.  Maybe the Romans and French were wrong to think both types of knowledge possible.

Meanwhile, what do we mean by right and wrong?  Mostly, I’ve been thinking about politics in this regard, not drug use or homicide.  I’ve been wondering whether terms like right and wrong should be abandoned altogether when it comes to politics.  I mean, every political issue I can think of seems to me to be more easily analyzed in terms of what (if any) group benefits, versus what (if any) group gets hurt.   Is it more accurate to say that a policy or practice is “right” when viewed from one group’s perspective, and “wrong” when viewed from another?

Take, for example, immigration reform.  You might argue that tightening controls favors those who already live in a country, and disfavors those who want to enter it.  Assuming that’s true, would that make the tightening right, or wrong?  Doesn’t it depend on whose perspective you’re adopting?

Arguably, capital punishment hurts convicted murderers while benefiting taxpayers who would otherwise bear the costs associated with long prison terms.  We can argue about deterrence, and whether capital punishment deters future criminals and therefore benefits potential future victims.  But what does it mean to argue that capital punishment is “right” or “wrong”?  The simplistic precept “It is wrong to kill” either condemns all killing, including the killing of Osama Bin Laden,  or it provides no answer at all because the real question is when it is right to kill and when it isn’t.  I have the same question about higher taxes, about the Affordable Health Care Act, about environmental regulations, and about every other political issue I can think of.  “Right” and “wrong” seem too absolute to be helpful in understanding complex tradeoffs which may well benefit some groups while hurting others.

I can follow a discussion pretty well when it’s phrased as a discussion of what groups will arguably benefit by some policy or proposal, and what groups (if any) will be hurt.   But I have difficulty when the same debate is phrased in terms of what’s “wise” or what’s “sound policy,” because it seems to me always to come back to “wise for whom?”  Immigration reform might be good for the American economy, but is it good for the rest of the world?  Obamacare may benefit those with preexisting conditions and are poor and unhealthy, but not those who are healthy or wealthy.  Is a Pennsylvania  law “wise” if it helps Pennsylanians but hurts New Yorkers?   Is an American policy “wise” because it helps Americans, even if it hurts Russians, Filipinos, or Cubans?

It may help us express how disturbed we are by the shooting in Las Vegas, if we call it “pure evil,” but I don’t see it that way.  (Frankly, I don’t know what “pure evil” means. ) Rather, it seems to me we all have personal points of view, which is to say, minds that tell us stories.  In those stories, we ourselves are often the unappreciated heroes.  In some other stories, we may be the victims.   But in how many stories are we purveyors of unadulterated wrong?  I believe that the Vegas shooter told himself a story in which he was a hero, or a victim, or both.   And if we do things because they make sense to us, in the context of the people, values, religion or nation with which we identify, and in the context of the stories we see ourselves acting in,  then do we have anything more than a subjective point of view, a limited perspective incapable of assessing a more objective or universal wisdom about right and wrong?  I think we all suffer from genuine mental impairments – if not anything as egregious as sociopathic aggression or drug addiction, then more common ailments like self-interest, self-delusion, arrogance, bad habit, confirmation bias or simply poor judgment resulting from our fallibility.  At best, we have a passing familiarity with right and wrong, not authoritative knowledge of it.  At worst, we are all sick, and so occupy ground not entirely unlike that of the Las Vegas shooter or the drug addict.

Maybe it’s time to stop the litmus test of good versus evil.  To recognize instead that what benefits one person may hurt another.  That when our government incarcerates an addict, storms a deranged mass shooter’s hotel room, or takes the life of a militant dictator, we are not making God-like moral judgements that one person is “good” and another “pure evil,” but simply making practical tradeoffs to protect certain interests at the expense of others.  And maybe, in the next political discussion we have, it’ll prove helpful to stop talking about who and what are wrong, but who will likely benefit and who be hurt.

My hunch is that the Vegas shooter saw something as pure evil – and that whatever it was, it wasn’t himself.  His idea of evil was likely different from ours.  Indeed, he may have considered us as examples of pure evil. We’re wired to think we’re somehow different from him;  that, unlike him, we know the difference between right and wrong.  At times like these, in the face of senseless atrocity, it’s easy to feel that way, to see a fundamental difference between him and us:  After all, we say smugly, we would never indiscriminately kill scores of people.

But we killed over six hundred thousand in our civil war.  We killed a hundred thousand at Hiroshima. We’ve killed in Vietnam, Iraq, and Afghanistan.  In a few weeks, when the Vegas shootings are no longer front page news, we’ll be calling each other stupid, or evil, or just plain wrong, as if we have nothing in common with the Vegas shooter.  As if we have the unerring ability to identify what’s right and wrong, and to do so with the full understanding the Romans and French called scire and savoir.

Different as we may be in other respects, I say we all suffer from that disease.

Families of victims in Vegas, you’re in our thoughts and prayers.

– Joe

Please follow, share and like us:
Follow by Email

Top Ten Blunders – Your Nominations

A month ago , I asked for your thoughts about the greatest blunders of all time.  I was thinking of blunders from long ago, especially “a list that considers only past human blunders, removed from the passions of the present day.”  I observed, “My special interest lies in blunders where large numbers of people… have believed that things are one way, where the passage of time has proven otherwise.  I believe such a list might help remind us of our own fallibility, as a species…

I got only five nominations. (I imagine the rest of you are simply reluctant to nominate your own blunders.  But hey.  All of us have done things we’d rather our children not hear about.)  As for the rest of you, I’m grateful for your nominations, even if they do imply that blame lies elsewhere than ourselves.  The five I received are certainly food for thought.

One was, “Founding Fathers missed huge by not imposing term limits.”  According to a recent Rasmussen opinion poll, 74% of Americans now favor term limits, with only 13% opposed.*  One could argue the jury is in: the verdict being that the founding fathers should have imposed term limits.  That said, with the average length of service in the U.S. House being 13.4 years, we obviously elect half of our representatives to seven or more consecutive terms.  And Michigan voters have sent John Dingle back to Congress for over fifty-seven years, even longer than his father’s decades of service before him.  Do they feel differently about term limits in Michigan?  If the founding fathers’ failure to impose term limits was a great blunder, don’t the American voters make a far greater blunder every two years when they send these perennial office holders back to Washington, year after year? I mean, it’s at least arguable that the Founding Fathers were right in failing to impose term limits.  But who can deny the hypocrisy when an electorate that favors term limits (that means us, folks) does what they themselves would prohibit?  Millions of Americans today are either wrong in favoring term limits, or wrong in re-electing the same Congressmen over and over again – and surely wrong by doing both simultaneously.  At least if measured by the number of people involved, the blunder we commit today strikes me as greater than that committed by a handful of wigged men in 1789.

A second nomination: “Y2K has to be in the top 20?”  That one sure brings a smile to my face.  You remember the “experts” predictions of the global catastrophe we’d see when all those computers couldn’t handle years starting with anything but a 1 and a 9.   Then, when the time came, nothing happened.  I don’t know of a single problem caused by Y2K.   If judged by the certainty of the so-called experts, and the size of the gap between the predicted calamity and what actually transpired, Y2K clearly deserves recognition.

But compare Y2K to other predictions of doom.  There can be no predicted calamity greater than the end of existence itself.  A glance at Wikipedia’s article, “List of Dates Predicted for Apocalyptic Events,” includes 152 dates that have been predicted for the end of the world.  And they haven’t been limited to freakish fringes of society.  Standouts include Pope Sylvester II’s prediction that the world would end on January 1, 1000, Pope Innocent III’s that it would end 666 years after the rise of Islam,  Martin Luther’s prediction that it would end no later than 1600, and Christopher Columbus’s that it would end in 1501. (When that year ended successfully, he revised his prediction to 1658, long after he’d be dead; he apparently didn’t want to be embarrassed again).  Cotton Mather’s prediction of 1697 had to be amended twice.  Jim Jones predicted the end in  1967 and Charles Manson 1969.  My favorite on Wikipedia’s list dates from May 19, 1780, when “a combination of smoke from forest fires, a thick fog, and cloud cover” was taken by members of the Connecticut General Assembly as a sign that the end times had arrived.  (It’s my favorite because it may help explain why the founding fathers saw no need for term limits.)  But fully half of the Wikipedia list consists of predictions made since 1900.   Over twenty-five have been since the Y2K blunder.  The recent predictions include one from a recent Presidential candidate (Pat Robertson) who predicted the world would end in 2007.  And though not yet included by Wikipedia,  last month’s solar eclipse brought out yet more predictions of the end of the world – never mind that only a tiny fraction of the earth’s surface was in a position to notice it.  (Would the world only end across a thin strip of North America?)

We can laugh at Christopher Columbus, but what of the fact that the list of doomsday prophecies continues to grow, despite how often the doomsayers have been wrong?  Measured by the enormity of the subject matter and the apparent widespread lack of concern about being “twice bitten,” man’s fondness for predicting when the world will end as a result of some mystical interpretation of ancient texts strikes me as a bigger blunder than Y2K – and unlike Y2K, it shows no sign of going away.

A third nomination: “The earth is flat.”  The archetypal human blunder.  Months ago, while struggling to think of other blunders as egregious, I was led by Google to a Wikipedia article on “the flat earth myth,” which I assumed was exactly what I was looking for.  But to my dismay, I read that the “flat earth myth” is not the old belief that the world was flat; rather, it is the current, widely-held belief that people in the Middle Ages believed the earth to be flat!  I’d spent a lifetime feeling proudly superior to the ignorant medieval masses.  Was it me, after all, who was wrong?

My discovery reminded me of the difficulty of ranking human error.  The article asserted that throughout the Middle Ages, the “best minds of the day” knew the earth was not flat.  The “myth” was created in the 17th Century, as part of a Protestant campaign against Catholic Church teachings, accelerated by the fictional assertion in Washington Irving’s popular biography of Christopher Columbus that members of the Spanish court questioned Columbus’s belief that the earth was round.  Gershwin’s unforgettable, “They all laughed at Christopher Columbus…” etched the myth forever in our minds.  The article quotes Stephen Jay Gould: “[A]ll major medieval scholars accepted the Earth’s roundness as an established fact of cosmology.”  The blunder wasn’t a relic of the Middle Ages, but an error of current understanding based on a post-enlightenment piece of popular fiction!

Meanwhile, the Flat Earth Society lives on to this day.  Their website, at, “offers a home to those wayward thinkers that march bravely on with REASON and TRUTH in recognizing the TRUE shape of the Earth – Flat. “  Most of them, I think, are dead serious.  But wait.  Which is the greater blunder: that of the medieval masses who saw their world as a patchwork of coastlines, rolling hillsides, mountains, valleys, and flat, grassy plains?  Or that of the experts, the major scholars who “knew” in the Middle Ages that the earth was a sphere?  The earth is not a sphere at all, we now know, but a squat, oblong shape that bulges around the equator because of the force of its spin.  Or is that error, too?  Need we mention that spheres don’t have mountains and valleys?  Need we mention that the surface of the earth, at a sub-atomic level, is anything but curved?  Aren’t all descriptions of the earth’s shape simply approximations?  And if we can accept approximations on the basis that they serve a practical purpose, then is the observable general flatness of the earth today any more “wrong” than a medieval scholar’s belief in sphericity?    Who really needs to know that the atoms forming the surface of the earth are really mostly air? The “wrongness” in our concepts of earth’s shape isn’t static, but evolving.

The oldest of the historical blunders nominated for WMBW’s top ten list have an ancient, scriptural flavor.

The first: “The number one thing that went wrong with humanity [was] when the first man said to another, ‘I think I heard god last night!’ and the other believed him.”**

The second comes from a different perspective: “The greatest blunder had to be Eve eating of the fruit of the tree of knowledge, having been tempted to be like God, deciding for herself what is good and what is evil.  Every person [becomes] his own god. The hell of it is, everyone decides differently, and we’re left to fight it out amongst ourselves.”**

The other three nominators thought that Y2K, belief in a flat earth, and failure to impose term limits should be considered for a place somewhere on the top ten list.  (Actually, Y2K’s sponsor only suggested it belonged somewhere in the top 20.)  But the two “religious” nominations were each called the biggest blunder of all.  (One was “the number one thing,” while the other “had to be” the greatest blunder.)   What is it about belief in God that prompts proponents and opponents alike to consider the right belief so important, and holding the wrong one the single greatest blunder of all time?

If you believe in God, though He doesn’t exist, you’re in error, but I don’t see why that error qualifies as the greatest blunder of all time, even when millions suffer from the same delusion.  I remember seeing an article in Science Magazine a few years ago, surveying the research that has attempted to determine whether believers tend to act more morally than non-believers.  Most of the studies showed little or no difference in conduct between the two groups.  For those who don’t believe in God, isn’t it one’s conduct, not one’s belief-system, that is the best measure of error?  For them, why does belief even matter?

If you don’t believe in God, though He does exist, you face a different problem.  If you believe as my mother did – that believing in God (not just any God, but the right God, in the right way) means you’ll spend eternity in Heaven rather than Hell – it’s easy to see why being wrong would matter to you. If roughly half the people in the world are headed to eternal damnation, that’s at least a problem bigger than term limits.

But there is a third alternative on the religious question.  If you’ve looked at the WMBW Home Page or my Facebook profile, you may have noticed my description of my own religious views – “Other, really.”  One of the main reasons for that description is pertinent to this question about the greatest blunders, so I will identify its relevant aspect here: “If God exists, He may care about what I do, but He’s not so vain as to care whether I believe in Him.”  My point is not to advance my reasons for that belief here, simply to point out that it may shed light on why many rank error on the religious question so high on the list of all-time blunders, while I do not.  Many believers, I think, believe it’s critically important to believe, so they try hard to get others to do so.  Non-believers react, first by pointing to examples of believers who’ve been guilty of wrongdoing, and eventually by characterizing the beliefs themselves as the reason for the wrongdoing.  In any case, the nominations concerned with religious beliefs were offered as clearly deserving the number one spot, while our “secular” nominations were put forward with less conviction about where on the list they belong — and that difference may have meaning, or not.

In my solicitation, I acknowledged the gray line between opinion and fact.  To some believers, the terrible consequences of not heeding the Word of God have been proven again and again, as chronicled throughout the Hebrew Bible.  To some non-believers, the terrible consequences of belief have been proven by the long list of wars and atrocities carried out in the name of Gods.  Whichever side you take, the clash of opinions remains as strong as ever.

So, what do I make of it all?  Only that I’d hoped for past, proven blunders which might remind us of our great capacity for error.  Instead, I discover evidence of massive self-contradiction on the part of the current American electorate; a growing list of contemporaries who, as recently as last month, are willing to predict the imminent end of the world; my own blunder, unfairly dismissive of the past, fooled by a piece of Washington Irving fiction; and a world as divided as ever regarding the existence of God.

To this observer, what it all suggests is that there’s nothing unique about the ignorance of past ages; and that an awfully large chunk of modern humanity is not only wrong, but likely making what some future generation will decide are among the greatest blunders of all time.

Sic transit gloria mundi.



** I’ve done some editing of punctuation in both of these nominations: I apologize if I’ve thereby changed the submitter’s meaning.


Please follow, share and like us:
Follow by Email

Asking the Ad Hominem Question

I generally wince when someone debating Topic X starts talking about his opponents, giving reasons he thinks his opponents believe as they do, trying to discredit their position by psycho-analyzing the reasons they hold it or expressing his disapproval of “the sort of people” who hold such positions.    It’s typically a variant of a thought analyzed well by Kathryn Schulz in Being Wrong: “I think the way I do because all evidence and logic support me; the only reason you think the way you do is because you suffer from… [here, fill in the blank.]”As I see it, such ad hominem arguments are often resorted to by those unable to make a good argument on Topic X itself.  Moreover, by making the debate personal, the ad hominem debater usually comes across as insulting, and that’s a sure-fire recipe for things to get ugly quickly.

I think it’s quite different to pose an ad hominem  question to oneself.  Asking ourselves why we believe what we do, when others don’t agree with us, can be a mind opening exercise.  (In case it’s not clear, “I believe what I believe because its true, and others disagree because they’re stupid” doesn’t count.)

Allow me to offer an example.  Having gotten some flack from readers for my thoughts about Charlottesville, I decided to ask myself the ad hominem question in an effort to understand why I favor removing statues of Confederate generals from public squares, when others don’t.   What is it about my background that causes me to favor such removal?

I’m pretty sure it was my career as an employment lawyer, a capacity in which I was often asked to advise employers on diversity issues and strategies for legally maintaining a dedicated, harmonious, loyal (and therefore productive) workforce.    Many of my clients experienced  variations on a problem I’ll call cultural conflict in the workplace, by which I don’t mean conflict between employer and employees, but among employees themselves.

The conflict involved was often racial, religious or gender-based.  For example, one company piped music from a radio station into its warehouses, only to discover that one group wanted to listen to a country music station, another a Latin station, and a third an R&B, Hip-Hop or Motown station.  Each group claimed it was being discriminated against if it didn’t get its way.  Another variant of the problem was when assembly line workers came to work wearing T-shirts that other employees found offensive —one T-shirt featured a burning cross; another the picture of a person wearing a white sheet and hood while aiming a gun at the viewer; another featured the “N” word; others were donned featuring raised fists and the words “Black Power” and shirts implying a revolution against “white rule.”

A frequent variant on the “culture conflict” problem  involved office environments where employees shared cubicles and wanted to decorate their cubicles with words or images that their neighbors or cubicle-mates found offensive.    In one case, one Christian employee began to hang skeletons, ghouls, devils and demons all over a shared cubicle, beginning in August, in preparation for Halloween; her Christian cubicle-mate believed that celebrating Halloween at all was the work of the devil; she countered  by hanging crucifixes, pictures of Jesus, manger scenes and Bible quotations on the shared cubicle wall, saying that devil worshipers would go to Hell; a third resident of that same cubicle corner — the one who actually complained to management — had religious convictions that prohibited the celebration of any holidays or the use of any religious imagery at all, on the grounds that all of it was idol-worship; she wanted it all removed.

Perhaps the most common variant of the culture conflict was in workplaces where male employees wanted to hang calendars or other pictures of naked (or scantily clad) women, while  women (and some men) objected on the grounds the working environment was made illegally offensive as a result.

In one case, there was already a racially charged atmosphere: a group of white ‘good ole boys’ always ate at one lunch table while blacks ate at another.  There’d been some mild taunting back and forth, but nothing too serious, when one day, several of the white employees started “practicing their knot tying skills” by making rope nooses in plain view of the blacks at the other table.  The blacks saw an obvious message which the whites of course denied.

In all such cases, the employer was left to decide what to do.  There were difficult legalities to deal with.  Some employers tried to address the problem by declaring that employees could post no personal messages on company property (like cubicle walls), but could post what they wanted on their own property (their  lunch boxes,  tool boxes, T-shirts, etc. ) But the public/private property distinction didn’t end the problem.  Someone who brings a Swastika and a “Death to All Jews” decal to work on his lunchbox is an obvious problem for workplace harmony, regardless of what the law says about it.

Surely, my background in this area shapes my views about cultural conflict regarding statues in public squares.  And I think what decided my position on statues was that such problems arose among my clients scores of times, yet never once was it the employer itself that wanted to post the material, wear the T-shirt, celebrate the holiday, practice tying knots, or whatever.  It was always a question of playing referee in the conflict between opposing groups of employees.

I believe it’s by analogy to that situation that I instinctively consider the problem faced by a government body deciding what or who to memorialize in the public square.   I don’t claim it’s an easy task.  But if a company or city had ever asked me if I thought it ought to hang crucifixes in its cubicles, display a picture of the devil in its lunchroom, hang a Confederate flag or a Playboy centerfold  in the conference room, or have its managers fashion nooses during an employee meeting, I’d have been flabbergasted.   It’s not that Robert E. Lee is like Satan, or Jesus, or a Playboy Centerfold, if we’re talking moral qualities, or what OUGHT to be offensive.  Rather, it’s the fact that, in my experience, all that mattered to the employers was that some of their employees considered the displays offensive.  When the display was controversial , it was viewed as a problem.  And without exception, my clients took pains not to introduce controversial images themselves.

In abstract theory, I can imagine that some symbols or ideas might be so important to the common good that an employer (or city council) should celebrate them, despite their being divisive.  (A statue of the sitting President?) But in the case of Confederate generals who fought to preserve an institution that has been illegal for 150 years now, my own cultural background — including my work experience —gives me no clue as to what their countervailing importance might be.

Anyway, I really do wince when people make ad hominem arguments against their opponents, but I like asking the ad hominem question of myself.  Whatever you think about Confederate generals, I’d love to hear from you if you’ve given the thought-experiment a try, especially if it has helped you understand differences in points of view between yourself and others.

— Joe

Please follow, share and like us:
Follow by Email

Thoughts About Charlottesville

Last week’s tragedy in Charlottesville  has touched close to home here in Richmond, the capital of the old Confederacy.  Lt. H. Jay Cullen was one of two police officers killed in the effort to restore peace.  His viewing is tonight; his funeral is tomorrow.    My optometrist is attending because she serves as a delegate to the state house.  My daughter is attending because she’s a former co-worker and friend of Lt. Cullen’s wife.  Amidst the grief and mourning, the firestorm of what passes for debate regarding the whole affair cries out for a WMBW perspective.

A few months back, when the removal of four confederate statues in New Orleans was in the news, my own thinking distinguished among the statues.  I thought the removal of some made sense, but not others.  I was struck by the fact that no one else seemed to consider them as separate cases.  Everyone seemed to have adopted an all-or-nothing posture: either you were for, or against, the “removal of the statues,” as if alignment with one side or the other mattered more than considering the merits of each statue on its own.  Was I the only one in my circle who saw a middle ground?  I still worry about a group-think tendency to align entirely with one side or the other.  Such polarizing alignment seems to me precisely what led to the Civil War in the first place.   But in the meantime, Charlottesville has caused me to consider the matter anew – and I’ve decided I was wrong about the statues in New Orleans.

I approved of the removal of most of the New Orleans statues, but felt otherwise about the statue of Robert E. Lee.  My opposition was on the ground that Lee was a good (if imperfect) man and that to remove his statue did an injustice, both to history and to him personally.  Now, I believe that I was wrong about the Lee statue, and I’m moved to explain why.

First, let’s consider what history tells us about the man in relation to slavery.  While historians disagree on certain details, it seems clear that Lee personally ordered the corporal punishment of slaves who resisted his authority.   Today, all but the most extreme white supremacists can agree that this was wrong.  Of course, Lee’s treatment of his slaves was not remarkably better or worse than the racism of thousands of other white men who owned slaves in those days; he apparently believed what most white Southerners  (as well as many in the North) believed: that the Bible made it their Christian duty to “look after” African Americans.  And for Lee, as for most slave-owners, this paternalistic attitude included both kindness (especially as a reward for loyalty and good work) and infliction of severe corporal punishment (as a deterrent to disobedience).  These days, it’s extremely hard to understand how so many people could have been so wrong, but hundreds of thousands of Lees’ white peers thought and acted as he did in their relation to black Americans.   This conduct was widespread; it shames us all.  While being widespread doesn’t justify what Lee did, it makes it a lot easier for me to recognize that Lee was much like the rest of us: i.e., capable of well-intended conduct that future generations may condemn as fundamentally, grievously wrong.

My admiration for Lee – which continues — comes despite his participation in the injustice of slavery.  I also admire slave-owners like George Washington and Thomas Jefferson (despite the fact that Jefferson described African Americans as having “a very strong and disagreeable odor,” a capacity for love limited to “eager desire” more than sentiment, and a capacity for reason he insisted was far inferior to that of the white man.) I admire these white men for the good that they did, despite my recognition that they were so grievously wrong about African Americans and slavery.

Lee, the man, was more than a participant in the repugnant institution of slavery.  He was a great military strategist.  He was a man who sacrificed his personal welfare for what he saw as his duty to his country.  And perhaps most importantly for me, he became a significant force for reconciliation after the war.  When the government of the Confederacy collapsed and its armies surrendered, many Southerners wanted to continue the fight for slavery, on an underground, guerrilla-warfare basis.  This stubborn, “never-say-die” sentiment led to formation of the K.K.K., and to the worst atrocities of the Reconstruction era.  Indirectly, it led to the current existence of hate groups like the Nazi group that marched in Charlottesville.   In the face of such atrocities, Robert E. Lee advocated against continued resistance, calling repeatedly for reconciliation with the north, for fair and decent treatment of the freedman, respect for the law, and the putting aside of past hatreds in order to restore unity, harmony, and civility.  True, he didn’t support giving blacks the right to vote, and I fault him for that, but he lived in an era when that was viewed as an extremist position.  And if one looks at his postwar record as a whole, he was primarily a peacemaker.   A man who sought a better life for blacks and whites alike, and had to swallow a great deal of personal pride to do so.  Indeed, I think he might have been an early supporter of WMBW, had it been around in his day!  Chiefly for that reason, I count myself as a fan and supporter of the man.

And because I admire the man, I was, until recently, opposed to the removal of statues honoring him.

But what now?  Sadly, it sometimes takes jarring events, close to home, to get us to change our minds. And in this case, I have changed mine.  Many of those opposed to removal of Lee’s statues say that removal is an affront to history.  That was my own thinking just a couple of months again.  But now I think I was wrong.  Whatever we may think of a particular person, good or bad, is there not room to distinguish between our sentiments about the person and the reasons to erect (or take down) a statue? And aren’t the reasons for erection or removal of a statue important?   Let’s consider, carefully and dispassionately, the possible reasons for removing Lee’s statue.

First, if Lee’s participation in the atrocities of slavery oblige us to take down his statue, it seems to me consistency would require us to demolish the Washington Monument and the Jefferson Memorial.  Can a consistent standard for statues limit us to memorialize only those leaders who were entirely free from wrong? Limiting statues to perfect people may put a lot of sculptors out of work…

What about the assertions of Anderson Cooper, the Los Angeles Times, and others, who claim that Lee’s statute should be taken down because Lee (unlike Washington and Jefferson) was a traitor to his country?  Anderson Cooper asserted that Washington fought for his country, and Lee against his.  The Los Angeles Time’s headline said that Washington’s ownership of slaves was not the equivalent to Lee’s “treason.”  But didn’t George Washington lead an army against his mother-country (England)? And didn’t Lee lead an army that defended his homeland (the Commonwealth of Virginia) against an army that had invaded it and was bearing down on its capital?    In labeling Lee a traitor, or Washington a patriot, I believe it important to distinguish between today’s perspectives and those at the time these men lived.  Washington and Jefferson were subjects of the British Crown, and readily admitted that they were engaged in rebellion against their government, for which they’d be found guilty of treason if they lost.   Washington and Jefferson were among the rebels who embedded slavery in the Constitution in the first place.  And by the time Lee threw in his lot with Virginia, the Supreme Court of the United States had upheld slavery as the law of the land.  Today, we think of our primary patriotic  allegiance as belonging  to the United States, which we regard as a single, unitary country.  But our pledge of allegiance to such a unified country resulted from the Federal victory in the Civil War (which is to say, in large part, from Lee’s willingness to give up the fight to preserve slavery, and to accede to the prevailing egalitarian view.)  Prior to that war, we had been a confederated union of sovereign states.  The very words “commonwealth” and “state” reflect the idea of nationhood to which patriotism was generally believed to adhere.  (As but one example of the perceived sovereignty of the individual states, when Merriweather Lewis went west in the early 1800’s, meeting with native American tribes who knew nothing of the new government in Washington, he gave the same prepared speech to all of them, a speech that referred to President Thomas Jefferson as the great chief of “seventeen nations” (the number of sovereign states then comprising the U.S.).  I believe strongly that Lee’s sense of allegiance to his homeland, the Commonwealth of Virginia, was an honorable, patriotic position – moreso, by far, than Washington’s taking up of arms against England.  As I understand history, it was Washington who was the traitor to his country; Lee was the dutiful servant of his.

Is the difference, then, that the 1776 war for American independence was a morally “just war,” and the war to preserve the southern Confederacy and slavery an unjust one?   A lot of historians question whether the atrocities allegedly committed by England really were sufficient to warrant rebellion against them, but assuming you think Lee’s war relatively wrong, compared to Washington’s, I see Lee’s status, in this regard, as similar to the status of thousands of soldiers whose names appear on the Vietnam Memorial.  As a people, we have adopted the principle of honoring servicemen who have fought for their sovereign government, even when the war in which they served is judged by history to have been wrong.  If we remove the statue of Lee because he served his homeland in an unjust war, what are we to do with the Vietnam War Memorial?

What about the argument that most attracted me, that to remove the statue of Lee is to rewrite history?  It’s undeniable that Lee played an important part in history, but so did Lord Cornwallis, John Wilkes Booth, and Lee Harvey Oswald.  To have no statues honoring them is not to rewrite history, nor to deny the place of these individuals in it.  It is simply to recognize the difference between preserving history and the reasons to honor praiseworthy individuals by erecting public memorials to them.  A public statue is a symbol, intended to celebrate an idea.    If we ignore what the subjects of our statues symbolize, we risk celebrating the wrong things.  So, the right question, I think, is not whether Lee was a great general, or played an important role in history, or owned or punished his slaves, or was a traitor or a dutiful servant.  The right question to ask, I would suggest, is what does his memorial symbolize?

In considering that question, I think the key consideration is that while Washington was a traitor to his country, he did fight for ours.  And while Lee was a loyal, dutiful patriot who fought for his homeland, he did fight against the unified country that arose from that war.  It is not to demean the man’s character, or his service, or history itself, to recognize that he fought to divide what has (since) become the nation to which we now pledge allegiance.  If our public memorials are intended to remind us of our public principles, then it is the principle of unity, as a nation, that seems especially in need of attention these days – not the division for which Lee fought.  I have no idea whether Lord Cornwallis owned any slaves.  And he may have been a fine and honorable man, even a role model.  But we Americans don’t erect public statues to honor him.  In one sense, Lee symbolizes the opposition to the current American government every bit as much as Cornwallis does.  I see no loss in failing to memorialize either man.

AS for the many arguments in the nature of “If we remove this statue, what next?” I believe there are matters of institutional purpose to consider. I doubt the NAACP will ever erect a statue to George Washington, the white slave owner, nor should they, because he was a white slave-owner and as such is inimical to the interests of that organization.   The racist Louis Agassiz’s name has, thankfully, been removed from schools named to honor him, but I believe his name properly remains as the name of Agassiz Glacier, in Glacier National Park, because Agassiz remains respected for his pioneering scientific work on glaciers.  As abhorrent as I consider the Nazis to be, if they want to erect a statue of Adolf Hitler on their own property, they’d have the right to do so.  As for Washington and Lee, I do not believe that the college bearing their names should feel compelled to change its name or remove the statues I presume exist on its campus to remember them.  Washington saved the school with his financial support; as the college’s president, Lee greatly expanded the school; I believe the college should honor these men for that institution-specific history, and if that includes maintaining statues to both men, I support that.  In that context, Washington and Lee would symbolize, and be accorded honored for, their service to that institution.   In 1962, the United States Military Academy named one of its barracks after Lee.  I think that appropriate, because Lee was a brilliant military strategist and because he had served as that school’s commander.  And I think George Washington should (and will) properly remain on our dollar bills, and be honored in our national capital, because despite his racism and ownership of slaves, and despite his being a traitor to his sovereign country, he was still instrumental in the establishment of this country.  By this reasoning, even a statue of John Wilkes Booth might be appropriate at Ford’s Theater.  My point is that there’s a proper role for institutional purpose in the choice of who an institution recognizes through its memorials.  Even if we get to the point of tearing down his memorial in Washington, a statue of Thomas Jefferson will always be appropriate at Monticello, and a statue of Lee appropriate at Stratford Hall.  To remove some statues of Robert E. Lee does not require the removal of all of them, and certainly doesn’t mean to erase him from history; much depends, I think, on the institution and its purposes.

So where does that leave us?  The City Councils of New Orleans and Charlottesville are institutions, and institutions of a particular type: they have been elected to represent all their citizens.  They should be celebrating the current government (American, not British; the USA, not the CSA).  And they should be choosing memorials that symbolize the current ideals of the people they represent – the ideals of a diverse nation that has come together in peace. In these divisive times, it is as important as ever that they choose symbols of tolerance and inclusion.   By virtue of his position as opposition commander in an effort to divide the union, Robert E. Lee necessarily symbolizes opposition to the national government that won the war.  He symbolizes a divided country, one in which the north would have been free to abolish slavery as long as the south was free to continue it.  That’s not an ideal any government in the United States should want to memorialize.   It is past time to stop celebrating it, or anyone who represents ethnic, racial or ideological division.

Right or wrong, those are my views.  But this week, as I watched our president, and our news media, address the issue from opposite sides, I was struck again by the all-or-nothing positioning on both sides.  Trump and the media both talked about “the two sides” – those for, and against, removal, and sometimes, as if all those who opposed removal were white supremacists or Neo-Nazis.  Are we no longer capable of a more nuanced analysis?  Must every individual be vilified by association with the worst of the people on the other side? Must people classify me as a Nazi, if I utter a single word of respect for a man like Robert E. Lee, or a liberal destroyer of history, if I support the removal of his statue?

Changing people’s minds will only happen when people starting listening to each other.  These days, it seems, no one is listening to anybody; people seem interested in knowing whether you’re for them or against them, and that’s it – not your reasons, not the finer points of what you have to say, or the reasoning behind it.  The scary thing is, it’s remarkably like the situation in 1860, when the polar opposites took their corners and came out fighting, leaving hundreds of thousands of casualties in their wake.  In my view, the only way to avoid a repeat of such violence is to be alert to the possible faults in ourselves; and to be willing to continue looking for the good in people even after we see the bad in them.  We have to be willing to learn from those we think are wrong.  Otherwise, I believe, we will all share responsibility for the violence to come.

So though I join the call for removing his statues from public places, I still think we can learn from Robert E. Lee.  In an 1865 letter to a magazine editor, he wrote, at the end of the war, “It should be the object of all to avoid controversy, to allay passion, give full scope to reason and to every kindly feeling. By doing this and encouraging our citizens to engage in the duties of life with all their heart and mind, with a determination not to be turned aside by thoughts of the past and fears of the future, our country will not only be restored in material prosperity, but will be advanced in science, in virtue and in religion.”

How I wish that Lee himself had been in Charlottesville last week, to make that point to all those present.   I wonder if any of those whose acts led to violence had any idea of that side of Robert E. Lee.  Or did both “sides” simply think of him as a symbol of an era in which white supremacy was the law of the land, and align themselves accordingly?

The next fight close to home will no doubt involve the statues of all the confederate generals lining Monument Avenue here in Richmond.  The very short video attached, courtesy of our local TV station, offers a message I think Robert E. Lee would have approved of.


Please follow, share and like us:
Follow by Email

The Top Ten Blunders of All Time

For several months now, I’ve been plagued by the thought that certain ways of “being wrong” are different from others.  I’ve wondered whether I’ve confused any thing by not mentioning types of error and distinguishing between them.  For example, there are errors of fact and errors of opinion.  (It’s one thing to be wrong in thinking that 2+ 2 = 5, or that Idaho is further south than Texas, while it’s quite another to be “wrong” about whether Gwyneth Paltrow or Meryl Streep is a better actor.)  Meanwhile, different as statements of fact may be from statements of opinion, all such propositions have in common that they are declarations about present reality.  Not so a third type of error – judgmental errors about what “ought” to be done.   Should I accept my friend’s wedding invitation?  Should I apologize to my brother?  Should we build a wall on the Mexican border?  I might be wrong in my answer to all such questions, but how is it possible to know?

Is there a difference between matters of opinion (Paltrow is better than Streep) and ethics (it’s wrong to kill)?  Many would say there’s an absolute moral standard against which ethics ought to be judged, quite apart from questions of personal preference; others would argue that such standards are themselves a matter of personal preference.  I’ve thought a lot about how different types of error might be distinguished.  But every time I think I’m getting somewhere, I wind up deciding I was wrong.

One of the ways I’ve come full circle relates to the distinction between past and future.  It’s one thing to be wrong about something that has, in fact, happened, and another to be wrong about a prediction of things to come.  Right?  Isn’t one a matter of fact, and the other a matter of opinion?  In doing the research for my recent novel, Alemeth, I came across the following  tidbit  from the Panola Star of December 24, 1856:

The past is disclosed; the future concealed in doubt.  And yet human nature is heedless of the past and fearful of the future, regarding not science and experience that past ages have revealed.

Here I was, writing a historical novel about the divisiveness that led to civil war. I was driven to spend seven years on the project because of the sentiment expressed in that passage: that we can, and ought to, learn from the past.  (Specifically, we should learn that when half the people in the country feel one way, and half the other, both sides labeling the other stupid, or intentionally malicious, an awful lot of people are likely wrong about the matter in question, and the odds seem pretty close to even that any given individual (that includes each of us) is one of the many in the wrong.  And importantly, the great divide wasn’t because all the smart people lived in some states, or that all the bad people lived in others: rather, people tended to think as they did because of the prevailing sentiments of the people around them. Hmnnn…)

Then, an interesting thing happened in the course of writing the book.  Research began teaching me  how many pitfalls there are in thinking we can really know the past.  We have newspapers, and old letters, and other records, but how much is left out of such things?  How many mistakes might they contain?  Indeed, how many were distorted, intentionally, by partisan agendas at the time?  The more I came across examples of each of those things, the less sure I became that we can ever really know the past.  I often can’t remember what I myself was doing ten minutes ago; how will I ever be able to reconstruct how things were for tens of thousands of people a hundred years ago?  Indeed, the more I thought about it, I began to circle back on myself, wondering whether the opposite of where I’d started was true:  Because the past has, forever, already passed, we’ll never be able to return to it, to touch it, to look it directly in the eye, right?  Whereas, we will have that ability with respect to things yet to come.  If that’s true, the future just might be more “verifiable”  than the past.   I get dizzy just thinking about it.

Anyway, an idea I’ve been kicking around is to ask you, WMBW’s readers, to submit nominations for the ten greatest (human) blunders of all time.  I remain extremely interested in the idea, so if any of you are inclined to submit nominations, I’d be delighted.  But the reason I haven’t actually made the request before now stems from my confusion about categories of wrong.  Any list of “the ten greatest blunders of all time” would be focused on the past and perhaps the present, while presumably excluding the future.  But I’m tempted to exclude the present as well.   I mean, I feel confident there are plenty of strong opinions about, say, global warming – and since the destruction of our species, if not all life on earth, may be at stake, sending carbon into the air might well deserve a place on such a list.   Your own top ten blunders of all time list might include abortion, capitalism,  Obamacare, the Wall, our presence in Afghanistan, our failure to address world hunger, etc., depending on your politics.  But a top ten list of blunders based on current issues (that is, based on the conviction that  “the other side” is currently making a world class blunder) would surprise few of us. It seems to me the internet and daily news already makes the nominees for such a list clear.  What would be served by our adding to it here?

My interest, rather, has been in a list that considers only past human blunders, removed from the passions of the present day.  I believe such a list might help remind us of our own fallibility, as a species.  I for one am constantly amazed, when I research the past, at our human capacity for error.  Not just individual error, but widespread cultural error, or fundamental mistakes in accepted scientific thinking.  My bookshelves are full of celebrations of the great achievements of mankind, books that fill us with pride in our own wisdom, but where are the books which chronicle our stupendous errors, and so remind us of our fallibility? How could nearly all of Germany have got behind Hitler?  How could the South have gone to war to preserve slavery?  How could so many people have believed that disease was caused by miasma, or that applying leeches to drain blood would cure sickness, or that the earth was flat, or that the sun revolves around the earth?

What really interests me is not just how often we’ve been wrong, but how ready we’ve been to assert, at the time, that we knew we were right.  The English explorer John Davys shared the attitude of many who brought European culture to the New World, before native Americans were sent off to reservations:

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

History is full of such declarations.  In researching the American ante-bellum South, not once did I come across anyone saying, “Now, this slavery thing is a very close question, and we may well be wrong, but we think, on balance, that…”  In the days before we knew that mosquitos carried Yellow Fever, scientific pronouncements asserted as fact that the disease was carried by wingless, crawling animalcula that crept along the ground.  This penchant for treating our beliefs as knowledge is why I so love the quote (attributed to various people) that runs, “It ain’t what people don’t know that’s the problem; it’s what they do know, that ain’t so.”

My special interest lies in blunders where large numbers of people – especially educated people, or those in authority – have believed that things are one way, where the passage of time has proven otherwise.  My interest is especially strong if the people were adamant, or arrogant, about what they believed.  Consider this, then, a request for nominations, if you will, especially of blunders with that sort of feel

Yet be forewarned.  There’s a reason I haven’t been able to come up with a list of my own.  One is that, while not particularly interested in errors of judgment or opinion, I’m not sure where the dividing line falls between fact and opinion. Often, as in the debate over global warming, the very passions aroused are over whether the question is a matter of fact or opinion.  Quite likely, what we believe is fact; what our opponents believe is opinion.

The other is the shaky ground I feel beneath my feet when I try to judge historical error as if today’s understanding of truth will be the final word.   Remember when we “learned” that thalidomide would help pregnant women deal with morning sickness?  Or when we “learned” that saccharin causes cancer?  That red wine was good for the heart (or bad?  What are they saying on that subject these days?)  What about when Einstein stood Newton on his head, or the indications, now, that Einstein might not have got it all right?  If our history is replete with examples of wrongness, what reason is there to think that we’ve gotten past such blunders, that today’s understanding of truth is the standard by which we might identify the ten greatest blunders of all time?  Perhaps the greater blunder may be when we confidently identify, as a top ten false belief of the past, something which our grandchildren will discover has been true all along.…

If this makes you as dizzy as it does me, then consider this: The word “wrong” comes from Old English wrenc, a twisting; it’s related to Old High German renken, to wrench, which is why the tool we call a wrench is used to twist things.  This is all akin to the twisting we produce when we wring out a wet cloth, for when such cloth has been thoroughly twisted, wrinkled, or wrung out, we call it “wrong.” Something is wrong, in other words, when it’s gotten so twisted as to be other than straight.

But in an Einsteinian world, what is it to be straight?  The word “correct,” like the word “right” itself, comes from Latin rectus, meaning straight.  The Latin comes, in turn, from the Indo-European root reg-.  The same root that gave us the Latin word rex, meaning the king.  Joseph Partridge tells us that the king was so called because he was the one who kept us straight, which is to say, in line with his command.  The list of related words, not surprisingly, includes not only regular and regimen, not only reign, realm and region, but the German word Reich.  If the history of language tells us much about ourselves and how we think, then consider the regional differences in civil war America as an instance of rightness..  Consider the history of Germany’s Third Reich as an instance of rightness  It seems we’ve always defined being “right” as a matter of conformity, in alignment with whatever authority controls our and our neighbors’ ideas.

Being wrong, on the other hand?   Is it destined to be defined only as the belief not in conformity to the view accepted by those in charge?  Sometimes I think I’ve got wrongness understood, thinking I know what it is, thinking I’m able to recognize it when I see it.  But I always seem to end up where I began, going around in circles, as if space itself is twisted, curved, or consisting of thirteen dimensions.   I therefore think my own nomination for the Ten Greatest Blunders of All Time has to go to Carl Linnaeus, for calling us Homo Sapiens. 

If you have a nomination of your own, please leave it as a comment on this thread, with any explanation, qualification, or other twist you might want to leave with it.

I’m looking forward to your thoughts.



Please follow, share and like us:
Follow by Email

The Tag Line

WMBW’s tagline is “Fallibility>Humility>Civility.”  It’s punctuated to suggest that one state of being should lead naturally to the next.  The relationship between these three concepts being central to the idea, today I’ve accepted my brother’s suggestion to comment about the meaning of the words.

Etymology books tell us that “fallibility” comes from the Latin fallere, a transitive verb that meant to cause something to stumble.  In the reflexive form, Cicero’s me fallit (“something caused me to stumble”) bestowed upon our concept of fallibility the useful idea that when one makes a mistake, it isn’t one’s own fault.  As Flip Wilson used to say, “the devil made me do it.”

This is something I adore about language – the way we speak is instructive because it mirrors the way we think.   Therefore, tracing the way language evolves, we can trace the logic (or illogic) of the way we have historically tended to think, and so we can learn something about ourselves.  Applying that concept here leads me to conclude that denying personal responsibility for our mistakes goes back at least as far as Cicero, probably as far as the origins of language itself, and perhaps even farther.  “I did not err,” our ancient ancestors taught their children to say; “something caused me to stumble.”

I also think it’s fun to examine the development of language to see how basic ideas multiply into related concepts, the way parents give rise to multiple siblings.  And so, from the Latin fallere come the French faux pas and the English words false, fallacy,  fault, and ultimately, failure and fail.  While I’ve heard people admit that they were at fault when they stumbled, it’s far less common to hear anyone admit responsibility for complete failure.  If someone does, her friends tell her not to be so hard on herself.  His psychiatrist is liable to label him abnormal, perhaps pathologically so: depressed, perhaps, or at least lacking in healthy self-esteem.  The accepted wisdom tells us that a healthier state of mind comes from placing blame elsewhere, rather than on oneself.  Most interesting.

Humility, meanwhile, apparently began life in the Indo-European root khem, which spawned similar-sounding words in Hittite, Tokharian, and various other ancient languages.  All such words meant the earth, the land, the soil, the ground – that which is lowly, one might say; the thing upon which all of us have been raised to tread.  In Latin the Indo-European root meaning the ground underfoot became humus, and led to English words like exhume, meaning to remove from the ground.  Not long thereafter, one imagines, the very ancient idea that human beings came from the ground (dust, clay, or whatever) or at least lived on it led to the Latin word homo, a derivative of humus, which essentially meant a creature of the ground (as opposed to those of the air or the sea).  From there came the English words human and humanity.  Our humanity, then, might be said to mean, ultimately, our very lowliness.

From the Latin, homo and humus give us two rather contrary sibling words.  These siblings remain in a classic rivalry played out to this day in all manner of ways.  On the one hand, homo and humus give us our word “humility,” the quality of being low to the ground.  We express humility when we kneel before a lord  or bow low to indicate subservience. In this light, humility might be said to be the very essence of humanity, since both embody our lowly, soiled, earth-bound natures  But our human nature tempts us with the idea that it isn’t good to be so low to the ground.  To humiliate someone else is to put them in their place (to wit, low to the ground, or at least, low compared to us.) And while we share with dogs and many other creatures of the land the habit of getting low to express submissiveness, some of our fellow creatures of the land go so far as to lay down and bare the undersides of their necks to show submission.  Few of us are willing to demonstrate that degree of humility.)

And so the concept of being a creature of the ground underfoot gives rise to a sibling rivalry — there arises what might be called the “evil twin” of humility, and it is the scientific name by which we distinguish ourselves from other land-based creatures: the perception that we are the best and wisest of them gives rise to homo sapiens, the wise land-creature.  As I’ve pointed out in an earlier blog, even that accolade wasn’t enough to satisfy us for long: now our scientists have bestowed upon us the name homo sapiens sapiens, or the doubly wise creatures of the earth.   I find much that seems telling in the tension between our humble origins and our self-congratulatory honorific.  As for the current state of the rivalry, I would merely point out that not one of our fellow creatures of the land, as far as I know, have ever called us wise.  It may be only us who think us so.

And now, I turn to “civility.”  Joseph Partridge, my favorite etymologist, traces the word back to an Indo-European root kei, meaning to lie down. In various early languages, that common root came to mean the place where one lies down, or one’s home. (Partridge asserts that the English word “home” itself ultimately comes from the same root.)  Meanwhile, Partridge tells us, the Indo-European kei morphed into the Sanskrit word siva, meaning friendly.  (It shouldn’t be hard to imagine how the concepts of home and friendliness were early associated, especially given the relationship between friendliness and propagation.) In Latin, a language which evolved in one of the ancient world’s most concentrated population centers, the root kei became the root ciu- seen in such words as ciuis, (a citizen, or person in relation to his neighbors), and ciuitas (a city-state, an aggregation of citizens, the quality of being in such an inherently friendly relationship to others).  By the time we get to English, such words as citizen, citadel, city, civics and civilization, and of course civility itself, all owe their basic meaning to the idea of getting along well with those with whom we share a home.

In the olden days, when one’s home might have been a tent on the Savannah, or a group of villagers occupying one bank of the river, civility was important to producing harmony and cooperation among those who laid down to sleep together.  Such cooperation was important for families to work together and survive.  But as families became villages, villages became cities, and city-states became larger civilizations, we have been expanding the reach of people who sleep together.  (And I mean literally – my Florida-born son, my Japanese-born daughter-in-law, and my grandson, Ryu, who even as I write is flying back from Japan to Florida, remind me of that fact daily.)  Our family has spread beyond the riverbank to the globe.

Given the meanings of all these words, I would ask how far our modern sense of “home” and “family” extend?  What does it mean, these days, to be “civilized”?  What does it mean, oh doubly-wise creatures of the earth, to be “humane”? And in the final analysis, what will it take to “fail”?

— Joe

Please follow, share and like us:
Follow by Email

My Favorite African Photo

I got back from an African safari vacation last night, very jet-lagged, having not slept for about 43 hours.  When I woke up this morning, I was anxious  to start organizing the photographs from my trip.  Siting down at the PC to do so, I found an e-mail from my erstwhile roommate, John, reminding me to send him photos of the wildlife I’d seen.  (John is an avid outdoorsman who once tried to make a living as a wildlife photographer.) Having not yet gone through the photos myself, having not yet cropped, nor cut, nor selected any of them, I wasn’t ready to give John the full-blown “Here Are the Pics of My African Vacation” slideshow – but I decided I’d send him just one of them – both because it was my sentimental favorite, of all those I’d taken, and because I knew that John, of all people, would appreciate it.

Now, the reason John would appreciate this particular photo was not just that he’s an erstwhile wildlife photographer; almost all the photos I’d taken were of African wildlife.  But the year that John and I spent as college roommates, many decades ago, were marked by regular discussions of deep philosophical issues; and this photograph had become my  favorite due to its philosophical implications, implications I felt sure John would appreciate.

As I learned on the Shamwari game preserve, most wild baboon troops of South Africa run quickly away at the approach of human beings.  But on the day this photograph was taken, I had come to the extreme southwestern tip of the African continent, a rocky, mountainous formation that rises high above sea level like the prow of a sailing ship that projects above the ocean waves.  In fact, here is a photograph – one taken from the Wikipedia article regarding the Cape of Good Hope – which shows the general topography of the place.

View at Cape Point

(Photo by Thomas Bjørkan - Own work, CC BY-SA 3.0,

Naturally enough, given the impressive topography, the Cape has become a tourist attraction.  The result of being a tourist attraction is that the native baboons of the Cape have lost their fear of human beings.  In the Cape Point parking lot, they were nearly as plentiful as the people, ready to pounce on anyone foolish enough to walk by with a sandwich in hand.  They were sitting on the roofs of cars.  They were scouting for half-open windows through which to steal picnic lunches.  They were on the rocks, in the bushes, outside the souvenir shop, intermingling fearlessly with us, their more advanced cousins.

I took the photograph in question – the photograph I wanted to send to John because it had become my favorite – while standing on the Cape, looking south like some fifteenth century Portuguese explorer from the bow of his ship, gazing across thousands of miles of ocean toward the south pole.  The Atlantic Ocean was to my right, the Indian not far to my left, and the Antarctic somewhere far in the distance in front of me. To my immediate left, on the summit of the mountain peak, a lighthouse had been built to guide ships rounding the Cape. Because of my fear of heights, I had not attempted the funicular or the steep climb from the funicular to the summit, but as I looked at the rocky cliff, with the triple-ocean breeze blowing into my face and the triple-ocean surf crashing into the old, unmoving rocks below, I noticed movement high up on the cliff’s stony face.  Tapping into the unconscious (but ingrained) ability of one primate to recognize the movements of another, I was drawn to it, a twitch on the horizon, a dark profile silhouetted against a bright sky.  He was maybe fifteen hundred feet away from me and several hundred feet higher than me, but I could see him settle onto one of the highest, most southerly rocks of the cliff side, clearly fixing his gaze southward, looking out over the oceans just as I’d been doing — except, of course, that he was braver and more agile than me, having dared to climb out onto the virtual bowsprit of the continent, where I would not.  I wondered why he wasn’t in the parking lot, with the rest of his kind, ready to pounce on a tourist; wondered why he had gone off on his own, to gaze across the oceans toward the vast unknown.

Like all primates, baboons are an intelligent species.  Scientists have recently discovered that they can acquire orthographic processing skills which form part of the ability to read.* I wondered if this solitary philosopher was more intelligent than his fellows in the parking lot.  I imagined the thoughts he was having about other lands, far away.  I imagined him capable of evolving into another Bartholomeo Diaz someday.  Gazing across the ocean and into the unknown, I wondered: wasn’t it possible he had seen ships pass, and wondered how he might build a ship of his own, to go exploring, some day?  I maximized the camera’s zoom and got the best picture of the contemplative creature I could.

The sight so impressed me, in fact, that for the rest of my time in Africa, I told people about it.  Last night – my first night home – I told my wife, and my daughters, and my grandson Jacob, about it.  Jacob in particular was wide-eyed as I promised to show him the photograph when he comes over this afternoon.   The profiled creature has become my hero; the photograph of him looking out across the ocean has stuck with me, and I haven’t been able to get it out of my head – more than the photographs of lions, cheetahs, elephants, and giraffes I took – even more than the elegant springbok herd, the pod of dangerous hippopotami, or the solitary, rare and elusive black rhinoceros.  It is my favorite photograph, despite the fact that, fully zoomed in, and lacking a tripod for my camera, the image came out slightly blurry. It is my favorite not for its technical quality, but because of its fascinating philosophical implications.  And as I composed my e-mail to John this morning, he seemed the perfect person to appreciate those implications.

Anyway, this morning, as I composed my e-mail to John, I described the photograph I was sending him, describing why it was my favorite, much as I had to Jacob last night, much as I have to you here.  As I was finishing my written description to John, my grandson Evan walked into the room. I invited him to come take a look at the photograph of the contemplative baboon.  I fetched it from the digital camera’s SD drive and displayed it on my monitor.  Evan and I shared still more deep, philosophical observations about our most intriguing subject.  Finally, after Evan departed, I embedded the photograph into my e-mail to John, as I now do here:

You can see the solitary baboon toward the top of the picture, squatting on all fours, his tail raised behind him, dreaming of building his own ship and exploring the oceans on three sides of him.

Alternatively, you can do as I did.  To wit: as I embedded the photograph into my e-mail to John, I realized that I could blow it up even larger, digitally, than I’d been able to do through the zoom setting on the camera  With the wonders of modern technology – my virtual icon in the shape of a magnifying glass with a plus sign – I was able to enlarge the photo enough to see the image at a level of detail not revealed by the camera’s telephoto lens.  Glints of sunlight on the rock, the baboon’s tail, his haunches.  Magnifying the image even more, I thought I might even have captured the contemplative expression on the creature’s face.  But the more I enlarged it, the more the baboon’s haunches looked like a torso, his legs hidden behind the rock; and the more its tail looked like a back pack.  With a final enlargement, I could see how close this baboon had evolved to the point of being able to read – he was wearing a pair of glasses.

As you’ve figured out by now, the fascinating, contemplative creature was actually a tourist, just like me (only without the fear of heights). The only baboon in the picture had been on my side of the lens.  What will I tell my grandchildren now?  (At least until now, a few of them still look up to me.)  Is that Jacob, coming up the stairs now?

Still, the photograph remains my favorite wildlife photograph.  And the reason hasn’t changed, either: although it’s still a bit blurry, the photograph has deep, philosophical implications for the species it portrays.

— Joe

*Jonathan Grainger; Stéphane Dufau; Marie Montant; Johannes C. Ziegler; Joël Fagot (2012). “Orthographic processing in baboons (Papio papio)”. Science. 336 (6078): 245–248. PMID 22499949. doi:10.1126/science.1218152.



Please follow, share and like us:
Follow by Email

The Rather Large Ant

Thanks to my now-37-year-old son Daniel for today’s illustration of one reason we’re so easily and often wrong.  His e-mail to his mother, in thanks for his birthday present:
“Thanks mom, I love the shorts.  We recently changed to casual dress at work and all my old ones were probably bought when I was in junior high. I’ve been meaning to upgrade but I HATE clothes shopping, so this is a truly useful gift.
“Now on to the bad part. While I do enjoy the chocolates you gifted, I think we need to make those [chocolates] in-person gifts only from here on out. They actually held up to the temperature surprisingly well; they weren’t melted at all. However…
“As I was opening the package I noticed a rather large ant on my arm. I swatted it and continued opening the package.  I saw another ant on my leg.  As I went to send it to meet its brother, from the corner of my eye I saw two others skittering along the outside of the package.
“I figured a handful of ants had decided to check the package out.  While picking through the shorts I began to realize I should have taken this endeavor outside, as one after another appeared.  But I continued methodically, ensuring no ant escaped as I carefully separated each layer of the shorts, still blissfully overconfident in my ability to handle whatever lay ahead. When I got to the final pair I was a bit stressed out – I don’t like killing things, not even ants.  But this is my house now, and I gotta let them know.
“I lift that last pair up, and I see the expected few newly-disturbed ants run across its pockets. I’m happy this whole thing is almost over.  I reach toward one of the final survivors. He loses his footing, perhaps in fear as he sees that God has now chosen him, and falls to the bottom of the expected-to-be-empty box. Only it wasn’t. There is something truly awful about a swarm of anything, and ants are no exception. The silver lining is that millipedes would have been infinitely worse. I had the pride not to scream, but I jumped back in horror and revulsion. Right now the box is sitting in the backyard, to be dealt with in the safety of daylight.
“Thank you for the shorts and nightmares, mom.
“I love you more than you know.”
Dan’s e-mail was not composed with an eye toward appearing in this blog, but I offer it (with his permission) because it seems to me an excellent illustration of how we’re so good at persisting in error.  Central to this phenomenon, in my view, are the roles played by focus, expectation, and confidence.  In Dan’s case, the appearance of a single ant — a “rather large” ant, in fact — created a perception that it was a solitary intruder.  On the strength of that initial perception, the appearance of three other ants caused no more than the minimally necessary adjustment to the initial theory— a “handful” of ants was in the process of checking out the box.  And having adopted a careful strategy to deal with that belief — the uncovering and execution of every single threatening intruder — his very carefulness, his focused determination to execute that strategy, led to expectation, and to confidence in the result expected.  That very focus and confidence is what blinded him to the truth.
He’s a chip off the old block, alright — and so are the rest of us, I think: confirmation bias and WYSIATI (as Daniel Kahneman calls it) — What You See Is All There Is — are shared cognitive traits we’ve all inherited from common ancestors.  In the world in which we live, individual ants — the things we see — seem large.  Taken individually, each new ant simply confirms what we already believe; it takes a sudden swarm, discovered too late, to shock us into awareness of the way things really are.
Humility is the mother of wisdom, I think.  So my wish for today is that we can look at every ant we come across with wonder, knowing that behind every little thing we see, there’s a great many more — some of which are far larger — that we don’t.
Please follow, share and like us:
Follow by Email