Your Daily Dilemma

You’re approaching a door. 

Not one of those modern supermarket doors with motion sensors that open automatically, but  one of those plain old doors with an old-fashioned brass knob you actually have to turn.

Cradled in your right arm, you are carrying a large bag full of groceries.

In your left hand, you are carrying a Slurpee (or a Slushee, or whatever they call them).  By whatever name, it is a large styrofoam cup (bad for the environment), with a thin plastic lid (bad for the environment) and a plastic straw (the type that kills innocent birds), the liquid contents of which you have already mostly consumed (to the detriment of your gut health).  But the styrofoam cup is still full of ice and a couple ounces of a chemical-laden soft drink which, if consumed, will only further poison you.

But enough of that.  The immediate problem facing you is how to open the door.

To open the door with your right hand you’d have to put the bag of groceries down.  At your age, given the condition of your back, this isn’t as easy as it once was.  You might strain your back, or even fall and break your hip.   And if you put the bag down, it might fall over, spilling out all that marvelous junk food you were so looking forward to.  Is the dog around?  How much of it will he get, before you can stop him?

To open the door with your left hand, you’d either have to put the soft drink on the ground – risking problems similar to those just described, though not exactly the same  – or trying to turn the knob with the drink still in hand, hoping to turn the knob without spilling the drink.  But of course, if you spill the drink, there’s a floor to clean…

Which hand do you use to open the door?

Stop.  Really.  Stop and reflect on it.  Which hand would you use?

Some people would say the correct answer is the right hand.  Others would say the left.

But these people have been raised in a different world than mine – one in which there are only two answers, right or left. 

And whether they’ve chosen the right or the left, they feel quite sure that they have made the only sensible choice.

In my world, there are many, many options available.  And even if it were a simple choice between right and left (which it hardly ever is), there are so many unknowns, ramifications, risks, possibilities and preferences to consider, that it’s really all very subjective. I mean, maybe it would be better for the dog to eat the Twinkies, rather than you…

How is it even possible to think one choice “right,” and the other “wrong”?

How is it possible to think someone who chooses differently than you is either stupid or evil?

Ya got me.  Maybe, as kids, we all just drank too much Kool-Aid.

Wrong Parking Space

Fifteen years ago, I quit taking statins for high cholesterol; I’ve been resisting doctors’ pleas to resume them ever since. For several years now, the doctors have been recommending high blood pressure medication too. Dutifully, I added that recommendation to the list of those I respectfully decline to follow.

But this spring, a series of developments finally reduced (wore down?) my resistance. I’d felt some minor chest pains (more like muscle strain than anything serious) but after that, I began to notice that my blood pressure was way up. My wind was also down. Anyway, last week, I succumbed to an appointment with a cardiologist. The cardiologist insisted I come back for a nuclear stress test. The test was scheduled for this morning.

So I drive to the hospital. I pull into the parking garage and begin searching for an empty spot. The first level is full, so when I find an empty space on the second level, I start pulling into it – only to see a sign informing me that the space is reserved for the elderly. Dutifully, respectfully, I start to pull out of the space, until I happen to glance back at the sign.


After a lifetime of being young, I know that reserved spaces are for other people, not me. Right?

But I’ll be 69 next month.

Humbled yet again, I pulled back into the space – apparently, the space where I belong.

– Joe

Sweatshirt Photo

Now you can judge for yourselves.

My last post recounted our domestic controversy about the color of my wife’s sweatshirt. It all began when I made a casual comment that, based on our “matching” sneakers, sweatpants, and sweatshirts, she and I were dressed alike. When she replied that my sweatshirt was gray and hers was green, I readily acknowledged that the match was not exact, and I’ll now happily submit to a judgment that Karen’s is sand, or tan, or mushroom, or any other label that simply proves I was wrong ever to think of it as being “gray” like mine.

But admitting I’m wrong is one thing. Admitting my spouse is right? That’s far, far harder. Can I manage it? Well.. NO! I’ll DIE before I call it green!!!

Still, with so much controversy, I thought it only fair to post a photograph 0f the two. My gray shirt is on the right. Karen’s shirt — call it what you will — is on the left.

P.S. If anyone else says its green, I’ll — I’ll — well, I guess I’ll just have to count it as one more proof that there’s no such thing as objective reality.

Karen’s Sweatshirt

Karen and I were about to leave for the gym when I noticed we were both wearing white sneakers, black sweat pants and gray sweat shirts.  When I casually remarked on the coincidence, she surprised me by disagreeing.  Her green sweatshirt was nothing like my gray one, she said.  Mine was a classic gray, with no color at all; we both agreed about that.  But hers, she insisted, was clearly green.

Astounded, I examined her sweatshirt in every light I could.  To my eye, her gray sweatshirt was different from mine only in that it had an extremely slight brownish tint to it.  In certain lights, I thought I might detect some blue sparkle amidst the gray, and in other lights, red or purple.  If I really stretched, I could persuade myself there were occasional flecks of yellow, the way sunlight reflecting off a field of freshly fallen snow might sparkle with microscopic pinpricks of various colors.  But as I saw it, that was it.  The sweatshirt was clearly gray, as clearly as snow is white, and the mix of other tones, each of them barely noticeable, combined to give its grayness a little more earthiness than mine – no more. It was still clearly gray.

Our respective workouts at the gym did nothing to resolve our different perspectives.  So as we were leaving, Karen asked three women behind the membership counter to tell us what color her sweatshirt was.  Sensing marital discord, one of the ladies tactfully declined to venture an opinion.  But when a second said Karen’s sweatshirt was gray, I chortled with glee to have my opinion corroborated.  Karen’s dismay was evident.  Picking up on Karen’s dismay, the third woman studied the shirt carefully and announced that it was “tan.” To my eye, there was a stronger hint of tan in the gray than green , so on the drive home, I enjoyed that heady feeling a man gets when other women agree with him, especially in disagreement with his wife.  My self-satisfaction was further enhanced when, arriving home, Karen asked our daughter Kate her opinion.  Her answer – “sand” – was music to my ears. I’ve never set foot on a green beach.

Now, we all know people can have different perceptions of the same thing.  But that’s not the point here.  At the moment Karen realized she didn’t have the support she’d expected, she blurted out, “Well.  It USED to be green!”

Aha!  For me, that explained so much.  The sweatshirt, nearly fifteen years old, had faded; Karen had clearly failed to notice the change..  In my very first WMBW blog, I’d told the story of two mistakes I’d just made on the golf course: one forming an incorrect belief about the location of my ball, and the other, more serious error, maintaining that belief thereafter, even in the face of evidence I was wrong.  If Karen’s sweatshirt had been green when she bought it, that would explain why she still thought it green.  She hadn’t noticed its gradual fading, so her once-green sweatshirt had always remained her green sweatshirt. 

I was reminded of the time, forty years ago, when I wrote on an application for a new driver’s license that my hair color was blond.  When the clerk who took my application handed it back to me, saying my hair was brown, I argued with her.  My hair had always been blond.  It wasn’t until a look in the mirror at home that I realized she was right.  Examining myself with “new eyes,” I wondered how long I’d been ignoring the evidence while continuing my long-held belief. 

I thought I might post my thoughts about this phenomenon – the way we cling to our existing beliefs despite contrary evidence – here on WMBW.  But later that day Karen came gleefully home with the report that another daughter, Jen, agreed with her that the shirt was green.  I was crestfallen. For two weeks now, I’ve been bothered by that report.   Was my theory wrong?  Was it simply a matter of differences in the rods and cones of different observers?  Whether my theory about the persistence of old beliefs had validity or not, I felt compelled to admit that Karen’s sweatshirt was not persuasive evidence of it.   I’ve already written about rods and cones. Karen’s sweatshirt, it seemed, deserved no place in WMBW.

But wait.  Alert as you are, you might now be asking yourself, “Why then is he wasting my time with these reflections about the sweatshirt?”  Great question.  The answer is that, just last night, I found out still more about the sweatshirt: namely, I learned that Karen actually has two of them, and they are identical.  Same size, same style, same brand, same color.  Bought at the same time, some ten to fifteen years ago. Bought from the same store, one by Karen and one by Jen. Jen – the only other observer to call the sweatshirt green – had worn the identical green sweatshirt for years, back in the day when it really was green, before she gave it to Karen.

So now I blog to report that of five people who’ve based their opinions only on current evidence, there’s been a single tactful abstention, one “tan,” one “sand,” and two “grays.”  In contrast, the two “greens” come from the two women who bought the same green sweatshirts over a decade ago, wore them for years, and formed their beliefs long before the sun and frequent washings had done their work. Five non-greens from people without prior beliefs,in contrast to two greens from people with prior beliefs.

Now, a sample size of only seven people may not be large enough to constitute statistical proof in support of my view.  That’s probably a good thing, because if a large sample size confirmed my theory, I might feel entitled to tell Karen I’d been proven right. (And that’s rarely a good thing for one spouse to say to another.) So this story is not one like my golfball post, about two mistakes, one the forming of an erroneous belief, the other of holding on to that belief without being willing to question it. And this is not even a story about my conviction that long-held beliefs (whether accurate or not) persist in the face of recent contrary evidence.  (There are reasons our marriage has lasted 47 years.)

Rather my point is simply that there’s always new evidence that can be brought to bear on one’s beliefs. In the case of Karen’s sweatshirt, when all I had was my own observation, that single observation was enough to persuade me that the shirt was gray. As Daniel Kahneman writes in Thinking Fast and Slow, “You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it.”  The thing is, Karen, too, had built her story years ago, and based on the information she had at the time, the shirt was green.

For me, subsequent evidence (Karen’s perception that the shirt was green) was enough to get me looking closer, to question my perception, though it didn’t change my mind.  But next came evidence of the perceptions of others – proponents of tan, and sand, and another gray – that led me to a conclusion about retinal differences (not to mention to gloating that I was in the majority).  The next piece of evidence – that Karen had bought a green shirt long ago that had apparently faded – changed my understanding from a theory of retinal difference to one of believing that Karen was suffering from confirmation bias.  Next, with the evidence that Jen, too, thought the shirt green, I was thrust back in the direction of retinal differences.  Now, the most recent information – that Jen wore the identical shirt for years – has cast yet another light on the whole matter. Currently, I’m back to attributing this “minority view” to confirmation bias. But as for the continuing parade of evidence to consider, has it ended, or is there more to come?

Kahneman could have had my initial conclusion in mind (the simply story that the sweatshirt was gray because I perceived it as gray) when he wrote, “Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle.” Indeed, the more I learned, the more complicated the puzzle became. But I think Kahneman’s conclusion is profound: “Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

I believe it’s natural for people to continue to believe what they’ve always believed, even in the face of contrary evidence.  And so, I suspect I’ll continue to believe that confirmation bias, aka close-mindedness, is a shared trait of our common human nature – at least until presented with contrary evidence. I only hope that I’ll be willing to consider that new evidence if it comes my way. I just don’t see any reason to believe that I already know everything about that sweatshirt that there is to know.

This morning, when I told Karen I was going to post my thoughts about all this, she looked me in the eye and said, “I STILL think it’s green.”   Well.  I still think it’s a sandy shade of gray. And I’m not calling Karen any more stubborn than I am, because she is a fantastic listener, always willing to consider new information. I only hope she feels the same way about me.

Best to all for the new year.


The Last Word

With much sadness, I have just now changed this website’s description of one of We May Be Wrong’s founding members – from the present tense, to the past.

In 1960, a 24 year old Dr. Paul Clement Czaja (January 9, 1936 – May 8, 2018) had just earned his Ph.D. in philosophy when he persuaded Nancy Rambush (then headmaster of the Whitby School and founder of the American Montessori Association) to let him teach existential philosophy to children.  She was impressed with his enthusiasm and his willingness to work for practically nothing, but since she thought parents might not understand the importance of teaching philosophy to children, she asked if he wouldn’t mind teaching other things as well.  So Paul “officially” taught creative writing, Latin and various other subjects not often taught to ten year olds.  But philosophy was his first love, and it found its way into everything.

Only fourteen years older than me, Paul was more an older brother than a teacher.  He showed me how to love the world around me; introduced me to the joy of learning everything I could about it.  The way a magnifying glass could make fire; the way Latin could turn language on its head yet still come out as modern English; the thrill of catching butterflies in nets; the way the Greek Alphabet could be painted with Japanese brushes and jet-black ink; the vital inner parts of dissected foetal pigs; the wonders of the Trachtenberg system of mathematical calculation; the wiggling of microscopic paramecia in pond water; the thrill of catching people and their stories with a 35 millimeter still camera, that of making our own stories with  a 16 millimeter movie camera, and then, the even weirder thrill of telling stories with frame-by-frame, stop-motion photography; the writings of Gertrude Stein, William Carlos Williams, and James Baldwin; the power of telling stories of our own  with just pen and ink.  We spliced and edited rolls of movie film we’d made and, somehow, we even enjoyed diagramming sentences, rummaging through grammar the way we searched for the Indo-European roots of words. Though I was not yet a teenager, Paul introduced me to Ingmar Bergman movies, to Van Gogh’s Starry Night, to Rodin’s The Thinker, and to Edward Steichen’s photographic exhibition,  The Family of Man.

To say the least, it was not your typical middle-school education.

They say that when a butterfly flaps its wings, it can have profound effects on the other side of the world – a concept I first heard from Paul, I’m sure.  If I hadn’t met him, he wouldn’t have written the recommendation that got me into Phillips Exeter, and I wouldn’t have… well, if a single butterfly flapping its wings can have a profound impact, having Paul as a teacher every day (winter and summer) for four impressionable years was like being borne to Mexico by millions of Monarchs.  We stayed in touch during my later school years, and then persisted in friendship as the difference in our ages seemed to vanish with the passage of time.  And so, I was pleased that he joined We May Be Wrong in 2016 as one of our founding members.

But now, it’s time for a confession.  As we tried to get our new website off the ground, Paul proposed that WMBW publish a poem he had written.  Being a man of great faith, Paul wrote a lot about God – prayers, poems, meditations.  When he proposed that WMBW publish his poem, I disagreed on the ground that I didn’t want the brand new website to come across as “pro” or “anti” anything controversial.  I didn’t want to risk alienating potential followers, be they liberal or conservative, Republican or Democrat, believers or non-believers, by implying some sort of hidden agenda.  (The ONLY agenda was to be the benefit of listening to others with an open mind.)  Holding the keys to the publishing platform, I declined to publish his poem lest it be misunderstood to evangelize about God, rather than fallibility.  But even then, I told him, once the website has been up for a while, we might be able to publish that sort of thing.

Well, the time has come.  I wish I’d published it before he left the earth he loved for the better one he yearned for.  For Paul,  I can only say a prayer of thanks for all he did for me, and for so many other children, and now, share his wonderful poem.  (It seems only right that he should have the last word.)

Fire in the Soup: A Creation Story

It happened


this earth

had just cooled down


being molten magna

to being


and steaming




the vaporous skies

had emptied

eons of towering cumulus

clouds of rain

making oceans


were so great


the whole sphere


much more a watery world,


the rocky land


but one large

continental island


in the middle

of a now

beautiful blue planet.


And then

when the heavens

were no longer


by that thick


of sulphurous cloud cover,


the earth’s atmosphere


pure and clear



the stars of the universe

to shine

so brightly


the night sky

seemed to be


with black peppery dots,



that a flame

came streaking

through the sky


to earth


the warm soup

of the sea





that chemical mineral ooze


the very first


that ever was

on this

so singular planet.



when that protozoa

eventually became





the idea


that perhaps

that life causing



once upon a time


the oceanic soup

could be

the pure energy

that is




that were



all life


ever evolved


that first protozoa

would be




of the eternal God —







God is love.

Such a thought

seems to be

a happy,

hope filled,


kind of


–Paul Clement Czaja


Love and Long Life

I just got back from a foray into the unfamiliar.  The unfamiliar nearly always gets me thinking (which is why I love it so).  And sometimes, you get another WMBW post as a result.

In this case, the experience wasn’t entirely unfamiliar.  It was my fiftieth high school reunion.  But as familiar as some of the attendees were, I encountered people different than the ones I had known fifty years before. Older, of course.  And wiser, I should hope.

There was a meeting of former classmates, a discussion session, planned as an exchange of ideas.  I was scheduled to speak briefly on the subject of We May Be Wrong.  The moderator who kicked off the session – now a well-known author and psychiatrist – began with the assertion that it is Love – along with the treasured relationships that bind us together on account of Love – that is the single best predictor of having a long life.  In study after study, said the moderator, the correlation between a long life and a life that includes friendships and social relationships with loved ones and friends is strong – stronger, in fact, than with any other predictor.  By returning to campus because of our bonding, we had self-identified as people who, on average, would lead a long life.

I do not doubt such research results.  I instinctively feel that the proposition is true.  It resonates with my WMBW perspective.  I immediately bonded with the speaker, sensing we had much in common.  And as I waited for my turn at the podium, I hoped my message would be well received.  I looked forward, in other words, to increasing the bond between my audience and myself.  I wanted to feel more of that love, even if it didn’t add a few more hours to my life.

But then, the assertion was made that one reason we classmates had so bonded as to travel from across the country, and even from some other countries, to reassemble fifty years later, was that our common enemies had created an especially strong bond among us.  The school’s Dean during our years in school was identified as one of them; when a class member suggested we take time to share stories about the man, there was no shortage of volunteers.  (Needless to say, all the stories were about how inhuman and unfeeling he was, and every one was welcomed with nods, and laughter, and more love.) President Lyndon Johnson was another target, and the Vietnam war: both were identified as things we all opposed, things that brought us together in love and portended well for our future longevity.

The moderator then said we ought to be proud, describing us children of the sixties as the generation that had changed the course of America by championing love, peace, and understanding.

In fact, between 1964 and 1968, I and a few other nerds had been members of the Young Americans for Freedom.  If you’re too young to remember it, the YAF was a student group that supported the war in Vietnam and other conservative political positions.  What’s more, I’d never had a run-in with our dean.  While in school, I had had a problem identifying with all the “bonding love” my classmates felt; during my years there, I’d felt shunned and ridiculed by them on account of my minority beliefs.  The ridicule led me, in those vulnerable years, to withdraw from my politically-charged peers.  To keep my political views to myself.

As I recalled these teenage experiences, I found myself contemplating something that had been said at breakfast that morning by a different attendee.  She had cited recent scientific research to the effect that the electro-chemical activity in the brain that occurs when we are with loved ones, feeling the bonds of strong community, is precisely the same as the activity occurring when we come together and hate (or at least disapprove of) those not within our group.

This, too, struck me as all but self-evident.  After all, why is it that having a common enemy causes us to unite, to feel comfort,  security, and all the ties that bind?

A few minutes later, when I got up to speak about We May Be Wrong, I found that some things hadn’t changed from fifty years earlier.  I still hoped my thoughts would be well received.  I still wanted to feel some of that love.  To belong to the tribe.

The world is a lonely place, from the outside, looking in.

– Joe

Thanks to F. Lee Bailey – Part Two

Last time out, I was discussing F. Lee Bailey’s effort to identify various reasons a witness can be mistaken.  Bailey’s thesis was that juries don’t want to believe that witnesses lie, so the wisest and most effective way for a lawyer to discredit a witness is to point out for the jury other reasons – other than bold-faced lies – that a witness might not be telling the truth.  Attempting to come up with my own list, I offered examples that involved lack of information, interpretation of information, forgetfulness, the making of assumptions, the lack of focus, and unconscious force of habit.  Today, I continue that survey of reasons for error.

One common reason witnesses can seem to have diametrically opposed versions of reality, when neither is lying, has to do with language.  When my son was four years old, he was being particularly cranky one evening, whining out loud while I was trying to watch television.  I told him to behave himself or I’d put him to bed.  He quieted a bit, but only momentarily.  So I repeated my threat.  “Behave,” I said in a louder and sterner voice than the first time, “or you’re going to bed!”  Once again, the threat worked only briefly, so his whining began again and I repeated my threat a third time, even more sternly than before.  Again it worked, but when the whining returned only seconds later, at the limit of my patience, I cried out “Daniel, behave!!” in the fiercest tone I could muster.   Frightened nearly to death by my obvious anger, his chin trembled in fear.

“I’m haive,” he assured me.  “I’m haive.

The point is, words mean different things to different people.  Language can get in the way.  To my four year old son, I might as well have been babbling.  Was it his mistake, to misunderstand, or mine, to assume he understood what it meant to be have?  It was, in either case, a failure of communication.  And failure of communication consistently ranks high on lists of reasons for mistake.

Sometimes, we have trouble communicating even with ourselves, and when this happens, it suggests different reasons for error.  A couple of years before we left Florida, on a winter day Karen had invited two guests to the house to paint for the afternoon, I agreed to cook them a meal.  A wall of sliding glass doors that looked out to the swimming pool gave the kitchen the best light for painting, and because of the light, it was the ladies’ chosen spot, as well as my work area for the day.   The meal included  a spiced chutney for which the ingredients  included coriander, cumin, and a little cayenne pepper.  Soon after preparing the chutney I felt a burning sensation in my right eye.  I rubbed the eye with the back of my hand, and then with a wet cloth, but rubbing the eye seemed only to increase the burning sensation.   My tear ducts went into high gear, but despite this natural defense, the burning did not abate.

I’d just recently started wearing contact lenses, and fearing that a lens could be trapping the offensive powder against my cornea, I worried it might be the reason my tear ducts were being ineffective.  The worry was heightened when I went to the sink and flushed my eye with a glass of water, with no consequent reduction in pain.

The urgency of removing the spices became an urgency to remove the contact lens – but I realized quickly that I was having great difficulty even locating the darn thing.   When I tried to squeeze it off and out, my fingers came up empty.  My inability to feel it suggested two possible explanations.  As had happened before, it might have become so closely fitted to the cornea that underlying suction was simply preventing its removal.  Alternatively, all that tearing (or the water from the sink)  had washed the lens down into the eyelid where (having assumed the shape of a folded burrito packed with spicy powder) it was making elimination of the powder impossible.  With the burning sensation getting stronger by the second, I raced from the kitchen to the closest mirror – in our bedroom upstairs – and pulled the lower eyelid down in a search for the offensive lens.  But what with hyperactive tear ducts, pain, and the lack of a functioning lens, my poor eyes couldn’t tell whether the lens was in the eyelid or not.  I couldn’t feel it there, or folded into the upper eyelid, or stuck stubbornly to the cornea itself.  Ever more determined to remove it, I kept pinching at the lens with my fingertips from everywhere in the eye socket it might possibly be.

Unsuccessful, I ran back downstairs, flung open the sliding glass doors and crouched at poolside, dunking my head into the winter-cold water, thrashing my head to generate as much flow as possible, convinced that this, at least, would flush out the offending lens.  But when I lifted my head from the water the pain only increased.  The ladies were laughing now, asking what in the heck I was doing.  But caring only about the pain, I shut my eyes.  The pain increased.  Again and again, I tried to fish for the offending lens, sure that it was to blame, wherever it was

In time, the pain eventually stopped – but not until the ladies suggested I thoroughly wash my hands.  As soon as I did, I realized I could use my fingers to pinch around for the missing lens without adding more spice to the mix.  But even then, I couldn’t locate the lens.

Able at last to see well and think straight again, I found my glasses on the kitchen counter.  Only then did I realize the depths of my folly.  Removing the glasses had been the first thing I’d done, even before rubbing my eyes with the kitchen cloth.  I hadn’t been wearing my contact lenses that day at all.

How does one classify such an error?  You could ascribe it to my inexperience in the kitchen and consequent failure to wash the spices off my hands.  You could ascribe it to my inexperience with contact lenses.  You could ascribe it to my bad decision-making when under pressure, or to forgetfulness, or to lack of focus.  You might say that habit was to blame, as the removal of my glasses at the first sign of irritation was one of those unconscious habits that are so automatic we forget about them.  (In that case, lack of self-awareness about my own habits was also to blame.)  Finally, you might ascribe it to the  presence of an idea – a false idea, but an idea nevertheless – that once in command of my attention, made all the other reasons irrelevant.  The idea that a contact lens had trapped the powder had  supplanting the powder itself as the culprit in need of ferreting out.  I’d entirely made up the story of the folded contact lens, but it was so graphic, so real, so painfully compelling, that it became the thing I focused on; it took command of my world.

One conclusion I draw is that it’s hard to classify reasons for error neatly into distinct types, because any one error may result from all sorts of factors.  But being human, I’m prone to think in terms of types and classifications; they help me think I better understand the world.  And when I do, I’m especially fond of this last-mentioned cause for error – the false stories we tell ourselves.   False as my story about the contact lens  was, IT was the story playing out in my head; IT created the entire world with which my conscious self interacted.  For all intents and purposes, it became my reality.

I’ve enjoyed reading psychologists, philosophers and story-tellers share thoughts about the stories we tell ourselves.  I’ve especially enjoyed reading opinions about whether it’s possible for the human brain to know whether the world it perceives is “real” in any sense distinguishable from the stories we tell ourselves.  Ultimately, I don’t know if creating these stories is the most common reason for our errors or not, but I think they’re among the most interesting.

Finally, to F. Lee Bailey, in addition to conveying my thanks for getting me to think about the reasons people may be wrong, I’d like to convey a suggestion: that, possibly, people lie more often than he supposed.  Possibly, they just do it, most often, to themselves.

— Joe

Happy Halloween

Since we planned to be out of town for Halloween this year, we produced our annual Haunted House last night, a bit before official trick-or-treat night. “We” means myself and my volunteer crew, of which, this year, there were thirteen members.  What an appropriate number for a Haunted House!

As usual, It took several weeks back in September for me to get psyched.  First, I had to stop thinking about my other projects.  I had to come up with a theme, decide on characters, scenes, and devices, and develop a story line in my mind, imagining the experience our visitors would have, before I could nail the first nail.  As I created the structure that defined the maze-like path to be followed, as I shot each staple into the black plastic walls intended to keep visitors’ footsteps and focus in the right direction, as I adjusted the angle and dimness of each tea light to reveal only what I wanted to reveal, eventually, the construction of the house drew me into the scenes and characters I was imagining.  And as usual, now that “the day after” has arrived, I’ve awoken before the sun rises, my mind crawling with memories of last night’s screams and laughter.  I try to go back to much-needed sleep, but the thoughts of next year’s possibilities get in the way.  It’s the same old story.  Once my mind gets psyched for the Haunted House, it starts to wear a groove in a horrific path; now, it will take something powerful to lift it out of that groove.

I wish I’d done more theater in my life.  I suppose some of my elaborate practical jokes might lay claim to theater.  I’ve even tried my hand at a few crude movies of the narrative, “artsy” sort.  But mostly, its been novels and haunted houses.  I suppose I’ve wanted to tell stories with pictures and words ever since I was a kid.  It’s how I’ve always imagined who I am.

In my efforts to be a better writer, I’ve read much on the craft of writing, from popular books like Stephen King’s On Writing to academic tomes like Mieke Bal’s‘s Narratology.  But among the ghouls and monsters on my mind this dark morning comes the memory of a book on writing I read a few years back, one by Lisa Cron called Wired for Story.  That book makes the point that human brains have evolved to give us a highly developed capacity – indeed, a need – to think in terms of stories, and that we’re now hard-wired to do so.

The opening words of Ms. Cron’s book set the neurological stage:

“In the second it takes you to read this sentence, your senses are showering you with over 11,000,000 pieces of information.  Your conscious mind is capable of registering about forty of them.  And when it comes to actually paying attention?  On a good day, you can process seven bits of data at a time…”

Cron’s book goes on to describe how the very success of our species depends on our capacity to translate overwhelming experience into simple stories.  I don’t know the source, or even if it’s true – maybe from The Agony and the Ecstasy? — but Michelangelo is said to have observed that when he sculpted, he didn’t create art, he just removed everything that wasn’t art.  In my own writing, I’ve come to realize how true that is.  Research produces so many pieces of data, and because I find it fascinating, my temptation is to share it all with my readers.  But thorough research is a little bit like real life, which is to say, like Cron’s 11,000,000 pieces of information.  That much information simply doesn’t make a story, any more than the slab of marble Michelangelo started with makes art.

Our brains are not wired to deal with such overloads, but to ensure our survival, which they do by “imagining” ourselves in hypothetical situations, scoping out what “might” happen to us if we eat the apple, smell the flower, or step in front of the oncoming bus.  Every memory we have is similarly a story – not a photographic reproduction of reality, but an over-simplified construct designed to make sense of our experience.  Think of what you were doing a minute before you started reading this post.  What do you remember?  Certainly not every smell, every sound, every thought that crossed your mind, every pixel of your peripheral vision.  What you remember of that moment is a microcosm of what you remember about your entire life.  Sure, you can remember what you were doing September 11, 2001, but how many details of your own life on that infamous day could you recall, if you devoted every second of tomorrow to the task?  And that was a very memorable day.  What do you recall of  September 11, 1997?  Chances are you have no idea of the details of your experience that day.  The fact is, we don’t remember 99.99% of our lives.  All we remember are the pieces of the narrative stories we tell ourselves about who we are, which is to say, what our experiences have been.

The same holds true about our thoughts of the future.  As we drive down the road, we don’t forecast whether the next vehicle we pass will be a blue Toyota or a green Chevy.  We do, however, forecast whether our boss will be angry when we ask for a raise, or whatever might happen that’s important to us when we arrive at our destination (which is, usually, a function of why we’re going there).  Whether we’re thinking about the past, the present, or the future, we see ourselves as the protagonist in a narrative story defined by the very narrow story-view we’ve shaped to date, which includes our developing notions of what’s important to us.  Our proficiency at doing this is what has helped us flourish as a species.  This is why photographers tend to see more sources of light in the world, and painters more color, while novelists see more words and doctors see more symptoms of illness. The more entrenched we are in who we’ve become, the more different is the way we perceive reality.

Understanding ourselves as hard-wired for dealing with simple, limited stories rather than the totality of our actual experience – not to mention the totality of universal experience – has important ramifications for self-awareness.  As the psychologist Jean Piaget taught us, from our earliest years, we take our experiences and form conclusions about the patterns they appear to represent.  As long as new experiences are consistent with these constructs, we continue interpreting the world on the basis of them.  When a new experience presents itself that may not fit neatly into the pattern, we either reject it or (often with some angst) we begrudgingly modify our construct of reality to incorporate it.  From that point forward, we continue to interpret new experiences in accordance with our existing constructs, seeing them as consistent with our understanding of “reality” (as previously decided-on) whenever we can make it fit.

And so, from earliest childhood, we form notions of reality based on personal experience.  The results are the stories we tell of ourselves and of our worlds, stories which have a past and which continue to unfold before us.  As Cron points out, we are the protagonists in these stories.  And I’d like to make an additional point: that in the stories we tell ourselves, we are sometimes the heroes.  We are sometimes the victims.  But unless we are psychopathic, we are rarely, if ever, the villains.

There are, of course, plenty of villains in these stories, but the villains are always other people.  In your story, maybe the villains are big business, or big government; evil Nazis or evil communists; aggressive religious zealots, cold-blooded, soul-less atheists, or even Satan himself. It could be your heartless neighbor who lets his dog keep you up all night long with its barking, or the unfeeling cop who just gave you that unjust speeding ticket.

As you think of the current chapter of your life story, who are the biggest villains?  And are you one of them?  I doubt it.  But I suggest asking ourselves, what are the stories the villains tell about themselves?  What is it that makes them see themselves as the heroes of their stories, or the victims?  Isn’t it reasonable to assume that their stories make as much sense to them as our stories make to us?

We have formed our ideas about reality based on our own experiences, because they make sense to us.  Indeed, our stories make sense to us because they are the only way we can get our minds around a reality that’s throwing 11,000,000 pieces of information at us every second of our waking lives.  We live in a reality of mountain ranges, full of granite and marble.  Michelangelo finds meaning in it by chipping away everything that isn’t The Pieta, Auguste Rodin by chipping away everything that isn’t The Thinker.  When they find meaning in such small samples of worldwide rock, is it any wonder they see reality differently?

Psychologists tell us that self-esteem is important to mental health, so it’s no wonder that in the stories we tell ourselves, we are the heroes on good days, the victims on bad ones, and the villains only every third leap year or so.  Others are the normal villains.  But if I’m your villain, and you’re mine, then we can’t both be right – or can we?  An objective observer would say that your story makes excellent sense to you, for the same reasons my story makes excellent sense to me.  Both are grounded in experience, and your experience is quite different from mine.  Even more importantly, I think, your “story” represents about 7/11,000,000th of your life experiences while my story represents about 7/11,000,000th of mine.

But confirmation bias means that we fight like heck to conform new experience to our pre-existing stories.  If a new experience doesn’t demand a complete re-write, we’ll find a way to fit it in.  It’s like we’re watching a movie in a theater.  If some prankster projectionist has spliced in a scene from another movie, the whole story we’re watching makes no sense to us and sometimes we want to start over, from the beginning.    If our stories are wrong, our entire understanding of who we are and how we fit in becomes a heap of tangled film on the projection room floor.

One of the things I love about Halloween is how it lets us imagine ourselves as something different.  I mean, Christmas puts our focus on Jesus or Santa Claus, role models to emulate, but their larger-than-life accomplishments and abilities are distinctly other than the selves we know.  Mother’s Day and Valentine’s Day encourage us to focus on other people.  Halloween is the one holiday that encourages us to pretend to be something we’re not – to put aside our existing views of the world “as it really is” and become whatever our wildest imaginations might see us as.  I think that’s why I like it so much.  Obviously, I’m not really a vampire ghoul from Transylvania, but when my current worldview is based on a tiny,  7/11,000,000,000th slice of my own personal experience, how much less accurate can that new self-image be?

I think of “intelligence” as the ability to see things from multiple points of view.  The most pig-headed dullards I know are those who seem so stuck in their convictions that they can’t even imagine the world as I or others see it.  I tend to think that absent the ability to see things from multiple points of view, we’d have no basis for making comparisons, no basis for preferences, no basis for judgment, and therefore, no basis for wisdom.

Halloween is the one time of year I really get to celebrate my imagination, to change my story from one in which I’m hero or victim to one in which I’m a villain.  As I try to see things from a weird, offbeat, or even seemingly evil point of view, I get practice in trying to see things as others see them.  For me, it seems a very healthy habit to cultivate.

But I must end on a note of caution.  As someone who tries to tell stories capable of captivating an audience, I am keenly aware of a conflict.  As the dramatist, my goal is to channel your experience, your thoughts, your attention, along a path I’ve staked out, to an end I have in mind.  When I’m successful, I create the groove.  My audience follows it.  In this respect, good story-telling, when directed toward others, is a form of mind control.

But what about story-telling to oneself?  It’s probably good news that in real life, there isn’t just one Stephen King or Tom Clancy trying to capture your attention or lead you to some predetermined goal.  Every book, movie, TV commercial, internet pop-up ad, billboard, preacher, politician, news reporter, self-help guru and next door neighbor has a story to tell, and wants you to follow it.  The blessing of being exposed to 11,000,000 pieces of information every second is that we’re not in thrall to a single voice trying to control the way we see the world.  But does this mean we’re free?  The reduction of the world’s complexity into a single world-view is a story that IS told by a single voice — our own.  All of our individual experiences to date have been shaped by our brains into a story, a story in which we are the heroes and victims.  The most powerful things that seek to control our views of the world are those stories.  We’ve been telling ourselves one since the day we began to experience reality.  My own?  Since early childhood, I have seen myself as a story-teller.  Since September, the immanence of Halloween has forced me, almost unwillingly at first, to focus on my annual Haunted House.  At first, it was hard.  But in just a few weeks, the themes, characters, scenes, and devices of this story took such a hold on me, that I woke up this morning unable to think of anything else.

Such are the pathways of our minds.  If my thoughts can be so channeled in just a few weeks, how deep are the grooves I’ve been cutting for over sixty years?  Am I really free to change the story of my life, or am I the helpless victim of the story I’ve been telling?

This week, try imagining yourself as something very different.  Something you’d normally find very weird, maybe even distasteful.  But remember – don’t imagine yourself the villain.  Imagine yourself, in this new role, as part hero, part victim. Get outside your prior self, and have a Happy Halloween.

— Joe

My Favorite African Photo

I got back from an African safari vacation last night, very jet-lagged, having not slept for about 43 hours.  When I woke up this morning, I was anxious  to start organizing the photographs from my trip.  Siting down at the PC to do so, I found an e-mail from my erstwhile roommate, John, reminding me to send him photos of the wildlife I’d seen.  (John is an avid outdoorsman who once tried to make a living as a wildlife photographer.) Having not yet gone through the photos myself, having not yet cropped, nor cut, nor selected any of them, I wasn’t ready to give John the full-blown “Here Are the Pics of My African Vacation” slideshow – but I decided I’d send him just one of them – both because it was my sentimental favorite, of all those I’d taken, and because I knew that John, of all people, would appreciate it.

Now, the reason John would appreciate this particular photo was not just that he’s an erstwhile wildlife photographer; almost all the photos I’d taken were of African wildlife.  But the year that John and I spent as college roommates, many decades ago, were marked by regular discussions of deep philosophical issues; and this photograph had become my  favorite due to its philosophical implications, implications I felt sure John would appreciate.

As I learned on the Shamwari game preserve, most wild baboon troops of South Africa run quickly away at the approach of human beings.  But on the day this photograph was taken, I had come to the extreme southwestern tip of the African continent, a rocky, mountainous formation that rises high above sea level like the prow of a sailing ship that projects above the ocean waves.  In fact, here is a photograph – one taken from the Wikipedia article regarding the Cape of Good Hope – which shows the general topography of the place.

View at Cape Point

(Photo by Thomas Bjørkan - Own work, CC BY-SA 3.0,

Naturally enough, given the impressive topography, the Cape has become a tourist attraction.  The result of being a tourist attraction is that the native baboons of the Cape have lost their fear of human beings.  In the Cape Point parking lot, they were nearly as plentiful as the people, ready to pounce on anyone foolish enough to walk by with a sandwich in hand.  They were sitting on the roofs of cars.  They were scouting for half-open windows through which to steal picnic lunches.  They were on the rocks, in the bushes, outside the souvenir shop, intermingling fearlessly with us, their more advanced cousins.

I took the photograph in question – the photograph I wanted to send to John because it had become my favorite – while standing on the Cape, looking south like some fifteenth century Portuguese explorer from the bow of his ship, gazing across thousands of miles of ocean toward the south pole.  The Atlantic Ocean was to my right, the Indian not far to my left, and the Antarctic somewhere far in the distance in front of me. To my immediate left, on the summit of the mountain peak, a lighthouse had been built to guide ships rounding the Cape. Because of my fear of heights, I had not attempted the funicular or the steep climb from the funicular to the summit, but as I looked at the rocky cliff, with the triple-ocean breeze blowing into my face and the triple-ocean surf crashing into the old, unmoving rocks below, I noticed movement high up on the cliff’s stony face.  Tapping into the unconscious (but ingrained) ability of one primate to recognize the movements of another, I was drawn to it, a twitch on the horizon, a dark profile silhouetted against a bright sky.  He was maybe fifteen hundred feet away from me and several hundred feet higher than me, but I could see him settle onto one of the highest, most southerly rocks of the cliff side, clearly fixing his gaze southward, looking out over the oceans just as I’d been doing — except, of course, that he was braver and more agile than me, having dared to climb out onto the virtual bowsprit of the continent, where I would not.  I wondered why he wasn’t in the parking lot, with the rest of his kind, ready to pounce on a tourist; wondered why he had gone off on his own, to gaze across the oceans toward the vast unknown.

Like all primates, baboons are an intelligent species.  Scientists have recently discovered that they can acquire orthographic processing skills which form part of the ability to read.* I wondered if this solitary philosopher was more intelligent than his fellows in the parking lot.  I imagined the thoughts he was having about other lands, far away.  I imagined him capable of evolving into another Bartholomeo Diaz someday.  Gazing across the ocean and into the unknown, I wondered: wasn’t it possible he had seen ships pass, and wondered how he might build a ship of his own, to go exploring, some day?  I maximized the camera’s zoom and got the best picture of the contemplative creature I could.

The sight so impressed me, in fact, that for the rest of my time in Africa, I told people about it.  Last night – my first night home – I told my wife, and my daughters, and my grandson Jacob, about it.  Jacob in particular was wide-eyed as I promised to show him the photograph when he comes over this afternoon.   The profiled creature has become my hero; the photograph of him looking out across the ocean has stuck with me, and I haven’t been able to get it out of my head – more than the photographs of lions, cheetahs, elephants, and giraffes I took – even more than the elegant springbok herd, the pod of dangerous hippopotami, or the solitary, rare and elusive black rhinoceros.  It is my favorite photograph, despite the fact that, fully zoomed in, and lacking a tripod for my camera, the image came out slightly blurry. It is my favorite not for its technical quality, but because of its fascinating philosophical implications.  And as I composed my e-mail to John this morning, he seemed the perfect person to appreciate those implications.

Anyway, this morning, as I composed my e-mail to John, I described the photograph I was sending him, describing why it was my favorite, much as I had to Jacob last night, much as I have to you here.  As I was finishing my written description to John, my grandson Evan walked into the room. I invited him to come take a look at the photograph of the contemplative baboon.  I fetched it from the digital camera’s SD drive and displayed it on my monitor.  Evan and I shared still more deep, philosophical observations about our most intriguing subject.  Finally, after Evan departed, I embedded the photograph into my e-mail to John, as I now do here:

You can see the solitary baboon toward the top of the picture, squatting on all fours, his tail raised behind him, dreaming of building his own ship and exploring the oceans on three sides of him.

Alternatively, you can do as I did.  To wit: as I embedded the photograph into my e-mail to John, I realized that I could blow it up even larger, digitally, than I’d been able to do through the zoom setting on the camera  With the wonders of modern technology – my virtual icon in the shape of a magnifying glass with a plus sign – I was able to enlarge the photo enough to see the image at a level of detail not revealed by the camera’s telephoto lens.  Glints of sunlight on the rock, the baboon’s tail, his haunches.  Magnifying the image even more, I thought I might even have captured the contemplative expression on the creature’s face.  But the more I enlarged it, the more the baboon’s haunches looked like a torso, his legs hidden behind the rock; and the more its tail looked like a back pack.  With a final enlargement, I could see how close this baboon had evolved to the point of being able to read – he was wearing a pair of glasses.

As you’ve figured out by now, the fascinating, contemplative creature was actually a tourist, just like me (only without the fear of heights). The only baboon in the picture had been on my side of the lens.  What will I tell my grandchildren now?  (At least until now, a few of them still look up to me.)  Is that Jacob, coming up the stairs now?

Still, the photograph remains my favorite wildlife photograph.  And the reason hasn’t changed, either: although it’s still a bit blurry, the photograph has deep, philosophical implications for the species it portrays.

— Joe

*Jonathan Grainger; Stéphane Dufau; Marie Montant; Johannes C. Ziegler; Joël Fagot (2012). “Orthographic processing in baboons (Papio papio)”. Science. 336 (6078): 245–248. PMID 22499949. doi:10.1126/science.1218152.