Four Billion Years

Most of you have probably recognized my obsession with matters of scale – the idea that our brains are wired to prevent us from being able to grasp orders of magnitude.  Whether it be atomic particles measured in Planck lengths, galaxies measured in light years, or even our difficulty estimating the number of grains of sand on a beach, I’ve been pondering the impossibility of genuine comprehension. Can we only deceive ourselves into thinking we can comprehend such numbers and distances? 

As a child visiting the American Museum of Natural History in New York, I was impressed by the timeline of life on Earth laid out as a wall mural, and I was impressed by the fact that mankind occupied such a small section of it, compared to the history of all life.  But then, I noticed that the scale had changed to give more space to humanity than its proportionate share, as if children wouldn’t be able to handle the relative insignificance of humanity, if we saw the real thing.  I don’t know if that mural has been redone since the 1960’s, but the Museum still offers for sale what they call “The Big History Timeline Wallbook” for $19.95. Their website describes the six foot long timeline as “divided into 12 sections covering both natural history as well as the history of human civilizations.”  As I see it, giving human civilization one twelfth of a timeline of all existence is bound to leave kids with an inflated sense of the importance of humanity.

I’ve been trying to make things (spreadsheets, videos, books, diagrams) which don’t employ collapsing scales in order to appease our need to see the importance of ourselves. 

For example, I recently tried to get a feel for the height of Mt. Everest compared to the size of the planet as a whole.  I’d seen plenty of diagrams in books and on the internet which show mountains on the surface of the Earth, but if the whole Earth itself was represented, the images have always been labeled with the caveat, “Not Drawn to Scale.”  The reason, of course, is that the diameter of the Earth is so much greater than the height of Everest that for an image of the Earth to appear on one page of a book, Mt. Everest would only be as high as a human hair is wide.  Such problems drive illustrators to resort to deception.  If the people at the American Museum of Natural History didn’t resort to collapsing scales, their six foot long diagram of history since the big bang would leave less than two feet for the history of all life on Earth,  about 1/1,000th of an inch for the history of homo sapiens, and about a 1/100th of that for what we think of as human “civilization.”

Every picture of Earth’s cloud-studded atmosphere I’ve seen has made it seem to extend quite far above the surface of the Earth – far higher, say, than Everest.  And we’ve all wondered why, hurtling through space at sixty-seven thousand miles per hour, the Earth’s atmosphere isn’t torn to shreds.   But when I looked, I couldn’t find an image showing me the thickness of the atmosphere compared to the thickness of the Earth.  It occurred to me that perhaps an Excel Spreadsheet might be a handy tool for giving me a better understanding of what I wanted to grasp. So I made such an image myself, using Excel. Starting at the center of the Earth, I added a kilometer to the next row of the spreadsheet, then I copied that increment and dragged it down the page the requisite number of kilometers. The image so drawn enabled me to “see” the picture that resulted – but only sort of.  Specifically, holding my finger down on the scroll bar continuously, the kilometers from the Earth’s center to its surface sped by in a blur that lasted about twenty seconds or so.  Then, suddenly, Everest and the atmosphere flashed by —so fast I couldn’t see them at all.

Never again will I forget that our atmosphere is a thin film barely coating the surface of the planet.  Hurtling the Earth through space has as much chance of disrupting our atmosphere as a pitch toward home plate would have of wiping the grass stains off a baseball.  

In any case, a couple of years ago, I became determined to figure out a way to represent the history of life on Earth, giving mankind no more than our due space within the whole.  I wanted a timeline that did not vary in scale.  I wanted to express the place held by humanity not only with the “knowing” that our brains are capable of, but through the more real “experiencing” that I believe gets us closer to genuine understanding of very large numbers.  How might I use an experience to enhance my intellectual understanding?

One early idea was to use ping pong balls threaded together in a very long string, with a person having to actually walk the length of the string to experience how long it would be.  I wanted each ping pong ball to represent a length of time that a human being might actually understand.  I thought perhaps a thousand years – the time since William the Conqueror – might be such a number, to be represented by a single ping pong ball.  The math was easy – at that scale, I would only need four million ping pong balls to represent the four billion years of life on Earth.  But the idea of someone walking the length of the string came to an end when I realized that, leaving a 10 mm space between each 40 mm diameter ball, a string of four million such balls would be more than a hundred and twenty-four miles long.   Few people would be willing to walk such a length to “experience” that amount of time, so I felt compelled to change my approach. 

Marbles would be too tough to string together.  Lead fishing weights would be smaller and easy to crimp onto a long fishing line. But the combined weight of four million fishing weights became a prohibitive factor. 

So I turned from ping pong balls and fishing weights to the idea of making a sound or video recording.  But it was quickly apparent that an hour-long video couldn’t express the concept meaningfully. What if such a video counted off the history of life on Earth in increments of a thousand years each?  It would be pointless if the numbers on a screen were simply a blur; each would have to be visible for, say, half a second, even to be recognizable.  But at two thousand years per second, it would take a person over sixty years, watching continuously, to see such a video. The video idea went the way of the ping pong balls.

The more ideas I considered, the more I felt myself inching closer to understanding the length of time involved, but the more it seemed that real understanding of such huge durations was impossible, as a practical matter.  We can say the words, “Life has been on Earth for four billion years,”  and we can multiply and divide numbers, but does any of that mean we can really appreciate how long four billion years is? By writing characters on a blackboard, we can manipulate decimal points and zeros to deceive ourselves into a false sense of “understanding” such large numbers.  We have also developed whole systems of math that make perfect logical sense of the so-called “imaginary” numbers like the square root of negative two.  We make calculations and even practical use of them, but none of us really “understands” them. I think big numbers are the same.  Math helps us deceive ourselves into a feeling of understanding when no real understanding exists. 

Still, we probably get by more efficiently in this real world precisely because our brains can’t comprehend such immensity. 

I remain keen on distinguishing between mere intellectual “understanding” (aka self-deception) and true experiential understanding.  Obviously, we can’t actually experience a duration as long as the four-hundred-billion-year history of life on Earth .  But try counting down that history, even a million years at a time, from four billion years ago, and see how far you get.  “Four billion.  Three billion, nine hundred and ninety-nine million.  Three billion, nine hundred and ninety-eight million.  Three billion…” It would be like singing yourself all the way through “100 Bottles of Beer on the Wall” ten thousand times in a row. If we can’t bring ourselves even to say the numbers, even if we’re counting a million years at a time, then how can we think we really understand them?

In any case, while my string of ping pong balls will never see the light of day, my effort to put the life span of humanity into perspective has finally resulted in something real – a book titled “It’s Been Four Billion Years: The Story of Life on Earth a Million Years at a Time.”

It’s at the printer’s now, and will be available on and other retailers in September.  The retail price is $19.95 – same as the History Museum’s deceptively skewed timeline.  (Needless to say, my book maintains a constant scale, all the way through.)

The best part of it is that in appreciation for your having subscribed to this blog, I am willing to send you a free copy when they come out next month.  If you send me an e-mail with your snail-mail address, I’ll be able to send you one.

In the mean time, I’m thinking of doing another book, trying to express extremes of distance and size….

Until then, peace to you all.

Primates and Praise

Early in the Christian churches, bishops and archbishops came to be called “primates.” The word was not intended to evoke images of orangutans or macaques.  (It would be another five hundred years before Carl Linnaeus classified homo sapiens as a member of that order.) Rather, even in Latin, the word for “first” had been used to mean a superior, a leader, or most excellent person, and the Christians had no problem designating their spiritual leaders with the term as well.

There are many things I like about my Christian heritage.  If Christians today preached what I believe the historical Jesus preached, I’d readily identify as a Christian.  But as I see it, modern Christianity gets Jesus wrong in a number of respects. 

When I was only eight, I was invited to spend the weekend in the countryside with a friend.  Since I’d have to miss Sunday mass, I made a phone call to ask for permission to do so.  My friend’s family got quite a laugh when, after the call, they discovered I hadn’t been calling home, but the church rectory. The “Father” they’d heard me addressing was not my biological father, but the parish priest.

I had already been taught to call all priests “Father,” and even when I talk to priests today, I use the term of respect I was taught as a child.

But it wasn’t long after the parish priest told me it would be a sin to miss Mass  that I came across Matthew 23:9, where Jesus is said to have told his followers “to call no man Father, for one is your Father, which is in Heaven.”  Given that scripture, I never understood how Christians developed the practice of calling their priests “Father” – especially in an age when fathers demanded so much respect – except, of course, that the priests had taught them to.

It’s easier for me to understand why hierarchies arose as church memberships and treasuries grew – and why words like “bishop” (from Greek epi-skopos, meaning to watch over) came into use.  And it seems almost inevitable that as such growth continued, layers of rank would have to be added, for practical, administrative reasons.  So by the time the Bishops of Canterbury, York, Armagh and St. Andrews had become powerful, it isn’t entirely surprising that they’d call these leaders ‘primates.” But the primates were always first among “fathers,” and I still had a hard time squaring that with Matthew 23:9.

Nor was it that particular scripture alone.   According to Matthew 12:50, Jesus instructed his followers, “Whosoever shall do the will of my Father, which is in Heaven, the same is my brother, and sister, and mother.”  Jesus preached, “Blessed are the meek; for they shall inherit the earth” (Matt. 5:5) and “Whosoever therefore shall humble himself as this little child, the same is greatest in the kingdom of heaven” (Matt. 18:4). I read of a Jesus who washed the feet of his disciples, of a Jesus who frequently dismissed those who treated him with special reverence, of a Jesus who said to a man who addressed him as Good Master, “Why callest thou me good? There is no one good but one, that is God” (Matt. 19:16). I read of a Jesus who, when asked if he was King, replied only, “You said it” (Matt 27:11), as if to disavow the title himself.  In fact, Jesus taught, in the Sermon on the Mount, that his followers should pray to the Father (for His was the power and the glory). And, if we believe Matthew 7:23, Jesus chastised those who would honor him, warning, “Many will say to me in that day, ‘Lord, Lord, have we not prophesied in thy name? and in thy name have cast out devils? And in thy name done many wonderful works?’ And then will I profess to them, I never knew you: depart from me, ye that work iniquity.”

One reason I haven’t been to church but a few times in the last fifty years is my lack of comfort with heaping praise on this man who fought so hard to avoid it.  Last month, I went to a Catholic mass for the first time in many years.  One of the first hymns sung was To Jesus Christ, Our Sovereign King.

“To Jesus Christ, our sovereign king, who is the world’s salvation, all praise and homage do we bring, and thanks, and adoration. Christ Jesus, victor!  Christ Jesus, Ruler! Christ Jesus, Lord and Redeemer! Your reign extend, O King benign, to every land and nation; for in your kingdom, Lord divine, alone we find salvation.  To you and to your church, great King, we pledge our hearts’ oblation – until, before your throne, we sing in endless jubilation.”

Homage? Kingdom?  Reign? Throne?  I was taught the theology behind this hymn.  But for me, the theology fails to justify adoration of a man who shunned adoration, who deflected all praise to God, his father in heaven.  To my way of thinking, Jesus would not have approved of such a hymn.

Meanwhile, whatever may be said in defense of praising Jesus, I have even greater trouble with adoration of mankind.

Consider this passage from Pope John Paul II’s Gospel of Life, Evangelicum Vitae.  I can’t read it without thinking of Jesus’ teaching that the meek shall be blessed.

52. Man, as the living image of God, is willed by his Creator to be ruler and lord. Saint Gregory of Nyssa writes that “God made man capable of carrying out his role as king of the earth … Man was created in the image of the One who governs the universe. Everything demonstrates that from the beginning man’s nature was marked by royalty… Man is a king. Created to exercise dominion over the world, he was given a likeness to the king of the universe; he is the living image who participates by his dignity in the perfection of the divine archetype.”

I hope that my thoughts are not taken as an attack upon those who sing the hymn, or upon Pope Paul II for his thoughts about mankind.    I mean no disrespect, and God knows, I may be wrong.  But as Christians prepare this month to celebrate Jesus and his birth, I’m moved to point out my inability to buy into these aspects of modern Christianity. As I like to think of it, “I prefer the original.”  Father, Primate, Pope, Homo Sapiens Sapiens.  Clearly, we are prone to bestow honor on ourselves.  I don’t know whether we inherited this tendency from other primates or not, but the Jesus I believe in warned us against it.

Precedential Impeachment

I was heartened this week that in the debate over the legality of the national emergency declared by President Trump, people are talking about the precedents such declarations set.   This has nothing to do with my feelings about immigration, but my feelings about precedent  – both those precedents set in the past, and whatever new precedent we may set by decisions made today. 

As we grow closer to issuance of the Mueller Report and the possibility of  impeachment – which I’m still predicting – I thought the time right to reflect on precedent.

I begin with a reminder of some precedents set by voters.  After Marion Barry, then a married Mayor of the District of Columbia,  was caught on tape in an FBI sting soliciting sex and doing crack cocaine with a girlfriend, he was convicted by a majority-black jury and did time for the crime.  Yet immediately after his release from prison, his constituency reelected him, first to City Council and then to Mayor.  His campaign slogan was, “He May Not Be Perfect, But He’s Perfect for D.C.”  He won by large margins.

In his 1963 inaugural address as governor of Alabama, George Wallace, champion of the Jim Crow laws, declared that he stood for “segregation now, segregation tomorrow, segregation forever.”  Yet he was reelected Governor of Alabama several times and in 1968, carried five states in his third party campaign for President.

In 1969, Teddy Kennedy drove a young girl off a bridge, failed to report the fatal accident until others had already found her, and until any alcohol in his system had had time to dissipate.  He paid money to the girl’s family to make no public comments.  And yet, a year later, he was re-elected to his Senate seat by a 62% majority.  By the time he died, he’d been reelected six more times.  There was widespread support for his subsequent campaign for the Presidency.

I didn’t support any of these three politicians, but I’ve always supported the electorate’s right to be represented by whomever they desire.  American Democracy has survived in part bcause we have enough faith in our system that we’re content to wait until the next election cycle, to vote out administrations we find abhorrent.  As the cases of Barry, Wallace and Kennedy seem to make clear, we don’t require our political candidates to be free of wrongdoing.  The will of the electorate being supreme, it apparently includes the power to forgive, excuse, or simply ignore the misconduct of a candidate for office.  Misconduct is not, per se, grounds for disqualification, ineligibility, or removal.  If George Wallace had won the presidential election, would he have then been subject to impeachment for his segregationist views?   Should Ted Kennedy have been expelled from the Senate for his crimes at Chappaquiddick?  If he’d been elected president, would he have been subject to impeachment for those crimes?

With those questions in mind, I move on to the precedents Congress has set for removals from office.

Our Constitution permits Congress to expel its own members, on a two thirds vote.  Whereas presidents must be accused of “high crimes and misdemeanors,” there’s no similar standard set out before Congress can expel its own members.  One might imagine that with so many of them, there’s been a lot more crimes and misdemeanors committed by members of Congress over the years than by Presidents.  Yet only a handful of Congressmen have ever been expelled by vote of their peers.  The great majority of them were Congressmen from southern states expelled after those states succeeded from the union; they were expelled for “support of the Confederacy,” i.e., for conduct that essentially amounted to treason.  Clearly, others have left office voluntarily amid scandal and disgrace, but apart from those civil war rebels, there have apparently been only three members of Congress actually expelled.:

William Blount was charged with treason in 1797 after a letter in his handwriting proved that he was conspiring with Great Britain to take over Spanish Louisiana and Florida.  (As a major land speculator, Blount stood to profit from the predicted increase in land prices.)   Treason is often cited as the most obvious of “high crimes and misdemeanors.”. Interestingly, though, Blount’s home state of Tennessee continued to elect him to its state house; he served as its speaker until his death.

183 years later, Michael Myers of Pennsylvania was expelled for taking a $50,000 bribe from an FBI agent in connection with the Abscam scandal.  Proof, again, was rock solid. And in 2002, Jim Trafficant of Ohio was expelled after being criminal convicted on numerous counts of bribery, racketeering, and tax evasion. Again, solid proof.

Apart from that handful, that’s it.  My sense from this is that Congress has been amazingly cautious in expelling its own members.  By comparison, it has shown greater willingness  to go after presidents.  Still, it has only impeached two of them, Andrew Johnson and Bill Clinton.  Since Nixon’s impeachment was certain if he hadn’t resigned first, let’s add Nixon to the mix and call it three.  In case we’ve forgotten, I offer an attempt to summarize them:

President Andrew Johnson wanted to replace his Secretary of War, William Stanton.  Having succeeded to the presidency as a result of the Lincoln assassination, Johnson, a Democrat, had inherited Republican Stanton from Lincoln.  Johnson and Stanton had very different views on reconstruction, and Johnson felt he had the right to a cabinet of his choosing.  The Republican-controlled Congress disagreed, passing a law that prevented the President from dismissing cabinet members without its consent.  Johnson vetoed the law.  Congress overrode the veto.  Johnson considered the law an unconstitutional interference by the legislative branch of government with the prerogatives of the executive branch, so he dismissed Stanton anyway.  For that, he was impeached.

Johnson’s view about the constitution turned out to be correct.  Years later, the Supreme Court decided  that the law in question, restricting the President’s right to dismiss members of his cabinet,  had been unconstitutional.  But in the meantime, the Republic-controlled House had already impeached the Democratic president for the “high crime and misdemeanor” of violating their law by dismissing his cabinet officer.  History has judged the impeachment as a highly partisan political squabble that paid little heed to the opinions of the public.   The case strikes me as an example of how not to use the impeachment power.

The articles of impeachment drawn up against Richard Nixon for his involvement in Watergate were for “obstruction of justice” and “abuse of power” (which boiled down to actions taken to cover up and impede investigation of an illegal break-in by his agents and supporters) and for “contempt of Congress,” i.e., failing to comply with Congressional subpoenas.   Personally, I wonder about that last charge – whether an executive failure to comply with a legislative subpoena is the sort of separation of powers dispute that characterized the Johnson impeachment.  But as for the first two charges, they were (1) for crimes by Nixon (perjury and obstruction), (2) in connection with investigation into another  criminal act (essentially a burglary), (3) committed during the President’s term in office and (4) presumably committed for the purpose of influence his reelection.   Unlike the Johnson case, there was no viable argument that the criminal laws violated were unconstitutional.  In my view, the impeachment articles proposed against Nixon offer a better example of an appropriate use of the impeachment process.

I also find it worth noting that at the time of his near-impeachment, Nixon was a highly unpopular president whose approval rating in public polls had dropped to the mid -twentieth percentile level. A lot of the sentiment against Nixon was actually due to matters extraneous to the impeachment charges, most especially, his conduct of the War in Vietnam.  But regardless of the cause for his low popularity, the Nixon case raises the question of the extent to which public sentiment should be a consideration in impeachment proceedings.  Thinking of Marion Barry, George Wallace, and Ted Kennedy, I’m reminded that we live in a democracy, in which the public’s right to representatives of their choice should not be lightly trifled with.  Any removal of an elected official from office serves to put Congress in the position of second-guessing the expressed will of the electorate.  And as you might suspect in a post on WMBW, any decision by a few people to override the expressed preferences of millions risks being nothing more than arrogance.  As noted in an earlier WMBW post, arrogance is the taking to yourself of authority not rightfully yours.  In a democracy, any time Congress removes someone elected by the people,  it’s hard not to ask whether they’re overstepping their bounds.  That said, why would popular sentiment not be an appropriate consideration in deciding whether to impeach?  If “high crimes and misdemeanors” ultimately boils down to a political question, is that necessarily a bad thing? 

The impeachment of Bill Clinton was for alleged perjury and obstruction of justice stemming from sexual misconduct with Monica Lewinsky and Paula Jones.  An Arkansas state employee, Jones alleged she’d been brought to then Governor Clinton’s motel room by state troopers, where he propositioned her and exposed himself to her.  She filed her sexual harassment lawsuit against Clinton within the applicable three year period of limitations. 

In the Me Too era, it’s interesting that the Jones lawsuit was only dismissed because the presiding judge found she could not prove that Clinton’s conduct damaged her.  (Not that she hadn’t done so, but that she could not do so.) 

That quirk of history aside, Clinton was asked in the Jones lawsuit about his relations with White House intern Monica Lewinsky.  Clinton’s later admissions and public apologies remove any significant doubt that he did in fact have a sexual relationship with Monica Lewinsky.  But in sworn testimony on multiple occasions, Clinton denied having any sort of sexual relationship with her, or even being alone with her.  The charge of obstruction of justice was for trying to influence the testimony of Lewinsky and Clinton’s own White House Secretary to support him in his sworn denials – efforts quite similar, it seems to me, to the obstruction of justice charges against Nixon.   

At the time, there were many who defended Clinton by minimizing the national significance of a President’s sexual activities.  Clinton complained that the inquiries were an invasion of his “privacy.” But the charges against Clinton weren’t for the sexual activity, they were for the alleged obstruction of justice that surrounded it, and for the perjury Clinton committed. (Nixon was widely considered a liar, but he was not charged with perjury, i.e., lying under oath, as Clinton was.)  As in Nixon’s case, there were two levels of misdeed – the underlying one (burglary, in Nixon’s case, sexual harassment in Clinton’s) and the subsequent misdeeds for which impeachment proceedings were brought – obstruction of justice and perjury.  The Democrat Clinton was impeached, but while Republicans split on the vote to remove him from office, every Democratic senator voted to acquit him of all charges, so conviction by a two-thirds majority failed.

In today’s environment. It seems unlikely that Clinton’s sexual activities would be dismissed as easily as many dismissed it in the 1990’s.  So, if public sentiment is a factor (and I think it is, whether it should be or not), the acquittal of Clinton might have come down differently today.  And that’s true, I think, even though public sentiment about perjury and obstruction of justice has not seemed to change from what it was back then.  It’s public sentiment about sexual abuse by people in power that have changed.

I certainly wonder, if President Trump were impeached for committing perjury and obstructing justice with respect to, say, his relationship with Stormy Daniels, Democrats would unanimously vote to acquit him, as they did with Clinton.

As with Nixon’s obstruction of justice, there was no question about the constitutionality of the laws Clinton was accused of violating.  As with Nixon, the charges against Clinton were for crimes committed during the term of office.  Since the misconduct by Clinton occurred during his second term of office, it was not designed to influence an upcoming election, as Nixon’s presumably was, so a removal from office could not be said to be any sort of remedy for election fraud.  But public sentiment was quite different than it had been in the Nixon case.  In contrast to Nixon’s abysmal public approval ratings, Clinton’s remained in the mid 60th percentile throughout his presidency, and reached a high in the mid 70th percentile after the impeachment proceedings.

Finally, I note that all three Presidential impeachments  so far have been brought by an opposition Congress – twice by Republicans against a Democratic President, and once by Democrats against a Republican President. No Congress has ever gone after a President of its own party.

Bottom line: there seems to be very little precedent for Congress to remove a president, or one of its own, from office.  Treason seems to be enough, and so does taking bribes, but there’s been a mixed record when it comes to perjury and obstruction of justice.  The differences seem better accounted for by partisan politics and by the political climate of the day, i.e., the popularity of the President accused.

I wonder whether, to some extent, this last factor is appropriate.  Other countries have procedures for recall elections.  In this country, we have them for other public offices.  Ultimately, in any democracy, one might think of impeachment and removal from office by Congress as a substitute for such a recall election.   I think the arguments are strong that Congressmen in red and blue states will, and should, be influenced in their actions by what they think their constituents want, and frankly, up to a point, I’m not bothered by that.  But Nixon should not have been impeached because he was unpopular, and Clinton should not have been acquitted because he was a Democrat.   There’s still precedent to be considered regarding the actual allegations made and proven.  And I strongly think it should be.

When Congress acts, I hope it doesn’t deprive us voters, collectively, of the right to be represented by the leaders we choose.  Otherwise, I may think them guilty of great arrogance. That said, I think there’s a point at which elected officials should unseat other elected officials, I just haven’t decided exactly where I think that point is. But as we try to sort such things out is, I hope we act consistently with past precedent, and with awareness that we’ll be setting precedent for the future as well.

The Bias Blind Spot

In my novel, Alemeth, I told the story of an ante-bellum family who ran a cotton plantation in Mississippi.  They owned sixty African-American slaves.  Their belief in the righteousness of the southern cause was based on their view that slavery was sanctioned by Holy Scripture.  Essentially, they believed that God had charged them with a duty to perpetuate the peculiar institution.

One of the mysteries that attracted me to this true story was how so many people could have been wrong about an institution which, today, nearly all mankind agrees is evil.  I wanted to understand how their wrongness came to be.  Of course, this family was not alone.  Their neighbors, their churches, their doctors, their lawyers, their newspapermen, shared their views.  At the risk of gross oversimplification, it is at least roughly true that about twenty million northerners thought slavery wrong, and five or six million southerners thought it right. 

I’m not talking about related questions, like whether slavery was worth going to war over, or whether it justified secession; I’m not talking about whether there were some in the north who supported slavery, or who were racists, or whether there were individual abolitionists in the south. I’m talking about whether people thought slavery was an evil that should be immediately abolished or that it was an economic necessity that ought to be preserved for the foreseeable future – and on that point, the people of the South showed amazing agreement with each other.  One indication of just how geographically lopsided the distribution of opinions was: the large number of Christian church denominations that split into separate northern and southern churches over the slavery question.

If every person had simply thought out the rightness and wrongness of it for himself, there’d have been a thorough mixture of opinions in every state, north and south. Differences as to details notwithstanding, the geographically lopsided distribution of opinions  as to the central question that was a necessary condition for civil war convinces me that something else was going on. 

How was it that nearly all the good white people lived up north, and nearly all the bad ones lived in the south? 

Okay, not really.  I know that couldn’t be true. So I wonder, how did it happen that nearly all the smart people lived up north, and all the stupid ones lived in the south? 

Okay, really, not that either.  While mulling this mystery over, my daughter Jen forwarded me a blog by someone I don’t know – his name is Sean Blanda – called “The ‘Other Side’ is Not Dumb.”   I think Sean is right.  On average, surely the people of the south were as good, and as smart, as their northern counterparts.  So perhaps, being “right” or “wrong” has little to do with how smart you are?  Or how good you are?

Was it self-interest, tradition and peer pressure that caused the people of the south to descend into such widespread error?   A sort of groupthink, perhaps, arising from common backgrounds and perspectives?.  Fair enough.  But what, then, about the beliefs of those in the North?  Was the correct position of the north regarding slavery due to an absence of groupthink, self-interest, and peer pressure there?  Was the south riddled with conditions that contributed to southern bias, while the north was able to arrive at the “right” answer because it was free of any such influences?

Maybe so.  Maybe we could all agree about the errors and biases of the south, now that we all agree about the evils of slavery.  But what of those controversies on which we don’t yet agree?  In political election cycles, the country always seems split fairly evenly between Republicans and Democrats.  Is it possible that one side’s views are explained in terms of cultural bias, but the other side’s views are not?  According to the Pew Research Center, about 30% of the World’s population is Christian, and a similar portion (about 22%) is Muslim.  Is it possible that the 30% is simply better informed than the 22%?  That the 22% are smarter than the 30%?  That one view is the result of cultural biases and the happenstance of birthplace and family influence, but the other view is not?  Are the debates over gun control, abortion, global warming, Vegan diets and same sex marriage, debates between smart people and stupid people?  Between the good people and the bad people?

Finally, what are the odds that, on each and every issue, it’s ME who recognizes the truth (because it really is the truth), while my opponents’ incorrectness can be explained by bias? 

In Being Wrong (Harper Collins, 2010), Kathryn Schulz writes, “Let’s say that I believe that drinking green tea is good for my health.  Let’s also say that I’ve been drinking three cups of green tea a day for twenty years, that I come from a long line of green tea drinkers, and that I’m the CEO of a family-owned corporation, Green Tea International.  An impartial observer would instantly recognize that I have three very compelling reasons to believe in the salubrious effects of green tea, none of which have anything to do with whether those effects are real…  I have powerful social, psychological, and practical reasons to believe in the merits of green tea.”

Makes sense, doesn’t it?  In the example just given, Schulz is writing about what would be obvious to an impartial observer.  But more important is what’s obvious to partial observers – to those who are convinced that the other side is wrong.  If we’re talking about people we’re convinced are wrong (like those who supported slavery) it’s natural to believe that their views are shaped by – and therefore depend on – their peculiar life experiences.  Yet when it comes to the things we have decided we’re right about, we ‘re unable to see that our beliefs are a function of own life experiences in the same way.  Because we believe that the Statue of Liberty really towers above New York Harbor, we believe it is objectively real, regardless of our subjective perspective, culture, or bias. To us, everything that’s “obviously true” is like another Statute of Liberty. 

“Sure, it may be that my father was a civil rights activist and my mother worked for George McGovern, but I hold my liberal views because they are objectively right…”  Or, “Sure, it may be I grew up reading the Christian Bible, but my faith in Jesus has nothing to do with that happenstance; I have faith in Jesus because he has revealed himself to me…”  When people believe that something is true, they believe it not because of anything about themselves or their own backgrounds, they believe it because – well, because it’s true.

Simultaneously, because we believe that slavery was wrong, we are quick to conclude that those who supported it only did so because of such a cultural bias.  This readiness to see bias as being the reason for the (erroneous) beliefs of others, while being unable to see that bias may explain why we ourselves believe certain things, is something professional psychologists call the “bias blind spot.” A quick Google search on “the bias blind spot” reveals a host of scientific studies regarding this phenomenon.  Many have shown it to be true: we are quick to ascribe bias (from whatever source) to those we disagree with, while denying it in ourselves.

In a May, 2005 article in The Personality and Social Psychology Bulletin (Ehrlinger, Glovich, & Ross, “Peering into the Bias Blind Spot: People’s Assessments of Bias in Themselves and Others”), the authors explored two empirical consequences of the phenomenon: First, that people are more inclined to think they are guilty of bias in the abstract than in any specific instance.  (“Sure, I recognize that I’m capable of bias; but doggone it, not when it comes to this.”)  Second, that people tend to believe that their own personal connection to a given issue is a source of accuracy and enlightenment – while simultaneously believing that such personal connections by those who hold different views are a source of bias. 

I find the second point especially interesting.  Think about it:  As to the beliefs I hold most dear on some controversial subject, do I have personal experiences that are relevant?  If so, do I consider those personal experiences as giving me special insights into the matter?  Now ask the same question about the typical person on the other side of that issue.  Do the reasons for their error lie at least in part in their different experiences?  Do I not see those experiences as providing valuable insights, but as reasons to explain away their error?  Personally, I’ve been guilty of this double standard often. 

Schulz points out that when we try to understand how people disagree with us, our first tendency is to assume they don’t have all the information we have – something Schulz calls the Ignorance Assumption. So we try to educate them.  If our efforts to educate them don’t work, if they adhere to their mistaken beliefs even after we’ve given them the benefit of our own information and experiences, then we decide they must be less able than we are to properly evaluate the evidence.  (In other words, we decide they just not as smart as we are – Schulz’s “Idiocy Assumption.”)  Finally, if we become convinced they’re actually smart people, we find ourselves considering them morally flawed –selfish at best, just plain rotten at worst (Schulz’s “Evil Assumption.”)

At the end of the day, it might just be that I’m right about a few things.  But if so, I doubt it’s because I’m smarter, or a better person, than those on the other side.  And it’s certainly not because I have no cultural biases of my own.

I’ll end by quoting Schulz one more time: “If we assume that people who are wrong are ignorant, or idiotic, or evil – well, small wonder that we prefer not to confront the possibility of error in ourselves.”

– Joe

The Impeachment to Come

First, a series of predictions: The U.S. House of Representatives will impeach Donald Trump.  He will not resign, so the Senate will conduct a trial on whatever charges are brought against him. The next couple of years there’ll be plenty of talk about the meaning of “high crimes and misdemeanors.”   At the end of the day, once all the evidence is in, I will approve of President Trump’s removal from office for “high crimes and misdemeanors.”  Until then, I will try (not always successfully) to keep an open mind.  I will view some participants and spectators as sharks in a feeding frenzy.  And I will not be able to restrain myself from commenting, especially when I think the street buzz fails to appreciate nuances or fails to put today’s events in historical perspective. 

Anticipating all that, and before the gavel brings the first meeting of the Impeachment Committee to order, I thought I’d ask a question intentionally broader than the eventual “high crimes and misdemeanors” question.  Namely, is Donald Trump the most independent, egotistical maverick who has ever served as president?

Perhaps he is.  Perhaps cabinet shake-ups, midnight tweets, criminal investigations and mounting criticism by members of his own party demonstrate that the man is out-of-control, a rogue who has lost all sense of attachment to the country and even to his own political party, an egotistical maverick who thinks he’s smarter than the combined wisdom on Capitol Hill and is prone to take the law into his own hands. 

But on the subject of mavericks, I thought I’d take a look at two pieces of historical data.  One of these is how often presidents have used their veto power.   An independent maverick willing to assert himself over the views of the Congress would seem likely to use the veto more often. 

The other is a president’s use of the Executive Order.  Bypassing Congress, presidents have sometimes attempted to make law by executive order.  The courts have often found that executive orders have exceeded proper presidential powers.   This is certainly not true of all executive orders.  The first such order recognized by the American Presidency Project was George Washington’s order that his cabinet members report back to him on the status of matters in their respective areas of responsibility.  There’s obviously a big difference between the executive activism suggested by that order and, say, Harry Truman’s order nationalizing the country’s steel mills.  So as a measure of presidential activism, the count of a president’s executive orders may be more problematic than a count of his vetoes.  As with vetoes, a president whose party is in control of Congress might be expected to use executive orders less than a president with an opposition party in power on Capitol Hill.  So there are obviously variables at play, not accounted for by the raw numbers  Still, one might expect a president who’s apt to take matters into his own hands, a president who tries to control the country personally rather than letting Congress do so, might be expected to issue more executive orders than a more docile, less activist president.

My thought was that the frequency of presidential vetoes and executive orders may provide at least some insight into the degree of ego and power various presidents have attempted to wield while in office.

In the following table, from FDR through Donald Trump, I’ve included the data for all the presidents.  Before FDR, I’ve included only those presidents who set new record highs for use of executive orders or vetoes.  I’ve used the president’s months in office to convert absolute numbers to monthly rates.  Here’s what I get, using data from the American Presidency Project and the U.S. Senate.

* Figures for Donald Trump are to date, i.e.,  December of 2018.

The numbers above don’t tell the whole story by any means. For example, hundreds of Cleveland’s vetoes were of private pension bills for Civil War veterans. Congress wanted to grant pensions to individual, named veterans after the Pension Bureau had investigated and denied them.  The bills presented the same issue again and again, and the result drastically inflated Cleveland’s total vetoes.

So the bare counts are no doubt subject to all sorts of explanations and interpretations.  But for me, the counts suggest a couple of things worth keeping in mind.

The first is that there have been two growth spurts in presidential activism as measured by these indicators.  The first spurt was when the country was being rended apart and put back together again over the slavery question. President Pierce nearly doubled the prior record of executive orders, Lincoln advanced it again, and after Lincoln’s assassination, Johnson and Grant, while trying to put the country back together again, more than doubled it again.  Meanwhile, Johnson and Grant each set new records for presidential vetoes, and did so by large margins.  It was certainly a tumultuous time.

The second spurt began with Teddy Roosevelt and ended with Harry Truman, a period spanning the Great Depression and two world wars.  That spurt is evident in both executive orders and vetoes, with FDR setting the all time record for both, despite the fact his party was in control of both houses of Congress for his entire presidency.  More tumultuous times.

Judged by that historical observation, in this time when the country is so polarized and divided, one might expect we’d have an activist president, at least as assessed by these measures.

The second observation I would make is more subjective, but I think important to think about, even so: namely,the correlation between a president’s “executive activism” as suggested by this data, and his reputation as a great president, as judged by history . To me, this will be important to keep in mind as we face the impeachment proceedings to come – not to argue that Donald Trump is a great president, but to help us remember what standard we’re judging him by, and if we remove him from office, what it is we remove him for.    

Putting Grover Cleveland aside, consider how history has regarded the other notables on the list:  Shortly after his election to office, President Lincoln ordered the arrest of several Maryland legislators who favored secession, right before a scheduled vote on secession, for the transparent reason of keeping Maryland from voting to secede.  (Now that was a bold display of executive activism!)  Yet history has judged that bold presidential action by all but forgetting it. 

Two years later, when Lincoln issued his most famous executive order (the Emancipation Proclamation) he took great pains to make sure it was “legal.” Lincoln disagreed with the U.S.Supreme Court’s recent decision in the Dred Scott case that, because slaves were private property  under state law, the federal government had no right or power to free them.  That decision was the law of the land, but Lincoln circumvented it by asserting that he did have power to confiscate property being used in rebellion against the federal government. So rather than having Congress do it, Lincoln freed the slaves by a stroke of his executive pen.  But recognizing the Supreme Court’s ruling, he only freed those slaves in the states that were in armed rebellion against the national government.  That respect for the rule of law is something Lincoln is much criticized for today.  Current progressive thinking would probably treat him better if he had contravened the law as then decided by the Supreme Court, and used his executive power to free all the slaves.  Lincoln was a maverick, but as judged by history, possibly not maverick enough.

Nearly a hundred years later, when President Truman used an executive order to place the country’s steel mills under federal control, the Supreme Court held his order unconstitutional. Truman is also third on the list of most active vetoers in history.  Yet Truman is highly regarded for his independence today.

Theodore Roosevelt, who set new records for issuing Executive Orders and established a reputation as one of the most egotistical mavericks to ever occupy the office, got his face enshrined on Mount Rushmore.   He is often considered one of the five greatest Presidents in American history.

And Franklin Roosevelt, who tried to pack the Supreme Court when too  much of his agenda was ruled unconstitutional, who set the record for issuing activist executive orders by a large margin, and who set the record for presidential vetoes even though his own party controlled Congress throughout his presidency, is widely hailed by many as the best president in history. He is certainly highly regarded by today’s “progressives” for his executive activism.

The point is that, as I see it, history has generally looked upon presidential activism with high regard  — at least when it approves of the goals a president  has pursued.  

So where does President Trump fall, on these measures of  ego and executive activism?  He has used the Executive Order more frequently than President Obama, but then, Obama’s use of the Executive Order was the lowest in modern times.  When compared to other modern presidents, Trump’s rate has been comparatively low.  And as for his use of the veto power, there have been 2,574 presidential vetoes since 1789 — not one of them by Mr. Trump.

There are a lot of ways to measure a President’s ego, independence, and executive activism. If measured by midnight tweets and rash statements made to the television news media, President Trump is surely the most arrogant President in history. (That’s an easy claim to make considering Andrew Jackson and Ulysses Grant had neither twitter nor TV.)   But measured by such quantifiable things as frequency of executive orders and vetoes, Mr. Trump has been far less of a maverick than either of the Roosevelts , Wilson, or Truman.  And as far as I can tell, being mavericks who were not always in line with their own parties had a lot to do with why such men have been regarded well by history. 

My point?  I simply hope that, as the impeachment proceedings progress, we keep in mind that impeachment was not designed to punish presidents for having policies and positions we disagree with.  Impeachment was not intended as a remedy for presidents with big egos, or even for those who run counter to the views on Capitol Hill or within their own political parties.  Let’s not impeach Donald Trump because he’s a maverick, unless we think that presidents yet to come who are cut from the mold of Jackson, Lincoln, Wilson and both Roosevelts  will deserve to be impeached for their  roguishness.  Let’s think long and hard, with a sound historical perspective, about the separation of powers, the presidency, and the best meaning to give to “high crimes and misdemeanors.”

I may be wrong, but I predict I’ll have more to say in the months to come about that term.  But those are my thoughts for now.  I look forward to hearing yours.

— Joe


What do the Kavanaugh hearings, Halloween and Homer’s Odyssey all have in common?

Here’s my take on it.

  1. The Kavanaugh Confirmation Hearings

Someone recently said to me, “Joe, you were a lawyer once.  You understand evidence.  You can see that all the evidence supports my position on this.”  The person who said that to me could have been talking about the Kavanaugh hearings.  Like so much media coverage of the hearings, this fellow thought of a trial as the evidence all points in one direction or the other .  My answer to him was that if I’d learned anything in thirty years of bar membership it was that my mother was right: there are always at least two sides to a story, and the truth is generally somewhere in between.  If juries heard only one side’s witnesses and arguments, every verdict would be unanimous.  Is it any wonder that if you tell me what news source you follow, I can pretty well predict how you feel about the world?

In years of practicing law, I saw over and over again how witness testimony polarized over time.  From the plaintiff’s perspective, the size of the wrong and the depth of the injury always grew, while from the defendant’s perspective, the strength of the alibi and the sense of indignation always did likewise.  Add the way politicians and the media frame a case as pitting good against evil, and you have everyone asking which of the witnesses is lying.  In this view, it has to be one or the other.  When I said, about the Kavanaugh hearings, that I thought both witnesses were telling the truth as they saw it, people looked at me like I was some sort of crazed lunatic from outer space.  The hearings, and especially the media coverage of them, left me shaking my head about what made them so typical of polarized American politics today: namely, a complete inability to empathize with the other side.

  1. Halloween

Yesterday, I came across a piece published last year in USA Today titled “5 Halloween Myths and Urban Legends, Debunked.”  Myth Number 3 was titled, “Satan is the Reason for the Season.”  While acknowledging that Halloween can be traced back to ancient Celtic harvest festivals, the article argued that the modern event has nothing to do with Satan, and never could have, as Satan is a Judaeo-Christian character that would have made no sense to the ancient Celtic polytheists who started those harvest festivals.  The article also points out that All Hallow’s Eve is the first of three days Christianity devotes to remembering the souls of the Christian faithful.  The religious origins of the modern holiday have to do with honoring the good dead, not the immortal Satan, the embodiment of evil

But when it comes to Halloween, like the Kavanaugh hearings, people are polarized.  To many, Halloween will always be about pure evil.  For many on both sides, there’s a complete inability to empathize with the other.

  1. The Odyssey.

My first exposure to the Odyssey was probably Kirk Douglas’s portrayal of the classical hero in 1954’s Hollywood version, Ulysses.  While I don’t remember much of that movie, I feel sure that Kirk Douglas’s character must have been very heroic, in the modern sense of that word – which is to say, a particularly good and capable guy fighting the good fight against evil.  My sense of the story has always been that the Cyclops, Poseidon and the suitors were monstrously bad while Odysseus wasn’t far shy of sainthood.  I want to take this opportunity to rave about the new translation I just finished reading by Emily Wilson.  It manages to be an amazingly easy and accessible read while maintaining the strict metrical qualities of the original.  For the first time, I didn’t have to “study” the epic, I could just read it, and do so at the same pace I might read John Grisham or Dan Brown.  As a result, I acquired a sense of the whole as I never have before.   I strongly recommend her translation, whether you’ve read the epic before or not.

Wilson’s excellent and engaging translation gave me several new perspectives about the story.  One is that the very name Odysseus can be translated as “hated” or at least “disliked.”  He’s easy to hate because he’s not just duplicitous, he’s multiplicitous.  There’s something for everyone to hate.  In Wilson’s words, he is “a migrant…, a political and military leader, a strategist, a poet, a loving husband and father, an adulterer, a homeless person, an athlete, a disabled cripple, a soldier with a traumatic past, a pirate, thief and liar, a fugitive, a colonial invader, a home owner, a sailor, a construction worker, a mass murderer, and a war hero.” Wilson gives much attention to how a person can be so complex and multi-faceted, at once so hated and so loved.  Her Odysseus is anything but the one dimensional champion of goodness that I grew up admiring. Perhaps we see ourselves in him.  Perhaps that’s what allows us to empathize.

It has become common to dismiss the pagan gods as amoral and often wicked libertines that no thinking person could believe were real.  Modern criticism of the Greek gods generally amounts to the argument that they are no better than us human beings.  Wilson points out they’re essentially the same as powerful human beings except that they live forever, but morally and ethically, they’re no better than us.  This strikes me as a natural criticism of deity if you’re comparing it to a God conceived of as morally perfect and all knowing.  But have there been unintended consequences to conceiving of God as the embodiment of perfect goodness and omniscience?  What have been the consequences of living with the aim of achieving such righteousness ourselves?  What have I done by measuring my self-worth by comparison to a single, homogeneous and absolute goodness who has revealed Himself to me?  Has it worked to make me self-righteous?

One reason I’ve always been attracted to Greek myth is that the gods DO behave like human beings.  I’ve long felt that such portrayals allow us to see the consequences of our foibles in archetypal ways that can help us to avoid mistakes as effectively as a lot of sermons I’ve heard.     At their cores, the modern worldview suggests that the difference between good and evil is apparent, and that life is simple: if we choose correctly, we’ll live forever in the home of the gods.  In the old pagan worldview, life is a constant struggle to sort out the difference between good and  bad; that even in the home of the gods, it can be hard to distinguish right from wrong; that sometimes, what seems good to one person (or god) seems bad to another.  In this worldview, there isn’t any Grand Commission of Justice to tell us which is which.

There’s little doubt in my mind that most of us would choose to live in a world where good and evil are clearly defined and labelled. But is the real world more nuanced and dependent on point of view than that?  Wilson points out that Odysseus is offered a perfect and immortal life by Circe, but turns it down, choosing instead his mortal home in his mortal world.  Is that why we can love him and hate him at the same time?  There are good reasons the Bible has stood the test of time.  I think there are good reasons the Odyssey has too.

So: What similarities do I see between the Kavanaugh hearings, Halloween, and the Odyssey? For me, all three tell us something about the extent to which Platonic thinking about absolutes has changed the world.  In the pre-Platonic, polytheistic world of Odysseus we could celebrates diverse and multiple perspectives; in the modern world, there must be a single and absolute truth distinguishable by its righteousness.  In the Christian Era, we’re used to hearing the gods of Greek myth dismissed as either “immoral” or “amoral.”  But in the Odyssey, Zeus is the god of justice and of hospitality toward strangers.  One of the most constant themes is that the gods will not approve of mistreating strangers.  It’s not that the Homeric gods don’t care about what’s good and right, but that (just like people) they don’t share a singular and unchanging view of what “goodness” consists of.

Of the many epithets applied to Odysseus (apart from being godlike),  most begin with the prefix “poly-,” meaning multiple.  Odysseus is “poly-tropos” (multiply turning), poly-phrona (multiply-minded), poly-meganos (employing multiple devices), poly-tlas (multiply enduring), poly-penthes (multiply-pained), poly-stonos (multiply-sorrowed) and poly-aretos (multiply prayed for.)  In a sense, this multiplicity makes him all things to all people.  It’s a big part of why he’s hated.  He is also incredibly adaptable, assuming different guises and characteristics in different situations.  His understanding of right and wrong is neither absent nor irrelevant – it is simply changing.

All our modern religious and political instincts tell us to condemn such inconstancy.  We’re trained to think in terms of Platonic absolutes, of clear and perfect Goodness on one side and clear and perfect Evil on the other.  We’re told we can identify the Truth and that we’re bound to adhere to it.  If Professor Ford was telling the truth as she saw it, then Judge Kavanaugh had to be lying, as he saw it.  If Halloween is not a glorification of the Judaeo-Christian God, it must be the work of Satan.  If Odysseus is inconsistent from one day to the next, he must represent an inferior state of being because perfect people have to be constant, unchanging and right.

But is there a difference between being constant, unchanging and right, and being rigid, intolerant, and set in our ways?

I’m not advocating for a rudderless, amoral view of the world.  Goodness is certainly worth striving for.  But how can I know for certain I’ve found it, when others disagree with me about what’s good?  Once again, I’m reminded of Alexander Solzhenitsyn’s words:

“If only there were evil people somewhere insidiously committing evil deeds and it were necessary only to separate them from the rest of us and destroy them.  But the line between good and evil cuts through the heart of every human being.  And who’s willing to destroy a piece of his own heart?”

I recently read Jonathan Haidt’s The Righteous Mind: Why Good People are Divided by Politics and Religion. The book is worth a read for many reasons, but the concept I found most thought-provoking was Haidt’s view on the evolutionary origins of human reason.  The traditional view is that the capacity for reason and logical analysis evolved in human beings as tools for reaching the best conclusions.  In reality, Haidt suggests, human beings wouldn’t have survived unless they could form immediate judgments about things without reasoned analysis.  (You can’t conduct a reasoned analysis of whether to run from a saber-toothed tiger or not.)  But we are also social animals whose early survival depended on the ability to work together in teams.   And to act as a team,  we needed coordinated approaches.  Haidt says our social survival depended on leaders able to persuade others to follow their judgments.  According to Haidt, reason and logical analysis arose about the same time as language did, and they evolved for much the same social purposes: that is, not as tools of decision-making to help an individual determine what’s right, but as tools of persuasion to help convince others to go along with our judgments.  (In the process, we convince ourselves that our judgments are right, too, but that’s a result, not a cause.)

In this view, all of human reasoning has its origins in persuading others, in post-hoc justification to support judgments already formed.  If Solzhenitsyn and Haidt are right, then all the arguments between Professor Ford and Justice Kavanaugh, Democrats and Republicans, Christians and atheists, NPR and Fox News, Halloween enthusiasts and its enemies,  and indeed, between you and me, have to do with persuasion, not with what either one of us has always revered as “reason.”

In this sense, maybe Ford’s and Kavanaugh’s truths are similar.  Last year, I blogged about liking Halloween because it invited us to try out the worldview of a character we normally think of us strange, monstrous, or even evil.  Maybe it isn’t bad that we put ourselves in the shoes of terrible others on Halloween.  Maybe it’s okay to change our understanding of right and wrong at times, to try out new perspectives, just like Homer’s Odysseus did.  Maybe multiplicity helps us empathize.

After listing the many (contradictory) traits her Odysseus exhibits, Emily Wilson  writes, “immersing ourselves in his story, and considering how these categories can exist in the same imaginative space, may help us reconsider the origins of Western literature, and our infinitely complex contemporary world.”

Maybe she’s on to something there?

– Joe


In 1595, the early English explorer and colonist, John Davys, wrote in The Worlde’s Hydrographical Discription, Thomas Dawson, London,

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

In the 1850’s, the Reverend Augustus Longstreet – president of a leading American University and minister of the Lord – wrote to his son-n-law regarding the unreasonable behavior of his slaves:

“The creatures persistently refuse to live together as man and wife, even after I have mated them with all the wisdom I possess, and built them such desirable homes.”

About the same time, the famous case of the slave, Dred Scot, wound its way up to the Supreme Court of the United States.  On its way, the Supreme Court of the state of Missouri found that one of the key issues before it was whether African slavery really did exist for the benefit of the slaves.

Of course, we’ve come a long way since then.  In 1997, Robert Hendrickson wrote, in The Facts on File Encyclopedia of Word and Phrase Origins, Checkmark Books :

“cretin.  Our pejorative cretin, for “an idiot,” began as a kindly word.  In the Middle Ages many deformed people with a low mentality lived in the Alpine regions, their condition resulting from a thyroid condition now known as myxedema, which was possibly caused by a deficiency of iodine in their drinking water.  These unfortunates were called Chrétiens, “Christians,” by the Swiss, because the word distinguished human beings like these people from brutes, and they believed these childlike innocents were incapable of actual sin.  But the kindly word went into French as cretin, meaning “idiot,” and retained the same meaning when it passed into English.”

It leaves me wondering: if our best navigators, University presidents, and supreme courts can be such cretins, where does that leave the rest of us?

Honk if you love word origins.

– Joe

Top Ten Blunders – Your Nominations

A month ago , I asked for your thoughts about the greatest blunders of all time.  I was thinking of blunders from long ago, especially “a list that considers only past human blunders, removed from the passions of the present day.”  I observed, “My special interest lies in blunders where large numbers of people… have believed that things are one way, where the passage of time has proven otherwise.  I believe such a list might help remind us of our own fallibility, as a species…

I got only five nominations. (I imagine the rest of you are simply reluctant to nominate your own blunders.  But hey.  All of us have done things we’d rather our children not hear about.)  As for the rest of you, I’m grateful for your nominations, even if they do imply that blame lies elsewhere than ourselves.  The five I received are certainly food for thought.

One was, “Founding Fathers missed huge by not imposing term limits.”  According to a recent Rasmussen opinion poll, 74% of Americans now favor term limits, with only 13% opposed.*  One could argue the jury is in: the verdict being that the founding fathers should have imposed term limits.  That said, with the average length of service in the U.S. House being 13.4 years, we obviously elect half of our representatives to seven or more consecutive terms.  And Michigan voters have sent John Dingle back to Congress for over fifty-seven years, even longer than his father’s decades of service before him.  Do they feel differently about term limits in Michigan?  If the founding fathers’ failure to impose term limits was a great blunder, don’t the American voters make a far greater blunder every two years when they send these perennial office holders back to Washington, year after year? I mean, it’s at least arguable that the Founding Fathers were right in failing to impose term limits.  But who can deny the hypocrisy when an electorate that favors term limits (that means us, folks) does what they themselves would prohibit?  Millions of Americans today are either wrong in favoring term limits, or wrong in re-electing the same Congressmen over and over again – and surely wrong by doing both simultaneously.  At least if measured by the number of people involved, the blunder we commit today strikes me as greater than that committed by a handful of wigged men in 1789.

A second nomination: “Y2K has to be in the top 20?”  That one sure brings a smile to my face.  You remember the “experts” predictions of the global catastrophe we’d see when all those computers couldn’t handle years starting with anything but a 1 and a 9.   Then, when the time came, nothing happened.  I don’t know of a single problem caused by Y2K.   If judged by the certainty of the so-called experts, and the size of the gap between the predicted calamity and what actually transpired, Y2K clearly deserves recognition.

But compare Y2K to other predictions of doom.  There can be no predicted calamity greater than the end of existence itself.  A glance at Wikipedia’s article, “List of Dates Predicted for Apocalyptic Events,” includes 152 dates that have been predicted for the end of the world.  And they haven’t been limited to freakish fringes of society.  Standouts include Pope Sylvester II’s prediction that the world would end on January 1, 1000, Pope Innocent III’s that it would end 666 years after the rise of Islam,  Martin Luther’s prediction that it would end no later than 1600, and Christopher Columbus’s that it would end in 1501. (When that year ended successfully, he revised his prediction to 1658, long after he’d be dead; he apparently didn’t want to be embarrassed again).  Cotton Mather’s prediction of 1697 had to be amended twice.  Jim Jones predicted the end in  1967 and Charles Manson 1969.  My favorite on Wikipedia’s list dates from May 19, 1780, when “a combination of smoke from forest fires, a thick fog, and cloud cover” was taken by members of the Connecticut General Assembly as a sign that the end times had arrived.  (It’s my favorite because it may help explain why the founding fathers saw no need for term limits.)  But fully half of the Wikipedia list consists of predictions made since 1900.   Over twenty-five have been since the Y2K blunder.  The recent predictions include one from a recent Presidential candidate (Pat Robertson) who predicted the world would end in 2007.  And though not yet included by Wikipedia,  last month’s solar eclipse brought out yet more predictions of the end of the world – never mind that only a tiny fraction of the earth’s surface was in a position to notice it.  (Would the world only end across a thin strip of North America?)

We can laugh at Christopher Columbus, but what of the fact that the list of doomsday prophecies continues to grow, despite how often the doomsayers have been wrong?  Measured by the enormity of the subject matter and the apparent widespread lack of concern about being “twice bitten,” man’s fondness for predicting when the world will end as a result of some mystical interpretation of ancient texts strikes me as a bigger blunder than Y2K – and unlike Y2K, it shows no sign of going away.

A third nomination: “The earth is flat.”  The archetypal human blunder.  Months ago, while struggling to think of other blunders as egregious, I was led by Google to a Wikipedia article on “the flat earth myth,” which I assumed was exactly what I was looking for.  But to my dismay, I read that the “flat earth myth” is not the old belief that the world was flat; rather, it is the current, widely-held belief that people in the Middle Ages believed the earth to be flat!  I’d spent a lifetime feeling proudly superior to the ignorant medieval masses.  Was it me, after all, who was wrong?

My discovery reminded me of the difficulty of ranking human error.  The article asserted that throughout the Middle Ages, the “best minds of the day” knew the earth was not flat.  The “myth” was created in the 17th Century, as part of a Protestant campaign against Catholic Church teachings, accelerated by the fictional assertion in Washington Irving’s popular biography of Christopher Columbus that members of the Spanish court questioned Columbus’s belief that the earth was round.  Gershwin’s unforgettable, “They all laughed at Christopher Columbus…” etched the myth forever in our minds.  The article quotes Stephen Jay Gould: “[A]ll major medieval scholars accepted the Earth’s roundness as an established fact of cosmology.”  The blunder wasn’t a relic of the Middle Ages, but an error of current understanding based on a post-enlightenment piece of popular fiction!

Meanwhile, the Flat Earth Society lives on to this day.  Their website, at, “offers a home to those wayward thinkers that march bravely on with REASON and TRUTH in recognizing the TRUE shape of the Earth – Flat. “  Most of them, I think, are dead serious.  But wait.  Which is the greater blunder: that of the medieval masses who saw their world as a patchwork of coastlines, rolling hillsides, mountains, valleys, and flat, grassy plains?  Or that of the experts, the major scholars who “knew” in the Middle Ages that the earth was a sphere?  The earth is not a sphere at all, we now know, but a squat, oblong shape that bulges around the equator because of the force of its spin.  Or is that error, too?  Need we mention that spheres don’t have mountains and valleys?  Need we mention that the surface of the earth, at a sub-atomic level, is anything but curved?  Aren’t all descriptions of the earth’s shape simply approximations?  And if we can accept approximations on the basis that they serve a practical purpose, then is the observable general flatness of the earth today any more “wrong” than a medieval scholar’s belief in sphericity?    Who really needs to know that the atoms forming the surface of the earth are really mostly air? The “wrongness” in our concepts of earth’s shape isn’t static, but evolving.

The oldest of the historical blunders nominated for WMBW’s top ten list have an ancient, scriptural flavor.

The first: “The number one thing that went wrong with humanity [was] when the first man said to another, ‘I think I heard god last night!’ and the other believed him.”**

The second comes from a different perspective: “The greatest blunder had to be Eve eating of the fruit of the tree of knowledge, having been tempted to be like God, deciding for herself what is good and what is evil.  Every person [becomes] his own god. The hell of it is, everyone decides differently, and we’re left to fight it out amongst ourselves.”**

The other three nominators thought that Y2K, belief in a flat earth, and failure to impose term limits should be considered for a place somewhere on the top ten list.  (Actually, Y2K’s sponsor only suggested it belonged somewhere in the top 20.)  But the two “religious” nominations were each called the biggest blunder of all.  (One was “the number one thing,” while the other “had to be” the greatest blunder.)   What is it about belief in God that prompts proponents and opponents alike to consider the right belief so important, and holding the wrong one the single greatest blunder of all time?

If you believe in God, though He doesn’t exist, you’re in error, but I don’t see why that error qualifies as the greatest blunder of all time, even when millions suffer from the same delusion.  I remember seeing an article in Science Magazine a few years ago, surveying the research that has attempted to determine whether believers tend to act more morally than non-believers.  Most of the studies showed little or no difference in conduct between the two groups.  For those who don’t believe in God, isn’t it one’s conduct, not one’s belief-system, that is the best measure of error?  For them, why does belief even matter?

If you don’t believe in God, though He does exist, you face a different problem.  If you believe as my mother did – that believing in God (not just any God, but the right God, in the right way) means you’ll spend eternity in Heaven rather than Hell – it’s easy to see why being wrong would matter to you. If roughly half the people in the world are headed to eternal damnation, that’s at least a problem bigger than term limits.

But there is a third alternative on the religious question.  If you’ve looked at the WMBW Home Page or my Facebook profile, you may have noticed my description of my own religious views – “Other, really.”  One of the main reasons for that description is pertinent to this question about the greatest blunders, so I will identify its relevant aspect here: “If God exists, He may care about what I do, but He’s not so vain as to care whether I believe in Him.”  My point is not to advance my reasons for that belief here, simply to point out that it may shed light on why many rank error on the religious question so high on the list of all-time blunders, while I do not.  Many believers, I think, believe it’s critically important to believe, so they try hard to get others to do so.  Non-believers react, first by pointing to examples of believers who’ve been guilty of wrongdoing, and eventually by characterizing the beliefs themselves as the reason for the wrongdoing.  In any case, the nominations concerned with religious beliefs were offered as clearly deserving the number one spot, while our “secular” nominations were put forward with less conviction about where on the list they belong — and that difference may have meaning, or not.

In my solicitation, I acknowledged the gray line between opinion and fact.  To some believers, the terrible consequences of not heeding the Word of God have been proven again and again, as chronicled throughout the Hebrew Bible.  To some non-believers, the terrible consequences of belief have been proven by the long list of wars and atrocities carried out in the name of Gods.  Whichever side you take, the clash of opinions remains as strong as ever.

So, what do I make of it all?  Only that I’d hoped for past, proven blunders which might remind us of our great capacity for error.  Instead, I discover evidence of massive self-contradiction on the part of the current American electorate; a growing list of contemporaries who, as recently as last month, are willing to predict the imminent end of the world; my own blunder, unfairly dismissive of the past, fooled by a piece of Washington Irving fiction; and a world as divided as ever regarding the existence of God.

To this observer, what it all suggests is that there’s nothing unique about the ignorance of past ages; and that an awfully large chunk of modern humanity is not only wrong, but likely making what some future generation will decide are among the greatest blunders of all time.

Sic transit gloria mundi.



** I’ve done some editing of punctuation in both of these nominations: I apologize if I’ve thereby changed the submitter’s meaning.


The Top Ten Blunders of All Time

For several months now, I’ve been plagued by the thought that certain ways of “being wrong” are different from others.  I’ve wondered whether I’ve confused any thing by not mentioning types of error and distinguishing between them.  For example, there are errors of fact and errors of opinion.  (It’s one thing to be wrong in thinking that 2+ 2 = 5, or that Idaho is further south than Texas, while it’s quite another to be “wrong” about whether Gwyneth Paltrow or Meryl Streep is a better actor.)  Meanwhile, different as statements of fact may be from statements of opinion, all such propositions have in common that they are declarations about present reality.  Not so a third type of error – judgmental errors about what “ought” to be done.   Should I accept my friend’s wedding invitation?  Should I apologize to my brother?  Should we build a wall on the Mexican border?  I might be wrong in my answer to all such questions, but how is it possible to know?

Is there a difference between matters of opinion (Paltrow is better than Streep) and ethics (it’s wrong to kill)?  Many would say there’s an absolute moral standard against which ethics ought to be judged, quite apart from questions of personal preference; others would argue that such standards are themselves a matter of personal preference.  I’ve thought a lot about how different types of error might be distinguished.  But every time I think I’m getting somewhere, I wind up deciding I was wrong.

One of the ways I’ve come full circle relates to the distinction between past and future.  It’s one thing to be wrong about something that has, in fact, happened, and another to be wrong about a prediction of things to come.  Right?  Isn’t one a matter of fact, and the other a matter of opinion?  In doing the research for my recent novel, Alemeth, I came across the following  tidbit  from the Panola Star of December 24, 1856:

The past is disclosed; the future concealed in doubt.  And yet human nature is heedless of the past and fearful of the future, regarding not science and experience that past ages have revealed.

Here I was, writing a historical novel about the divisiveness that led to civil war. I was driven to spend seven years on the project because of the sentiment expressed in that passage: that we can, and ought to, learn from the past.  (Specifically, we should learn that when half the people in the country feel one way, and half the other, both sides labeling the other stupid, or intentionally malicious, an awful lot of people are likely wrong about the matter in question, and the odds seem pretty close to even that any given individual (that includes each of us) is one of the many in the wrong.  And importantly, the great divide wasn’t because all the smart people lived in some states, or that all the bad people lived in others: rather, people tended to think as they did because of the prevailing sentiments of the people around them. Hmnnn…)

Then, an interesting thing happened in the course of writing the book.  Research began teaching me  how many pitfalls there are in thinking we can really know the past.  We have newspapers, and old letters, and other records, but how much is left out of such things?  How many mistakes might they contain?  Indeed, how many were distorted, intentionally, by partisan agendas at the time?  The more I came across examples of each of those things, the less sure I became that we can ever really know the past.  I often can’t remember what I myself was doing ten minutes ago; how will I ever be able to reconstruct how things were for tens of thousands of people a hundred years ago?  Indeed, the more I thought about it, I began to circle back on myself, wondering whether the opposite of where I’d started was true:  Because the past has, forever, already passed, we’ll never be able to return to it, to touch it, to look it directly in the eye, right?  Whereas, we will have that ability with respect to things yet to come.  If that’s true, the future just might be more “verifiable”  than the past.   I get dizzy just thinking about it.

Anyway, an idea I’ve been kicking around is to ask you, WMBW’s readers, to submit nominations for the ten greatest (human) blunders of all time.  I remain extremely interested in the idea, so if any of you are inclined to submit nominations, I’d be delighted.  But the reason I haven’t actually made the request before now stems from my confusion about categories of wrong.  Any list of “the ten greatest blunders of all time” would be focused on the past and perhaps the present, while presumably excluding the future.  But I’m tempted to exclude the present as well.   I mean, I feel confident there are plenty of strong opinions about, say, global warming – and since the destruction of our species, if not all life on earth, may be at stake, sending carbon into the air might well deserve a place on such a list.   Your own top ten blunders of all time list might include abortion, capitalism,  Obamacare, the Wall, our presence in Afghanistan, our failure to address world hunger, etc., depending on your politics.  But a top ten list of blunders based on current issues (that is, based on the conviction that  “the other side” is currently making a world class blunder) would surprise few of us. It seems to me the internet and daily news already makes the nominees for such a list clear.  What would be served by our adding to it here?

My interest, rather, has been in a list that considers only past human blunders, removed from the passions of the present day.  I believe such a list might help remind us of our own fallibility, as a species.  I for one am constantly amazed, when I research the past, at our human capacity for error.  Not just individual error, but widespread cultural error, or fundamental mistakes in accepted scientific thinking.  My bookshelves are full of celebrations of the great achievements of mankind, books that fill us with pride in our own wisdom, but where are the books which chronicle our stupendous errors, and so remind us of our fallibility? How could nearly all of Germany have got behind Hitler?  How could the South have gone to war to preserve slavery?  How could so many people have believed that disease was caused by miasma, or that applying leeches to drain blood would cure sickness, or that the earth was flat, or that the sun revolves around the earth?

What really interests me is not just how often we’ve been wrong, but how ready we’ve been to assert, at the time, that we knew we were right.  The English explorer John Davys shared the attitude of many who brought European culture to the New World, before native Americans were sent off to reservations:

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

History is full of such declarations.  In researching the American ante-bellum South, not once did I come across anyone saying, “Now, this slavery thing is a very close question, and we may well be wrong, but we think, on balance, that…”  In the days before we knew that mosquitos carried Yellow Fever, scientific pronouncements asserted as fact that the disease was carried by wingless, crawling animalcula that crept along the ground.  This penchant for treating our beliefs as knowledge is why I so love the quote (attributed to various people) that runs, “It ain’t what people don’t know that’s the problem; it’s what they do know, that ain’t so.”

My special interest lies in blunders where large numbers of people – especially educated people, or those in authority – have believed that things are one way, where the passage of time has proven otherwise.  My interest is especially strong if the people were adamant, or arrogant, about what they believed.  Consider this, then, a request for nominations, if you will, especially of blunders with that sort of feel

Yet be forewarned.  There’s a reason I haven’t been able to come up with a list of my own.  One is that, while not particularly interested in errors of judgment or opinion, I’m not sure where the dividing line falls between fact and opinion. Often, as in the debate over global warming, the very passions aroused are over whether the question is a matter of fact or opinion.  Quite likely, what we believe is fact; what our opponents believe is opinion.

The other is the shaky ground I feel beneath my feet when I try to judge historical error as if today’s understanding of truth will be the final word.   Remember when we “learned” that thalidomide would help pregnant women deal with morning sickness?  Or when we “learned” that saccharin causes cancer?  That red wine was good for the heart (or bad?  What are they saying on that subject these days?)  What about when Einstein stood Newton on his head, or the indications, now, that Einstein might not have got it all right?  If our history is replete with examples of wrongness, what reason is there to think that we’ve gotten past such blunders, that today’s understanding of truth is the standard by which we might identify the ten greatest blunders of all time?  Perhaps the greater blunder may be when we confidently identify, as a top ten false belief of the past, something which our grandchildren will discover has been true all along.…

If this makes you as dizzy as it does me, then consider this: The word “wrong” comes from Old English wrenc, a twisting; it’s related to Old High German renken, to wrench, which is why the tool we call a wrench is used to twist things.  This is all akin to the twisting we produce when we wring out a wet cloth, for when such cloth has been thoroughly twisted, wrinkled, or wrung out, we call it “wrong.” Something is wrong, in other words, when it’s gotten so twisted as to be other than straight.

But in an Einsteinian world, what is it to be straight?  The word “correct,” like the word “right” itself, comes from Latin rectus, meaning straight.  The Latin comes, in turn, from the Indo-European root reg-.  The same root that gave us the Latin word rex, meaning the king.  Joseph Partridge tells us that the king was so called because he was the one who kept us straight, which is to say, in line with his command.  The list of related words, not surprisingly, includes not only regular and regimen, not only reign, realm and region, but the German word Reich.  If the history of language tells us much about ourselves and how we think, then consider the regional differences in civil war America as an instance of rightness..  Consider the history of Germany’s Third Reich as an instance of rightness  It seems we’ve always defined being “right” as a matter of conformity, in alignment with whatever authority controls our and our neighbors’ ideas.

Being wrong, on the other hand?   Is it destined to be defined only as the belief not in conformity to the view accepted by those in charge?  Sometimes I think I’ve got wrongness understood, thinking I know what it is, thinking I’m able to recognize it when I see it.  But I always seem to end up where I began, going around in circles, as if space itself is twisted, curved, or consisting of thirteen dimensions.   I therefore think my own nomination for the Ten Greatest Blunders of All Time has to go to Carl Linnaeus, for calling us Homo Sapiens. 

If you have a nomination of your own, please leave it as a comment on this thread, with any explanation, qualification, or other twist you might want to leave with it.

I’m looking forward to your thoughts.



The Tag Line

WMBW’s tagline is “Fallibility>Humility>Civility.”  It’s punctuated to suggest that one state of being should lead naturally to the next.  The relationship between these three concepts being central to the idea, today I’ve accepted my brother’s suggestion to comment about the meaning of the words.

Etymology books tell us that “fallibility” comes from the Latin fallere, a transitive verb that meant to cause something to stumble.  In the reflexive form, Cicero’s me fallit (“something caused me to stumble”) bestowed upon our concept of fallibility the useful idea that when one makes a mistake, it isn’t one’s own fault.  As Flip Wilson used to say, “the devil made me do it.”

This is something I adore about language – the way we speak is instructive because it mirrors the way we think.   Therefore, tracing the way language evolves, we can trace the logic (or illogic) of the way we have historically tended to think, and so we can learn something about ourselves.  Applying that concept here leads me to conclude that denying personal responsibility for our mistakes goes back at least as far as Cicero, probably as far as the origins of language itself, and perhaps even farther.  “I did not err,” our ancient ancestors taught their children to say; “something caused me to stumble.”

I also think it’s fun to examine the development of language to see how basic ideas multiply into related concepts, the way parents give rise to multiple siblings.  And so, from the Latin fallere come the French faux pas and the English words false, fallacy,  fault, and ultimately, failure and fail.  While I’ve heard people admit that they were at fault when they stumbled, it’s far less common to hear anyone admit responsibility for complete failure.  If someone does, her friends tell her not to be so hard on herself.  His psychiatrist is liable to label him abnormal, perhaps pathologically so: depressed, perhaps, or at least lacking in healthy self-esteem.  The accepted wisdom tells us that a healthier state of mind comes from placing blame elsewhere, rather than on oneself.  Most interesting.

Humility, meanwhile, apparently began life in the Indo-European root khem, which spawned similar-sounding words in Hittite, Tokharian, and various other ancient languages.  All such words meant the earth, the land, the soil, the ground – that which is lowly, one might say; the thing upon which all of us have been raised to tread.  In Latin the Indo-European root meaning the ground underfoot became humus, and led to English words like exhume, meaning to remove from the ground.  Not long thereafter, one imagines, the very ancient idea that human beings came from the ground (dust, clay, or whatever) or at least lived on it led to the Latin word homo, a derivative of humus, which essentially meant a creature of the ground (as opposed to those of the air or the sea).  From there came the English words human and humanity.  Our humanity, then, might be said to mean, ultimately, our very lowliness.

From the Latin, homo and humus give us two rather contrary sibling words.  These siblings remain in a classic rivalry played out to this day in all manner of ways.  On the one hand, homo and humus give us our word “humility,” the quality of being low to the ground.  We express humility when we kneel before a lord  or bow low to indicate subservience. In this light, humility might be said to be the very essence of humanity, since both embody our lowly, soiled, earth-bound natures  But our human nature tempts us with the idea that it isn’t good to be so low to the ground.  To humiliate someone else is to put them in their place (to wit, low to the ground, or at least, low compared to us.) And while we share with dogs and many other creatures of the land the habit of getting low to express submissiveness, some of our fellow creatures of the land go so far as to lay down and bare the undersides of their necks to show submission.  Few of us are willing to demonstrate that degree of humility.)

And so the concept of being a creature of the ground underfoot gives rise to a sibling rivalry — there arises what might be called the “evil twin” of humility, and it is the scientific name by which we distinguish ourselves from other land-based creatures: the perception that we are the best and wisest of them gives rise to homo sapiens, the wise land-creature.  As I’ve pointed out in an earlier blog, even that accolade wasn’t enough to satisfy us for long: now our scientists have bestowed upon us the name homo sapiens sapiens, or the doubly wise creatures of the earth.   I find much that seems telling in the tension between our humble origins and our self-congratulatory honorific.  As for the current state of the rivalry, I would merely point out that not one of our fellow creatures of the land, as far as I know, have ever called us wise.  It may be only us who think us so.

And now, I turn to “civility.”  Eric Partridge, my favorite etymologist, traces the word back to an Indo-European root kei, meaning to lie down. In various early languages, that common root came to mean the place where one lies down, or one’s home. (Partridge asserts that the English word “home” itself ultimately comes from the same root.)  Meanwhile, Partridge tells us, the Indo-European kei morphed into the Sanskrit word siva, meaning friendly.  (It shouldn’t be hard to imagine how the concepts of home and friendliness were early associated, especially given the relationship between friendliness and propagation.) In Latin, a language which evolved in one of the ancient world’s most concentrated population centers, the root kei became the root ciu- seen in such words as ciuis, (a citizen, or person in relation to his neighbors), and ciuitas (a city-state, an aggregation of citizens, the quality of being in such an inherently friendly relationship to others).  By the time we get to English, such words as citizen, citadel, city, civics and civilization, and of course civility itself, all owe their basic meaning to the idea of getting along well with those with whom we share a home.

In the olden days, when one’s home might have been a tent on the Savannah, or a group of villagers occupying one bank of the river, civility was important to producing harmony and cooperation among those who laid down to sleep together.  Such cooperation was important for families to work together and survive.  But as families became villages, villages became cities, and city-states became larger civilizations, we have been expanding the reach of people who sleep together.  (And I mean literally – my Florida-born son, my Japanese-born daughter-in-law, and my grandson, Ryu, who even as I write is flying back from Japan to Florida, remind me of that fact daily.)  Our family has spread beyond the riverbank to the globe.

Given the meanings of all these words, I would ask how far our modern sense of “home” and “family” extend?  What does it mean, these days, to be “civilized”?  What does it mean, oh doubly-wise creatures of the earth, to be “humane”? And in the final analysis, what will it take to “fail”?

— Joe