The Impeachment to Come

First, a series of predictions: The U.S. House of Representatives will impeach Donald Trump.  He will not resign, so the Senate will conduct a trial on whatever charges are brought against him. The next couple of years there’ll be plenty of talk about the meaning of “high crimes and misdemeanors.”   At the end of the day, once all the evidence is in, I will approve of President Trump’s removal from office for “high crimes and misdemeanors.”  Until then, I will try (not always successfully) to keep an open mind.  I will view some participants and spectators as sharks in a feeding frenzy.  And I will not be able to restrain myself from commenting, especially when I think the street buzz fails to appreciate nuances or fails to put today’s events in historical perspective. 

Anticipating all that, and before the gavel brings the first meeting of the Impeachment Committee to order, I thought I’d ask a question intentionally broader than the eventual “high crimes and misdemeanors” question.  Namely, is Donald Trump the most independent, egotistical maverick who has ever served as president?

Perhaps he is.  Perhaps cabinet shake-ups, midnight tweets, criminal investigations and mounting criticism by members of his own party demonstrate that the man is out-of-control, a rogue who has lost all sense of attachment to the country and even to his own political party, an egotistical maverick who thinks he’s smarter than the combined wisdom on Capitol Hill and is prone to take the law into his own hands. 

But on the subject of mavericks, I thought I’d take a look at two pieces of historical data.  One of these is how often presidents have used their veto power.   An independent maverick willing to assert himself over the views of the Congress would seem likely to use the veto more often. 

The other is a president’s use of the Executive Order.  Bypassing Congress, presidents have sometimes attempted to make law by executive order.  The courts have often found that executive orders have exceeded proper presidential powers.   This is certainly not true of all executive orders.  The first such order recognized by the American Presidency Project was George Washington’s order that his cabinet members report back to him on the status of matters in their respective areas of responsibility.  There’s obviously a big difference between the executive activism suggested by that order and, say, Harry Truman’s order nationalizing the country’s steel mills.  So as a measure of presidential activism, the count of a president’s executive orders may be more problematic than a count of his vetoes.  As with vetoes, a president whose party is in control of Congress might be expected to use executive orders less than a president with an opposition party in power on Capitol Hill.  So there are obviously variables at play, not accounted for by the raw numbers  Still, one might expect a president who’s apt to take matters into his own hands, a president who tries to control the country personally rather than letting Congress do so, might be expected to issue more executive orders than a more docile, less activist president.

My thought was that the frequency of presidential vetoes and executive orders may provide at least some insight into the degree of ego and power various presidents have attempted to wield while in office.

In the following table, from FDR through Donald Trump, I’ve included the data for all the presidents.  Before FDR, I’ve included only those presidents who set new record highs for use of executive orders or vetoes.  I’ve used the president’s months in office to convert absolute numbers to monthly rates.  Here’s what I get, using data from the American Presidency Project and the U.S. Senate.

* Figures for Donald Trump are to date, i.e.,  December of 2018.

The numbers above don’t tell the whole story by any means. For example, hundreds of Cleveland’s vetoes were of private pension bills for Civil War veterans. Congress wanted to grant pensions to individual, named veterans after the Pension Bureau had investigated and denied them.  The bills presented the same issue again and again, and the result drastically inflated Cleveland’s total vetoes.

So the bare counts are no doubt subject to all sorts of explanations and interpretations.  But for me, the counts suggest a couple of things worth keeping in mind.

The first is that there have been two growth spurts in presidential activism as measured by these indicators.  The first spurt was when the country was being rended apart and put back together again over the slavery question. President Pierce nearly doubled the prior record of executive orders, Lincoln advanced it again, and after Lincoln’s assassination, Johnson and Grant, while trying to put the country back together again, more than doubled it again.  Meanwhile, Johnson and Grant each set new records for presidential vetoes, and did so by large margins.  It was certainly a tumultuous time.

The second spurt began with Teddy Roosevelt and ended with Harry Truman, a period spanning the Great Depression and two world wars.  That spurt is evident in both executive orders and vetoes, with FDR setting the all time record for both, despite the fact his party was in control of both houses of Congress for his entire presidency.  More tumultuous times.

Judged by that historical observation, in this time when the country is so polarized and divided, one might expect we’d have an activist president, at least as assessed by these measures.

The second observation I would make is more subjective, but I think important to think about, even so: namely,the correlation between a president’s “executive activism” as suggested by this data, and his reputation as a great president, as judged by history . To me, this will be important to keep in mind as we face the impeachment proceedings to come – not to argue that Donald Trump is a great president, but to help us remember what standard we’re judging him by, and if we remove him from office, what it is we remove him for.    

Putting Grover Cleveland aside, consider how history has regarded the other notables on the list:  Shortly after his election to office, President Lincoln ordered the arrest of several Maryland legislators who favored secession, right before a scheduled vote on secession, for the transparent reason of keeping Maryland from voting to secede.  (Now that was a bold display of executive activism!)  Yet history has judged that bold presidential action by all but forgetting it. 

Two years later, when Lincoln issued his most famous executive order (the Emancipation Proclamation) he took great pains to make sure it was “legal.” Lincoln disagreed with the U.S.Supreme Court’s recent decision in the Dred Scott case that, because slaves were private property  under state law, the federal government had no right or power to free them.  That decision was the law of the land, but Lincoln circumvented it by asserting that he did have power to confiscate property being used in rebellion against the federal government. So rather than having Congress do it, Lincoln freed the slaves by a stroke of his executive pen.  But recognizing the Supreme Court’s ruling, he only freed those slaves in the states that were in armed rebellion against the national government.  That respect for the rule of law is something Lincoln is much criticized for today.  Current progressive thinking would probably treat him better if he had contravened the law as then decided by the Supreme Court, and used his executive power to free all the slaves.  Lincoln was a maverick, but as judged by history, possibly not maverick enough.

Nearly a hundred years later, when President Truman used an executive order to place the country’s steel mills under federal control, the Supreme Court held his order unconstitutional. Truman is also third on the list of most active vetoers in history.  Yet Truman is highly regarded for his independence today.

Theodore Roosevelt, who set new records for issuing Executive Orders and established a reputation as one of the most egotistical mavericks to ever occupy the office, got his face enshrined on Mount Rushmore.   He is often considered one of the five greatest Presidents in American history.

And Franklin Roosevelt, who tried to pack the Supreme Court when too  much of his agenda was ruled unconstitutional, who set the record for issuing activist executive orders by a large margin, and who set the record for presidential vetoes even though his own party controlled Congress throughout his presidency, is widely hailed by many as the best president in history. He is certainly highly regarded by today’s “progressives” for his executive activism.

The point is that, as I see it, history has generally looked upon presidential activism with high regard  — at least when it approves of the goals a president  has pursued.  

So where does President Trump fall, on these measures of  ego and executive activism?  He has used the Executive Order more frequently than President Obama, but then, Obama’s use of the Executive Order was the lowest in modern times.  When compared to other modern presidents, Trump’s rate has been comparatively low.  And as for his use of the veto power, there have been 2,574 presidential vetoes since 1789 — not one of them by Mr. Trump.

There are a lot of ways to measure a President’s ego, independence, and executive activism. If measured by midnight tweets and rash statements made to the television news media, President Trump is surely the most arrogant President in history. (That’s an easy claim to make considering Andrew Jackson and Ulysses Grant had neither twitter nor TV.)   But measured by such quantifiable things as frequency of executive orders and vetoes, Mr. Trump has been far less of a maverick than either of the Roosevelts , Wilson, or Truman.  And as far as I can tell, being mavericks who were not always in line with their own parties had a lot to do with why such men have been regarded well by history. 

My point?  I simply hope that, as the impeachment proceedings progress, we keep in mind that impeachment was not designed to punish presidents for having policies and positions we disagree with.  Impeachment was not intended as a remedy for presidents with big egos, or even for those who run counter to the views on Capitol Hill or within their own political parties.  Let’s not impeach Donald Trump because he’s a maverick, unless we think that presidents yet to come who are cut from the mold of Jackson, Lincoln, Wilson and both Roosevelts  will deserve to be impeached for their  roguishness.  Let’s think long and hard, with a sound historical perspective, about the separation of powers, the presidency, and the best meaning to give to “high crimes and misdemeanors.”

I may be wrong, but I predict I’ll have more to say in the months to come about that term.  But those are my thoughts for now.  I look forward to hearing yours.

— Joe

Multiplicity

What do the Kavanaugh hearings, Halloween and Homer’s Odyssey all have in common?

Here’s my take on it.

  1. The Kavanaugh Confirmation Hearings

Someone recently said to me, “Joe, you were a lawyer once.  You understand evidence.  You can see that all the evidence supports my position on this.”  The person who said that to me could have been talking about the Kavanaugh hearings.  Like so much media coverage of the hearings, this fellow thought of a trial as the evidence all points in one direction or the other .  My answer to him was that if I’d learned anything in thirty years of bar membership it was that my mother was right: there are always at least two sides to a story, and the truth is generally somewhere in between.  If juries heard only one side’s witnesses and arguments, every verdict would be unanimous.  Is it any wonder that if you tell me what news source you follow, I can pretty well predict how you feel about the world?

In years of practicing law, I saw over and over again how witness testimony polarized over time.  From the plaintiff’s perspective, the size of the wrong and the depth of the injury always grew, while from the defendant’s perspective, the strength of the alibi and the sense of indignation always did likewise.  Add the way politicians and the media frame a case as pitting good against evil, and you have everyone asking which of the witnesses is lying.  In this view, it has to be one or the other.  When I said, about the Kavanaugh hearings, that I thought both witnesses were telling the truth as they saw it, people looked at me like I was some sort of crazed lunatic from outer space.  The hearings, and especially the media coverage of them, left me shaking my head about what made them so typical of polarized American politics today: namely, a complete inability to empathize with the other side.

  1. Halloween

Yesterday, I came across a piece published last year in USA Today titled “5 Halloween Myths and Urban Legends, Debunked.”  Myth Number 3 was titled, “Satan is the Reason for the Season.”  While acknowledging that Halloween can be traced back to ancient Celtic harvest festivals, the article argued that the modern event has nothing to do with Satan, and never could have, as Satan is a Judaeo-Christian character that would have made no sense to the ancient Celtic polytheists who started those harvest festivals.  The article also points out that All Hallow’s Eve is the first of three days Christianity devotes to remembering the souls of the Christian faithful.  The religious origins of the modern holiday have to do with honoring the good dead, not the immortal Satan, the embodiment of evil

But when it comes to Halloween, like the Kavanaugh hearings, people are polarized.  To many, Halloween will always be about pure evil.  For many on both sides, there’s a complete inability to empathize with the other.

  1. The Odyssey.

My first exposure to the Odyssey was probably Kirk Douglas’s portrayal of the classical hero in 1954’s Hollywood version, Ulysses.  While I don’t remember much of that movie, I feel sure that Kirk Douglas’s character must have been very heroic, in the modern sense of that word – which is to say, a particularly good and capable guy fighting the good fight against evil.  My sense of the story has always been that the Cyclops, Poseidon and the suitors were monstrously bad while Odysseus wasn’t far shy of sainthood.  I want to take this opportunity to rave about the new translation I just finished reading by Emily Wilson.  It manages to be an amazingly easy and accessible read while maintaining the strict metrical qualities of the original.  For the first time, I didn’t have to “study” the epic, I could just read it, and do so at the same pace I might read John Grisham or Dan Brown.  As a result, I acquired a sense of the whole as I never have before.   I strongly recommend her translation, whether you’ve read the epic before or not.

Wilson’s excellent and engaging translation gave me several new perspectives about the story.  One is that the very name Odysseus can be translated as “hated” or at least “disliked.”  He’s easy to hate because he’s not just duplicitous, he’s multiplicitous.  There’s something for everyone to hate.  In Wilson’s words, he is “a migrant…, a political and military leader, a strategist, a poet, a loving husband and father, an adulterer, a homeless person, an athlete, a disabled cripple, a soldier with a traumatic past, a pirate, thief and liar, a fugitive, a colonial invader, a home owner, a sailor, a construction worker, a mass murderer, and a war hero.” Wilson gives much attention to how a person can be so complex and multi-faceted, at once so hated and so loved.  Her Odysseus is anything but the one dimensional champion of goodness that I grew up admiring. Perhaps we see ourselves in him.  Perhaps that’s what allows us to empathize.

It has become common to dismiss the pagan gods as amoral and often wicked libertines that no thinking person could believe were real.  Modern criticism of the Greek gods generally amounts to the argument that they are no better than us human beings.  Wilson points out they’re essentially the same as powerful human beings except that they live forever, but morally and ethically, they’re no better than us.  This strikes me as a natural criticism of deity if you’re comparing it to a God conceived of as morally perfect and all knowing.  But have there been unintended consequences to conceiving of God as the embodiment of perfect goodness and omniscience?  What have been the consequences of living with the aim of achieving such righteousness ourselves?  What have I done by measuring my self-worth by comparison to a single, homogeneous and absolute goodness who has revealed Himself to me?  Has it worked to make me self-righteous?

One reason I’ve always been attracted to Greek myth is that the gods DO behave like human beings.  I’ve long felt that such portrayals allow us to see the consequences of our foibles in archetypal ways that can help us to avoid mistakes as effectively as a lot of sermons I’ve heard.     At their cores, the modern worldview suggests that the difference between good and evil is apparent, and that life is simple: if we choose correctly, we’ll live forever in the home of the gods.  In the old pagan worldview, life is a constant struggle to sort out the difference between good and  bad; that even in the home of the gods, it can be hard to distinguish right from wrong; that sometimes, what seems good to one person (or god) seems bad to another.  In this worldview, there isn’t any Grand Commission of Justice to tell us which is which.

There’s little doubt in my mind that most of us would choose to live in a world where good and evil are clearly defined and labelled. But is the real world more nuanced and dependent on point of view than that?  Wilson points out that Odysseus is offered a perfect and immortal life by Circe, but turns it down, choosing instead his mortal home in his mortal world.  Is that why we can love him and hate him at the same time?  There are good reasons the Bible has stood the test of time.  I think there are good reasons the Odyssey has too.

So: What similarities do I see between the Kavanaugh hearings, Halloween, and the Odyssey? For me, all three tell us something about the extent to which Platonic thinking about absolutes has changed the world.  In the pre-Platonic, polytheistic world of Odysseus we could celebrates diverse and multiple perspectives; in the modern world, there must be a single and absolute truth distinguishable by its righteousness.  In the Christian Era, we’re used to hearing the gods of Greek myth dismissed as either “immoral” or “amoral.”  But in the Odyssey, Zeus is the god of justice and of hospitality toward strangers.  One of the most constant themes is that the gods will not approve of mistreating strangers.  It’s not that the Homeric gods don’t care about what’s good and right, but that (just like people) they don’t share a singular and unchanging view of what “goodness” consists of.

Of the many epithets applied to Odysseus (apart from being godlike),  most begin with the prefix “poly-,” meaning multiple.  Odysseus is “poly-tropos” (multiply turning), poly-phrona (multiply-minded), poly-meganos (employing multiple devices), poly-tlas (multiply enduring), poly-penthes (multiply-pained), poly-stonos (multiply-sorrowed) and poly-aretos (multiply prayed for.)  In a sense, this multiplicity makes him all things to all people.  It’s a big part of why he’s hated.  He is also incredibly adaptable, assuming different guises and characteristics in different situations.  His understanding of right and wrong is neither absent nor irrelevant – it is simply changing.

All our modern religious and political instincts tell us to condemn such inconstancy.  We’re trained to think in terms of Platonic absolutes, of clear and perfect Goodness on one side and clear and perfect Evil on the other.  We’re told we can identify the Truth and that we’re bound to adhere to it.  If Professor Ford was telling the truth as she saw it, then Judge Kavanaugh had to be lying, as he saw it.  If Halloween is not a glorification of the Judaeo-Christian God, it must be the work of Satan.  If Odysseus is inconsistent from one day to the next, he must represent an inferior state of being because perfect people have to be constant, unchanging and right.

But is there a difference between being constant, unchanging and right, and being rigid, intolerant, and set in our ways?

I’m not advocating for a rudderless, amoral view of the world.  Goodness is certainly worth striving for.  But how can I know for certain I’ve found it, when others disagree with me about what’s good?  Once again, I’m reminded of Alexander Solzhenitsyn’s words:

“If only there were evil people somewhere insidiously committing evil deeds and it were necessary only to separate them from the rest of us and destroy them.  But the line between good and evil cuts through the heart of every human being.  And who’s willing to destroy a piece of his own heart?”

I recently read Jonathan Haidt’s The Righteous Mind: Why Good People are Divided by Politics and Religion. The book is worth a read for many reasons, but the concept I found most thought-provoking was Haidt’s view on the evolutionary origins of human reason.  The traditional view is that the capacity for reason and logical analysis evolved in human beings as tools for reaching the best conclusions.  In reality, Haidt suggests, human beings wouldn’t have survived unless they could form immediate judgments about things without reasoned analysis.  (You can’t conduct a reasoned analysis of whether to run from a saber-toothed tiger or not.)  But we are also social animals whose early survival depended on the ability to work together in teams.   And to act as a team,  we needed coordinated approaches.  Haidt says our social survival depended on leaders able to persuade others to follow their judgments.  According to Haidt, reason and logical analysis arose about the same time as language did, and they evolved for much the same social purposes: that is, not as tools of decision-making to help an individual determine what’s right, but as tools of persuasion to help convince others to go along with our judgments.  (In the process, we convince ourselves that our judgments are right, too, but that’s a result, not a cause.)

In this view, all of human reasoning has its origins in persuading others, in post-hoc justification to support judgments already formed.  If Solzhenitsyn and Haidt are right, then all the arguments between Professor Ford and Justice Kavanaugh, Democrats and Republicans, Christians and atheists, NPR and Fox News, Halloween enthusiasts and its enemies,  and indeed, between you and me, have to do with persuasion, not with what either one of us has always revered as “reason.”

In this sense, maybe Ford’s and Kavanaugh’s truths are similar.  Last year, I blogged about liking Halloween because it invited us to try out the worldview of a character we normally think of us strange, monstrous, or even evil.  Maybe it isn’t bad that we put ourselves in the shoes of terrible others on Halloween.  Maybe it’s okay to change our understanding of right and wrong at times, to try out new perspectives, just like Homer’s Odysseus did.  Maybe multiplicity helps us empathize.

After listing the many (contradictory) traits her Odysseus exhibits, Emily Wilson  writes, “immersing ourselves in his story, and considering how these categories can exist in the same imaginative space, may help us reconsider the origins of Western literature, and our infinitely complex contemporary world.”

Maybe she’s on to something there?

– Joe

Cretins

In 1595, the early English explorer and colonist, John Davys, wrote in The Worlde’s Hydrographical Discription, Thomas Dawson, London,

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

In the 1850’s, the Reverend Augustus Longstreet – president of a leading American University and minister of the Lord – wrote to his son-n-law regarding the unreasonable behavior of his slaves:

“The creatures persistently refuse to live together as man and wife, even after I have mated them with all the wisdom I possess, and built them such desirable homes.”

About the same time, the famous case of the slave, Dred Scot, wound its way up to the Supreme Court of the United States.  On its way, the Supreme Court of the state of Missouri found that one of the key issues before it was whether African slavery really did exist for the benefit of the slaves.

Of course, we’ve come a long way since then.  In 1997, Robert Hendrickson wrote, in The Facts on File Encyclopedia of Word and Phrase Origins, Checkmark Books :

“cretin.  Our pejorative cretin, for “an idiot,” began as a kindly word.  In the Middle Ages many deformed people with a low mentality lived in the Alpine regions, their condition resulting from a thyroid condition now known as myxedema, which was possibly caused by a deficiency of iodine in their drinking water.  These unfortunates were called Chrétiens, “Christians,” by the Swiss, because the word distinguished human beings like these people from brutes, and they believed these childlike innocents were incapable of actual sin.  But the kindly word went into French as cretin, meaning “idiot,” and retained the same meaning when it passed into English.”

It leaves me wondering: if our best navigators, University presidents, and supreme courts can be such cretins, where does that leave the rest of us?

Honk if you love word origins.

– Joe

Top Ten Blunders – Your Nominations

A month ago , I asked for your thoughts about the greatest blunders of all time.  I was thinking of blunders from long ago, especially “a list that considers only past human blunders, removed from the passions of the present day.”  I observed, “My special interest lies in blunders where large numbers of people… have believed that things are one way, where the passage of time has proven otherwise.  I believe such a list might help remind us of our own fallibility, as a species…

I got only five nominations. (I imagine the rest of you are simply reluctant to nominate your own blunders.  But hey.  All of us have done things we’d rather our children not hear about.)  As for the rest of you, I’m grateful for your nominations, even if they do imply that blame lies elsewhere than ourselves.  The five I received are certainly food for thought.

One was, “Founding Fathers missed huge by not imposing term limits.”  According to a recent Rasmussen opinion poll, 74% of Americans now favor term limits, with only 13% opposed.*  One could argue the jury is in: the verdict being that the founding fathers should have imposed term limits.  That said, with the average length of service in the U.S. House being 13.4 years, we obviously elect half of our representatives to seven or more consecutive terms.  And Michigan voters have sent John Dingle back to Congress for over fifty-seven years, even longer than his father’s decades of service before him.  Do they feel differently about term limits in Michigan?  If the founding fathers’ failure to impose term limits was a great blunder, don’t the American voters make a far greater blunder every two years when they send these perennial office holders back to Washington, year after year? I mean, it’s at least arguable that the Founding Fathers were right in failing to impose term limits.  But who can deny the hypocrisy when an electorate that favors term limits (that means us, folks) does what they themselves would prohibit?  Millions of Americans today are either wrong in favoring term limits, or wrong in re-electing the same Congressmen over and over again – and surely wrong by doing both simultaneously.  At least if measured by the number of people involved, the blunder we commit today strikes me as greater than that committed by a handful of wigged men in 1789.

A second nomination: “Y2K has to be in the top 20?”  That one sure brings a smile to my face.  You remember the “experts” predictions of the global catastrophe we’d see when all those computers couldn’t handle years starting with anything but a 1 and a 9.   Then, when the time came, nothing happened.  I don’t know of a single problem caused by Y2K.   If judged by the certainty of the so-called experts, and the size of the gap between the predicted calamity and what actually transpired, Y2K clearly deserves recognition.

But compare Y2K to other predictions of doom.  There can be no predicted calamity greater than the end of existence itself.  A glance at Wikipedia’s article, “List of Dates Predicted for Apocalyptic Events,” includes 152 dates that have been predicted for the end of the world.  And they haven’t been limited to freakish fringes of society.  Standouts include Pope Sylvester II’s prediction that the world would end on January 1, 1000, Pope Innocent III’s that it would end 666 years after the rise of Islam,  Martin Luther’s prediction that it would end no later than 1600, and Christopher Columbus’s that it would end in 1501. (When that year ended successfully, he revised his prediction to 1658, long after he’d be dead; he apparently didn’t want to be embarrassed again).  Cotton Mather’s prediction of 1697 had to be amended twice.  Jim Jones predicted the end in  1967 and Charles Manson 1969.  My favorite on Wikipedia’s list dates from May 19, 1780, when “a combination of smoke from forest fires, a thick fog, and cloud cover” was taken by members of the Connecticut General Assembly as a sign that the end times had arrived.  (It’s my favorite because it may help explain why the founding fathers saw no need for term limits.)  But fully half of the Wikipedia list consists of predictions made since 1900.   Over twenty-five have been since the Y2K blunder.  The recent predictions include one from a recent Presidential candidate (Pat Robertson) who predicted the world would end in 2007.  And though not yet included by Wikipedia,  last month’s solar eclipse brought out yet more predictions of the end of the world – never mind that only a tiny fraction of the earth’s surface was in a position to notice it.  (Would the world only end across a thin strip of North America?)

We can laugh at Christopher Columbus, but what of the fact that the list of doomsday prophecies continues to grow, despite how often the doomsayers have been wrong?  Measured by the enormity of the subject matter and the apparent widespread lack of concern about being “twice bitten,” man’s fondness for predicting when the world will end as a result of some mystical interpretation of ancient texts strikes me as a bigger blunder than Y2K – and unlike Y2K, it shows no sign of going away.

A third nomination: “The earth is flat.”  The archetypal human blunder.  Months ago, while struggling to think of other blunders as egregious, I was led by Google to a Wikipedia article on “the flat earth myth,” which I assumed was exactly what I was looking for.  But to my dismay, I read that the “flat earth myth” is not the old belief that the world was flat; rather, it is the current, widely-held belief that people in the Middle Ages believed the earth to be flat!  I’d spent a lifetime feeling proudly superior to the ignorant medieval masses.  Was it me, after all, who was wrong?

My discovery reminded me of the difficulty of ranking human error.  The article asserted that throughout the Middle Ages, the “best minds of the day” knew the earth was not flat.  The “myth” was created in the 17th Century, as part of a Protestant campaign against Catholic Church teachings, accelerated by the fictional assertion in Washington Irving’s popular biography of Christopher Columbus that members of the Spanish court questioned Columbus’s belief that the earth was round.  Gershwin’s unforgettable, “They all laughed at Christopher Columbus…” etched the myth forever in our minds.  The article quotes Stephen Jay Gould: “[A]ll major medieval scholars accepted the Earth’s roundness as an established fact of cosmology.”  The blunder wasn’t a relic of the Middle Ages, but an error of current understanding based on a post-enlightenment piece of popular fiction!

Meanwhile, the Flat Earth Society lives on to this day.  Their website, at theflatearthsociety.org, “offers a home to those wayward thinkers that march bravely on with REASON and TRUTH in recognizing the TRUE shape of the Earth – Flat. “  Most of them, I think, are dead serious.  But wait.  Which is the greater blunder: that of the medieval masses who saw their world as a patchwork of coastlines, rolling hillsides, mountains, valleys, and flat, grassy plains?  Or that of the experts, the major scholars who “knew” in the Middle Ages that the earth was a sphere?  The earth is not a sphere at all, we now know, but a squat, oblong shape that bulges around the equator because of the force of its spin.  Or is that error, too?  Need we mention that spheres don’t have mountains and valleys?  Need we mention that the surface of the earth, at a sub-atomic level, is anything but curved?  Aren’t all descriptions of the earth’s shape simply approximations?  And if we can accept approximations on the basis that they serve a practical purpose, then is the observable general flatness of the earth today any more “wrong” than a medieval scholar’s belief in sphericity?    Who really needs to know that the atoms forming the surface of the earth are really mostly air? The “wrongness” in our concepts of earth’s shape isn’t static, but evolving.

The oldest of the historical blunders nominated for WMBW’s top ten list have an ancient, scriptural flavor.

The first: “The number one thing that went wrong with humanity [was] when the first man said to another, ‘I think I heard god last night!’ and the other believed him.”**

The second comes from a different perspective: “The greatest blunder had to be Eve eating of the fruit of the tree of knowledge, having been tempted to be like God, deciding for herself what is good and what is evil.  Every person [becomes] his own god. The hell of it is, everyone decides differently, and we’re left to fight it out amongst ourselves.”**

The other three nominators thought that Y2K, belief in a flat earth, and failure to impose term limits should be considered for a place somewhere on the top ten list.  (Actually, Y2K’s sponsor only suggested it belonged somewhere in the top 20.)  But the two “religious” nominations were each called the biggest blunder of all.  (One was “the number one thing,” while the other “had to be” the greatest blunder.)   What is it about belief in God that prompts proponents and opponents alike to consider the right belief so important, and holding the wrong one the single greatest blunder of all time?

If you believe in God, though He doesn’t exist, you’re in error, but I don’t see why that error qualifies as the greatest blunder of all time, even when millions suffer from the same delusion.  I remember seeing an article in Science Magazine a few years ago, surveying the research that has attempted to determine whether believers tend to act more morally than non-believers.  Most of the studies showed little or no difference in conduct between the two groups.  For those who don’t believe in God, isn’t it one’s conduct, not one’s belief-system, that is the best measure of error?  For them, why does belief even matter?

If you don’t believe in God, though He does exist, you face a different problem.  If you believe as my mother did – that believing in God (not just any God, but the right God, in the right way) means you’ll spend eternity in Heaven rather than Hell – it’s easy to see why being wrong would matter to you. If roughly half the people in the world are headed to eternal damnation, that’s at least a problem bigger than term limits.

But there is a third alternative on the religious question.  If you’ve looked at the WMBW Home Page or my Facebook profile, you may have noticed my description of my own religious views – “Other, really.”  One of the main reasons for that description is pertinent to this question about the greatest blunders, so I will identify its relevant aspect here: “If God exists, He may care about what I do, but He’s not so vain as to care whether I believe in Him.”  My point is not to advance my reasons for that belief here, simply to point out that it may shed light on why many rank error on the religious question so high on the list of all-time blunders, while I do not.  Many believers, I think, believe it’s critically important to believe, so they try hard to get others to do so.  Non-believers react, first by pointing to examples of believers who’ve been guilty of wrongdoing, and eventually by characterizing the beliefs themselves as the reason for the wrongdoing.  In any case, the nominations concerned with religious beliefs were offered as clearly deserving the number one spot, while our “secular” nominations were put forward with less conviction about where on the list they belong — and that difference may have meaning, or not.

In my solicitation, I acknowledged the gray line between opinion and fact.  To some believers, the terrible consequences of not heeding the Word of God have been proven again and again, as chronicled throughout the Hebrew Bible.  To some non-believers, the terrible consequences of belief have been proven by the long list of wars and atrocities carried out in the name of Gods.  Whichever side you take, the clash of opinions remains as strong as ever.

So, what do I make of it all?  Only that I’d hoped for past, proven blunders which might remind us of our great capacity for error.  Instead, I discover evidence of massive self-contradiction on the part of the current American electorate; a growing list of contemporaries who, as recently as last month, are willing to predict the imminent end of the world; my own blunder, unfairly dismissive of the past, fooled by a piece of Washington Irving fiction; and a world as divided as ever regarding the existence of God.

To this observer, what it all suggests is that there’s nothing unique about the ignorance of past ages; and that an awfully large chunk of modern humanity is not only wrong, but likely making what some future generation will decide are among the greatest blunders of all time.

Sic transit gloria mundi.

–Joe

*http://www.rasmussenreports.com/public_content/politics/general_politics/october_2016/more_voters_than_ever_want_term_limits_for_congress

** I’ve done some editing of punctuation in both of these nominations: I apologize if I’ve thereby changed the submitter’s meaning.

 

The Top Ten Blunders of All Time

For several months now, I’ve been plagued by the thought that certain ways of “being wrong” are different from others.  I’ve wondered whether I’ve confused any thing by not mentioning types of error and distinguishing between them.  For example, there are errors of fact and errors of opinion.  (It’s one thing to be wrong in thinking that 2+ 2 = 5, or that Idaho is further south than Texas, while it’s quite another to be “wrong” about whether Gwyneth Paltrow or Meryl Streep is a better actor.)  Meanwhile, different as statements of fact may be from statements of opinion, all such propositions have in common that they are declarations about present reality.  Not so a third type of error – judgmental errors about what “ought” to be done.   Should I accept my friend’s wedding invitation?  Should I apologize to my brother?  Should we build a wall on the Mexican border?  I might be wrong in my answer to all such questions, but how is it possible to know?

Is there a difference between matters of opinion (Paltrow is better than Streep) and ethics (it’s wrong to kill)?  Many would say there’s an absolute moral standard against which ethics ought to be judged, quite apart from questions of personal preference; others would argue that such standards are themselves a matter of personal preference.  I’ve thought a lot about how different types of error might be distinguished.  But every time I think I’m getting somewhere, I wind up deciding I was wrong.

One of the ways I’ve come full circle relates to the distinction between past and future.  It’s one thing to be wrong about something that has, in fact, happened, and another to be wrong about a prediction of things to come.  Right?  Isn’t one a matter of fact, and the other a matter of opinion?  In doing the research for my recent novel, Alemeth, I came across the following  tidbit  from the Panola Star of December 24, 1856:

The past is disclosed; the future concealed in doubt.  And yet human nature is heedless of the past and fearful of the future, regarding not science and experience that past ages have revealed.

Here I was, writing a historical novel about the divisiveness that led to civil war. I was driven to spend seven years on the project because of the sentiment expressed in that passage: that we can, and ought to, learn from the past.  (Specifically, we should learn that when half the people in the country feel one way, and half the other, both sides labeling the other stupid, or intentionally malicious, an awful lot of people are likely wrong about the matter in question, and the odds seem pretty close to even that any given individual (that includes each of us) is one of the many in the wrong.  And importantly, the great divide wasn’t because all the smart people lived in some states, or that all the bad people lived in others: rather, people tended to think as they did because of the prevailing sentiments of the people around them. Hmnnn…)

Then, an interesting thing happened in the course of writing the book.  Research began teaching me  how many pitfalls there are in thinking we can really know the past.  We have newspapers, and old letters, and other records, but how much is left out of such things?  How many mistakes might they contain?  Indeed, how many were distorted, intentionally, by partisan agendas at the time?  The more I came across examples of each of those things, the less sure I became that we can ever really know the past.  I often can’t remember what I myself was doing ten minutes ago; how will I ever be able to reconstruct how things were for tens of thousands of people a hundred years ago?  Indeed, the more I thought about it, I began to circle back on myself, wondering whether the opposite of where I’d started was true:  Because the past has, forever, already passed, we’ll never be able to return to it, to touch it, to look it directly in the eye, right?  Whereas, we will have that ability with respect to things yet to come.  If that’s true, the future just might be more “verifiable”  than the past.   I get dizzy just thinking about it.

Anyway, an idea I’ve been kicking around is to ask you, WMBW’s readers, to submit nominations for the ten greatest (human) blunders of all time.  I remain extremely interested in the idea, so if any of you are inclined to submit nominations, I’d be delighted.  But the reason I haven’t actually made the request before now stems from my confusion about categories of wrong.  Any list of “the ten greatest blunders of all time” would be focused on the past and perhaps the present, while presumably excluding the future.  But I’m tempted to exclude the present as well.   I mean, I feel confident there are plenty of strong opinions about, say, global warming – and since the destruction of our species, if not all life on earth, may be at stake, sending carbon into the air might well deserve a place on such a list.   Your own top ten blunders of all time list might include abortion, capitalism,  Obamacare, the Wall, our presence in Afghanistan, our failure to address world hunger, etc., depending on your politics.  But a top ten list of blunders based on current issues (that is, based on the conviction that  “the other side” is currently making a world class blunder) would surprise few of us. It seems to me the internet and daily news already makes the nominees for such a list clear.  What would be served by our adding to it here?

My interest, rather, has been in a list that considers only past human blunders, removed from the passions of the present day.  I believe such a list might help remind us of our own fallibility, as a species.  I for one am constantly amazed, when I research the past, at our human capacity for error.  Not just individual error, but widespread cultural error, or fundamental mistakes in accepted scientific thinking.  My bookshelves are full of celebrations of the great achievements of mankind, books that fill us with pride in our own wisdom, but where are the books which chronicle our stupendous errors, and so remind us of our fallibility? How could nearly all of Germany have got behind Hitler?  How could the South have gone to war to preserve slavery?  How could so many people have believed that disease was caused by miasma, or that applying leeches to drain blood would cure sickness, or that the earth was flat, or that the sun revolves around the earth?

What really interests me is not just how often we’ve been wrong, but how ready we’ve been to assert, at the time, that we knew we were right.  The English explorer John Davys shared the attitude of many who brought European culture to the New World, before native Americans were sent off to reservations:

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

History is full of such declarations.  In researching the American ante-bellum South, not once did I come across anyone saying, “Now, this slavery thing is a very close question, and we may well be wrong, but we think, on balance, that…”  In the days before we knew that mosquitos carried Yellow Fever, scientific pronouncements asserted as fact that the disease was carried by wingless, crawling animalcula that crept along the ground.  This penchant for treating our beliefs as knowledge is why I so love the quote (attributed to various people) that runs, “It ain’t what people don’t know that’s the problem; it’s what they do know, that ain’t so.”

My special interest lies in blunders where large numbers of people – especially educated people, or those in authority – have believed that things are one way, where the passage of time has proven otherwise.  My interest is especially strong if the people were adamant, or arrogant, about what they believed.  Consider this, then, a request for nominations, if you will, especially of blunders with that sort of feel

Yet be forewarned.  There’s a reason I haven’t been able to come up with a list of my own.  One is that, while not particularly interested in errors of judgment or opinion, I’m not sure where the dividing line falls between fact and opinion. Often, as in the debate over global warming, the very passions aroused are over whether the question is a matter of fact or opinion.  Quite likely, what we believe is fact; what our opponents believe is opinion.

The other is the shaky ground I feel beneath my feet when I try to judge historical error as if today’s understanding of truth will be the final word.   Remember when we “learned” that thalidomide would help pregnant women deal with morning sickness?  Or when we “learned” that saccharin causes cancer?  That red wine was good for the heart (or bad?  What are they saying on that subject these days?)  What about when Einstein stood Newton on his head, or the indications, now, that Einstein might not have got it all right?  If our history is replete with examples of wrongness, what reason is there to think that we’ve gotten past such blunders, that today’s understanding of truth is the standard by which we might identify the ten greatest blunders of all time?  Perhaps the greater blunder may be when we confidently identify, as a top ten false belief of the past, something which our grandchildren will discover has been true all along.…

If this makes you as dizzy as it does me, then consider this: The word “wrong” comes from Old English wrenc, a twisting; it’s related to Old High German renken, to wrench, which is why the tool we call a wrench is used to twist things.  This is all akin to the twisting we produce when we wring out a wet cloth, for when such cloth has been thoroughly twisted, wrinkled, or wrung out, we call it “wrong.” Something is wrong, in other words, when it’s gotten so twisted as to be other than straight.

But in an Einsteinian world, what is it to be straight?  The word “correct,” like the word “right” itself, comes from Latin rectus, meaning straight.  The Latin comes, in turn, from the Indo-European root reg-.  The same root that gave us the Latin word rex, meaning the king.  Joseph Partridge tells us that the king was so called because he was the one who kept us straight, which is to say, in line with his command.  The list of related words, not surprisingly, includes not only regular and regimen, not only reign, realm and region, but the German word Reich.  If the history of language tells us much about ourselves and how we think, then consider the regional differences in civil war America as an instance of rightness..  Consider the history of Germany’s Third Reich as an instance of rightness  It seems we’ve always defined being “right” as a matter of conformity, in alignment with whatever authority controls our and our neighbors’ ideas.

Being wrong, on the other hand?   Is it destined to be defined only as the belief not in conformity to the view accepted by those in charge?  Sometimes I think I’ve got wrongness understood, thinking I know what it is, thinking I’m able to recognize it when I see it.  But I always seem to end up where I began, going around in circles, as if space itself is twisted, curved, or consisting of thirteen dimensions.   I therefore think my own nomination for the Ten Greatest Blunders of All Time has to go to Carl Linnaeus, for calling us Homo Sapiens. 

If you have a nomination of your own, please leave it as a comment on this thread, with any explanation, qualification, or other twist you might want to leave with it.

I’m looking forward to your thoughts.

Joe

 

The Tag Line

WMBW’s tagline is “Fallibility>Humility>Civility.”  It’s punctuated to suggest that one state of being should lead naturally to the next.  The relationship between these three concepts being central to the idea, today I’ve accepted my brother’s suggestion to comment about the meaning of the words.

Etymology books tell us that “fallibility” comes from the Latin fallere, a transitive verb that meant to cause something to stumble.  In the reflexive form, Cicero’s me fallit (“something caused me to stumble”) bestowed upon our concept of fallibility the useful idea that when one makes a mistake, it isn’t one’s own fault.  As Flip Wilson used to say, “the devil made me do it.”

This is something I adore about language – the way we speak is instructive because it mirrors the way we think.   Therefore, tracing the way language evolves, we can trace the logic (or illogic) of the way we have historically tended to think, and so we can learn something about ourselves.  Applying that concept here leads me to conclude that denying personal responsibility for our mistakes goes back at least as far as Cicero, probably as far as the origins of language itself, and perhaps even farther.  “I did not err,” our ancient ancestors taught their children to say; “something caused me to stumble.”

I also think it’s fun to examine the development of language to see how basic ideas multiply into related concepts, the way parents give rise to multiple siblings.  And so, from the Latin fallere come the French faux pas and the English words false, fallacy,  fault, and ultimately, failure and fail.  While I’ve heard people admit that they were at fault when they stumbled, it’s far less common to hear anyone admit responsibility for complete failure.  If someone does, her friends tell her not to be so hard on herself.  His psychiatrist is liable to label him abnormal, perhaps pathologically so: depressed, perhaps, or at least lacking in healthy self-esteem.  The accepted wisdom tells us that a healthier state of mind comes from placing blame elsewhere, rather than on oneself.  Most interesting.

Humility, meanwhile, apparently began life in the Indo-European root khem, which spawned similar-sounding words in Hittite, Tokharian, and various other ancient languages.  All such words meant the earth, the land, the soil, the ground – that which is lowly, one might say; the thing upon which all of us have been raised to tread.  In Latin the Indo-European root meaning the ground underfoot became humus, and led to English words like exhume, meaning to remove from the ground.  Not long thereafter, one imagines, the very ancient idea that human beings came from the ground (dust, clay, or whatever) or at least lived on it led to the Latin word homo, a derivative of humus, which essentially meant a creature of the ground (as opposed to those of the air or the sea).  From there came the English words human and humanity.  Our humanity, then, might be said to mean, ultimately, our very lowliness.

From the Latin, homo and humus give us two rather contrary sibling words.  These siblings remain in a classic rivalry played out to this day in all manner of ways.  On the one hand, homo and humus give us our word “humility,” the quality of being low to the ground.  We express humility when we kneel before a lord  or bow low to indicate subservience. In this light, humility might be said to be the very essence of humanity, since both embody our lowly, soiled, earth-bound natures  But our human nature tempts us with the idea that it isn’t good to be so low to the ground.  To humiliate someone else is to put them in their place (to wit, low to the ground, or at least, low compared to us.) And while we share with dogs and many other creatures of the land the habit of getting low to express submissiveness, some of our fellow creatures of the land go so far as to lay down and bare the undersides of their necks to show submission.  Few of us are willing to demonstrate that degree of humility.)

And so the concept of being a creature of the ground underfoot gives rise to a sibling rivalry — there arises what might be called the “evil twin” of humility, and it is the scientific name by which we distinguish ourselves from other land-based creatures: the perception that we are the best and wisest of them gives rise to homo sapiens, the wise land-creature.  As I’ve pointed out in an earlier blog, even that accolade wasn’t enough to satisfy us for long: now our scientists have bestowed upon us the name homo sapiens sapiens, or the doubly wise creatures of the earth.   I find much that seems telling in the tension between our humble origins and our self-congratulatory honorific.  As for the current state of the rivalry, I would merely point out that not one of our fellow creatures of the land, as far as I know, have ever called us wise.  It may be only us who think us so.

And now, I turn to “civility.”  Joseph Partridge, my favorite etymologist, traces the word back to an Indo-European root kei, meaning to lie down. In various early languages, that common root came to mean the place where one lies down, or one’s home. (Partridge asserts that the English word “home” itself ultimately comes from the same root.)  Meanwhile, Partridge tells us, the Indo-European kei morphed into the Sanskrit word siva, meaning friendly.  (It shouldn’t be hard to imagine how the concepts of home and friendliness were early associated, especially given the relationship between friendliness and propagation.) In Latin, a language which evolved in one of the ancient world’s most concentrated population centers, the root kei became the root ciu- seen in such words as ciuis, (a citizen, or person in relation to his neighbors), and ciuitas (a city-state, an aggregation of citizens, the quality of being in such an inherently friendly relationship to others).  By the time we get to English, such words as citizen, citadel, city, civics and civilization, and of course civility itself, all owe their basic meaning to the idea of getting along well with those with whom we share a home.

In the olden days, when one’s home might have been a tent on the Savannah, or a group of villagers occupying one bank of the river, civility was important to producing harmony and cooperation among those who laid down to sleep together.  Such cooperation was important for families to work together and survive.  But as families became villages, villages became cities, and city-states became larger civilizations, we have been expanding the reach of people who sleep together.  (And I mean literally – my Florida-born son, my Japanese-born daughter-in-law, and my grandson, Ryu, who even as I write is flying back from Japan to Florida, remind me of that fact daily.)  Our family has spread beyond the riverbank to the globe.

Given the meanings of all these words, I would ask how far our modern sense of “home” and “family” extend?  What does it mean, these days, to be “civilized”?  What does it mean, oh doubly-wise creatures of the earth, to be “humane”? And in the final analysis, what will it take to “fail”?

— Joe

MLK and the Dream

I had a dream last night;  I woke up this morning thinking about it. And my train of thought went from there to Martin Luther King’s dream.  Remembering the late civil rights leader led me to contemplate a sort of ironic coincidence: that last Monday – the 16th – the Martin Luther King Holiday – was the very day I made the final revisions to my novel, Alemeth, and began the process of formatting it for the printing company.

Completion of the novel is the fulfillment of a dream.  I could trace its origins back to the early 1960’s, possibly even to the very year of King’s famous 1963 speech.  That was when my grandmother first showed me some of the letters my great uncle Alemeth had written home from the front lines during the Civil War.   Or I could trace its origins to a dinner that Karen and I had with our friends Roger and Lynda ten years ago, when a lively discussion got me thinking about a novel that explored (or even tested) the differences between fiction and non-fiction.  Or I could trace it back seven years, when I chose to write Alemeth’s life story.  No matter how far back I go to date the novel’s origins, it has been many years in the making. Somewhere along the way, a novel based on Alemeth’s life became a dream, and it seemed ironic that the dream had finally been fulfilled on the Martin Luther King Holiday.

But the coincidence seemed ironic for reasons deeper than that my novel has been sort of a dream for me.  It seems ironic because the themes of King’s famous “I Have a Dream” speech and the themes of Alemeth are so closely related.

For King’s dream, we need scant reminder.  “[O]ne day… even the state of Mississippi… will be transformed into an oasis of freedom and justice.” “[M]y four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character…” My great uncle, Alemeth Byers, the title character of my novel, was the son of a cotton planter in Mississippi.  The family owned sixty slaves when the Civil War began.   In calling Mississippi “a state sweltering with the heat of injustice, sweltering with the heat of oppression,” Martin Luther King had been talking about my family.

Early in my research into Alemeth’s life, I began to confront what, for me, was terribly unsettling.  I knew my grandparents to be among the kindest, most “Christian,” most tolerant people I knew.  But as I grew older, my research into their lives, and into their parents’ lives, revealed more and more evidence of racial bigotry.  In old correspondence, these prejudices pop up often – and most alarming of all – when I looked honestly at the historical record, I saw those prejudices getting passed down, from generation to generation.

In one respect, I felt I was confronting a paradox of the highest order.  My mother was kind and loving, and my sense was that her kindness was in large part because her parents had been kind.  My instincts applied the same presumption to their parents, as if “loving and kind” was a trait carried down in the genes, or at least in the serious study of Christian Scripture.  (My grandmother was a Sunday School teacher; my childhood visits to her house always included Bible study.). Presuming that my great grandparents were as kind and loving as my grandparents, and knowing that they, too, had been devout Christians, I found it paradoxical that all this well-studied and well-practiced Christianity not only tolerated racial bigotry but, in great uncle Alemeth’s day, was used to justify a war to preserve human bondage.  Frankly, it made no sense.  I wondered: How did these people square their Christian beliefs with their ownership of so many slaves?  With their support for a war intended to preserve their “property rights” in these other people?

It was even more unsettling, then, to realize how the “squaring” had occurred.   George Armstrong’s The Christian Doctrine of Slavery (Charles Scribner, New York, 1857) made a fascinating read.  That work expounded, in argument after argument, based on scripture after scripture, how God had created the separate races, given Moses Commandments which made no mention of slavery, instructed the Israelites to make slaves of their heathen enemies (Leviticus 25:44-46), sent a Son to save us who never once condemned slavery though he lived in its midst, and inspired Saint Paul to send the slave Onesimus back to Philemon with instructions to be a good, obedient slave to his master.  Armstrong’s work was perhaps the most impactful, but by no means did it represent an isolated view.  My research uncovered source after source that made plain how the slave owners of the ante-bellum South were able to square their support of slavery with their Christianity: they did so by interpreting Christian Scripture as supporting the institution.  Indeed, in some sermons of the day, the case was made that being a good Christian required a commitment to the defense of slavery, because civilized white people had a Christian duty to care for their “savage” African slaves.  In the end, of course, they were so convinced they were right that they were willing to go to war and fight (and die) for it.  (Their cause being a righteous one, the killing of people in support of it met all the requirements for a “Just War” as traditional Christian doctrine expounded it.)

For me, it was an eye-opener to realize that southern Christians based their support of slavery squarely on Christian scripture.  It was also an eye-opener to see how the beliefs and attitudes of the community were shared, both horizontally and vertically.  By horizontally, I mean how family members, neighbors, newspapers, courts, elected representatives, school teachers and preachers all worked together to homogenize the Southern attitude toward slavery.  (It was rare to find a voice of dissent – the conclusion seems compelling that the few dissenters tended to keep their opinions to themselves, for fear of being run out of town, as those considered “unsound on the slavery question” generally were.)  By vertically, I mean how attitudes and beliefs were passed down from one generation to the next, most strongly within immediate families, but also within whole communities and cultures.  My research extended back in time to the racism of our national heroes, Washington and Jefferson, and forward in time through my grandparents, my parents, and –

Indeed.  What about myself?  Historical research proves again and again how, once accepted in a family or community, “wrong” attitudes and beliefs can be passed down so easily from one generation to the next.  Is it possible I could be exempt from such influences?  Somehow free to form my opinions entirely on reason and logic, safe from any familial or cultural biases? All my historical research has led me to conclude that we are most  prone to be blind to the wrongness within that which is most familiar; if that’s true, what are the ramifications for my own attitudes and beliefs?  How much of the racism inherent in my family history manages to cling to my own way of thinking?  I hope none of it, of course, but how likely is it that some of it persists?

I will repeat a quote from  The Gulag Archipelago, which I already mentioned in a prior WMBW post and which I managed to squeeze into Alemeth as well.  Alexander Solzhenitsyn expressed a wish for an easier world:

If only there were evil people somewhere insidiously committing evil deeds and it were necessary only to separate them from the rest of us and destroy them. But the line between good and evil cuts through the heart of every human being. And who’s willing to destroy a piece of his own heart?

I’ll have more to say in later posts about what psychologists call the “bias blind spot.”  For now, suffice it to say that much as I share King’s dream for a day when prejudice will be a thing of the past, I fear that as long as we have families, as long as parents teach their children, as long as such a thing as “culture” exists, we will all have our prejudices.  Many of them, I believe, will have been inherited from our parents and grandparents.  Others from school teachers, preachers, news sources, national heroes, or friends.  A rare few, perhaps, we will have created entirely on our own.  But they will be there.  And others who see this the same way I do have suggested an idea that makes a great deal of sense to me: that to begin the path toward a more just world, we’d do well to begin by trying (as best we can) to identify what our own biases are.

In Alemeth, I have tried to take a step in that direction.  Early in the evolution of the novel, I found myself asking whether it was I who was creating Alemeth, or Alemeth who had created me.  It’s a novel about my family – about the culture that began the process of making me what I am – and it’s not an entirely pretty picture.  But the dream that inspired it, and the research and thought given to the project, is also largely responsible for the existence of something else.  I don’t think I’ll be giving too much away if I give you a hint: the last four words of the novel are “we may be wrong.”

— Joe