News, Thoughts and Opinions

Top Ten Blunders – Your Nominations

A month ago , I asked for your thoughts about the greatest blunders of all time.  I was thinking of blunders from long ago, especially “a list that considers only past human blunders, removed from the passions of the present day.”  I observed, “My special interest lies in blunders where large numbers of people… have believed that things are one way, where the passage of time has proven otherwise.  I believe such a list might help remind us of our own fallibility, as a species…

I got only five nominations. (I imagine the rest of you are simply reluctant to nominate your own blunders.  But hey.  All of us have done things we’d rather our children not hear about.)  As for the rest of you, I’m grateful for your nominations, even if they do imply that blame lies elsewhere than ourselves.  The five I received are certainly food for thought.

One was, “Founding Fathers missed huge by not imposing term limits.”  According to a recent Rasmussen opinion poll, 74% of Americans now favor term limits, with only 13% opposed.*  One could argue the jury is in: the verdict being that the founding fathers should have imposed term limits.  That said, with the average length of service in the U.S. House being 13.4 years, we obviously elect half of our representatives to seven or more consecutive terms.  And Michigan voters have sent John Dingle back to Congress for over fifty-seven years, even longer than his father’s decades of service before him.  Do they feel differently about term limits in Michigan?  If the founding fathers’ failure to impose term limits was a great blunder, don’t the American voters make a far greater blunder every two years when they send these perennial office holders back to Washington, year after year? I mean, it’s at least arguable that the Founding Fathers were right in failing to impose term limits.  But who can deny the hypocrisy when an electorate that favors term limits (that means us, folks) does what they themselves would prohibit?  Millions of Americans today are either wrong in favoring term limits, or wrong in re-electing the same Congressmen over and over again – and surely wrong by doing both simultaneously.  At least if measured by the number of people involved, the blunder we commit today strikes me as greater than that committed by a handful of wigged men in 1789.

A second nomination: “Y2K has to be in the top 20?”  That one sure brings a smile to my face.  You remember the “experts” predictions of the global catastrophe we’d see when all those computers couldn’t handle years starting with anything but a 1 and a 9.   Then, when the time came, nothing happened.  I don’t know of a single problem caused by Y2K.   If judged by the certainty of the so-called experts, and the size of the gap between the predicted calamity and what actually transpired, Y2K clearly deserves recognition.

But compare Y2K to other predictions of doom.  There can be no predicted calamity greater than the end of existence itself.  A glance at Wikipedia’s article, “List of Dates Predicted for Apocalyptic Events,” includes 152 dates that have been predicted for the end of the world.  And they haven’t been limited to freakish fringes of society.  Standouts include Pope Sylvester II’s prediction that the world would end on January 1, 1000, Pope Innocent III’s that it would end 666 years after the rise of Islam,  Martin Luther’s prediction that it would end no later than 1600, and Christopher Columbus’s that it would end in 1501. (When that year ended successfully, he revised his prediction to 1658, long after he’d be dead; he apparently didn’t want to be embarrassed again).  Cotton Mather’s prediction of 1697 had to be amended twice.  Jim Jones predicted the end in  1967 and Charles Manson 1969.  My favorite on Wikipedia’s list dates from May 19, 1780, when “a combination of smoke from forest fires, a thick fog, and cloud cover” was taken by members of the Connecticut General Assembly as a sign that the end times had arrived.  (It’s my favorite because it may help explain why the founding fathers saw no need for term limits.)  But fully half of the Wikipedia list consists of predictions made since 1900.   Over twenty-five have been since the Y2K blunder.  The recent predictions include one from a recent Presidential candidate (Pat Robertson) who predicted the world would end in 2007.  And though not yet included by Wikipedia,  last month’s solar eclipse brought out yet more predictions of the end of the world – never mind that only a tiny fraction of the earth’s surface was in a position to notice it.  (Would the world only end across a thin strip of North America?)

We can laugh at Christopher Columbus, but what of the fact that the list of doomsday prophecies continues to grow, despite how often the doomsayers have been wrong?  Measured by the enormity of the subject matter and the apparent widespread lack of concern about being “twice bitten,” man’s fondness for predicting when the world will end as a result of some mystical interpretation of ancient texts strikes me as a bigger blunder than Y2K – and unlike Y2K, it shows no sign of going away.

A third nomination: “The earth is flat.”  The archetypal human blunder.  Months ago, while struggling to think of other blunders as egregious, I was led by Google to a Wikipedia article on “the flat earth myth,” which I assumed was exactly what I was looking for.  But to my dismay, I read that the “flat earth myth” is not the old belief that the world was flat; rather, it is the current, widely-held belief that people in the Middle Ages believed the earth to be flat!  I’d spent a lifetime feeling proudly superior to the ignorant medieval masses.  Was it me, after all, who was wrong?

My discovery reminded me of the difficulty of ranking human error.  The article asserted that throughout the Middle Ages, the “best minds of the day” knew the earth was not flat.  The “myth” was created in the 17th Century, as part of a Protestant campaign against Catholic Church teachings, accelerated by the fictional assertion in Washington Irving’s popular biography of Christopher Columbus that members of the Spanish court questioned Columbus’s belief that the earth was round.  Gershwin’s unforgettable, “They all laughed at Christopher Columbus…” etched the myth forever in our minds.  The article quotes Stephen Jay Gould: “[A]ll major medieval scholars accepted the Earth’s roundness as an established fact of cosmology.”  The blunder wasn’t a relic of the Middle Ages, but an error of current understanding based on a post-enlightenment piece of popular fiction!

Meanwhile, the Flat Earth Society lives on to this day.  Their website, at theflatearthsociety.org, “offers a home to those wayward thinkers that march bravely on with REASON and TRUTH in recognizing the TRUE shape of the Earth – Flat. “  Most of them, I think, are dead serious.  But wait.  Which is the greater blunder: that of the medieval masses who saw their world as a patchwork of coastlines, rolling hillsides, mountains, valleys, and flat, grassy plains?  Or that of the experts, the major scholars who “knew” in the Middle Ages that the earth was a sphere?  The earth is not a sphere at all, we now know, but a squat, oblong shape that bulges around the equator because of the force of its spin.  Or is that error, too?  Need we mention that spheres don’t have mountains and valleys?  Need we mention that the surface of the earth, at a sub-atomic level, is anything but curved?  Aren’t all descriptions of the earth’s shape simply approximations?  And if we can accept approximations on the basis that they serve a practical purpose, then is the observable general flatness of the earth today any more “wrong” than a medieval scholar’s belief in sphericity?    Who really needs to know that the atoms forming the surface of the earth are really mostly air? The “wrongness” in our concepts of earth’s shape isn’t static, but evolving.

The oldest of the historical blunders nominated for WMBW’s top ten list have an ancient, scriptural flavor.

The first: “The number one thing that went wrong with humanity [was] when the first man said to another, ‘I think I heard god last night!’ and the other believed him.”**

The second comes from a different perspective: “The greatest blunder had to be Eve eating of the fruit of the tree of knowledge, having been tempted to be like God, deciding for herself what is good and what is evil.  Every person [becomes] his own god. The hell of it is, everyone decides differently, and we’re left to fight it out amongst ourselves.”**

The other three nominators thought that Y2K, belief in a flat earth, and failure to impose term limits should be considered for a place somewhere on the top ten list.  (Actually, Y2K’s sponsor only suggested it belonged somewhere in the top 20.)  But the two “religious” nominations were each called the biggest blunder of all.  (One was “the number one thing,” while the other “had to be” the greatest blunder.)   What is it about belief in God that prompts proponents and opponents alike to consider the right belief so important, and holding the wrong one the single greatest blunder of all time?

If you believe in God, though He doesn’t exist, you’re in error, but I don’t see why that error qualifies as the greatest blunder of all time, even when millions suffer from the same delusion.  I remember seeing an article in Science Magazine a few years ago, surveying the research that has attempted to determine whether believers tend to act more morally than non-believers.  Most of the studies showed little or no difference in conduct between the two groups.  For those who don’t believe in God, isn’t it one’s conduct, not one’s belief-system, that is the best measure of error?  For them, why does belief even matter?

If you don’t believe in God, though He does exist, you face a different problem.  If you believe as my mother did – that believing in God (not just any God, but the right God, in the right way) means you’ll spend eternity in Heaven rather than Hell – it’s easy to see why being wrong would matter to you. If roughly half the people in the world are headed to eternal damnation, that’s at least a problem bigger than term limits.

But there is a third alternative on the religious question.  If you’ve looked at the WMBW Home Page or my Facebook profile, you may have noticed my description of my own religious views – “Other, really.”  One of the main reasons for that description is pertinent to this question about the greatest blunders, so I will identify its relevant aspect here: “If God exists, He may care about what I do, but He’s not so vain as to care whether I believe in Him.”  My point is not to advance my reasons for that belief here, simply to point out that it may shed light on why many rank error on the religious question so high on the list of all-time blunders, while I do not.  Many believers, I think, believe it’s critically important to believe, so they try hard to get others to do so.  Non-believers react, first by pointing to examples of believers who’ve been guilty of wrongdoing, and eventually by characterizing the beliefs themselves as the reason for the wrongdoing.  In any case, the nominations concerned with religious beliefs were offered as clearly deserving the number one spot, while our “secular” nominations were put forward with less conviction about where on the list they belong — and that difference may have meaning, or not.

In my solicitation, I acknowledged the gray line between opinion and fact.  To some believers, the terrible consequences of not heeding the Word of God have been proven again and again, as chronicled throughout the Hebrew Bible.  To some non-believers, the terrible consequences of belief have been proven by the long list of wars and atrocities carried out in the name of Gods.  Whichever side you take, the clash of opinions remains as strong as ever.

So, what do I make of it all?  Only that I’d hoped for past, proven blunders which might remind us of our great capacity for error.  Instead, I discover evidence of massive self-contradiction on the part of the current American electorate; a growing list of contemporaries who, as recently as last month, are willing to predict the imminent end of the world; my own blunder, unfairly dismissive of the past, fooled by a piece of Washington Irving fiction; and a world as divided as ever regarding the existence of God.

To this observer, what it all suggests is that there’s nothing unique about the ignorance of past ages; and that an awfully large chunk of modern humanity is not only wrong, but likely making what some future generation will decide are among the greatest blunders of all time.

Sic transit gloria mundi.

–Joe

*http://www.rasmussenreports.com/public_content/politics/general_politics/october_2016/more_voters_than_ever_want_term_limits_for_congress

** I’ve done some editing of punctuation in both of these nominations: I apologize if I’ve thereby changed the submitter’s meaning.

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/09/03/top-ten-blunders-your-nominations/
Twitter
RSS

Asking the Ad Hominem Question

I generally wince when someone debating Topic X starts talking about his opponents, giving reasons he thinks his opponents believe as they do, trying to discredit their position by psycho-analyzing the reasons they hold it or expressing his disapproval of “the sort of people” who hold such positions.    It’s typically a variant of a thought analyzed well by Kathryn Schulz in Being Wrong: “I think the way I do because all evidence and logic support me; the only reason you think the way you do is because you suffer from… [here, fill in the blank.]”As I see it, such ad hominem arguments are often resorted to by those unable to make a good argument on Topic X itself.  Moreover, by making the debate personal, the ad hominem debater usually comes across as insulting, and that’s a sure-fire recipe for things to get ugly quickly.

I think it’s quite different to pose an ad hominem  question to oneself.  Asking ourselves why we believe what we do, when others don’t agree with us, can be a mind opening exercise.  (In case it’s not clear, “I believe what I believe because its true, and others disagree because they’re stupid” doesn’t count.)

Allow me to offer an example.  Having gotten some flack from readers for my thoughts about Charlottesville, I decided to ask myself the ad hominem question in an effort to understand why I favor removing statues of Confederate generals from public squares, when others don’t.   What is it about my background that causes me to favor such removal?

I’m pretty sure it was my career as an employment lawyer, a capacity in which I was often asked to advise employers on diversity issues and strategies for legally maintaining a dedicated, harmonious, loyal (and therefore productive) workforce.    Many of my clients experienced  variations on a problem I’ll call cultural conflict in the workplace, by which I don’t mean conflict between employer and employees, but among employees themselves.

The conflict involved was often racial, religious or gender-based.  For example, one company piped music from a radio station into its warehouses, only to discover that one group wanted to listen to a country music station, another a Latin station, and a third an R&B, Hip-Hop or Motown station.  Each group claimed it was being discriminated against if it didn’t get its way.  Another variant of the problem was when assembly line workers came to work wearing T-shirts that other employees found offensive —one T-shirt featured a burning cross; another the picture of a person wearing a white sheet and hood while aiming a gun at the viewer; another featured the “N” word; others were donned featuring raised fists and the words “Black Power” and shirts implying a revolution against “white rule.”

A frequent variant on the “culture conflict” problem  involved office environments where employees shared cubicles and wanted to decorate their cubicles with words or images that their neighbors or cubicle-mates found offensive.    In one case, one Christian employee began to hang skeletons, ghouls, devils and demons all over a shared cubicle, beginning in August, in preparation for Halloween; her Christian cubicle-mate believed that celebrating Halloween at all was the work of the devil; she countered  by hanging crucifixes, pictures of Jesus, manger scenes and Bible quotations on the shared cubicle wall, saying that devil worshipers would go to Hell; a third resident of that same cubicle corner — the one who actually complained to management — had religious convictions that prohibited the celebration of any holidays or the use of any religious imagery at all, on the grounds that all of it was idol-worship; she wanted it all removed.

Perhaps the most common variant of the culture conflict was in workplaces where male employees wanted to hang calendars or other pictures of naked (or scantily clad) women, while  women (and some men) objected on the grounds the working environment was made illegally offensive as a result.

In one case, there was already a racially charged atmosphere: a group of white ‘good ole boys’ always ate at one lunch table while blacks ate at another.  There’d been some mild taunting back and forth, but nothing too serious, when one day, several of the white employees started “practicing their knot tying skills” by making rope nooses in plain view of the blacks at the other table.  The blacks saw an obvious message which the whites of course denied.

In all such cases, the employer was left to decide what to do.  There were difficult legalities to deal with.  Some employers tried to address the problem by declaring that employees could post no personal messages on company property (like cubicle walls), but could post what they wanted on their own property (their  lunch boxes,  tool boxes, T-shirts, etc. ) But the public/private property distinction didn’t end the problem.  Someone who brings a Swastika and a “Death to All Jews” decal to work on his lunchbox is an obvious problem for workplace harmony, regardless of what the law says about it.

Surely, my background in this area shapes my views about cultural conflict regarding statues in public squares.  And I think what decided my position on statues was that such problems arose among my clients scores of times, yet never once was it the employer itself that wanted to post the material, wear the T-shirt, celebrate the holiday, practice tying knots, or whatever.  It was always a question of playing referee in the conflict between opposing groups of employees.

I believe it’s by analogy to that situation that I instinctively consider the problem faced by a government body deciding what or who to memorialize in the public square.   I don’t claim it’s an easy task.  But if a company or city had ever asked me if I thought it ought to hang crucifixes in its cubicles, display a picture of the devil in its lunchroom, hang a Confederate flag or a Playboy centerfold  in the conference room, or have its managers fashion nooses during an employee meeting, I’d have been flabbergasted.   It’s not that Robert E. Lee is like Satan, or Jesus, or a Playboy Centerfold, if we’re talking moral qualities, or what OUGHT to be offensive.  Rather, it’s the fact that, in my experience, all that mattered to the employers was that some of their employees considered the displays offensive.  When the display was controversial , it was viewed as a problem.  And without exception, my clients took pains not to introduce controversial images themselves.

In abstract theory, I can imagine that some symbols or ideas might be so important to the common good that an employer (or city council) should celebrate them, despite their being divisive.  (A statue of the sitting President?) But in the case of Confederate generals who fought to preserve an institution that has been illegal for 150 years now, my own cultural background — including my work experience —gives me no clue as to what their countervailing importance might be.

Anyway, I really do wince when people make ad hominem arguments against their opponents, but I like asking the ad hominem question of myself.  Whatever you think about Confederate generals, I’d love to hear from you if you’ve given the thought-experiment a try, especially if it has helped you understand differences in points of view between yourself and others.

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/08/22/asking-the-ad-hominem-question/
Twitter
RSS

Thoughts About Charlottesville

Last week’s tragedy in Charlottesville  has touched close to home here in Richmond, the capital of the old Confederacy.  Lt. H. Jay Cullen was one of two police officers killed in the effort to restore peace.  His viewing is tonight; his funeral is tomorrow.    My optometrist is attending because she serves as a delegate to the state house.  My daughter is attending because she’s a former co-worker and friend of Lt. Cullen’s wife.  Amidst the grief and mourning, the firestorm of what passes for debate regarding the whole affair cries out for a WMBW perspective.

A few months back, when the removal of four confederate statues in New Orleans was in the news, my own thinking distinguished among the statues.  I thought the removal of some made sense, but not others.  I was struck by the fact that no one else seemed to consider them as separate cases.  Everyone seemed to have adopted an all-or-nothing posture: either you were for, or against, the “removal of the statues,” as if alignment with one side or the other mattered more than considering the merits of each statue on its own.  Was I the only one in my circle who saw a middle ground?  I still worry about a group-think tendency to align entirely with one side or the other.  Such polarizing alignment seems to me precisely what led to the Civil War in the first place.   But in the meantime, Charlottesville has caused me to consider the matter anew – and I’ve decided I was wrong about the statues in New Orleans.

I approved of the removal of most of the New Orleans statues, but felt otherwise about the statue of Robert E. Lee.  My opposition was on the ground that Lee was a good (if imperfect) man and that to remove his statue did an injustice, both to history and to him personally.  Now, I believe that I was wrong about the Lee statue, and I’m moved to explain why.

First, let’s consider what history tells us about the man in relation to slavery.  While historians disagree on certain details, it seems clear that Lee personally ordered the corporal punishment of slaves who resisted his authority.   Today, all but the most extreme white supremacists can agree that this was wrong.  Of course, Lee’s treatment of his slaves was not remarkably better or worse than the racism of thousands of other white men who owned slaves in those days; he apparently believed what most white Southerners  (as well as many in the North) believed: that the Bible made it their Christian duty to “look after” African Americans.  And for Lee, as for most slave-owners, this paternalistic attitude included both kindness (especially as a reward for loyalty and good work) and infliction of severe corporal punishment (as a deterrent to disobedience).  These days, it’s extremely hard to understand how so many people could have been so wrong, but hundreds of thousands of Lees’ white peers thought and acted as he did in their relation to black Americans.   This conduct was widespread; it shames us all.  While being widespread doesn’t justify what Lee did, it makes it a lot easier for me to recognize that Lee was much like the rest of us: i.e., capable of well-intended conduct that future generations may condemn as fundamentally, grievously wrong.

My admiration for Lee – which continues — comes despite his participation in the injustice of slavery.  I also admire slave-owners like George Washington and Thomas Jefferson (despite the fact that Jefferson described African Americans as having “a very strong and disagreeable odor,” a capacity for love limited to “eager desire” more than sentiment, and a capacity for reason he insisted was far inferior to that of the white man.) I admire these white men for the good that they did, despite my recognition that they were so grievously wrong about African Americans and slavery.

Lee, the man, was more than a participant in the repugnant institution of slavery.  He was a great military strategist.  He was a man who sacrificed his personal welfare for what he saw as his duty to his country.  And perhaps most importantly for me, he became a significant force for reconciliation after the war.  When the government of the Confederacy collapsed and its armies surrendered, many Southerners wanted to continue the fight for slavery, on an underground, guerrilla-warfare basis.  This stubborn, “never-say-die” sentiment led to formation of the K.K.K., and to the worst atrocities of the Reconstruction era.  Indirectly, it led to the current existence of hate groups like the Nazi group that marched in Charlottesville.   In the face of such atrocities, Robert E. Lee advocated against continued resistance, calling repeatedly for reconciliation with the north, for fair and decent treatment of the freedman, respect for the law, and the putting aside of past hatreds in order to restore unity, harmony, and civility.  True, he didn’t support giving blacks the right to vote, and I fault him for that, but he lived in an era when that was viewed as an extremist position.  And if one looks at his postwar record as a whole, he was primarily a peacemaker.   A man who sought a better life for blacks and whites alike, and had to swallow a great deal of personal pride to do so.  Indeed, I think he might have been an early supporter of WMBW, had it been around in his day!  Chiefly for that reason, I count myself as a fan and supporter of the man.

And because I admire the man, I was, until recently, opposed to the removal of statues honoring him.

But what now?  Sadly, it sometimes takes jarring events, close to home, to get us to change our minds. And in this case, I have changed mine.  Many of those opposed to removal of Lee’s statues say that removal is an affront to history.  That was my own thinking just a couple of months again.  But now I think I was wrong.  Whatever we may think of a particular person, good or bad, is there not room to distinguish between our sentiments about the person and the reasons to erect (or take down) a statue? And aren’t the reasons for erection or removal of a statue important?   Let’s consider, carefully and dispassionately, the possible reasons for removing Lee’s statue.

First, if Lee’s participation in the atrocities of slavery oblige us to take down his statue, it seems to me consistency would require us to demolish the Washington Monument and the Jefferson Memorial.  Can a consistent standard for statues limit us to memorialize only those leaders who were entirely free from wrong? Limiting statues to perfect people may put a lot of sculptors out of work…

What about the assertions of Anderson Cooper, the Los Angeles Times, and others, who claim that Lee’s statute should be taken down because Lee (unlike Washington and Jefferson) was a traitor to his country?  Anderson Cooper asserted that Washington fought for his country, and Lee against his.  The Los Angeles Time’s headline said that Washington’s ownership of slaves was not the equivalent to Lee’s “treason.”  But didn’t George Washington lead an army against his mother-country (England)? And didn’t Lee lead an army that defended his homeland (the Commonwealth of Virginia) against an army that had invaded it and was bearing down on its capital?    In labeling Lee a traitor, or Washington a patriot, I believe it important to distinguish between today’s perspectives and those at the time these men lived.  Washington and Jefferson were subjects of the British Crown, and readily admitted that they were engaged in rebellion against their government, for which they’d be found guilty of treason if they lost.   Washington and Jefferson were among the rebels who embedded slavery in the Constitution in the first place.  And by the time Lee threw in his lot with Virginia, the Supreme Court of the United States had upheld slavery as the law of the land.  Today, we think of our primary patriotic  allegiance as belonging  to the United States, which we regard as a single, unitary country.  But our pledge of allegiance to such a unified country resulted from the Federal victory in the Civil War (which is to say, in large part, from Lee’s willingness to give up the fight to preserve slavery, and to accede to the prevailing egalitarian view.)  Prior to that war, we had been a confederated union of sovereign states.  The very words “commonwealth” and “state” reflect the idea of nationhood to which patriotism was generally believed to adhere.  (As but one example of the perceived sovereignty of the individual states, when Merriweather Lewis went west in the early 1800’s, meeting with native American tribes who knew nothing of the new government in Washington, he gave the same prepared speech to all of them, a speech that referred to President Thomas Jefferson as the great chief of “seventeen nations” (the number of sovereign states then comprising the U.S.).  I believe strongly that Lee’s sense of allegiance to his homeland, the Commonwealth of Virginia, was an honorable, patriotic position – moreso, by far, than Washington’s taking up of arms against England.  As I understand history, it was Washington who was the traitor to his country; Lee was the dutiful servant of his.

Is the difference, then, that the 1776 war for American independence was a morally “just war,” and the war to preserve the southern Confederacy and slavery an unjust one?   A lot of historians question whether the atrocities allegedly committed by England really were sufficient to warrant rebellion against them, but assuming you think Lee’s war relatively wrong, compared to Washington’s, I see Lee’s status, in this regard, as similar to the status of thousands of soldiers whose names appear on the Vietnam Memorial.  As a people, we have adopted the principle of honoring servicemen who have fought for their sovereign government, even when the war in which they served is judged by history to have been wrong.  If we remove the statue of Lee because he served his homeland in an unjust war, what are we to do with the Vietnam War Memorial?

What about the argument that most attracted me, that to remove the statue of Lee is to rewrite history?  It’s undeniable that Lee played an important part in history, but so did Lord Cornwallis, John Wilkes Booth, and Lee Harvey Oswald.  To have no statues honoring them is not to rewrite history, nor to deny the place of these individuals in it.  It is simply to recognize the difference between preserving history and the reasons to honor praiseworthy individuals by erecting public memorials to them.  A public statue is a symbol, intended to celebrate an idea.    If we ignore what the subjects of our statues symbolize, we risk celebrating the wrong things.  So, the right question, I think, is not whether Lee was a great general, or played an important role in history, or owned or punished his slaves, or was a traitor or a dutiful servant.  The right question to ask, I would suggest, is what does his memorial symbolize?

In considering that question, I think the key consideration is that while Washington was a traitor to his country, he did fight for ours.  And while Lee was a loyal, dutiful patriot who fought for his homeland, he did fight against the unified country that arose from that war.  It is not to demean the man’s character, or his service, or history itself, to recognize that he fought to divide what has (since) become the nation to which we now pledge allegiance.  If our public memorials are intended to remind us of our public principles, then it is the principle of unity, as a nation, that seems especially in need of attention these days – not the division for which Lee fought.  I have no idea whether Lord Cornwallis owned any slaves.  And he may have been a fine and honorable man, even a role model.  But we Americans don’t erect public statues to honor him.  In one sense, Lee symbolizes the opposition to the current American government every bit as much as Cornwallis does.  I see no loss in failing to memorialize either man.

AS for the many arguments in the nature of “If we remove this statue, what next?” I believe there are matters of institutional purpose to consider. I doubt the NAACP will ever erect a statue to George Washington, the white slave owner, nor should they, because he was a white slave-owner and as such is inimical to the interests of that organization.   The racist Louis Agassiz’s name has, thankfully, been removed from schools named to honor him, but I believe his name properly remains as the name of Agassiz Glacier, in Glacier National Park, because Agassiz remains respected for his pioneering scientific work on glaciers.  As abhorrent as I consider the Nazis to be, if they want to erect a statue of Adolf Hitler on their own property, they’d have the right to do so.  As for Washington and Lee, I do not believe that the college bearing their names should feel compelled to change its name or remove the statues I presume exist on its campus to remember them.  Washington saved the school with his financial support; as the college’s president, Lee greatly expanded the school; I believe the college should honor these men for that institution-specific history, and if that includes maintaining statues to both men, I support that.  In that context, Washington and Lee would symbolize, and be accorded honored for, their service to that institution.   In 1962, the United States Military Academy named one of its barracks after Lee.  I think that appropriate, because Lee was a brilliant military strategist and because he had served as that school’s commander.  And I think George Washington should (and will) properly remain on our dollar bills, and be honored in our national capital, because despite his racism and ownership of slaves, and despite his being a traitor to his sovereign country, he was still instrumental in the establishment of this country.  By this reasoning, even a statue of John Wilkes Booth might be appropriate at Ford’s Theater.  My point is that there’s a proper role for institutional purpose in the choice of who an institution recognizes through its memorials.  Even if we get to the point of tearing down his memorial in Washington, a statue of Thomas Jefferson will always be appropriate at Monticello, and a statue of Lee appropriate at Stratford Hall.  To remove some statues of Robert E. Lee does not require the removal of all of them, and certainly doesn’t mean to erase him from history; much depends, I think, on the institution and its purposes.

So where does that leave us?  The City Councils of New Orleans and Charlottesville are institutions, and institutions of a particular type: they have been elected to represent all their citizens.  They should be celebrating the current government (American, not British; the USA, not the CSA).  And they should be choosing memorials that symbolize the current ideals of the people they represent – the ideals of a diverse nation that has come together in peace. In these divisive times, it is as important as ever that they choose symbols of tolerance and inclusion.   By virtue of his position as opposition commander in an effort to divide the union, Robert E. Lee necessarily symbolizes opposition to the national government that won the war.  He symbolizes a divided country, one in which the north would have been free to abolish slavery as long as the south was free to continue it.  That’s not an ideal any government in the United States should want to memorialize.   It is past time to stop celebrating it, or anyone who represents ethnic, racial or ideological division.

Right or wrong, those are my views.  But this week, as I watched our president, and our news media, address the issue from opposite sides, I was struck again by the all-or-nothing positioning on both sides.  Trump and the media both talked about “the two sides” – those for, and against, removal, and sometimes, as if all those who opposed removal were white supremacists or Neo-Nazis.  Are we no longer capable of a more nuanced analysis?  Must every individual be vilified by association with the worst of the people on the other side? Must people classify me as a Nazi, if I utter a single word of respect for a man like Robert E. Lee, or a liberal destroyer of history, if I support the removal of his statue?

Changing people’s minds will only happen when people starting listening to each other.  These days, it seems, no one is listening to anybody; people seem interested in knowing whether you’re for them or against them, and that’s it – not your reasons, not the finer points of what you have to say, or the reasoning behind it.  The scary thing is, it’s remarkably like the situation in 1860, when the polar opposites took their corners and came out fighting, leaving hundreds of thousands of casualties in their wake.  In my view, the only way to avoid a repeat of such violence is to be alert to the possible faults in ourselves; and to be willing to continue looking for the good in people even after we see the bad in them.  We have to be willing to learn from those we think are wrong.  Otherwise, I believe, we will all share responsibility for the violence to come.

So though I join the call for removing his statues from public places, I still think we can learn from Robert E. Lee.  In an 1865 letter to a magazine editor, he wrote, at the end of the war, “It should be the object of all to avoid controversy, to allay passion, give full scope to reason and to every kindly feeling. By doing this and encouraging our citizens to engage in the duties of life with all their heart and mind, with a determination not to be turned aside by thoughts of the past and fears of the future, our country will not only be restored in material prosperity, but will be advanced in science, in virtue and in religion.”

How I wish that Lee himself had been in Charlottesville last week, to make that point to all those present.   I wonder if any of those whose acts led to violence had any idea of that side of Robert E. Lee.  Or did both “sides” simply think of him as a symbol of an era in which white supremacy was the law of the land, and align themselves accordingly?

The next fight close to home will no doubt involve the statues of all the confederate generals lining Monument Avenue here in Richmond.  The very short video attached, courtesy of our local TV station, offers a message I think Robert E. Lee would have approved of.

https://www.facebook.com/CBS6News/videos/10154980841367426/

–Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/08/18/thoughts-about-charlottesville/
Twitter
RSS

The Top Ten Blunders of All Time

For several months now, I’ve been plagued by the thought that certain ways of “being wrong” are different from others.  I’ve wondered whether I’ve confused any thing by not mentioning types of error and distinguishing between them.  For example, there are errors of fact and errors of opinion.  (It’s one thing to be wrong in thinking that 2+ 2 = 5, or that Idaho is further south than Texas, while it’s quite another to be “wrong” about whether Gwyneth Paltrow or Meryl Streep is a better actor.)  Meanwhile, different as statements of fact may be from statements of opinion, all such propositions have in common that they are declarations about present reality.  Not so a third type of error – judgmental errors about what “ought” to be done.   Should I accept my friend’s wedding invitation?  Should I apologize to my brother?  Should we build a wall on the Mexican border?  I might be wrong in my answer to all such questions, but how is it possible to know?

Is there a difference between matters of opinion (Paltrow is better than Streep) and ethics (it’s wrong to kill)?  Many would say there’s an absolute moral standard against which ethics ought to be judged, quite apart from questions of personal preference; others would argue that such standards are themselves a matter of personal preference.  I’ve thought a lot about how different types of error might be distinguished.  But every time I think I’m getting somewhere, I wind up deciding I was wrong.

One of the ways I’ve come full circle relates to the distinction between past and future.  It’s one thing to be wrong about something that has, in fact, happened, and another to be wrong about a prediction of things to come.  Right?  Isn’t one a matter of fact, and the other a matter of opinion?  In doing the research for my recent novel, Alemeth, I came across the following  tidbit  from the Panola Star of December 24, 1856:

The past is disclosed; the future concealed in doubt.  And yet human nature is heedless of the past and fearful of the future, regarding not science and experience that past ages have revealed.

Here I was, writing a historical novel about the divisiveness that led to civil war. I was driven to spend seven years on the project because of the sentiment expressed in that passage: that we can, and ought to, learn from the past.  (Specifically, we should learn that when half the people in the country feel one way, and half the other, both sides labeling the other stupid, or intentionally malicious, an awful lot of people are likely wrong about the matter in question, and the odds seem pretty close to even that any given individual (that includes each of us) is one of the many in the wrong.  And importantly, the great divide wasn’t because all the smart people lived in some states, or that all the bad people lived in others: rather, people tended to think as they did because of the prevailing sentiments of the people around them. Hmnnn…)

Then, an interesting thing happened in the course of writing the book.  Research began teaching me  how many pitfalls there are in thinking we can really know the past.  We have newspapers, and old letters, and other records, but how much is left out of such things?  How many mistakes might they contain?  Indeed, how many were distorted, intentionally, by partisan agendas at the time?  The more I came across examples of each of those things, the less sure I became that we can ever really know the past.  I often can’t remember what I myself was doing ten minutes ago; how will I ever be able to reconstruct how things were for tens of thousands of people a hundred years ago?  Indeed, the more I thought about it, I began to circle back on myself, wondering whether the opposite of where I’d started was true:  Because the past has, forever, already passed, we’ll never be able to return to it, to touch it, to look it directly in the eye, right?  Whereas, we will have that ability with respect to things yet to come.  If that’s true, the future just might be more “verifiable”  than the past.   I get dizzy just thinking about it.

Anyway, an idea I’ve been kicking around is to ask you, WMBW’s readers, to submit nominations for the ten greatest (human) blunders of all time.  I remain extremely interested in the idea, so if any of you are inclined to submit nominations, I’d be delighted.  But the reason I haven’t actually made the request before now stems from my confusion about categories of wrong.  Any list of “the ten greatest blunders of all time” would be focused on the past and perhaps the present, while presumably excluding the future.  But I’m tempted to exclude the present as well.   I mean, I feel confident there are plenty of strong opinions about, say, global warming – and since the destruction of our species, if not all life on earth, may be at stake, sending carbon into the air might well deserve a place on such a list.   Your own top ten blunders of all time list might include abortion, capitalism,  Obamacare, the Wall, our presence in Afghanistan, our failure to address world hunger, etc., depending on your politics.  But a top ten list of blunders based on current issues (that is, based on the conviction that  “the other side” is currently making a world class blunder) would surprise few of us. It seems to me the internet and daily news already makes the nominees for such a list clear.  What would be served by our adding to it here?

My interest, rather, has been in a list that considers only past human blunders, removed from the passions of the present day.  I believe such a list might help remind us of our own fallibility, as a species.  I for one am constantly amazed, when I research the past, at our human capacity for error.  Not just individual error, but widespread cultural error, or fundamental mistakes in accepted scientific thinking.  My bookshelves are full of celebrations of the great achievements of mankind, books that fill us with pride in our own wisdom, but where are the books which chronicle our stupendous errors, and so remind us of our fallibility? How could nearly all of Germany have got behind Hitler?  How could the South have gone to war to preserve slavery?  How could so many people have believed that disease was caused by miasma, or that applying leeches to drain blood would cure sickness, or that the earth was flat, or that the sun revolves around the earth?

What really interests me is not just how often we’ve been wrong, but how ready we’ve been to assert, at the time, that we knew we were right.  The English explorer John Davys shared the attitude of many who brought European culture to the New World, before native Americans were sent off to reservations:

“There is no doubt that we of England are this saved people, by the eternal and infallible presence of the Lord predestined to be sent into these Gentiles in the sea, to those Isles and famous Kingdoms, there to preach the peace of the Lord; for are not we only set on Mount Zion to give light to all the rest of the world? *** By whom then shall the truth be preached, but by them unto whom the truth shall be revealed?”

History is full of such declarations.  In researching the American ante-bellum South, not once did I come across anyone saying, “Now, this slavery thing is a very close question, and we may well be wrong, but we think, on balance, that…”  In the days before we knew that mosquitos carried Yellow Fever, scientific pronouncements asserted as fact that the disease was carried by wingless, crawling animalcula that crept along the ground.  This penchant for treating our beliefs as knowledge is why I so love the quote (attributed to various people) that runs, “It ain’t what people don’t know that’s the problem; it’s what they do know, that ain’t so.”

My special interest lies in blunders where large numbers of people – especially educated people, or those in authority – have believed that things are one way, where the passage of time has proven otherwise.  My interest is especially strong if the people were adamant, or arrogant, about what they believed.  Consider this, then, a request for nominations, if you will, especially of blunders with that sort of feel

Yet be forewarned.  There’s a reason I haven’t been able to come up with a list of my own.  One is that, while not particularly interested in errors of judgment or opinion, I’m not sure where the dividing line falls between fact and opinion. Often, as in the debate over global warming, the very passions aroused are over whether the question is a matter of fact or opinion.  Quite likely, what we believe is fact; what our opponents believe is opinion.

The other is the shaky ground I feel beneath my feet when I try to judge historical error as if today’s understanding of truth will be the final word.   Remember when we “learned” that thalidomide would help pregnant women deal with morning sickness?  Or when we “learned” that saccharin causes cancer?  That red wine was good for the heart (or bad?  What are they saying on that subject these days?)  What about when Einstein stood Newton on his head, or the indications, now, that Einstein might not have got it all right?  If our history is replete with examples of wrongness, what reason is there to think that we’ve gotten past such blunders, that today’s understanding of truth is the standard by which we might identify the ten greatest blunders of all time?  Perhaps the greater blunder may be when we confidently identify, as a top ten false belief of the past, something which our grandchildren will discover has been true all along.…

If this makes you as dizzy as it does me, then consider this: The word “wrong” comes from Old English wrenc, a twisting; it’s related to Old High German renken, to wrench, which is why the tool we call a wrench is used to twist things.  This is all akin to the twisting we produce when we wring out a wet cloth, for when such cloth has been thoroughly twisted, wrinkled, or wrung out, we call it “wrong.” Something is wrong, in other words, when it’s gotten so twisted as to be other than straight.

But in an Einsteinian world, what is it to be straight?  The word “correct,” like the word “right” itself, comes from Latin rectus, meaning straight.  The Latin comes, in turn, from the Indo-European root reg-.  The same root that gave us the Latin word rex, meaning the king.  Joseph Partridge tells us that the king was so called because he was the one who kept us straight, which is to say, in line with his command.  The list of related words, not surprisingly, includes not only regular and regimen, not only reign, realm and region, but the German word Reich.  If the history of language tells us much about ourselves and how we think, then consider the regional differences in civil war America as an instance of rightness..  Consider the history of Germany’s Third Reich as an instance of rightness  It seems we’ve always defined being “right” as a matter of conformity, in alignment with whatever authority controls our and our neighbors’ ideas.

Being wrong, on the other hand?   Is it destined to be defined only as the belief not in conformity to the view accepted by those in charge?  Sometimes I think I’ve got wrongness understood, thinking I know what it is, thinking I’m able to recognize it when I see it.  But I always seem to end up where I began, going around in circles, as if space itself is twisted, curved, or consisting of thirteen dimensions.   I therefore think my own nomination for the Ten Greatest Blunders of All Time has to go to Carl Linnaeus, for calling us Homo Sapiens. 

If you have a nomination of your own, please leave it as a comment on this thread, with any explanation, qualification, or other twist you might want to leave with it.

I’m looking forward to your thoughts.

Joe

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/08/03/the-top-ten-blunders-of-all-time/
Twitter
RSS

The Tag Line

WMBW’s tagline is “Fallibility>Humility>Civility.”  It’s punctuated to suggest that one state of being should lead naturally to the next.  The relationship between these three concepts being central to the idea, today I’ve accepted my brother’s suggestion to comment about the meaning of the words.

Etymology books tell us that “fallibility” comes from the Latin fallere, a transitive verb that meant to cause something to stumble.  In the reflexive form, Cicero’s me fallit (“something caused me to stumble”) bestowed upon our concept of fallibility the useful idea that when one makes a mistake, it isn’t one’s own fault.  As Flip Wilson used to say, “the devil made me do it.”

This is something I adore about language – the way we speak is instructive because it mirrors the way we think.   Therefore, tracing the way language evolves, we can trace the logic (or illogic) of the way we have historically tended to think, and so we can learn something about ourselves.  Applying that concept here leads me to conclude that denying personal responsibility for our mistakes goes back at least as far as Cicero, probably as far as the origins of language itself, and perhaps even farther.  “I did not err,” our ancient ancestors taught their children to say; “something caused me to stumble.”

I also think it’s fun to examine the development of language to see how basic ideas multiply into related concepts, the way parents give rise to multiple siblings.  And so, from the Latin fallere come the French faux pas and the English words false, fallacy,  fault, and ultimately, failure and fail.  While I’ve heard people admit that they were at fault when they stumbled, it’s far less common to hear anyone admit responsibility for complete failure.  If someone does, her friends tell her not to be so hard on herself.  His psychiatrist is liable to label him abnormal, perhaps pathologically so: depressed, perhaps, or at least lacking in healthy self-esteem.  The accepted wisdom tells us that a healthier state of mind comes from placing blame elsewhere, rather than on oneself.  Most interesting.

Humility, meanwhile, apparently began life in the Indo-European root khem, which spawned similar-sounding words in Hittite, Tokharian, and various other ancient languages.  All such words meant the earth, the land, the soil, the ground – that which is lowly, one might say; the thing upon which all of us have been raised to tread.  In Latin the Indo-European root meaning the ground underfoot became humus, and led to English words like exhume, meaning to remove from the ground.  Not long thereafter, one imagines, the very ancient idea that human beings came from the ground (dust, clay, or whatever) or at least lived on it led to the Latin word homo, a derivative of humus, which essentially meant a creature of the ground (as opposed to those of the air or the sea).  From there came the English words human and humanity.  Our humanity, then, might be said to mean, ultimately, our very lowliness.

From the Latin, homo and humus give us two rather contrary sibling words.  These siblings remain in a classic rivalry played out to this day in all manner of ways.  On the one hand, homo and humus give us our word “humility,” the quality of being low to the ground.  We express humility when we kneel before a lord  or bow low to indicate subservience. In this light, humility might be said to be the very essence of humanity, since both embody our lowly, soiled, earth-bound natures  But our human nature tempts us with the idea that it isn’t good to be so low to the ground.  To humiliate someone else is to put them in their place (to wit, low to the ground, or at least, low compared to us.) And while we share with dogs and many other creatures of the land the habit of getting low to express submissiveness, some of our fellow creatures of the land go so far as to lay down and bare the undersides of their necks to show submission.  Few of us are willing to demonstrate that degree of humility.)

And so the concept of being a creature of the ground underfoot gives rise to a sibling rivalry — there arises what might be called the “evil twin” of humility, and it is the scientific name by which we distinguish ourselves from other land-based creatures: the perception that we are the best and wisest of them gives rise to homo sapiens, the wise land-creature.  As I’ve pointed out in an earlier blog, even that accolade wasn’t enough to satisfy us for long: now our scientists have bestowed upon us the name homo sapiens sapiens, or the doubly wise creatures of the earth.   I find much that seems telling in the tension between our humble origins and our self-congratulatory honorific.  As for the current state of the rivalry, I would merely point out that not one of our fellow creatures of the land, as far as I know, have ever called us wise.  It may be only us who think us so.

And now, I turn to “civility.”  Joseph Partridge, my favorite etymologist, traces the word back to an Indo-European root kei, meaning to lie down. In various early languages, that common root came to mean the place where one lies down, or one’s home. (Partridge asserts that the English word “home” itself ultimately comes from the same root.)  Meanwhile, Partridge tells us, the Indo-European kei morphed into the Sanskrit word siva, meaning friendly.  (It shouldn’t be hard to imagine how the concepts of home and friendliness were early associated, especially given the relationship between friendliness and propagation.) In Latin, a language which evolved in one of the ancient world’s most concentrated population centers, the root kei became the root ciu- seen in such words as ciuis, (a citizen, or person in relation to his neighbors), and ciuitas (a city-state, an aggregation of citizens, the quality of being in such an inherently friendly relationship to others).  By the time we get to English, such words as citizen, citadel, city, civics and civilization, and of course civility itself, all owe their basic meaning to the idea of getting along well with those with whom we share a home.

In the olden days, when one’s home might have been a tent on the Savannah, or a group of villagers occupying one bank of the river, civility was important to producing harmony and cooperation among those who laid down to sleep together.  Such cooperation was important for families to work together and survive.  But as families became villages, villages became cities, and city-states became larger civilizations, we have been expanding the reach of people who sleep together.  (And I mean literally – my Florida-born son, my Japanese-born daughter-in-law, and my grandson, Ryu, who even as I write is flying back from Japan to Florida, remind me of that fact daily.)  Our family has spread beyond the riverbank to the globe.

Given the meanings of all these words, I would ask how far our modern sense of “home” and “family” extend?  What does it mean, these days, to be “civilized”?  What does it mean, oh doubly-wise creatures of the earth, to be “humane”? And in the final analysis, what will it take to “fail”?

— Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/07/26/the-tag-line/
Twitter
RSS

My Favorite African Photo

I got back from an African safari vacation last night, very jet-lagged, having not slept for about 43 hours.  When I woke up this morning, I was anxious  to start organizing the photographs from my trip.  Siting down at the PC to do so, I found an e-mail from my erstwhile roommate, John, reminding me to send him photos of the wildlife I’d seen.  (John is an avid outdoorsman who once tried to make a living as a wildlife photographer.) Having not yet gone through the photos myself, having not yet cropped, nor cut, nor selected any of them, I wasn’t ready to give John the full-blown “Here Are the Pics of My African Vacation” slideshow – but I decided I’d send him just one of them – both because it was my sentimental favorite, of all those I’d taken, and because I knew that John, of all people, would appreciate it.

Now, the reason John would appreciate this particular photo was not just that he’s an erstwhile wildlife photographer; almost all the photos I’d taken were of African wildlife.  But the year that John and I spent as college roommates, many decades ago, were marked by regular discussions of deep philosophical issues; and this photograph had become my  favorite due to its philosophical implications, implications I felt sure John would appreciate.

As I learned on the Shamwari game preserve, most wild baboon troops of South Africa run quickly away at the approach of human beings.  But on the day this photograph was taken, I had come to the extreme southwestern tip of the African continent, a rocky, mountainous formation that rises high above sea level like the prow of a sailing ship that projects above the ocean waves.  In fact, here is a photograph – one taken from the Wikipedia article regarding the Cape of Good Hope – which shows the general topography of the place.

View at Cape Point

(Photo by Thomas Bjørkan - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=21923168)

Naturally enough, given the impressive topography, the Cape has become a tourist attraction.  The result of being a tourist attraction is that the native baboons of the Cape have lost their fear of human beings.  In the Cape Point parking lot, they were nearly as plentiful as the people, ready to pounce on anyone foolish enough to walk by with a sandwich in hand.  They were sitting on the roofs of cars.  They were scouting for half-open windows through which to steal picnic lunches.  They were on the rocks, in the bushes, outside the souvenir shop, intermingling fearlessly with us, their more advanced cousins.

I took the photograph in question – the photograph I wanted to send to John because it had become my favorite – while standing on the Cape, looking south like some fifteenth century Portuguese explorer from the bow of his ship, gazing across thousands of miles of ocean toward the south pole.  The Atlantic Ocean was to my right, the Indian not far to my left, and the Antarctic somewhere far in the distance in front of me. To my immediate left, on the summit of the mountain peak, a lighthouse had been built to guide ships rounding the Cape. Because of my fear of heights, I had not attempted the funicular or the steep climb from the funicular to the summit, but as I looked at the rocky cliff, with the triple-ocean breeze blowing into my face and the triple-ocean surf crashing into the old, unmoving rocks below, I noticed movement high up on the cliff’s stony face.  Tapping into the unconscious (but ingrained) ability of one primate to recognize the movements of another, I was drawn to it, a twitch on the horizon, a dark profile silhouetted against a bright sky.  He was maybe fifteen hundred feet away from me and several hundred feet higher than me, but I could see him settle onto one of the highest, most southerly rocks of the cliff side, clearly fixing his gaze southward, looking out over the oceans just as I’d been doing — except, of course, that he was braver and more agile than me, having dared to climb out onto the virtual bowsprit of the continent, where I would not.  I wondered why he wasn’t in the parking lot, with the rest of his kind, ready to pounce on a tourist; wondered why he had gone off on his own, to gaze across the oceans toward the vast unknown.

Like all primates, baboons are an intelligent species.  Scientists have recently discovered that they can acquire orthographic processing skills which form part of the ability to read.* I wondered if this solitary philosopher was more intelligent than his fellows in the parking lot.  I imagined the thoughts he was having about other lands, far away.  I imagined him capable of evolving into another Bartholomeo Diaz someday.  Gazing across the ocean and into the unknown, I wondered: wasn’t it possible he had seen ships pass, and wondered how he might build a ship of his own, to go exploring, some day?  I maximized the camera’s zoom and got the best picture of the contemplative creature I could.

The sight so impressed me, in fact, that for the rest of my time in Africa, I told people about it.  Last night – my first night home – I told my wife, and my daughters, and my grandson Jacob, about it.  Jacob in particular was wide-eyed as I promised to show him the photograph when he comes over this afternoon.   The profiled creature has become my hero; the photograph of him looking out across the ocean has stuck with me, and I haven’t been able to get it out of my head – more than the photographs of lions, cheetahs, elephants, and giraffes I took – even more than the elegant springbok herd, the pod of dangerous hippopotami, or the solitary, rare and elusive black rhinoceros.  It is my favorite photograph, despite the fact that, fully zoomed in, and lacking a tripod for my camera, the image came out slightly blurry. It is my favorite not for its technical quality, but because of its fascinating philosophical implications.  And as I composed my e-mail to John this morning, he seemed the perfect person to appreciate those implications.

Anyway, this morning, as I composed my e-mail to John, I described the photograph I was sending him, describing why it was my favorite, much as I had to Jacob last night, much as I have to you here.  As I was finishing my written description to John, my grandson Evan walked into the room. I invited him to come take a look at the photograph of the contemplative baboon.  I fetched it from the digital camera’s SD drive and displayed it on my monitor.  Evan and I shared still more deep, philosophical observations about our most intriguing subject.  Finally, after Evan departed, I embedded the photograph into my e-mail to John, as I now do here:

You can see the solitary baboon toward the top of the picture, squatting on all fours, his tail raised behind him, dreaming of building his own ship and exploring the oceans on three sides of him.

Alternatively, you can do as I did.  To wit: as I embedded the photograph into my e-mail to John, I realized that I could blow it up even larger, digitally, than I’d been able to do through the zoom setting on the camera  With the wonders of modern technology – my virtual icon in the shape of a magnifying glass with a plus sign – I was able to enlarge the photo enough to see the image at a level of detail not revealed by the camera’s telephoto lens.  Glints of sunlight on the rock, the baboon’s tail, his haunches.  Magnifying the image even more, I thought I might even have captured the contemplative expression on the creature’s face.  But the more I enlarged it, the more the baboon’s haunches looked like a torso, his legs hidden behind the rock; and the more its tail looked like a back pack.  With a final enlargement, I could see how close this baboon had evolved to the point of being able to read – he was wearing a pair of glasses.

As you’ve figured out by now, the fascinating, contemplative creature was actually a tourist, just like me (only without the fear of heights). The only baboon in the picture had been on my side of the lens.  What will I tell my grandchildren now?  (At least until now, a few of them still look up to me.)  Is that Jacob, coming up the stairs now?

Still, the photograph remains my favorite wildlife photograph.  And the reason hasn’t changed, either: although it’s still a bit blurry, the photograph has deep, philosophical implications for the species it portrays.

— Joe

*Jonathan Grainger; Stéphane Dufau; Marie Montant; Johannes C. Ziegler; Joël Fagot (2012). “Orthographic processing in baboons (Papio papio)”. Science. 336 (6078): 245–248. PMID 22499949. doi:10.1126/science.1218152.

 

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/07/01/my-favorite-african-photo/
Twitter
RSS

The Rather Large Ant

Thanks to my now-37-year-old son Daniel for today’s illustration of one reason we’re so easily and often wrong.  His e-mail to his mother, in thanks for his birthday present:
“Thanks mom, I love the shorts.  We recently changed to casual dress at work and all my old ones were probably bought when I was in junior high. I’ve been meaning to upgrade but I HATE clothes shopping, so this is a truly useful gift.
“Now on to the bad part. While I do enjoy the chocolates you gifted, I think we need to make those [chocolates] in-person gifts only from here on out. They actually held up to the temperature surprisingly well; they weren’t melted at all. However…
“As I was opening the package I noticed a rather large ant on my arm. I swatted it and continued opening the package.  I saw another ant on my leg.  As I went to send it to meet its brother, from the corner of my eye I saw two others skittering along the outside of the package.
“I figured a handful of ants had decided to check the package out.  While picking through the shorts I began to realize I should have taken this endeavor outside, as one after another appeared.  But I continued methodically, ensuring no ant escaped as I carefully separated each layer of the shorts, still blissfully overconfident in my ability to handle whatever lay ahead. When I got to the final pair I was a bit stressed out – I don’t like killing things, not even ants.  But this is my house now, and I gotta let them know.
“I lift that last pair up, and I see the expected few newly-disturbed ants run across its pockets. I’m happy this whole thing is almost over.  I reach toward one of the final survivors. He loses his footing, perhaps in fear as he sees that God has now chosen him, and falls to the bottom of the expected-to-be-empty box. Only it wasn’t. There is something truly awful about a swarm of anything, and ants are no exception. The silver lining is that millipedes would have been infinitely worse. I had the pride not to scream, but I jumped back in horror and revulsion. Right now the box is sitting in the backyard, to be dealt with in the safety of daylight.
“Thank you for the shorts and nightmares, mom.
“I love you more than you know.”
Dan’s e-mail was not composed with an eye toward appearing in this blog, but I offer it (with his permission) because it seems to me an excellent illustration of how we’re so good at persisting in error.  Central to this phenomenon, in my view, are the roles played by focus, expectation, and confidence.  In Dan’s case, the appearance of a single ant — a “rather large” ant, in fact — created a perception that it was a solitary intruder.  On the strength of that initial perception, the appearance of three other ants caused no more than the minimally necessary adjustment to the initial theory— a “handful” of ants was in the process of checking out the box.  And having adopted a careful strategy to deal with that belief — the uncovering and execution of every single threatening intruder — his very carefulness, his focused determination to execute that strategy, led to expectation, and to confidence in the result expected.  That very focus and confidence is what blinded him to the truth.
He’s a chip off the old block, alright — and so are the rest of us, I think: confirmation bias and WYSIATI (as Daniel Kahneman calls it) — What You See Is All There Is — are shared cognitive traits we’ve all inherited from common ancestors.  In the world in which we live, individual ants — the things we see — seem large.  Taken individually, each new ant simply confirms what we already believe; it takes a sudden swarm, discovered too late, to shock us into awareness of the way things really are.
Humility is the mother of wisdom, I think.  So my wish for today is that we can look at every ant we come across with wonder, knowing that behind every little thing we see, there’s a great many more — some of which are far larger — that we don’t.
Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/06/18/the-rather-large-ant/
Twitter
RSS

Looking For Heresy in All the Wrong Places

People have different views about the causes of heresy.  There are many, I suppose.  But have you ever considered the role mere words play, in causing heresy?

By heresy, of course, I don’t mean a departure from the One True Religion.  (Not knowing what the One True Religion is, I wouldn’t know a departure from it if it hit me square between the eyes.)  Rather, I mean heresy in the original Greek sense of the word – meaning choice.  Have you ever seen two strangers who agreed right off that God exists, but who, after discussing their ideas of deity long enough, have discovered areas of disagreement?  Have you ever seen how, if they talk about God long enough, those disagreements sometimes fester?  How the sometimes lead to argument and to accusations of heresy, with schism and holy war not far behind?  If so, have you wondered how much of the difference comes down to a difference in words?

Consider Jupiter and Zeus, for example.  To the ears of a modern Christian, Muslim, or Jew, the words Jupiter and Zeus likely produce similar distaste:  after all, Jupiter and Zeus were gods of the Romans and Greeks; which is to say, pagan; which is to say, heathen, or false.  Anyone who worships Jupiter or Zeus today would be considered a heretic.  But the Greek word Zeus is simply a different spelling of the Latin word Deus, meaning god.  (The Greeks didn’t pronounce their word as one syllable, Zoose, as we do.  They pronounced it as two syllables, Dzeh-oos.  You might even say it out loud.  Pronounced with historic authenticity, the relationship to the Romans’ two-syllable De-us is more easily heard if you do.)  The word Jupiter, in turn, was sometimes spelled Diuspiter or Diuspater, and is simply a Romanized spelling of the Greek word Zeuspater.

These differences in spelling are ultimately due to differences of pronunciation.  Since the birth of language itself, differences in pronunciation have resulted from geographic dispersion.  Jack and Jill pronounce words the same way,  but after Jack emigrates to the mountains, after a few generations, his progeny are pronouncing things differently than those in the valley.   If we could roll back time, we could see that an ancient Greek pagan on bended knee to Zeuspater was worshiping the same god as the ancient Roman worshiping Deuspater – or was at least using the same words, just pronouncing them differently.  Furthermore, and most importantly, if we put pronunciation aside (along with its stepchild, spelling) we can see that both Roman and Greek alike were worshiping God the Father.  That is even we use the same words to describe God that the ancients did – “God the Father” – only we pronounce it differently.

Consider another example: the word Jove.  When Henry Higgins said, in My Fair Lady, “By Jove, I think she’s got it!” his exclamation referred to the pagan god Jove, right?  As you likely know, in ancient Rome, Jove was another name for Jupiter.  (In fact, the Romans used Jovis as the genitive case of Jupiter.)  But have you ever considered how the word Jove would have been pronounced in that ancient world?  I’m talking as a student of phonetics, the way Henry Higgins’ studied the pronunciation of Eliza Doolittle.

First of all, the Romans pronounced their J and I like a long E.  (Julius, as in Caesar, was pronounced  Ee-ooh-lee-us.)  We speakers of English sometimes pronounce our i’s the same way – like the i in media, or the second i in idiotic.   When the Romans pronounced Jove, then, the word as they pronounced it began with the sound of our long E, or “ee.”

Moving to the second letter, the o of Jove:  just as in English, the Romans had both a long and a short o, and they were pronounced like the long and short o’s. are in English.  But while we’ve come to pronounce the word Jove with a long o, the Romans pronounced their original with a short one: the same short o sound we use in the words hot, shot, and not.  (I’ll spell that sound here as ah.)  The first two sounds of the Roman pronunciation of Jove, then, would have been the sounds Ee and ah.

Next we come to Jove’s letter v.  If you ever studied Latin, you know that the Romans pronounced their v’s like we pronounce w’s.  (Anatomically, their top teeth didn’t rest on their lower lips.)  That is, they pronounced Caesar’s Veni, vidi, vici as Way-nee wee-dee wee-kee.  They’d have pronounced the v in Jove as if it were written with an English w.

This brings us to Jove’s final e.  The Romans knew of no such thing as the English “silent e” at the end of a word.  A final e was always to be pronounced.  It could be long or short.  If, as in the ablative form Jove, it were short, it would have been pronounced eh, as in the English word bed. If long, it would have been pronounced ay, like the e of paté or the Spanish que.

Putting those four facts of Roman pronunciation together, we find that the Roman pronunciation of “Jove” would not have been anything like the way Professor Higgins pronounced it.  Julius Caesar would have pronounced  J – O – V – E  as Ee – Ah – W – Eh or EE – Ah – W – Ay.  Try it yourself, if you like: say Ee-ah-w-eh out loud.  As many times as he played the recording over and over again, Professor Higgins would have had a heckuva time distinguishing the sound of Jove from the sound of Yahweh.

Imagine what an argument might have sounded like, when an ancient Roman and ancient Jew debated whether Jove or Yahweh was the real God the Father!

“It’s Yahweh,” says the Jew.

“Not it’s not!  It’s Ee-ah-w-eh!” says the Roman.

Sadly,  I think, genocides, crucifixions, and jihads have come from differences not much more substantial than that.

Which brings me to my final word for today:  Ignosticism.  It should not be confused with agnosticism.  It is a philosophic concept which maintains that it isn’t possible to “believe in God” without first having a clear idea of what the word “God” means.  It goes to the heart of the difference between saying that we know something about a subject and knowing everything about it.

To illustrate, if Jack’s God were good and Jill’s bad, we’d likely say they did not believe in the same god. If Jack’s god created the world, but Jill’s didn’t, likewise.  If Jack’s were omnipotent and everlasting, and Jill’s wasn’t, likewise.  But how far down that path should we go?  If Jack’s god threw Lucifer out of Heaven, and Jill’s didn’t, are they the same god?  If Jack’s turned water into wine, and Jill’s didn’t, are they the same god?  If Jack’s god tells us to pray toward Mecca, or to cut off our foreskins, or to not eat meat on Friday’s, and Jill’s doesn’t, can they be talking about the same God?  At what point does it make sense to feel confident that two people, each of whom shout from the mountaintops that “I believe in God,” actually believe in the SAME god?

I mean, if you know a guy name Abe Lincoln who was assassinated in 1865, and I know a guy by that same name who was very much like yours – a lawyer, an Illinois Republican with a wife named Mary Todd who once argued cases for the railroads, etc – but my guy is still alive in Honolulu, then we’re talking about two different Abe Lincolns, right?  If your car is a 2014 Toyota Avalon with a beige leather interior and XM radio, and so is mine, we’re still talking about two totally different cars unless they share precisely the same Vehicle Identification Number inscribed on the body, right?  When we’re talking about real people or things, we’re used to thinking that either they share precisely the same histories, behaviors, and other characteristics, or they’re two distinct things, right?  Conceived that way, do two people ever believe in the same God?

I believe that early in human history, there were many who shared a similar feeling that the sky was like a father who lived above mere mortals and, when so disposed, fertilized the soil below.   At some point, people who shared that vague analogy – or who had experienced the same Creator in the Garden of Eden – began using the ancient equivalent of words like “father god.”  As our numbers grew, some of us moving across mountains, others sailing across seas, the words with which we shared the analogy evolved in different ways no less than the tortoises of the Galapagos.  As we focused on nuances and details, we eventually developed detailed systems of words to define our beliefs and practices.  Vaguely-defined analogies to a heavenly father became debates over methods and the time of creation; analogies to biology became arguments over the possibility of a virgin birth; analogies to fatherhood led to schisms over whether Jesus of Nazareth was a son, and if so, an “only begotten” one; and belief in a common father-god led to  holy wars among peoples, all of whom believed in the One True God of Abraham, but who adopted very different traditions about which of the descendants of Abraham was an incarnation of that God, and which His mere prophet.

If I proclaim “I believe in God,” my dear mother (may she rest in peace) may rejoice for me.  At the same time, my agnostic friends may wonder what purple Kool-Aid I’ve gotten into.  Among my friends who are believers, some may express happiness at my new-found faith, but may wonder if I believe in the same god they do.  To all of them, I can only express that I am, if anything, a heretic.  I am a heretic in the word’s original sense: I have made a choice regarding the words I use to express my sense of awe and appreciation.  I choose the words I do carefully and with great reserve, because it seems to me that words get in the way. Indeed, it seems to me that unless I’m willing to say that my god exhibits all the same inherent characteristics, engages in all the same activities, and condones all the same behaviors, rituals, and forms of worship that someone else’s god does, then I don’t know what it is I’d be trying to communicate by saying “I believe in God.”  Does the assertion have any meaning if we have no common understanding of what “God” is? Ignosticism holds that it does not.

I share (with most of humankind, I think) a sense of awe at Creation, an intuition that there’s something more powerful than myself, a sense of reverence, even thanks, for that which brought all this to be.  I share with many a sense of trust that, in the end, its all a good thing.  But when a government form asks me what religion I am, I fear that “ignostic” will cause someone to think I mean agnostic.  If I put “heretic,” I’ll risk being ostracized by those who mean something else by that word, something that at times has caused people to be burned at the stake.   And in the end, I think that if we want to find the root causes of heresy, we may have been looking in all the wrong places.  Words, I believe, are among the root causes of heresy, because words are our own fickle creations, laden with all the nearsightedness and subjectivity of which we ourselves are made.

 

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/05/25/looking-for-heresy-in-all-the-wrong-places/
Twitter
RSS

Even as the Curtains Rise

Tomorrow morning, Easter morning, I will choose a spot of the earth in which to lay our dear sister’s ashes. Taking a last look at her earthly remains, I will say a goodbye to those ashes I’ll never see again, a farewell that is, even now, crying out to be dramatized as ‘the last’ goodbye.  But it will not be that; for whatever place I choose will be a place I will look upon often, and where, whenever I look, I will be sure to see her.

As I sit at my computer, typing these words, thinking these thoughts, I realize this is the chair in which I sit for so many hours of the day and night.  This the window I look through more than any other.  What I see out this window is our garden, and beyond it, in the woods, ‘Friend’s Theater,’ where children’s imaginations are loosed to a freedom full of joy.  I think how much she once loved theater ever since she was a child and had that role as the child in Madame Butterfly and I think how much she might have liked Friend’s Theater, and how much the theater’s tag line “where friends do things for friends”  is my reply to a lesson that she taught me, as much as anyone alive.

She’d have felt joy in the smiles of the children who now take the stage there, as she once did.  She’d have felt joy in the smiles of the children who dance, as she once could, and in the smiles of those who simply sit, unable to stand, as she was unable to stand, and especially in the smiles of those who simply listen, unable to speak, or simply watch, unable to hear.  For day after day, our sister taught me how to participate in the joys of others, no matter how many ways their bodies tried to keep them from their happy flights.

Yes, I think I will lay her in that place.  I can see it from my window.  It will be the perfect place for her a place where I’ll never again think of saying goodbye, but always, again and again, a smiling ‘Hello.”  A place where I can sit with her my little sister, my lifelong friend, my wisdom teacher where I can share in her delight, anticipating and smiling even as the curtains rise.

Joe

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/04/15/even-as-the-curtains-rise/
Twitter
RSS

Snapshot of a Child

Readers of WMBW may know that Doctor Paul Czaja, one of our co-founders, has spent his life as a Montessori educator.  Today’s blog is something WMBW believes he wrote in a recent newsletter to his current community at the Island Village Montessori School:

To inspire me every day now to continue on this my life’s journey as a caring servant of every child whose company I share, I have placed on my desk a photograph of me at the peak of my career. It shows my very beginning as a person smiling in the spring sunshine of the Bronx. My twin brother, Peter, and I are sitting with our Mom there on a small bench placed before the small hedged garden right in front of our red brick home. Sunshine and shadows dance on our three happy smiling faces.

We were still kids; looks like we were just two going on three years of age. I say it depicts me clearly at the peak of my career for it is obvious that I was not a “me” yet, and a sacred innocence was plainly visible in my face. This old photograph reveals very obviously that my primary sense then was my seeing – my simple yet profound perceiving the sacredness of ordinary life right there in front of me, and you can see there in my face the overwhelming happiness I was feeling for all the wonderful things right there in that now of my life as a little child.

Much later when I had grown up and had become an existentialist philosopher and poet — a twenty somewhat year old graduate student at Fordham University in the Bronx still — I was encouraged by a professor to investigate just what the modern artists were striving to reveal to us. I dutifully went to the Metropolitan Museum of Art in Manhattan and began to study the artistic renderings within their second floor galleries dedicated to Modern Art. I entered one small room in which was hung a single very large painting which was very simply a beautiful blue painted canvas with one diagonal swatch of red paint boldly and dramatically there — I almost heard it as a triumphant shout!

I sat on the bench there trying to grasp what this artist wanted to share with me, and suddenly I realized that I too had made just such a painting when I was only a toddler. I remembered crawling with a red crayon in my hand into our living room and there with great delight reaching out and making this upward very personal mark on the clean white wall next to the couch. I thrilled in the seeing of this singular contrast of my red crayon line and its sensuous waxy presence there on the flat pale wall. I do not recall if our mother was as pleased as I was, but this memory revealed in a flash just what this modern artist was wanting to convey: There is great meaning in one’s becoming a child again — there is a richness in recapturing the innocent epiphanies of first experiences — so sacred, for you are still at that time not yet a self-conscious “me” but totally open to what you are creating — to what you are communicating — to what you are actually living sharing that moment. The existentialist philosopher Buber had observed: “What counts is to know and to believe what one experiences and believes so directly that it can be translated into the life one lives.” That is who I was as a child — and that is what I wanted to always be all my life.

I offer a bi-weekly seminar in existential ethics for our middle school students. By round-table discussions evoked from selected case studies of real life situations in which a personal decision must be made of what is the right thing to be done, I foster an awareness in them that they each possess an innate potentiality to become a virtuous person by deciding to act kindly not selfishly, to be wise and not foolish. Having mastered the “Three Rs” they as they enter young adulthood are now ready to discover that becoming an empowered person requires the “Three Cs,” namely, the uniquely human act of honest communication that enables the creation of the almost mystical communion in which two “me s” wondrously become the kind of a “we” our hearts have always yearned for, and from that fruitful bonding of daily shared kindness is born a community that flourishes with the love and with the caring intelligence and with the creation of new beauty that brings true meaning to our whirling, silent, yet glorious universe.

So I meditate on the little boy that I was then when that snapshot was taken so many years ago, and I can see all these personal potentials that were already being actualized as I drank in life with my innocent seeing – and then as I turn to the children before me in this here and now, I strive all the more to join with our faculty and students in the creation of a true culture of compassion where we each come away from our daily encounters better and happier — which will be so evident by the look of kindness in our faces, by the gleam of joy in our eyes, and by the simple yet profound goodness of our greetings. Worthy of another snapshot!

— Paul Czaja

Please follow, share and like us:
Facebook
Follow by Email
Pinterest
Google+
https://wemaybewrong.org/wp/2017/03/26/snapshot-of-a-child/
Twitter
RSS