Three Recent Docu-dramas

     Three recent docu-dramas on Netflix have left me thinking about how our species deals with wrongness.  The first was a miniseries; the other two were feature length films; all three were based on real events.  Warning: plot spoilers ahead, though I’ll try to keep them limited.

     The Ripper is a four-episode mini-series about a series of murders in the County of York, England, in the 1970’s.  A woman walks alone in the wee hours of the morning, through a neighborhood of late-night bars believed to be a favorite hangout for hookers. Essentially, due to sexist attitudes that seem hard for us to believe fifty years later, the all-male detectives who become the series’ bumbling anti-heroes are insensitive to the possibility that a single mother in the modern world might have occasion to get a drink after work, and (quite sensibly) choose to walk home, rather than drive under the influence.  They become increasingly convinced of the theory that a woman walking alone in such a neighborhood at that hour of the morning must be a prostitute.   When other women are killed in similar ways, they realize there’s a serial killer at work, but their profile of the killer as a man who targets prostitutes becomes more than a theory.  As it gets shared with the detectives’ superiors, with the press, and with the public, their careers and the authority of their superiors in law enforcement become invested in it.  As the press hounds law enforcement for theories, those theories become part of “the truth known by all.”  Reputations, and the public’s confidence in law enforcement, in the press, and indeed, in government itself, are at stake. When evidence mounts that some victims were not prostitutes, the cops don’t abandon their profile of the prostitute killer, they theorize that the killer must have mistakenly thought that the victims were prostitutes.  (Someone is was wrong, not us.) Their attachment to their theory gets stronger and stronger as the murders mount, to the point that – well, there’s no need to ruin the whole plot.  It’s a fascinating expose of the power of confirmation bias, with extreme consequences we’d like to hope are rare – but which, methinks, probably aren’t.   

The second Netflix offering that left me thinking is Made You Look.  One promo description of it is, “A woman walks into a New York gallery with a cache of unknown masterworks. Thus begins a story of art world greed, willfulness and a high-stakes con.”  When the woman (Galfira Rosales) walks into the ultra-prestigious Knoedler Art Gallery with a single painting by a modern art master with no personal credentials and no proof of provenance for the artwork, Ann Freedman, Director of the gallery, is naturally  skeptical.  But Rosales has a credible story, so Freedman submits the painting to various experts for their opinions about the painting’s authenticity.  One after another, they vouch for the beauty and authenticity of the piece.  Some time after Freedman sells it for a substantial sum, Rosales comes to her with another painting, which Freedman also has evaluated for authenticity, and which, after the experts vouch for it, she also sells.  Over a period of some seventeen years, the process is repeated many times, until finally an evaluator questions the authenticity of a piece.  An investigation ultimately results in Rosales confessing that all the paintings she has sold to Knoedler and which Knoedler has re-sold to wealthy art afficionados — for a total of eighty million dollars – are forgeries.

In its review of the documentary, the LA Times calls Freedman “unrepentant,” a description I’ll get back to in a moment.  What stands out to me about the film is the harshness of Freedman’s critics as they accuse her of complicity in the fraud.  Many of the “experts” who originally vouched for the art quickly back off (once Rosales admits the fraud), claiming that their opinions as to authenticity were never really that at all. (Their reputations as experts being on the line, they strive to explain their own words as anything but the words of people who’ve been duped.)  Meanwhile, the wealthy art lovers who paid millions of dollars for fakes are acid in their condemnation of Freedman, certain that she was in on the fraud, clamoring for her imprisonment, ruination and (one imagines, if it were possible) her permanent exile from planet Earth.  Neither the experts nor the buyers who’ve been taken in seem “repentant” for their own gullibility, but that’s understandable:  human nature fights to preserve our sense of self-worth, and if that self-worth is imperiled by evidence we were duped, we naturally seek to pin the blame on others – in this case, on the evil, cunning, inexcusable co-perpetrator Ann Freedman, whose intentional trickery must bear the responsibility for our own mistakes.

One of Freedman’s harshest critics is New York Times reporter M.H. Miller, who is repeatedly featured in the film.  Miller is completely convinced that Freedman was in on the hoax.  He argues his case like a prosecutor, pointing to all the “obvious” reasons that Freedman must have known the works were fake. Unlike those who were duped, Miller’s certainty can’t be explained by the psychological needs of a duped victim.  Rather, it must be explained by the persuasive power of what Miller and The Times might call objective facts.  Of course, at the forefront of those facts is the eventual confession of Rosales.  Miller applies his expertise as an objective news reporter with all the benefit of hindsight.  He “knows the truth” – that the pieces were fakes – from the outset of his own understanding of the facts, and he condemns Freedman as complicit in the fraud, refusing to believe that she didn’t see things as clearly as he does.

I submit that when you “know the truth from the outset” people who saw things differently along the way inevitably look blind to it.  In any case, back to the review in the LA Times, calling Freedman “unrepentant,”  the description seems to me to suggest The Times’s implicit conclusion that Freedman was in on the hoax. On the West coast, just as on the East, then, objective reporters with the benefit of hindsight seem to share an underlying belief in objective truth.  If you’re seeking a Pulitzer prize for investigative reporting – as journalists these days all seem dedicated to doing – is there, perhaps, an essential requirement that you believe that objective truth not only exists, but that it can and should be convincingly exposed for what it is (by you)?  When one has the benefit of hindsight, clear logic and evidence must make Truth irrefutable, no?  Evidence that points to any other conclusion must not really be evidence at all, if properly understood.  And people who see the evidence as pointing in any other direction must, like Freedman, be guilty of wrongdoing.  And therefore “unrepentant.”

But the art experts who vouched for the authenticity of the fakes defended their reputations by trying to explain away their prior words.  (Unrepentant?) The wealthy but duped buyers never admitted their gullibility, defending their purchases by pointing to the reputation of the Knoedler Gallery and the sheer deceptive skill of its con-artist Director, Freedman. (Unrepentant?) In fact, Freedman is the only character in the drama who says she was duped, the only character willing to acknowledge that, despite her life’s work as an art dealer, she made some stupid mistakes. Sure, she liked the idea of being in on the discovery of  previously unknown works of masters, and of bringing them to the world’s attention.  If they were real, she profited; if not, she didn’t.  Her self-interest and sense of self-worth therefore blinded her to the evidence of forgery.  But as I see it, Freedman acknowledges as much, and no one else involved comes close to the depth of her own acceptance of responmsibility.  Everyone else tries to cast blame elsewhere.

     The third Netflix offering that got my attention, Spotlight, is an account of the Boston Globe’s investigation into Catholic Church pedophilia in Boston.  Like The Ripper and Made You Look, it ultimately paints a picture of eventually-known truth, on the one hand, and years of widespread blindness to that truth, on the other.  It, too, is a study in the human capacity for being wrong.  As the extent of pedophilia in the church becomes gradually clearer, it isn’t only the Church itself that tries to cover it up.  Parishioners deny it. Law enforcement denies it.   Family members of victims deny it.  Lawyers and prosecutors deny it.  Each element of the population with a role in the drama has its own self-serving psychological influences – I won’t call them “reasons” – for being blind to the extent of the problem.  I don’t think it’s a better docu-drama than the other two, but as I see it, Spotlight does us a favor.  The best part of it, for me, is that one storyline within Spotlight deals with The Globe’s own culpability, it’s own past blindness to the extent of the problem.  As everyone else is asking how complaints could be ignored and abuse excused for so long – by others Spotlight depicts The Globe as asking the same question of itself.

 The Globe, it seems to me, was complicit, but now, to its great credit, is portrayed as acknowledging as much. I might even say that the Globe comes across as “repentant.”  But Mr. Miller, of the New York Times, seems convinced that anyone with a brain should have seen what he sees, while the L.A. Times calls Ms. Freedman – and only Ms. Freedman – “unrepentant” when she alone has admitted to her wrongness.  Alas, I often feel that such self-righteous condemnation of the motives and beliefs of others is – how best to put it – a “sign of the times”?

All three are worth watching, both as studies of our human capacity for wrongness, and of how we respond when our ideas are tested by contrary evidence. All in all, a good mix to remind us of the ways we may be wrong.

The Broad Brush

Breaking many weeks of silence on this blog, I feel compelled to say that a conversation I had yesterday led to a question I found interesting.

My friend was upset about the attack on the Capitol  – as was I.  But I was also upset about the mainstream media’s coverage of the event.  When this difference became apparent to us both, my friend asked which was worse – an illegal, violent and destructive attack on the capital, or (assuming I was right that it existed) a certain form of media bias in the coverage of the event.

My friend offered several reasons for thinking the attack on the capital was worse.  It was planned.  Premeditated.  Illegal and destructive.  It resulted in four deaths.  It was a direct attack on the very focal point of American democracy.

I couldn’t disagree with any of that; the attack, I believe, deserves complete condemnation.  Yet I have to say that I think the particular sort of media bias I perceive is worse, if measured by the degree of threat I believe it poses to the freedoms our democracy stands for.

Before I explain, allow me to share something another friend recently sent to me.  It was a broadsheet, self-published by one M. Carey in 1815, appealing for civility in the aftermath of the war of 1812.  “The Olive Branch,” it was titled.  “Faults on Both Sides.”  “A Serious Appeal on the Necessity of Mutual Forgiveness and Harmony.”  It contained a quote from Polybius which I now share:

“If we pay a proper regard to truth, we shall find it necessary not only to condemn our friends upon some occasions, and commend our enemies, but also to commend and condemn the same persons, as different circumstances may require.”

I have rarely encountered a piece of wisdom I value more.  Yet I am hard-pressed to find evidence of its survival in the world today.

Polybius went on to explain:

“For just as it is not to be imagined that those who do great things should always be pursing false ideas, so neither is it probable that their conduct can always be exempt from error.”

When was the last time I heard a politician, or a news anchor, or a Facebook poster, condemn any aspect of something they favor, or some person with whom they are allied?  When was the last time I heard anyone commend any aspect of an opponent’s behavior?  I fear we have lost the wisdom of Polybius.

To forestall any charge that I myself fail to commend the good deeds of my enemies, permit me to depart from past practice and state directly a few of my beliefs about Donald Trump.

  • Personally, I approve of his picks for the Supreme Court and his choice of William Barr as his attorney general; I think efforts to discredit those appointees have often been politically motivated and disingenuous. 
  • Personally, I think Trump may have been right to leave so much of the fight against the COVID virus to the states; not that his approach didn’t cause problems, but because I fear that centralized control out of Washington may have created even worse problems.  Ultimately, I think it’s very hard to assess how things might have played out with a more federally-mandated approach.
  • Personally, I think Trump was right when he said, about the tragedy in Charlottesville, that there were good and bad people on both sides, a statement for which he has been widely criticized.
  • Personally, I think it perfectly acceptable that the U.S. Census would ask respondents to indicate whether they are citizens of the United States, another position for which Trump has been widely criticized.

Having voiced such approvals for some things Trump has done, am I now anathema to Trump-haters?   Do I risk being attacked by them as “the enemy”?  Sadly, I think I do. 

Sadly, therefore, I think I need to add my opinion that Trump has proved to be the worst President the country has ever had.  I think I need to add my opinion that while I might not support his impeachment for incitement to riot, I would support his impeachment for divisiveness, and especially for being derelict in his duty to facilitate an efficient transition of power.  Personally, I am thankful and relieved that in just a few more days, we will finally have someone else as our President.

But now, of course, I will be attacked by Trump’s supporters because, as one Trump supporter recently told me, to express such views proves that I have been “duped” by the liberal media. 

The wisdom of Polybius – that in the search for truth, we have to be able to find both good and bad in the same people – seems unacceptable today, on both sides of the political divide. 

In my view, Mr. Trump’s arrogance has facilitated the growth of an arrogance among his opponents, an attitude I’ve taken to think of as “liberal McCarthyism,” the effective blackballing of anyone who questions liberal wisdom.  As I see it, many liberals have come to behave like sharks aroused by the presence of blood, with carte blanche to criticize everything about Trump and to dismiss out of hand anyone who agrees with him about anything, because the man is so obviously wicked that agreeing with him about anything proves you’re wicked too.  Anyone who might have ever supported or agreed with him about any point, no matter how small, is now responsible for every harm the man has ever caused.  In the last several days, there’ve been calls for expelling from Congress those duly elected representatives who dared to object to the votes in the electoral college.  There’ve been repeated references in the mainstream media to the “violent mob of 40,000 Trump supporters” who participated in the rally in Washington, because some of them attacked the Capitol.  Last night’s local news reported how a man was fired because of his participation in the pro-Trump rally; the explanation being that “participating in the rally” was inimical to the organization’s values.

Where would we be if every outbreak of illegal acts by Vietnam war protestors caused the mainstream media, and the country, to dismiss the views of the thousands who protested peacefully?  Where would we be if the riots of 1967 had caused the country to turn its back on the civil rights movement? 

Thankfully, the media have gone to great lengths to point out that the destruction and looting that has happened during otherwise peaceful Black Lives Matter protests should not be attributed to the many who had nothing to do with such behavior.  But in the past several days, I’ve heard anchors, field reporters and the so-called experts they’ve interviewed decry and condemn the “mob of pro-Trump protestors” who gathered in Washington.  What has happened to the distinction between the peaceful majority and the destructive few?  The mainstream media and many anti-Trumpers seem, from my perspective, to be caught up in a frenzy of condemning all pro-Trumpers, painting them all now with the broad brush of riot, destruction, treason and death.

“No,” they will say to me. “You’re wrong.  Don’t you understand that to have anything to do with the protest in D.C. was to lend support to the violence?”

“No,” I will answer.  “I don’t understand that at all – and there was a day that you didn’t either.”

So, back to the question I find interesting:  Which is worse, the abhorrent, intentional, destructive attacks by those who broke into the Capitol, or the condemnation of everyone who is of a particular opinion, or who participates in a rally in support of that opinion, because of the abhorrent acts of a few?

My own opinion on the question flows from my assessment of the degree of threat posed by the two wrongs for the future of democracy. 

Whatever the number of people who broke into the Capitol Building, their vile and despicable behavior actually worked to their disadvantage.  As a direct result of their conduct, Republicans who’d planned to object to electoral votes decided not to do so.  My sense is that as an indirect result of their actions, they did more to unify the country against them and their cause than anything Chuck Schumer or Nancy Pelosi have ever done.  Some of them have been arrested and charged.  One can hope that all will be punished.  But as I see it, their actions never posed a threat to the future of the country as a whole, and in hindsight, actually helped to unify the country against them and their criminal manner of protest.

Meanwhile, however, it seems to me that the mainstream media and much of the country is now condemning “the mob of Pro-Trump protestors who gathered in Washington” for engaging in the worst sort of conduct.  At least one specifically put the blame on “all 40,000” of them.  Expel a congressman for objecting to a ballot?  Label participation in a peaceful protest “treasonous”?  Say that all those who went to Washington to participate in the rally were responsible for the tragic deaths that occurred? Fire people because their attendance at the rally is not consistent with your organization’s values? I’ve heard every one of these assertions made, some repeatedly.  And not by deranged fanatics, but by the so-called “experts” and respected news media types who guide much of American public opinion.

     If even a million people feel that way – much less fifty or a hundred million as I fear – then in my view, because of the prevalence of their attitude and their allies’ acceptance of it , they constitute a greater risk to the country’s future than several hundred widely-condemned law breakers.  That many people, painting with that broad a brush, broadcasting country-wide and enjoying each other’s approbation, potentially empower and nourish each other.  I already hesitate to express some of my opinions publicly, because I fear I will be attacked and demonized for entertaining them.  For the past twenty years or more, I’ve been becoming ever more left-leaning in my views; I’ve been what Democratic candidates consider a key swing voter, who might well be persuaded to vote for them, and indeed, I increasingly have.  But at times like these, I fear a new McCarthyism of the left, one that doesn’t welcome me for all the many issues on which I agree with them, but one that condemns me because of the few respects in which I don’t conform to their agenda. As a result, I feel pressured to take refuge on the right.

Whether Mr. Trump attempts some sort of ongoing role in public life, he is all but gone as our president.  His departure marks the time to search for ways to heal the divisiveness he helped to create.   Painting with a broad brush, condemning the many for the conduct of the few, is the fodder by which divisiveness can only continue to grow.

Polybius understood that if we really want truth, we have to recognize that our friends are capable of wrong, and our enemies capable of good.  We have to stop thinking “you’re either for us or against us.”  We have to put away the broad brush of condemnation for the abhorrent conduct of the few.   Or at least, if we feel forced to paint with that broad brush, we need to recognize how much we contribute to divisiveness, how many in the middle we may drive into the opposing camp, and how much we, too, may share blame for the next death that occurs.


After feeling profoundly embarrassed for our country since election day, I am feeling thankful today. First, for the fact that much of the Earth is still green and the sky still blue; second, for my family and friends; and last, for the privilege of living in a country where most elected officials graciously submit to rule by those they disagree with.

Four Billion Years

Most of you have probably recognized my obsession with matters of scale – the idea that our brains are wired to prevent us from being able to grasp orders of magnitude.  Whether it be atomic particles measured in Planck lengths, galaxies measured in light years, or even our difficulty estimating the number of grains of sand on a beach, I’ve been pondering the impossibility of genuine comprehension. Can we only deceive ourselves into thinking we can comprehend such numbers and distances? 

As a child visiting the American Museum of Natural History in New York, I was impressed by the timeline of life on Earth laid out as a wall mural, and I was impressed by the fact that mankind occupied such a small section of it, compared to the history of all life.  But then, I noticed that the scale had changed to give more space to humanity than its proportionate share, as if children wouldn’t be able to handle the relative insignificance of humanity, if we saw the real thing.  I don’t know if that mural has been redone since the 1960’s, but the Museum still offers for sale what they call “The Big History Timeline Wallbook” for $19.95. Their website describes the six foot long timeline as “divided into 12 sections covering both natural history as well as the history of human civilizations.”  As I see it, giving human civilization one twelfth of a timeline of all existence is bound to leave kids with an inflated sense of the importance of humanity.

I’ve been trying to make things (spreadsheets, videos, books, diagrams) which don’t employ collapsing scales in order to appease our need to see the importance of ourselves. 

For example, I recently tried to get a feel for the height of Mt. Everest compared to the size of the planet as a whole.  I’d seen plenty of diagrams in books and on the internet which show mountains on the surface of the Earth, but if the whole Earth itself was represented, the images have always been labeled with the caveat, “Not Drawn to Scale.”  The reason, of course, is that the diameter of the Earth is so much greater than the height of Everest that for an image of the Earth to appear on one page of a book, Mt. Everest would only be as high as a human hair is wide.  Such problems drive illustrators to resort to deception.  If the people at the American Museum of Natural History didn’t resort to collapsing scales, their six foot long diagram of history since the big bang would leave less than two feet for the history of all life on Earth,  about 1/1,000th of an inch for the history of homo sapiens, and about a 1/100th of that for what we think of as human “civilization.”

Every picture of Earth’s cloud-studded atmosphere I’ve seen has made it seem to extend quite far above the surface of the Earth – far higher, say, than Everest.  And we’ve all wondered why, hurtling through space at sixty-seven thousand miles per hour, the Earth’s atmosphere isn’t torn to shreds.   But when I looked, I couldn’t find an image showing me the thickness of the atmosphere compared to the thickness of the Earth.  It occurred to me that perhaps an Excel Spreadsheet might be a handy tool for giving me a better understanding of what I wanted to grasp. So I made such an image myself, using Excel. Starting at the center of the Earth, I added a kilometer to the next row of the spreadsheet, then I copied that increment and dragged it down the page the requisite number of kilometers. The image so drawn enabled me to “see” the picture that resulted – but only sort of.  Specifically, holding my finger down on the scroll bar continuously, the kilometers from the Earth’s center to its surface sped by in a blur that lasted about twenty seconds or so.  Then, suddenly, Everest and the atmosphere flashed by —so fast I couldn’t see them at all.

Never again will I forget that our atmosphere is a thin film barely coating the surface of the planet.  Hurtling the Earth through space has as much chance of disrupting our atmosphere as a pitch toward home plate would have of wiping the grass stains off a baseball.  

In any case, a couple of years ago, I became determined to figure out a way to represent the history of life on Earth, giving mankind no more than our due space within the whole.  I wanted a timeline that did not vary in scale.  I wanted to express the place held by humanity not only with the “knowing” that our brains are capable of, but through the more real “experiencing” that I believe gets us closer to genuine understanding of very large numbers.  How might I use an experience to enhance my intellectual understanding?

One early idea was to use ping pong balls threaded together in a very long string, with a person having to actually walk the length of the string to experience how long it would be.  I wanted each ping pong ball to represent a length of time that a human being might actually understand.  I thought perhaps a thousand years – the time since William the Conqueror – might be such a number, to be represented by a single ping pong ball.  The math was easy – at that scale, I would only need four million ping pong balls to represent the four billion years of life on Earth.  But the idea of someone walking the length of the string came to an end when I realized that, leaving a 10 mm space between each 40 mm diameter ball, a string of four million such balls would be more than a hundred and twenty-four miles long.   Few people would be willing to walk such a length to “experience” that amount of time, so I felt compelled to change my approach. 

Marbles would be too tough to string together.  Lead fishing weights would be smaller and easy to crimp onto a long fishing line. But the combined weight of four million fishing weights became a prohibitive factor. 

So I turned from ping pong balls and fishing weights to the idea of making a sound or video recording.  But it was quickly apparent that an hour-long video couldn’t express the concept meaningfully. What if such a video counted off the history of life on Earth in increments of a thousand years each?  It would be pointless if the numbers on a screen were simply a blur; each would have to be visible for, say, half a second, even to be recognizable.  But at two thousand years per second, it would take a person over sixty years, watching continuously, to see such a video. The video idea went the way of the ping pong balls.

The more ideas I considered, the more I felt myself inching closer to understanding the length of time involved, but the more it seemed that real understanding of such huge durations was impossible, as a practical matter.  We can say the words, “Life has been on Earth for four billion years,”  and we can multiply and divide numbers, but does any of that mean we can really appreciate how long four billion years is? By writing characters on a blackboard, we can manipulate decimal points and zeros to deceive ourselves into a false sense of “understanding” such large numbers.  We have also developed whole systems of math that make perfect logical sense of the so-called “imaginary” numbers like the square root of negative two.  We make calculations and even practical use of them, but none of us really “understands” them. I think big numbers are the same.  Math helps us deceive ourselves into a feeling of understanding when no real understanding exists. 

Still, we probably get by more efficiently in this real world precisely because our brains can’t comprehend such immensity. 

I remain keen on distinguishing between mere intellectual “understanding” (aka self-deception) and true experiential understanding.  Obviously, we can’t actually experience a duration as long as the four-hundred-billion-year history of life on Earth .  But try counting down that history, even a million years at a time, from four billion years ago, and see how far you get.  “Four billion.  Three billion, nine hundred and ninety-nine million.  Three billion, nine hundred and ninety-eight million.  Three billion…” It would be like singing yourself all the way through “100 Bottles of Beer on the Wall” ten thousand times in a row. If we can’t bring ourselves even to say the numbers, even if we’re counting a million years at a time, then how can we think we really understand them?

In any case, while my string of ping pong balls will never see the light of day, my effort to put the life span of humanity into perspective has finally resulted in something real – a book titled “It’s Been Four Billion Years: The Story of Life on Earth a Million Years at a Time.”

It’s at the printer’s now, and will be available on and other retailers in September.  The retail price is $19.95 – same as the History Museum’s deceptively skewed timeline.  (Needless to say, my book maintains a constant scale, all the way through.)

The best part of it is that in appreciation for your having subscribed to this blog, I am willing to send you a free copy when they come out next month.  If you send me an e-mail with your snail-mail address, I’ll be able to send you one.

In the mean time, I’m thinking of doing another book, trying to express extremes of distance and size….

Until then, peace to you all.

The Meaning of Large and Small

I continue to have difficulty comprehending the very large and the very small.

Yesterday, thinking about the word “small” itself, I got to wondering what I mean when I call something small.  I wondered how I would phrase a definition of the word, if I were assigned the task of creating one for a dictionary.  For example, I could say “small” means the same as “little.” But what would that add, say, to the understanding of someone who spoke only Chinese, or Martian?  My dictionary in fact defines “little” as “small in size.” Could I define “small” other than by simply using an English synonym for it? If you’re a word nerd like me, you might try doing this yourself. If you do, I’d be interested in hearing what you come up with. 

I think my big American Heritage Dictionary (Houghton-Mifflin) struggled with the same problem. In that dictionary, as noted, “little” is defined as “small in size,” while “small” is defined as “being below the average in size or magnitude.”  Fair enough, I thought, until I considered some other definitions in that same book, where  “size” is defined as “the physical dimension, magnitude or extent” of something, but “magnitude” is defined as “greatness in size or extent,” and “extent” is defined as “the range, magnitude or distance over which a thing extends.” 

Considering all these definitions together,  I imagined my Martian visitor persuaded that abstractions like “small” and “little” mean the same thing, but having no idea what that is. When the words are only defined in terms of each other, how can anyone tell what they really mean?

Though I felt I was going in circles, I kept trying.

“Great,” I learned, is “Very large in size,” while “large” is “of greater than average size or extent.”  So great means large and large means great.  Great! But if I didn’t already have an idea of big and small, where would that get me?

Of course, linguists have long recognized this circularity of language.  The problem isn’t just defining “small” by using a synonym like “little.”  It’s more general than that, and it ultimately comes from the absurdity of trying to define words using other words.  If we want to define what a word means by saying that word A is equal to  words B, C, and D, the problem is that no matter how many words we go through, every set of words becomes equivalent to nothing but other sets of words.  B, C and D are defined by E, F, and G, and those by H,I and J, but H, I and J are defined by A, B and C.   Even in a language of 50,000 words, that fixed set of things is limited – a closed loop, explainable only by itself.  Every word, sooner or later, can only be defined by reference to itself or to words that it has helped to define.  And in any such closed system, entropy sets in.

The definitions of “small” and “large” above both  make use of the concept of “average,” which might seem helpful, because “average” is a concept which takes me from the world of words into the world of mathematics.  If small is less than average and large is greater than average, then that should prove helpful – provided I know what “average” means.  But what do I mean by “average”?

My mathematical concept of “average” requires a finite set of numbers to consider.   I can say that the average of two 12’s, one 17 and one 19 is 15, but only because I know how many of each number I have for my calculation.  I’m dealing with a known, fixed, quantifiable set.  I might even be forgiven if I say that the average (in size) of one golf ball, one tennis ball, and one soccer ball is (more or less) a  baseball, because, again, I’m dealing with a know set of data.  But what data set — what objects,  and how many of them — should I use to compute an average, on my way toward understanding that small is below average, and large is above average?  The average size of all things? If I take  the smallest things I know, like quarks, and the largest, like the whole universe,  don’t I still need to know how many quarks there are, and how many stars of various sizes,  before I can compute an average size of things, and therefore to know what it means to be above or below the average of all things, and therefore, inherently large or small?

Meanwhile, in order to take into consideration, say, my dictionary, in order to count it in considering the average size of all things, do I count it as a single thing, about 14 inches in length and weighing a few pounds, or as a thousand smaller things, called pages, or several billion even smaller things, called molecules?  Is my car just a single car, or is it an engine, a body, a chassis, and four wheels? Obviously, if I count myself as one person of my size, I have a very different impact on the “average” of all things than if I count myself as a few billion cells of far smaller size. With such questions pervasive about every thing and every size, I submit, it is impossible to formulate a data set capable of yielding any meaningful concept of an “average” in the size of all things —  yet Houghton Mifflin has no problem saying that small things are things smaller than “average,” and large things  larger.

(By the way, I submit that it it makes no difference if we think in terms of medians. Using medians, I suspect our calculation would yield something only slightly larger than a quark, and virtually everything else would then be considered very, very big by comparison. And if we used the half way point between the size of a quark and the universe, we’d get get something half the size of the universe, and everything else would be very, very small. Can our feeling that we understand what’s big and what’s small be so dependent on different mathematical ways of thinking about averages?)

Pulling out that big dictionary again, I wonder, what makes it big?  At first glance, it doesn’t seem nearly as big as my car, yet I call it big while I call my car small.  Surely, I mean that my dictionary is big because it has greater magnitude – more pages, and more words — than other things to which I tend to compare it (roughly speaking, those other things I also call books).  I call my Toyota  small because it has less trunk space and passenger seating than my daughter’s SUV.  Could size, then, be a concept that is relative?  It seems so – but relative to what?

I find this last question intriguing.  I think a book big when I compare it to other books, and a car large (or small) when I compare it to other cars.  That concept of relative size seems easy. But if I think for a second that a star can only be thought large in comparison to other stars, I quickly retreat from my relativistic comfort zone.  Surely  stars are always, and absolutely, larger than books, and surely  books are always, and absolutely,  larger than quarks.  If so, surely there is something about size that is not relative to its own group of similar objects – something absolute which enables me to feel quite strongly  that one group of things is inherently larger than some other group of things.   And so, once again, I’m back to square one, wondering what makes one thing large and another thing small.

In desperation, I consult the dictionary again.  This time, instead of “large” or “small,” I look up the word “big.”  (After all, what could a big dictionary be better at defining?)

“Big” is defined by the folks at Houghton Mifflin as, “Of considerable size, magnitude or extent; large.”  Size, magnitude, extent, large – nothing new here.  Big is large, and large is big./ For a moment, I’m disappointed.  But wait.  (There’s more!)   I look up “Considerable.”  The first definition of “considerable” is “large in amount, extent, or degree.” ( Arghhh!  Large means considerable, and considerable means large.  I feel like I’ve been here before.)  For a moment, I consider looking up the new words “amount” or “degree,” but I decide that effort won’t likely be useful.  Then my eyes fall on the second definition of “considerable.” 

“Worthy of consideration.” 

Ah! We’ve left the world of physical dimensions, some place outside the closed loop of size words. Am I finally on to something?  I look up “worthy.”    I find, “Useful.  The quality that renders something desirable, useful, or valuable.”

I think I’ve found the answer I’ve been looking for.  Something is “considerable” if it is worthy of consideration, and it is worthy of consideration if it is useful.  Size is indeed relative, but relative, primarily, to what I find useful

I recently watched Season Six of the survival series “Alone.”  (Synopsis: Ten people competing to survive in Ice Age conditions.) In that world, a moose was important, both because, unlike a squirrel,  it could kill you, and because, if you could kill it, it could feed you for a very long time.  The series contestants considered thirty-pound Northern Pike or lake trout more valuable than ten-pound rabbits, which were in turn more valuable as food than even smaller things like mice.  The closer in size something was to the contestants, the more nutrition it brought.  The more “worthy of their consideration” it was. 

The contestants on “Alone” embraced the value of living as our primitive ancestors did, and I find myself reflecting that it was this ancestral way of life that shaped our species’ understanding of words like “big” and “small.”  Pigs and cows and grizzly bears were more important than, say, mosquitoes. As human beings evolved, those who paid most attention to things about their own size — things between, say, the size of spiders and mastodons — survived and reproduced better than those who paid attention to grains of sand or the vastness of the Universe.  I conclude that, as we generally use the terms “small” and “large,” absent a context which suggests a different relative comparison (a car being small compared to other cars), the default meaning is not really relative to a some incalculable “average,” but relative to ourselves. That is, smaller or larger than we are.  I myself create my sense of the “average” size of things.  Things smaller than me are small, and things larger than me are large. Things are large or small relative to me. And from an evolutionary perspective, it is the things closest to my own size that are (subjectively) important to me.

But are pigs and grizzly bears really more important than mosquitoes, objectively?  Exploding supernovae and super massive black holes are not only extremely large.  Astronomers and cosmologists now tell us that if it weren’t for them, we wouldn’t exist, as they create the very elements from which we’re made.  Those who study life’s origins tell me all complex forms of life began when bacteria became essential parts of our cells, so we wouldn’t exist were it not for bacteria.  And the importance of bacteria is not just historical.  If, today, things like plankton and bacteria stopped being available as food for larger things, moving up the food chain, we’d have nothing to feed on ourselves.  And all the time, quantum physicists remind us that without things as small as quarks, we wouldn’t exist either.  

So it isn’t really true that lions, tigers and bears are most important to my existence.  Nor were they, in fact, most important to our ancestors’ existence.  From an evolutionary perspective, we succeeded by paying attention to things our own size, not because such things were more important to us, but because we could actually do something about them if we paid attention to them.  Evolution proved that paying attention to them was useful to our survival.

But if the issue is usefulness to me – whether I can put my understanding of something to use, to help me eat it or to keep me safe from it – which should I consider more worthy of consideration, more considerable in size, to my current life in the 21st century – a black hole, or a virus?  If the answer is that I can do more with my understanding of viruses than I can with my understanding of black holes, why do I think a black hole  more  worthy of my consideration – more considerable in magnitude —bigger — than a virus?

Our notions of smallness and bigness come from a time in our past when we could not deal effectively with things far smaller or larger than ourselves, a time when things our own size – the moose, the cow and grizzly bear – were most worthy of our consideration.  We could not concern ourselves in those days with virology or pandemics, with things as small as molecules of CO2 or as large as ozone layers or the acidity of the oceans. Thinking about viruses rather than grizzly bears would have been fatal in those days. But such things, both the very large and the very small, are beginning to enter our sphere of influence.  As science continues to broaden our understanding of the world, our ability to prosper (or not) in the face of things we previously thought too large, or too small, to matter, changes. Is it time, now, to revise our thinking about the meaning of words like “large” and “small”?

A Season for Everything

     Hunkered down now, I think I’m like most of us are these days: nervous, on edge, and mindful of worst case scenarios.  My own playlist seems stuck on the last days of Pompei, the last days of the dinosaurs, and the last hours of 1999 when we took one last deep breath of life before experiencing Y2K.  Each tells me a lot about the dangers of predicting the future.

    I’ve spent much more time trying to understand the past than the future, and that habit has led me here, writing how we may be wrong because, whether it’s an effort to understand what life was like in ancient Egypt or what my wife said to me just five minutes ago, I am constantly reminded how hard it is to reconstruct the past, which has a way of slipping through our fingers, being gone forever, impossible to revisit in order to test it, or measure it, or take any more photographs of it, leaving us with only the scattered few relics which somehow found their way into our attics.  I’ve often thought that in one sense, at least, it’s actually easier to predict the future.  If we say that the world will end tomorrow, that’s something we can actually test.  When tomorrow comes, we can not only agree upon, but know, with relative certainty, whether we were right or wrong.  The past is not so easy.

     But whether we’re looking forward or backward, we can’t know, now, if we’re right about the conclusions we reach.  Predictions about upcoming election results, about stock market performance, about the future course of global pandemics, can only be based on comparable situations in the past.  We extrapolate from the known we’ve experienced to the unknown that looms ahead.  But in so doing, we assume a repetitiveness that may be misleading, especially when our ideas are based on the experience of mere lifetimes (like the surprised citizens of Pompei) but even when they’re based on a broader historical record (like those among the dinosaurs who’d studied the  Cambrian explosion — I imagine them sitting around, contemplating how far and well they had come since those days, at the moment the asteroid hit.)  Predicting the future always carries with it a bias in favor of the past, and past experience is very poorly suited to predicting the unprecedented. 

     Y2K teaches us that doomsayers may be wrong.  The eruption of Vesuvius that wiped out Pompei and the Chicxulub impactor that wiped out the dinosaurs teach us that calamity may strike even when no one’s predicting it.  It’s too late to hope that COVID-19 will be the dud that Y2K turned out to be.  There’s still time to hope it won’t be the end of life as we know it.  It is, of course, a time for diligence, not panic.  But within all the precautions we take to fight this invisible enemy, I like to remember that the poor souls who died at Pompei would have been dead for nearly two thousand years now anyway, even if Vesuvius hadn’t erupted.  And even more, I like to remember that if an asteroid hadn’t wiped out the dinosaurs, Mammalia would never have thrived, Humanity never existed.  From our limited perspective, the Chicxulub disaster was the best thing ever.  And from the perspective of those who will inherit this planet from us – the ones we often say we care so much about – we just don’t know how they will view the pandemic of 2020. Perhaps they’ll see it as the beginning of great new things.

     It is in that spirit that while I hunker down at home, wiping off door handles with my sanitizer, wondering if it would do any good to start praying again, I remind myself that I will be dead two thousand years from now, one way or another, and that perhaps the demise of us baby-boomers will save the social security system for our grandchildren.  Perhaps the crisis which forces us to stay home will lead to a world of less extended travel, more stay-at-home work, more locally-sourced foods, and ultimately, a just-in-time rescue of the world from global warming.  We just don’t know, and with uncertainty comes not only bad stock markets but room for hope.

      And here it is, spring time after all.  As I hunker down, I see birds building nests, I see squirrels and rabbits in the yard, and most comforting of all, I hear people talking about “us” – about coming together for each other, about our responsibility toward each other, about the sacrifices that health care workers and others are making for us.  As Pete Seeger reminded us, there’s a season for everything. By my former calendar, this particular season should be bringing me nightly news of Republicans and Democrats insulting each other, modeling animosity and disrespect for our grandchildren. I KNOW that as a result of COVID-19, I haven’t had to listen to quite so much of that recently.  Perhaps, COVID-19 is ushering in a new season, with a new calendar. And that, my friends, strikes me as a very good thing.

Being of Two Minds

                Nearly fifty years ago, I read Julian Jaynes’ book, the one with the imposing title, The Origins of Consciousness in the Breakdown of the Bicameral Mind.  Immediately one of my favorites, it remains so to this day.  Drawing on ancient literature, archaeology, neuroscience and other sources, Jaynes focused on the nature of consciousness, theorizing (largely on the basis of evidence of “auditory hallucinations” in early mankind) that consciousness arose when the two hemispheres of the brain first started “talking to each other” across the corpus callosum.

                Jaynes’s theories were extremely popular at the time; then they were attacked and called all wrong; then they made somewhat of a comeback, with a society formed in Jaynes’ honor.  I’m not sure I want to know where his reputation stands today.  I loved the idea, and I wouldn’t want to be saddened once again to learn that his theories are all wrong, knowing that in another thirty years, they might be accepted again.  Thanks to Jaynes, I will go to my grave remembering and enjoying the image of the bicameral mind, and of the two halves of it talking to each other, as Jaynes suggested.

                “Hey there, stranger.”

                “What?  Did somebody say something?”

                “Yeah.  It’s me.”

                “What?  Who are you?”

                “I’m you, dummy.  The other half of you, anyway… It’s really time we started recognizing each other, and thinking of ourselves as one. Dont you think?”

                Quite often, I catch myself thinking of Jaynes’s bicameral mind.  How, when a thought passes through my consciousness, it’s as if I’m both a speaker and a listener. 

                “Should I post this thought on my blog this morning?” asks the speaker.

                “Sure, why not,” answers the listener.

                To me, all thoughts seem like conversations between the two halves of my brain.

                Now, I know that all brain phenomena can’t be explained by this two-brain theory.   Memory, for example, doesn’t seem to reside on one side of the brain, the subject of a search by the other.   You’ve got the name of your fifth grade art teacher on the tip of your tongue.  (Well, of course it’s not really on the tip of your tongue; we all know that memories are stored in the brain – but where in the brain?)  It sure seems that recollections are made up of elements scattered here and there – perhaps the audio track here and the video track there, but more likely, different elements scattered like the loose pieces of construction paper always scattered around Mrs. What’s-Her-Name’s floor. Still, even if the physical location of the elements aren’t confined to one side of the brain or the other, the conversation that goes on in the effort to retrieve the name could be a conversation between the two halves. 

                R: “She was the one with the dark brown hair, right?”

                L:  “Yeah.  Auburn, maybe.  With a splash of gray above one ear.”

                R: “Did her name start with a B?”

                L: “No, I don’t think so.  Seems to me it began with an S.

                R: “S – T maybe?  Stubbs?  Staub?  Straughan?“

                From the many times we’ve been frustrated by inability to recall things, we often share a sense that even if they don’t reside on opposite sides of the corpus collosum, the things we’re searching for reside in parts of our brains that exist elsewhere, even if invisible to the part that’s on the hunt.

                AS it happens, I’m content to let the mysteries of memory remain unsolved.  For at least one more day, I can simply accept that what we call memory can be in our brains, somewhere, theoretically retrievable but temporarily unknown to the conscious mind.

                What I can’t accept, even for one more day, is the mystery of the dream state.  And I’m thinking of a particular type of dream, a particular aspect of the dream state.  I’m thinking of this aspect because of the dream I was having less than five minutes before I started this post this morning.  The origins of this morning’s dream go back to Penny, a woman I last worked with over seventeen years ago.  Last month, I happened to return to my former place of employment for a meeting with my former boss.  As I sat in the lobby waiting, Penny walked in.  I immediately recognized her and said, “Hi, Penny, how’ve you been?” There’d been several hundred people who’d worked in that building when I last did, seventeen years earlier, and having never worked with Penny closely, I was rather impressed with myself that I could pull her name right out of the air like that.

                But then, this morning, there was this dream.  In the dream, there was Penny again.  And I recognized her face, and I knew who she was, but my former boss was asking me to remember her name – and I couldn’t.  It took me a long time, and a lot of help from my boss, but in the dream, I finally remembered it.

                Now, remember that I’d remembered Penny’s name so well for seventeen years that I could retrieve it instantaneously when, unexpectedly, I saw her last month.  It didn’t seem to be hidden away in the cobwebs somewhere.  If it had been so quickly retrievable for seventeen years, is it possible that, during the dream, part of my brain was fully aware of the name, and was scripting this dream like a stage play, while another part was playing the part of a brain that couldn’t remember?  Had my brain somehow divided itself, for story-telling purposes, into a part that remembered and a part that didn’t?

              Anyone who’s ever had difficulty recalling something for a second or two may be inclined to feel that my dream this morning represented nothing more than the usual process of working to retrieve a memory, beginning with an inability to recall her name, then employing whatever processes the mind usually employs in its efforts to recall, and ending with success in the effort.   If this is what was going on in the dream, the dream could have ended the way waking efforts to remember things often do – with failure.  Nothing unusual here.  The dream state is subject to the same difficulty remember things as the waking state is, and its  efforts to remember things utilize the same or very similar strategies.

             But is it possible that my dreaming mind this morning was divided into two parts: a part that did know the name, and another part that didn’t? A story-telling part, that wanted to go on a ride through a process of remembering something, and choosing the story of Penny because it wanted wanted a successful outcome, and knew that with Penny, the outcome would be successful, because that part – the story-teller part – knew the woman’s name was Penny, and that part of my brain planned all along to end the dream with that revelation?

  And I actually think this may be closer to what really happens in at least some dreams, and my reasons are rooted in a similar, though slightly more elaborate, dream I had three or four months ago. Unlike my dream about Penny, that dream was longer, consisting of numerous scenes.  And in that dream, too, I was trying to identify something, starting from ignorance and ending up satisfied by understanding.  Early in it, I’d been told by an agent behind the counter of a rental car agency that the car I’d reserved had been taken, earlier that day, by a relative of mine.  When I asked who, he said the name had included the letter O.  I thought of names beginning with O, but there were no Ozzies or O’Briens in the family.  I thought of my cousins Joe and Lorin and Bobby, but no, said the man behind the counter, it wasn’t them.  After a while, another man told me that the name also included a G.  I had no relatives named Ogden, so I told the man it must have been one of my many cousins whose middle or last names were Logan. Once again, however, I was informed that I was wrong. Eventually, other people appeared in the dream supplying the letters N, U and Y, and by the end of the dream, I realized that the man I’d been trying to identify was a second cousin named Wendell, whose last name was Young. 

                In the dream, the revelation took me by surprise.  But what had me puzzled for days, and still has me wondering, is how the dream was even possible.  As the dreamer, I had no idea where the dream was headed when it began. Not until it ended did the clues make any sense.  Yet, as the spinner of the tale, as the “writer of the story,” so to speak, some part of my brain had to know where everything was headed from the outset.  Back when the man behind the counter was telling me it was a relative with an O in his name, the “writer of the story” knew, even if I did not.

                The reader of a mystery novel is ignorant at first, puts together clues, and finally connects the dots somewhere along the way – if not, he’s given the answer at the end, by the writer..  But mystery novels aren’t written that way.  The writer has typically known “who done it” since the first clue was inconspicuously mentioned back in Chapter One.  I understand how this workers with mystery novels, because you have two different minds at work – the mind of the writer and the mind of the reader.  But is the same true in dreams?  How was it possible, in my dream, for that man behind the counter to know that my relative’s name included a an O, at the beginning of the dream, unless he already knew the end of the dream?  And if he knew the end in advance, why didn’t I? 

                The only explanation I can think of is that the dreaming mind is really two minds, the mind of the writer and the mind of the reader.  That when we dream, we see ourselves walking (or flying?)  through a world with less than complete understanding, a world in which a lot more is known by a different mind which, though presumably also resident in our brain, knows far more than we do about the world – perhaps, even both the “real” world and the one in which the dream takes place. This “writer” ho knows more than we, the reader know, is intentionally giving us only part of what we see in the dream, the same way a mystery writer does, doling out information at the right time, to enhance the story.

                Some may think of this as evidence of God.  Part of me wonders that too. But more often, such phenomena make me think of my love for Jaynes’ theory about the Origins of Consciousness in the Breakdown of the Bicameral Mind.

                I guess you could say I’m of two minds about it, eh?

                Yeah. I think so.

The Meaning of Meaning

Years ago, my brothers and I started debating the existence of absolute truth.  My brothers defended its existence and knowability.  I questioned its knowability, if not its very existence.   After decades retracing the same ground, our dialogue began to seem less like a path to enlightenment than a rut.  My brothers still believed in absolute, objective truth, and that it’s possible to know at least some of it, while I stuck to my subjectivist guns.

My subjectivism included the matter of language.  I see words as arbitrary patterns of sound waves without inherent meaning, which is to say, lacking any meaning until two or more people reach agreement (or at least think they’ve reached agreement) on what idea those sound waves represent.  The word “fruit” is not inherent in apples or oranges.  Not only the sound f-r-u-i-t but the very concept of fruit exists only in the mind.  A “fruit” is not a real thing, but a category, a label for an idea.  And ideas, as we all know, exist only in the mind. 

Having agreed that early ancestors of MacIntosh and Granny Smith had enough in common to be called “apples,” and that the ancestors of navels and clementines had enough in common to be called “oranges,” we then went further and called them both “fruit.”  Slicing and dicing with our verbal ginsu knives, we label some fruit as “citrus.” We group fruit with legumes and call them both plants.  We add ourselves and elephants as mammals, then add plants and viruses and call us all “living things.” All the while, scientists debate the very idea of what living things are, including and excluding actual things from what is, I maintain, just a concept.  Importantly, the things themselves are not affected by what the scientists call them.  A rose remains a rose, by whatever name we call it.   

And so language, I say, remains subjective.  We attempt to group and classify real things by using conceptual labels.  We distinguish between a gallop and a trot, but we ignore the difference between the way I “walk” and the way a thoroughbred horse does, or a camel or a duck.  Arbitraily, subjectively, we call them all the same thing: “walk.”  Why not distinguish between a walk and a shawk and a mawk?  It’s all very arbitrary.  What constitutes a “walk” is obviously an idea – and ideas exist only in the mind.

Comfortable in my subjectivist philosophy of language, I recently came across the late Hilary Putnam, former Harvard professor and president of the American Philosophical Association.  Putnam famously claimed that “meaning ain’t just in the head.”  In his books Meaning and Reference (1973) and The Meaning of Meaning (1975), he used a thought experiment to demonstrate that the meanings of terms are affected by factors outside the mind.

Essentially, Putnam did this by asking us to imagine a world that is a perfect twin of Earth – that is, in every way but one.  The only exception is that its lakes, rivers, and oceans are filled not with H20 but with XYZ.  But everything else is identical, including people, and their tongues, and their languages – so that both Earth’s Frederick and Twin-Earth’s Froderick use the identical word “water” to refer to the stuff that fills the oceans on their respective planets.  Since Frederick and Froderick are physically indistinguishable, and since their words “water” have different meanings, those meanings cannot be determined solely by what is in their heads.  

So said Putnam.

The idea that meanings are rooted in real things, not just in subjective minds, became known as “semantic externalism.” It was credited with bringing about an “anti-subjectivist revolution” in philosophy, a revolution that threw into question the very “truth” of subjective experience.[1]

Yikes!  Was I wrong yet again?  Did I have to rethink my whole philosophy of language?  Did I have to concede to my brothers that there is such a thing as objectivity, at least in the meaning of words?

Not so fast.

Putnam’s Twin Earth thought experiment had me worried.  But at the end of the day, I decided it suffers from the common logical fallacy that its conclusion is contained in its premise.   The real question, I believe, boils down to one that Putnam may have had in mind when he titled one of his books The Meaning of Meaning.

If language is as subjective as I suppose, and if words can mean different things to different people, as I believe, who’s to say what a word really means?  I don’t believe there’s an objective answer, and perhaps Dr. Putnam did, but I think it may come down to what we mean by the word “meaning.” When faced with such questions, I’ve often sought the judgment of etymology, the history of words. I find it instructive to retrace the way words (and their meanings) change over time. And so I set out to unearth the etymological path by which the word “meaning” came to have meaning.

According to my research, the word is related to the Greek and Latin root men– (“to think”) from which English words like mental and mentor have derived.  It came into Old English as the verb maenan, meaning to state one’s intention, to intend, to have something in mind.  And much later, the verb “to mean” led to formation of the verbal noun, “meaning.”

From an etymological perspective, I would argue that meaning is therefore subjective, by definition.  If to “mean” something means to “have it in mind,” then there cannot be meaning independent of someone’s mind.  Definitionally, it is the idea formulated in the mind.  The person whose tongue pronounces the word’s sound is trying to convey the meaning in her mind.  And when the listener who hears the sound uses it to form an idea in her mind, “meaning” happens again.  To “mean” something is, always, to have an idea in mind.

I find it interesting to imagine the day, within the past few hundred years, on which two people were watching a meteor shower, or a lightning storm, or a two-headed snake – some occurrence that struck them as an omen of sorts – and one of them first asked the question, “What does it mean?”

It’s a question we’ve all asked at some point – if not about an omen, then about a symbol, a gesture, or some other mindless thing. The question has become an accepted expression in modern English.  But what a momentous event, the first time it was asked!  Here we had a word – to “mean” something – which (at the time) meant that a speaker had some concept “in mind” and “intended” to convey that concept to another.  That is, as then used, the word clearly referred to a subjective concept.  You’d long been able to ask what a person meant, intended, or “had in mind.” But when the question was first asked, “what does it mean?” referring to a lightning bolt, an earthquake, or a flood, the one asking the question was implicitly asking another, broader question – whether, perhaps, the “it” – the burning bush, the flood, the electrical discharge – could have “something in mind.” 

Alternatively, they were asking if the thing or event had been endowed with a meaning by virtue of having been in the “mind” of some superconscious deity that had caused the event.  If the “meaning” was that which had been in the mind of such a deity, it was arguably still subjective, i.e., still dependent on the idea that existed in a particular mind.  But if the meaning had originated in the thing or event itself – in the rock, or the flame, or the electrical discharge – then the conclusion would have to be that “meaning” can exist independent of a mind.

At any rate, it seems to me that whoever first asked the question, “What does it mean,” was expanding the very idea of “meaning.” Until that moment, to “mean” something meant to have it in mind.  To think it.  Until that moment, as I see it, everyone understood that “meaning” is entirely subjective.  To ask what “it” means was a misuse of the word.

And so, on the basis of etymology, I stand my ground.  “Meanings,” by definition, are ideas that form in the mind.  The idea of fruit.  The idea of walking.  Even Mr. Putnam’s theory of semantic externalism – that meaning “ain’t just in the head” – is an idea that, like all ideas, is just in the head.

[1] Davidson, Donald, Subjective, Intersubjective, Objective, Oxford University Press, 2001.

Impeachment Again

     While I may be wrong, I believe there are good grounds for impeaching presidents.  I just don’t think the House has chosen wisely in its effort to define what they are.

     Consider the second proposed article of impeachment.  It essentially charges the president with “obstructing Congress” by refusing to comply with Congressional subpoenas. My problem here is that I don’t think a President is required to do whatever Congress orders him to do. As I see it, refusal to comply with a subpoena is a perfectly valid way of contesting its legality.  As best I recall, it is not uncommon for a party in litigation to refuse to comply with a subpoena, as one of the ways of getting a court to decide whether the subpoena is legitimate.  And it seems to me that in cases involving the separation of powers, it’s similarly legitimate for a president to refuse to comply with a subpoena, anticipating that Congress would then have to go to court to seek to enforce it.

    By way of analogy, in order to challenge the validity of Jim Crow laws, Rosa Parks had to “violate the law,” triggering her arrest for  refusing to sit in the back of the bus.  This was risky, but a legitimate way to get judicial review of the constitutionality of the law in question.   In order to get the courts to consider his status as a conscientious objector, Mohammed Ali had to “violate the law” by refusing to submit to the military draft.  Risky again, but legitimate.  The courts have developed a doctrine of “standing,” a doctrine designed to prevent just anybody from asking the courts to decide purely hypothetical questions.  “Standing” means that to challenge a law, you have to be actually affected by it.   For reasons of “standing,” violating a law is sometimes required in order to get a court to consider its validity.   If you want to challenge a local zoning law in court, you may have to violate the law (as interpreted by the zoning board) or you won’t have standing.  If you think a provision of the Internal Revenue Code is unconstitutional, you’ll probably have to violate the I.R.S.’s interpretation of the law, getting assessed taxes and penalties, before you’ll have standing to challenge that law in court.  There were various examples of this in my own  practice of employment law.  Often, it’s risky.  If you lose such challenges, you suffer the consequences.  But if you win such challenges, the ultimate prize is a finding that you were actually within your rights all along – in effect a ruling that, like Rosa Parks and Muhammed Ali, you were never really in violation of the law in the first place.

     If Congress were King, I’d favor the impeachment of presidents for refusing to comply with its subpoenas.  But Congress is not King.  In our system of law, it is the Courts that are the arbiter of what is and isn’t against the law.  It seems to me that impeaching a president for refusing to comply with Congressional subpoenas that haven’t been considered and approved by the Judiciary turns the separation of powers on its head. If Congress starts removing presidents just because those presidents don’t submit to its orders, I fear for the balance of power that is the cornerstone of our system of government. 

     Consider next the first article of impeachment.  In it, the House is charging the president with abuse of power— specifically, by pressuring a foreign government to take an action that would interfere in the U.S. electoral process .  Now, I favor impeaching presidents for anything that would interfere with the U.S. electoral process, but I find an important distinction between things that would interfere in the process and things that could affect the outcome.  Specifically, I find it helpful to distinguish between three types of conduct that might be considered potential interference.

     The first type I’ll call “direct” interference in the electoral process itself.  Impeding access to the polls.  Casting fraudulent ballots.  Bribing election officials.  Falsifying results.  I think pressuring a foreign government to engage in such direct interference surely ought to be grounds for removal from office.  But such direct interference is not what the House is alleging.

    Rather, the House is alleging pressuring a foreign government to take action that could be expected to influence some U.S. voters, and thus, the election outcome.  In my view, the conduct charged raises serious questions about when and why actions taken on the world stage that could affect election outcomes constitute “interference” with the electoral process.    If the president succeeded in pressuring Iran to cease its nuclear weapons development, there’s little doubt that such action could affect the election outcome in the president’s favor, but I can’t see that the same as interference in the process.   Would pressuring Saudi Arabia to investigate the murder of Jamal Khashoggi  result in “interference” in our elections if it affected the outcome?  Would pressuring North Korea to investigate the treatment of U.S. student Otto Warmbier, if such an investigation benefited the incumbent president?  In my view, we want our presidents to pressure foreign governments, and it makes no difference to me that, if the pressure works, the result would influence voters in favor of the president or his party. 

     Two of the words I find most troubling in the Article proposed by the house are the little words, “that would.”  The President is not even accused of soliciting action “for the purpose of” influencing the election.  He is accused of seeking action “that would” influence the election, i.e., the election outcome.     One might argue that Lincoln saw the Emancipation proclamation as something “that would” help him win re-election.  One might argue that FDR saw the New Deal as something “that would” help him win re-election.   One might argue that Lyndon Johnson saw the Warren Commission’s investigation into the assassination of JFK as something “that would” help him win re-election.. Parties and candidates are always doing things for political purposes, i.e., doing things that will enhance their prospects for re-election.  I just can’t conceive of impeaching presidents for conduct because their actions would “interfere with elections” by having an impact on election outcomes.

     My view does not change simply because the target of the requested investigation is a political opponent or relative of a political opponent.  Many Presidents, from Abraham Lincoln to Jimmy Carter and Ronald Reagan, have been embarrassed by the conduct of close relatives.    Imagine that, in some future election cycle, evidence surfaces that suggests that Opponent O’s cousin may be conspiring with foreign companies to import drugs into the U.S.   Obviously, announcement of an investigation into such a possibility  might embarrass Opponent O and thereby affect the election outcome.   Do we want to discourage President P from soliciting a foreign country to undertake an investigation of the matter, because such an investigation would amount to interference with the election?  I think not.

     I would suggest that there is a third category of arguable election “interference” – and I think many of those who favor the impeachment of Mr. Trump may be motivated by the belief that his conduct falls into this third category.  I’ll call it the Fake News Category.  Impeachable offenses in that category might include, say, doctoring a videotape of one’s political opponent to make it appear she said something she really didn’t.  Photoshopping an opponent’s face onto a picture of someone doing something despicable.  Making up fake news stories for the purpose influencing votes.  In my view, this sort of conduct – widely acknowledged to be on the rise, widely predicted to become even more common in the future – is not direct interference with the electoral process.  But, to me, it is still problematic, even thought it is designed to affect election outcomes rather than election processes.  In my view, creation of such fraudulent news poses a threat to the integrity of our electoral outcomes every bit as serious as direct interference with processes, like stuffing ballot boxes, etc. I could favor articles of impeachment that directly accuse an incumbent president of intentionally fabricating such fake news for the purpose of affecting election outcomes.  And I suspect that Trump’s opponents believe that the President’s solicitation of Ukraine was tantamount to fabricating fake news.  But the Article the House is now considering does not accuse the president of fabricating fake news.   Rather, it accuses him of soliciting an investigation that would influence U.S. voters.

     Nowhere is free speech more important than in the political and electoral process. Charges of fabricating “fake news” are essentially charges of intentional fraud on the electorate.  An essential element of fraud is a misstatement of fact, known to be false when made, and made for the purpose of inducing someone to rely on the false statement to their detriment.  Intentionally creating fake news for the purposes of misleading the electorate amounts to such fraud, and should not be tolerated. But calling for an investigation into smoke is not the same as asserting the existence of fire when one knows there is in fact no fire.   In my view, if a President thinks she sees smoke, even about a political opponent, calling for an investigation to determine if there is a fire strikes me as a very legitimate use of power – and one we should want to encourage in  our presidents, not despite a possible impact on the outcome of elections but because of such impacts, in which investigations help to bring out facts and in which the electorate is able to assess thje evidence and how that evidence impacts their votes. Even now, members of the House are calling for an investigation of the President, anticipating that it will affect the outcome of upcoming elections  Should that turn their very votes for impeachment into impeachable offenses themselves? Do we want a world in which all our elected representatives risk impeachment any time they call for investigations into their opponents?

     Some, I suspect, would say that Trump’s calling for an investigation of Hunter Biden was tantamount to a fraudulent falsification of fact because allegations of impropriety by Biden have already been “discredited.”  But Ukraine is a country with a history of corruption.  The prior investigation I’m aware of only found no evidence of a violation of Ukranian (not U.S.) law. Was the prior investigation thorough? Unbiased? Not itself the result of corruption? Might a new investigation unearth evidence of a violation of U.S. law, or simply information the U.S. electorate might find relevant to its voting in an upcoming election? Investigations are meant to dig deeper into the truth.  In my view, calling for them does not come close to the kind of manufacture of fake news that I would consider good grounds for impeachment.

     For these reasons, I am not a fan of the House’s articles of impeachment, as drafted.  That said, there are other grounds for impeachment I would not mind seeing the House approve.  If Mr. Trump is suspected of fabricating false statements in order to affect election outcomes, I say charge him with fraud on the electorate. If the evidence supports the charges, I say remove him from office because of it.   In fact, I’ll go even further.  Just as I believe that impeaching for bribery will tend to discourage bribery and impeaching for cover-ups will tend to discourage cover-ups, I believe that impeaching for eating hamburgers will tend to discourage eating hamburgers.  What constitutes good ground for impeachment is a political question, not a legal one. And I believe the grounds chosen can be expected to have an in terrorem effect on the behavior of future presidents, discouraging them from engaging in whatever type of conduct is seen as grounds for impeachment – even if its eating hamburgers.  

     As a result, while I oppose impeaching presidents for refusing to comply with Congressional subpoenas, and I oppose impeaching presidents for pressuring foreign governments to conduct investigations that could affect U.S. election outcomes, I would LOVE to see Congress impeach this president (and several of their own number) for “Fomenting National Divisiveness.” As I see it, particulars to such articles might include such things as “Making public statements and otherwise manifesting such extreme disrespect for others as to exceed the bounds of propriety in a pluralistic society.”  Evidence in support of such charges could certainly include fabrication of false news stories, calling for investigations of opponents in bad faith, etc– but the gist of such charges would be the disrespect and divisiveness involved.  If presidents (and members of Congress) were to fear impeachment for “fomenting national divisiveness,” I believe they would be influenced to call for greater harmony; that they would tend to manifest greater respect for those who disagree with them; that political rhetoric would soften, and that civility in political debate would increase.  In my view, those would be very good results –not for one party or the other, but for the country as a whole.

Primates and Praise

Early in the Christian churches, bishops and archbishops came to be called “primates.” The word was not intended to evoke images of orangutans or macaques.  (It would be another five hundred years before Carl Linnaeus classified homo sapiens as a member of that order.) Rather, even in Latin, the word for “first” had been used to mean a superior, a leader, or most excellent person, and the Christians had no problem designating their spiritual leaders with the term as well.

There are many things I like about my Christian heritage.  If Christians today preached what I believe the historical Jesus preached, I’d readily identify as a Christian.  But as I see it, modern Christianity gets Jesus wrong in a number of respects. 

When I was only eight, I was invited to spend the weekend in the countryside with a friend.  Since I’d have to miss Sunday mass, I made a phone call to ask for permission to do so.  My friend’s family got quite a laugh when, after the call, they discovered I hadn’t been calling home, but the church rectory. The “Father” they’d heard me addressing was not my biological father, but the parish priest.

I had already been taught to call all priests “Father,” and even when I talk to priests today, I use the term of respect I was taught as a child.

But it wasn’t long after the parish priest told me it would be a sin to miss Mass  that I came across Matthew 23:9, where Jesus is said to have told his followers “to call no man Father, for one is your Father, which is in Heaven.”  Given that scripture, I never understood how Christians developed the practice of calling their priests “Father” – especially in an age when fathers demanded so much respect – except, of course, that the priests had taught them to.

It’s easier for me to understand why hierarchies arose as church memberships and treasuries grew – and why words like “bishop” (from Greek epi-skopos, meaning to watch over) came into use.  And it seems almost inevitable that as such growth continued, layers of rank would have to be added, for practical, administrative reasons.  So by the time the Bishops of Canterbury, York, Armagh and St. Andrews had become powerful, it isn’t entirely surprising that they’d call these leaders ‘primates.” But the primates were always first among “fathers,” and I still had a hard time squaring that with Matthew 23:9.

Nor was it that particular scripture alone.   According to Matthew 12:50, Jesus instructed his followers, “Whosoever shall do the will of my Father, which is in Heaven, the same is my brother, and sister, and mother.”  Jesus preached, “Blessed are the meek; for they shall inherit the earth” (Matt. 5:5) and “Whosoever therefore shall humble himself as this little child, the same is greatest in the kingdom of heaven” (Matt. 18:4). I read of a Jesus who washed the feet of his disciples, of a Jesus who frequently dismissed those who treated him with special reverence, of a Jesus who said to a man who addressed him as Good Master, “Why callest thou me good? There is no one good but one, that is God” (Matt. 19:16). I read of a Jesus who, when asked if he was King, replied only, “You said it” (Matt 27:11), as if to disavow the title himself.  In fact, Jesus taught, in the Sermon on the Mount, that his followers should pray to the Father (for His was the power and the glory). And, if we believe Matthew 7:23, Jesus chastised those who would honor him, warning, “Many will say to me in that day, ‘Lord, Lord, have we not prophesied in thy name? and in thy name have cast out devils? And in thy name done many wonderful works?’ And then will I profess to them, I never knew you: depart from me, ye that work iniquity.”

One reason I haven’t been to church but a few times in the last fifty years is my lack of comfort with heaping praise on this man who fought so hard to avoid it.  Last month, I went to a Catholic mass for the first time in many years.  One of the first hymns sung was To Jesus Christ, Our Sovereign King.

“To Jesus Christ, our sovereign king, who is the world’s salvation, all praise and homage do we bring, and thanks, and adoration. Christ Jesus, victor!  Christ Jesus, Ruler! Christ Jesus, Lord and Redeemer! Your reign extend, O King benign, to every land and nation; for in your kingdom, Lord divine, alone we find salvation.  To you and to your church, great King, we pledge our hearts’ oblation – until, before your throne, we sing in endless jubilation.”

Homage? Kingdom?  Reign? Throne?  I was taught the theology behind this hymn.  But for me, the theology fails to justify adoration of a man who shunned adoration, who deflected all praise to God, his father in heaven.  To my way of thinking, Jesus would not have approved of such a hymn.

Meanwhile, whatever may be said in defense of praising Jesus, I have even greater trouble with adoration of mankind.

Consider this passage from Pope John Paul II’s Gospel of Life, Evangelicum Vitae.  I can’t read it without thinking of Jesus’ teaching that the meek shall be blessed.

52. Man, as the living image of God, is willed by his Creator to be ruler and lord. Saint Gregory of Nyssa writes that “God made man capable of carrying out his role as king of the earth … Man was created in the image of the One who governs the universe. Everything demonstrates that from the beginning man’s nature was marked by royalty… Man is a king. Created to exercise dominion over the world, he was given a likeness to the king of the universe; he is the living image who participates by his dignity in the perfection of the divine archetype.”

I hope that my thoughts are not taken as an attack upon those who sing the hymn, or upon Pope Paul II for his thoughts about mankind.    I mean no disrespect, and God knows, I may be wrong.  But as Christians prepare this month to celebrate Jesus and his birth, I’m moved to point out my inability to buy into these aspects of modern Christianity. As I like to think of it, “I prefer the original.”  Father, Primate, Pope, Homo Sapiens Sapiens.  Clearly, we are prone to bestow honor on ourselves.  I don’t know whether we inherited this tendency from other primates or not, but the Jesus I believe in warned us against it.