Horton Hears a Whom?

December 31, 2014
Posted by Jay Livingston

In discussions of  language and grammar the word correct should usually be in quotes.  Either that or it should be amended to “currently correct.” That goes for pronunciation and spelling too. The trouble is that language prescriptivists seem to think that what is currently correct has always been so and always will be. They’re wrong.

NPR recently asked listeners for their language gripes – “the most misused word or phrase.” Topping the list was “I and “me.”  (The full list is here.)

Strictly speaking, “the gift is for you and I” is wrong. We have objective pronouns (me) and subjective pronouns (I). Putting a couple of words between the preposition (for) and the pronoun doesn’t change that. If you wouldn’t say, “The gift is for I,” then don’t say, “The gift is for you and I.”*



Strictly speaking, it should be “between you and me.”  But we don’t speak strictly. Language changes. Yesterday’s solecism becomes today’s standard usage. I don’t like “between you and I,” but wishing people would stop using it is like wishing they’d stop texting. (Need I point out that text as a verb did not exist until very, very recently?)

At #9 on the prescriptivists list is
Saying someone “graduated college” instead of “graduated from college.”
They  don’t have too much to worry about. Their preferred form is ten times more common.


But not too long ago, “he graduated from college” was itself a grammatical error. NPR, in the very next sentence, says,
A college graduates a student, not the other way around. The "from" makes a big difference.
But while NPR sees why this makes “he graduated college” incorrect, it fails to note that by this same logic, “he graduated from college” is also wrong.  If it’s the college that graduates the students, we should say “he was graduated from college.” And in fact, we did say it that way.



Imagine a newspaper in 1900 asking the NPR “most misused word or phrase” question. High on the readers’ list of grammar gripes: “Even our best educated are now saying, ‘I graduated from Harvard,’rather than the correct, ‘I was graduated from Harvard.’”

“Was graduated from” was never the most popular way of saying it, but it held its own up until about 1950. Since then “I graduated from” became the clear winner and is now, at least among the NPR complainers, the “correct” form.

Coming in at #5 on the list is
Ongoing confusion over “who” vs. “whom.”
The confusion is easily cleared up: get rid of whom. Reserve it for a few special occasions.  In fact, that’s what’s been happening.



The graph from Google Ngrams shows the frequency in books, i.e. formal writing. Its misuse can escape copy editors even at the Times :
The defenders of the interrogation program say little about two men whom are portrayed especially harshly by the Senate report

Surely whom is fading even faster in everyday speech. I’m surprised that NPR could find even a few dozen people who mourn its passing. I am certainly not among them. (Or is it amongst them?)

---------------------
The root of the I/me problem is that English lacks a disjunctive pronoun. The French, thanks to  moi, toi, etc., never make these mistakes.

Freaks and Civilization

December 30, 2014
Posted by Jay Livingston

Whatever happened to freaks?

I saw the musical “Sideshow” on Saturday. It’s based on the story of the Hilton sisters, Violet and Daisy, conjoined twins.  When the musical opens, they are in Texas with a traveling sideshow of freaks – the bearded lady, the dog boy, the half-man/half-woman, a midget couple, a three-legged man, and others. The storyline of the show traces the girls’ escape from the exploitative sideshow operator, who in effect owns them, and their relation with two men who teach them to sing and dance and who eventually develop their them into vaudeville stars of the 1920s.


Vaudeville is long gone, and so is the freak show.  We still have the staples of vaudeville – singers and dancers and comedians.  And you can still find, in clubs or circuses or late night TV, magicians and ventriloquists, jugglers and fire-eaters, contortionists and animal acts. But no freaks. 

I don’t mean the performers – the glass eater, the sword swallower, the human pin cushion, the geek. [Language note: Until very recently, the term geek referred to the sideshow guy who bit the heads off live chickens, and I am curious as to how geek came to mean something much less specific and much less deviant. The word freak too lost its bite starting in the 1960s with speed-freaks and acid-freaks. As unconventionality became more stylish, freak might mean nothing more than enthusiast. 

(Frequency of freak in books, as per Google nGrams.)

The term allowed an utterly ordinary person the fantasy of metamorphosis into someone offbeat and interesting.  Freakonomics – need I say more? The characters on the 2000s TV series “Freaks and Geeks” were neither, at least not according to the definitions of only a few decades earlier. They would not have qualified for the sideshow. On the other hand, many people walk around in the conventional world today so extensively tattooed that they would have easily been sideshow material a century ago.]

The performers who had developed unusual skills were examples of what we sociologists might call “achieved” deviance. The freaks I wonder about are those who one of the characters in “Sideshow” calls “nature’s mistakes.”  They seem to have disappeared from sight. My friends who grew up in New York used to go to Hubert’s Dime Museum on 42nd St., a sideshow collection of freaks and acts that ran through the mid-1960s.  The closest that the today’s Disneyfied Times Square comes is Madame Tussaud’s Wax Museum is just a few doors down the street. But Hubert’s and the like have not reopened anywhere. No doubt there are places on the Internet showing all sorts of physical anomalies, but I the audience for these sites is probably smaller and more secretive than was the freak show audience.

The freak show has fallen victim to a normative shift that has taken two forms. First, freaks are less abnormal.  We have become more accepting of people who are different. They are no longer the objects of fascination and horror that they once were. Our normative circle has expanded, spreading now to include many of the “differently abled” who might previously have been excluded. As the boundary has broadened, even those who are really different are no longer so distant. Consequently, they are not so deviant.  We have defined their deviance down. 

Second, as norms have become more accepting of physical difference, they have also become less tolerant of those who haven’t gotten the message, the unenlightened rabble who would belittle, tease, laugh, or gawk. We must teach them restraint and kindness.  It’s not nice to point and stare and others’ deformity.

This sounds a bit like Norbert Elias’s Civilizing Process, which traces how Europeans came to throw the heavy cloak of manners over bodily functions, violence, dining, and speech.  Elias was writing about a transition that began with the medieval aristocracy and filtered through bourgeoisie of later centuries. By the time book was published (the mid-20th century), the civilizing process seemed like something that had reached its peak in the 19th century.  The strictures of Victorian norms were loosening. We were less uptight about bodies, and that was groovy. 

Maybe, but apparently the civilizing beat goes on. If there is a message to “Sideshow,” it is that the freak show – exploiting its cast while egging on its audience, daring them to stare – was a shameful spectacle, one that we like to think we have relegated to the bin of history. 

Uncertainty and Foreboding. Are Things Really Falling Apart?

December 26, 2014
Posted by Jay Livingston

Don’t be fooled by the stories in the headlines or on the evening news, says Steven Pinker in an article for Slate (here).  Those stories are about death and devastation, and they reinforce a popular but incorrect picture of a world in chaos.

I think Robert McNamara was the first government official to use the quote from Yeats that has now become a cliche in this regard:

Things fall apart; the centre cannot hold;
    Mere anarchy is loosed upon the world,*

Or as Times columnist Roger Cohen said just two months ago (here),

Many people I talk to . . . have never previously felt so uneasy about the state of the world. . . . The search is on for someone to dispel foreboding and embody, again, the hope of the world.

A few weeks later, in his year-end summary (in verse, no less), Cohen repeated the same idea: “The world has never seemed more fragile.”

Never? Nonsense, says Pinker. As a nation and as a world, we’ve never had it so good.  And unlike the journalists reviewing their headlines and ledes, Pinker backs up his never-better diagnosis with data from the last quarter-century.  Murder, rape, war, mass killing, genocide, dictatorships – all down. Democracy – up. 



As the graphs show, things look pretty good. So why the despair? Pinker, a cognitive scientist, has basically one answer: the availability heuristic:

As long as violence has not vanished from the world, there will always be enough incidents to fill the evening news. And since the human mind estimates probability by the ease with which it can recall examples, newsreaders will always perceive that they live in dangerous times. All the more so when billions of smartphones turn a fifth of the world’s population into crime reporters and war correspondents.


True, but there’s something else, and I’m surprised that Pinker misses it: we are uneasy about the world today because of its uncertainty.  It seems worse than anything that’s gone before because we know how those things in the past turned out. 

The trouble with all the current problems that we can so easily think of – ISIS, climate change, global recession, and the rest – is not just that they’re bad but that they might get worse, and in ways that we can only imagine. (Of course we can only imagine them. They haven’t happened yet.) Hence, Cohen’s “foreboding” and “uneasy” feelings. 

Bad stuff happened in the past – recessions and crime and wars and global threats. But we survived them all, those of us who are still alive. Some things turned out terribly (Rwanda, Cambodia, Chernobyl, etc.), but they are over now, so we need not feel any sense of foreboding. In most cases, we don’t even feel much afterboding.  Even when the underlying problem remains, if we live with it long enough, it becomes familiar. So as long as it doesn’t get much worse, we learn to live with it.

By definition, what is familiar cannot be uncertain, so it causes less anxiety.  Back in the high-crime years of the 1960s and 70s, surveys found that people felt safer in their own neighborhoods than in unfamiliar neighborhoods – even when their own neighborhoods had a much higher crime rate. I remember phoning a guy for directions to his party in some NYC neighborhood I didn’t know. “Is it safe?” I asked. “Of course it’s safe,” he said indignantly.  When people asked the same question about my neighborhood, I’d give the same answer. “Of course, it’s safe.” I lived across from Needle Park, and I would sometimes see junkies on the nod, standing in stupor on the sidewalk. There were murders in Riverside Park two blocks away. But I had not been personally victimized, and the junkies became part of the taken-for-granted landscape.

My cognitions were adapting locally, but globally we do the same thing. The 1964 movie “Dr. Strangelove” is about a nuclear confrontation between the US and the Russia. People worried about that back then.  Today, both those countries still have more than enough nuclear warheads to blow up the world, and there have been some Strangelovian close calls. But the uneasiness, fear, and uncertainty of the 1960s have passed. Or as the full title of the movie says, “How I Learned to Stop Worrying and Love the Bomb.” It’s not that we love the bomb, but we have stopped worrying. 

When Roger Cohen and other handwringers look back at 2014 from the distance of a decade or probably less, they won’t see it with unease. Today’s problems won’t seem so threatening. They will instead be something that we lived through. And maybe, just maybe, they will also look at the data on long-term trends.

-----------------------
* I recall some journalist reporting that he overheard McNamara use this quote during a dinner party conversation. McNamara, Secretary of Defense for both JFK and LBJ, was one of the most important among the folks who brought us Vietnam. So I doubt that his quoting of Yeats extended to the next line of the poem, the one about “The blood-dimmed tide.”  My memory of this whole thing could be faulty. I’ve searched using Google and the Times index but can find no reference to it.

Whose Anecdote Is This Anyway?

December 22, 2014
Posted by Jay Livingston
(This is a revised post. The original version was different in tone.)

How much can we trust the memory of a memoirist?

In Cheryl Strayed’s memoir Wild (now a movie starring Reese Witherspoon), a man she meets on the road tells her a very unusual anecdote. A few days later, she will read that same anecdote in a book.  The echo cannot be coincidence. The anecdote is too special.

According to the jacket flap, Wild: From Lost to Found on the Pacific Crest Trail is  “A powerful, blazingly honest memoir: the story of an eleven-hundred-mile solo hike.”

Strayed leaves the trail at times to check back in to civilization or to circumvent stretches of the trail locked in by snow. After one such detour about halfway through her journey, she is hitching back to the Pacific Crest Trail (PCT). An old Ford Maverick stops to offer a ride – a woman and man in the front seat, another man and a dog in the back. She accepts.

The man sharing the back seat with Strayed is Spider – “his dark hair woven into a thin braid. He wore a black leather vest without a shirt underneath and a red bandanna tied biker-style of the top of his head.”

“What are you doing on the road anyway?” Lou asked from the front seat.

I went into the whole PCT shebang, explaining about the trail and the record snowpack and the complicated way I had to hitchhike to get to Old Station. They listened with respectful, distant curiosity, all three of them lighting up cigarettes as I spoke.

After I was done talking, Spider said, “I’ve got a story for you, Cheryl. I think it’s along the lines of what you’re talking about. I was reading about animals a while back and there was this motherfucking scientist in France back in the thirties or forties or whevever the motherfuck it was and he was trying to get apes to draw these pictures, to make art pictures like the kinds of pictures in serious motherfucking paintings that you see I museums and shit. So the scientist keeps showing the apes these paintings and giving them charcoal pencils to draw with and the one day one of the apes finally draws something but it’s not the art pictures that it draws. What it draws is the bars of its own motherfucking cage. Its own motherfucking cage! Man, that’s the truth ain’t it? I can relate to that and I bet you can too, sister.

“I can,” I said earnestly.

“We can all relate to that, man,” said Dave, and he turned in his seat so he and Spider could do a series of motorcycle blood brother hand jives in the air between them.

Twenty pages later, Strayed is reading a book. Before she started her journey, she mailed packages to herself, addressed to post offices along the Pacific Crest Trail.  The packages contained replenishment of food, supplies, and books.  On the trail, Strayed would tear out and burn the pages as she read them – no sense carrying around the extra weight – and start a new book at the next postal station.

A few days after her ride with Spider, she picks up one such package. “I sat for hours reading the book that had come in my box – Vladimir Nabokov’s Lolita – while waiting for my boots to arrive.”

Strayed doesn’t mention it, but at the end of Lolita is an afterword, “On a Book Entitled Lolita,” that Nabokov added for the US edition (Lolita had originally been published in France.)  Here, in part, is the third paragraph:

The first little throb of Lolita went through me late in 1939 or early in 1940, in Paris . . . . As far as I can recall, the initial shiver of inspiration was somehow prompted by a newspaper story about an ape in the Jardin des Plantes, who, after months of coaxing by a scientist, produced the first drawing ever charcoaled by an animal: this sketch showed the bars of the poor creature's cage.

Here is the sequence according to Strayed. Spider tells her the ape-cage/art-bars story. A few days later she reads Lolita, which, though she does not mention it,  contains this same story.  Did she really encounter this anecdote twice? Whose memory is speaking – Spider’s or Nabokov’s?

Spider, despite the “I was reading about animals” intro, doesn’t seem like someone who has read much literature or zoology.  Maybe in writing her memoir fifteen years later, Strayed remembers the ape parable, probably because it so perfectly reflects her state of mind at the time.  In her memory, the story sits in the heat and the mountains, someplace near the Trail. In a hike of three months and 1100 miles, her memory is off by only a few days and a hundred miles.  But that’s enough for her to confuse her sources. She gives the story to Spider and rewrites it in his idiom.

At first I thought that Strayed might be deliberately copying Nabokov, appropriating his remembered throb and translating it into the voice of one of her characters. Maybe she did. But the passage certainly does not seem like an homage to Nabokov or evidence of his influence or inspiration.* Besides, if she had been consciously ripping off the master’s material, wouldn’t she fear that some readers might notice? 

Till now, apparently nobody has.
-----------------------
* I’ve mentioned this problem before (here) in connection with a Kate Walbert story that appeared in the New Yorker.  Lorrie Moore’s 2012 story “Referential” very clearly references Nabokov’s “Signs and Symbols.” (My post on that is here.)