History and Horse Races - Run Only Once

July 5, 2011
Posted by Jay Livingston

Here’s sentence from a New York Times op-ed by David Greenberg last week.
Modern Republican isolationism began with the 1919 battle over joining the League of Nations, when Senate Republicans, led by so-called Irreconcilables like William Borah of Idaho, killed the deal - even though without American guidance, European affairs were doomed to explode again.
I’ve read sentences like this hundreds of times – I’ve probably written some myself. But last week, I had been reading Everything is Obvious* (*Once You Know the Answer) by Duncan Watts, and I got stuck on doomed. “European affairs were doomed to explode again.”

Were they? Was the explosion inevitable? And was the absence of US guidance a necessary cause of the explosion?

It certainly seems that way. We know that Europe exploded in war twenty years after that 1919 decision by the US to stay out of the League. But we know that only in retrospect. At the time, there may have been several other equally plausible outcomes.

It’s like saying that the 2011 Belmont Stakes was “doomed” to be won by Ruler on Ice. True, he did win. But before the start of the race, there were many other plausible outcomes, most of them more likely. If the same race were held tomorrow – the same twelve horses running on a similarly sloppy track – would Ruler on Ice inevitably win? I doubt it. (If he were again 24-1, I might be tempted to put a small bet on his nose, but I wouldn’t be at all confident of collecting.)

The trouble with history is that, like the 2011 Belmont, it’s run only once. And after it’s run, historians and op-ed writers do the same thing that horseplayers do after a race: they go back to the Racing Form, the past performances, and pick out the information nuggets, often pebble-sized, that account for the results. Sometimes they even add their own speculation as fact. A horseplayer might say that Ruler on Ice would also have won the Kentucky Derby if only he’d been entered, a statement for which we have zero observations. An op-ed writer might say that the US presence in the League of Nations would have prevented World War II, a speculation based on a similar number of observations.

Watts suggests a different way of looking at history. Here, for example, is his take on the surge in Iraq. In the fall of 2007, the US upped its Iraq force by 30,000 troops. By the next summer, violence had substantially decreased. Conclusion: the surge worked. It reduced the violence.

Or did it?
Many other things happened between the fall of 2007 and the summer of 2008 as well. Sunni resistance fighters, seeing an even greater menace from hard-core terrorist organizations like Al Qaeda than from American soldiers, began to cooperate with their erstwhile occupiers. The Shiite militia – most importantly Moktada Sadr’s Mahdi Army – also began to experience a backlash from their grassroots, possibly leading them to moderate their behavior. And the Iraqi Army and police forces, finally displaying sufficient competence to take on the the militias, began to assert themselves, as did the Iraqi government. Any one of these factors might have been at least as responsible for the drop in violence as the surge. Or perhaps it was some combination. Or perhaps it was something else entirely.
We can’t know because history is run only once.

Ideally, we should be able to go back, examine our variables, look at the possible scenarios, and assign each outcome a probability. Maybe the outcome that did happen would have a lower probability than others, and we would wind up saying that what happened was a fluke or at least improbable. But that’s not what we do.

Rather than producing doubt, the absence of “counterfactual” versions of history tends to have the opposite effect – namely that we tend to perceive what actually happened as having been inevitable.
As I was reading this part of Watts’s book, I kept thinking of Hans J. Morgenthau, the great political scientist. This was many years ago, perhaps during Vietnam, and after Morgenthau’s analysis, typically thoughtful and insightful, someone in the audience asked him what we could expect as the outcome. In his elegant, German-accented English. Morgenthau said,
Well, the answer to that is that I am a professor, not a prophet. I cannot tell you what will happen. I can only tell you why what did happen was absolutely bound to happen.

Will Ruler on Ice beat those same horses again? Will another surge reduce violence? Maybe, but don’t bet on it.

Whose Declaration Is It Anyway?

July 4, 2011
Posted by Jay Livingston

Mark Kleiman complains that the old Fourth-of-July custom of reading the Declaration of Independence “seems to have virtually disappeared.”

In a post three years ago, I reported that the custom was still alive and well in liberal Lenox, Mass. (Here’s the photo, the full post is here.)


Patriotism has usually been a value waved about by conservatives, especially when they are in power. But this was 2008, and after seven years of Bush-GOP rule, the liberals were starting to play Capture the Flag, a game they had been mostly sitting out of for decades.

When they are out of power, conservatives put their dissent-equals-disloyalty rhetoric back in the closet. The Declaration is an anti-government document. So I expect that we will see it read at many gatherings of the Tea Partistas. They have been calling Obama a tyrant for a while now, so they will be whistling and stomping their feet at every mention of tyranny and taxation. They’ll especially like the line about the “multitude of new offices.”

I wonder if they will feel uncomfortable (as the Lenox liberals did) at the negative tone in the references to “domestic insurrections” (i.e., slave rebellions) and “merciless savages” (native Americans).

Hush Little Baby

July 1, 2011
Posted by Jay Livingston

The parenting style of Mrs. Scavolini (not her real name), our next-door neighbor, was different from that of the other moms in the upper-middle-class, mostly WASP suburb where I grew up .

“I’m gonna kill you,” we would hear her scream at her kids, who for their part were often screaming at one another as well. “I’m gonna kill you.”

“But she never does,” sighed my mother.

It’s the recent flap* over Go the F**k to Sleep that takes me back across the decades to Mrs. Scavolini, with her shrill voice and her orange hair (and this was in an era when nobody had orange hair).

From the CNN website**

(Click on the image for a larger view.)
"Imagine if this were written about Jews, blacks, Muslims or Latinos," says Dr. David Arredondo. He is an expert on child development . . .
The irony, says Arredondo, is that the people buying the book are probably good parents.
It’s not an irony. It’s the whole point.

I haven’t read Go the F**k to Sleep, but it seems like one of those books you don’t have to actually read. The title says it all. The book is a satire of middle-class niceness and civility. It reveals the inconvenient truth that despite that niceness, kids can sometimes be a pain, and even good, kind, thoughtful, nice parents run out of patience. They’d like to explode at their kids, but they know they shouldn’t, just as know they shouldn’t denigrate Jews, blacks, Muslims, or Latinos. So they don’t flip off their kids and tell them to go the fuck to sleep. But they’d still like to, and maybe they even think it might be justifiable. That’s why the book is a best-seller.
Forbidden truth informs good satire’s jest –
What oft is thought but has to be suppressed.
OK, it’s not Pope, but you get the idea. You have to be what Dr. Arredondo calls a “good” parent to know the tension that makes the book title funny.

Mrs. Scavolini would not have gotten the joke.

* See the post and comments at Scatterplot for example.

** If you like the on-screen juxtaposition of this book and the caption about fungal disease among bats, you’ll find similar ironies here . (On many of these, an ad is covering the caption, so you have to click on the “x” to close it.)

The Fashion Report – Names Edition

June 27, 2011
Posted by Jay Livingston
(Cross-posted at Sociological Images)

In Sunday’s Times, David Leonhardt, who usually patrols the economics beat, looks at fashions in baby names (here). His primary focus is the rapid decline in old-fashioned names for girls. The “nostalgia wave” of Emma, Grace, Ella, and other late-nineteenth-century names, he argues, is over.

(Click on the image for a larger view.)

Well, yes and no. Sarah and Emma may be in decline, but the big gainer among girls’ names is Sophia, an equally nostalgic name that was last popular at the turn of the twentieth century. Isabella, too, (third largest gain) follows the same trend line. Besides, the nostalgia for old names was selective. Emma and Grace may have come back, but many other old-fashioned names never became trendy. One hundred years ago and continuing through the 1920s, one of the most popular girls’ names in the US was Mildred. (You can trace the popularity baby names at the Census website.)

“The lack of recent Jane Austen movies has probably played a role,” says Leonhardt, though he’s probably joking. Not only is Emma still in the top five, but I suspect that films of that persuasion appealed more to the prejudices and sensibilities of post-childbearing women. But the media do have an impact. In Freakonomics, Levitt and Dubner showed how fashions in names often trickle down. The Sophias and Isabellas become stylish first among the upscale and educated; it may be several years, even decades, before they became more widely popular. But the media/celebrity channel can bypass that slow trickle. As Leonhardt says, how else to explain the boom in Khloe?

(Click on the image for a larger view.)


Similarly, Addison, the second biggest gainer, may have gotten a boost from the fictional doctor who rose from “Gray’s Anatomy” to her own “Private Practice.” In the first year of “Gray’s Anatomy, the name Addison zoomed from 106th place to 28th. The name is also just different enough from Madison, which had been in the top ten for nearly a decade. Its stylishness was fading fast among the fashion-conscious.

Madison herself owed her popularity to the media. She created a big “Splash” soon after the film came out. As Tom Hanks says in the scene below, “Madison’s not a name.” [The clip will start at the beginning of relevant part of the scene. For purposes of this post, it should stop at 3:23, after the punch line (“Good thing we weren’t at 149th street.”). But I couldn’t figure out the code to make it stop.](Update: Disney has forced YouTube to remove this clip. See the footnote* for a transcript.)




At the time, the Hanks character was correct. Before “Splash” (1984) Madison was never in the top 1000. The next year, she was at 600. Now she has been in the top ten for nearly fifteen years, and at number two or three for half those years. (There have not yet been any Madisons in my classes. I suspect that will change soon.)

Boys’ names seem governed by somewhat different rules, with less overall variation, though recent trends are towards names with a final “n” (four out of the five big gainers in the chart above) and Biblical names.

These recent changes in girls' names aren't about nostalgia. Name trends are like fashion trends, they come and go. And, like fashion, name trends can be media driven, especially now that media can short-circuit the slower class diffusion process.

* Transcript of the relevant segment of the Splash clip:

Hanks: I'm going to have to call you something in English, because I can't pronounce -
Hannah: What - what are English names ?
Hanks: There's millions of them, I guess. Jennifer, Joanne, Hillary. . . .: Names, names. Linda, Kim .Where are we? [Cut to close up of Madison Ave. street sign] Madison.
Elizabeth, Samantha --
Hannah: I like Madison.
Hanks: Madison's not a name. [ a beat] Well, all right. Madison it is. Good thing we weren't at 149th street.