The First Derivative of the Wisdom of Crowds

February 2, 2019
Posted by Jay Livingston

If this is Superbowl weekend, then the Socioblog’s fancy must be turning to thoughts of the wisdom of crowds vs. the smart money.  It’s a question I have returned to several times since the first year this blog was on the field. (See. for example, this post about the 2010 Superbowl.)

The “wisdom of crowds” is like the Ask-the-Audience option in “Who Wants to Be a Millionaire.” The “smart money” is like Phone-a-Friend — a friend who knows a lot about the subject.

The trouble with the wisdom of crowds is that sometimes the crowd is wrong, as it was in the 2007 NFC championship game between the Bears and the Saints that I blogged about at the time (here.)

Now, a trio of academics — John McCoy (marketing), Dražen Prelec (management), and  H. Sebastian Seung (neuroscience) — has a variation that allows you to derive the right answer from the crowd even when the crowd is wrong. You might call it the first derivative of crowd wisdom.

Is Philadelphia the capital of Pennsylvania? 

Suppose you don’t know, and you ask the crowd.

The correct answer is no. The capital is Harrisburg. But many people think it is, because Philadelphia is a large, populous city. Most people know about Philadelphia. When you ask that question to a crowd of people, as we did with MIT students, only about a third of the crowd gets the correct answer.*

Yes is the popular answer. The crowd, by two-to-one, says Yes, Philadelphia is the capital. The crowd is wrong. The capital of Pennsylvania is Harrisburg. So much for the wisdom of crowds.

Wait, not so fast, say McCoy and his colleagues. Let’s also ask another question: “What percent of people do you think will answer No to this question?” The average estimate is 23%. But in fact, 33% answer No. This makes No a “surprisingly popular” answer, surprising in that more people than expected say No. It’s as though you are taking the first derivative of crowd wisdom rather than the wisdom function itself.

If you went with the popular answer, you’d say Yes and be wrong. But if you go with the derivative — the “surprisingly popular” answer —  you’ll get it right.

McCoy sees applications of this to all kinds of forecasts — the market for some product, the price of gold, voting, He doesn’t mention the Superbowl. Right now, about 25% of bettors think that the Rams will win or that they will lose by 2 points or less. But suppose we asked all bettors, “What fraction of people do you think are betting the Rams?” If they guessed that only 10% of them are backing the Rams, then the Rams would be the “surprisingly popular” choice, and you would be a fool not to put down a grand to win $1250. Alas, I know of no such surveys. Besides, I don’t trust Belichick.

Two other thoughts:
First, McCoy’s makes the concept harder to understand by choosing an example where No is right. “Is No the correct answer?” “Yes, No is right.”

Second, I was stunned that two-thirds of MIT students did not know the capital of fifth most populous state in the country. Look, people, we’re not asking about Pierre or Carson City. This is not rocket science. And now I get the feeling that at MIT a question about rocket science might have gotten a higher proportion of correct answers.

-----------------------------
*From an interview with McCoy on a Wharton School podcast. An article by Prelec, Seung, and McCoy in Nature is here behind a paywall.

Michel Legrand — 1932 - 2019

January 28, 2019
Posted by Jay Livingston

As a callow youth, I dismissed Michel Legrand as a guy who wrote pretty tunes and scores. In fact, he did write 200 or more scores for movies and TV. Then I heard Legrand Jazz (1958) with his arrangements for three different groups — 1. a big band, 2. a group centered on Ben Webster and four trombones, and 3. the core of Miles Davis’s 1958 sextet (Miles, Coltrane, Bill Evans, Paul Chambers), Phil Woods instead of Cannonball, plus a few other instruments including flute and harp.

The tunes were  eleven jazz classics, from Dixieland (Louis Armstrong’s “Wild Man Blues” ) through swing (Benny Goodman, “Stompin’ at the Savoy”) to bebop (Dizzy’s “Night in Tunisia”). The arrangements were “far out” for the time, and even 60 years later, they hold up well.

Legrand later described the recording session with the Miles.

Everyone said to me: “Miles will come to the meeting and stand near the door, keeping his trumpet in his closed case. He will listen for five minutes, and if he likes music, he will sit down, open his case, and play. If he does not like, he will leave and he will never again contact you.” I was so afraid that I had flare-ups of sweat! I started rehearsing with the orchestra. The door opened, and Miles listened by the door for five minutes. Then he sat down, opened his case and began to play. After the first catch, he asked me, “Michel, is my game [playing] suitable?” That is how it all began.


Here is that group playing Fats Waller’s “Jitterbug Waltz.”



The record went out of print and out of sight. I rarely met anyone, jazzers included, who knew of it. It was something of a collector’s item. Somewhere along the line, I lost my copy and wound up paying the equivalent today of about $75 for a used copy. Eventually, the album was re-issued as a CD, and now you can stream it anytime, anywhere.

Flashback Tuesday — MLK - AOC Edition

January 22, 2019
Posted by Jay Livingston

Alexandria Ocasio-Cortez was on Colbert last night. Colbert asked her about accusations that she was divisive. Said AOC:

Today is Martin Luther King day, and people called Martin Luther King divisive in his time. We forget that he was wildly unpopular when he was advocating for the Civil Rights Act.

She was right, sort of. King’s “unfavorables” were in the 40-45% range in the years AOC is talking about, much lower than Trump’s “unfavorables” today. But the favorable/unfavorable ratio did not turn positive until after he had been assassinated.

Here is part of a blog post I did seven years ago. The latter part of the post, about Reagan and Buckley, is not exactly relevant, but I’m leaving it in as a corrective to possible inaccuracies about them too.

*     *     *     *     *     *     *    


In August, Gallup (here) published some of their polling from the 1960s. The contrast with opinions today, when only 4% are unfavorable, is remarkable.

(Click on the chart for a larger view.)

(Note: these results include all races. The data for Whites only would surely show a higher percent unfavorable and a lower percent favorable.)

Except for 1966, the total favorable and unfavorable are fairly close.  (The change in 1966 is a result of King’s opposition to the Vietnam war.  He was right about that too.)  But of those with strong opinions, the “highly unfavorables” always outnumber the “highly favorables.” 

The unfavorables weren’t just those rabid Southern whites so familiar from the historical news footage. The same ideas could be found among seemingly temperate, sophisticated, and intellectual conservatives. Affable Ronald Reagan opposed the Civil Rights Act of 1964 and the Voting Rights Act of 1965. 

In 1957, William F. Buckley, Jr. supported the suppression of black votes in the South
The central question that emerges . . . is whether the White community in the South is entitled to take such measures as are necessary to prevail, politically and culturally, in areas in which it does not predominate numerically? The sobering answer is Yes – the White community is so entitled because, for the time being, it is the advanced race.  (The full article is excerpted by Brad DeLong here.)
That was before the rise of the Civil Rights movement.  Six years later, when Dr. King had come to prominence, a black church in Birmingham was firebombed.  Four young girls died.  Here is how Buckley’s National Review responded.
The fiend who set off the bomb does not have the sympathy of the white population in the South; in fact, he set back the cause of the white people there so dramatically as to raise the question whether in fact the explosion was the act of a provocateur – of a Communist, or of a crazed Negro. Some circumstantial evidence lends a hint of plausibility to that notion, especially the ten-minute fuse (surely a white man walking away from the church basement ten minutes earlier would have been noticed?). And let it be said that the convulsions that go on, and are bound to continue, have resulted from revolutionary assaults on the status quo, and a contempt for the law, which are traceable to the Supreme Court’s manifest contempt for the settled traditions of Constitutional practice. [emphasis added]
The suggestion that the firebombing was committed by “a communist or a crazed Negro” is a fantasy of pure desperation and wish-fulfillment.  Note also NR’s concern for “the cause of white people.”  As for the church bombing, the beatings, the tortures, the murders, and other acts of terrorism (“convulsions” as the NR calls them), committed against blacks and civil rights workers, just blame it all on the Supreme Court. 

All this would be laughable if the events were not of such grave importance and if the commentary were from some obscure, racist corner.  But National Review, then as now, was the main voice of intellectual conservatism. 

Eugene Volokh, in an appreciation of Buckley (here), notes that it wasn’t until the late 1960s, after the passage of the major civil rights laws and probably after the King and RFK assassinations, that Buckley and NR finally gave up defending segregation.  Volokh also says, approvingly,
Buckley tried very hard to create a genial and friendly image for conservatism as opposed to one that projected anger, intolerance, and rage.
Michael Harrington put it somewhat differently:
William Buckley is an urbane front man for some of the most vicious emotions in this country.

Confidence Games

January 19, 2019
Posted by Jay Livingston

Timing is crucial in comedy. In can be important in survey research as well. If you ask about satisfaction with government, and you take your survey at a historical moment when the Republican party controls the government, don’t be surprised if Republicans are more satisfied than Democrats. But also don’t write up your findings to imply that this means that Republicans have a deep and abiding faith in American institutions.

We’ve been here before, not with “satisfaction,” but with something similar — happiness. People who make claims about the relation between happiness and political views — people like Arthur Brooks, for example — often don’t bother to look at which party was holding sway at the time the survey they’re using was done. But that context matters a lot, especially now that the country has become so partisan and polarized, with people remaining loyal to their party the way sports fans are loyal to their team. In a post two years ago inspired by a Brooks column, I put it this way

When you’re talking about the relation between political views and happiness, you ought to consider who is in power. Otherwise, it’s like asking whether Yankee fans are happier than RedSox fans without checking the AL East standings. [the full post is here.]

I had a similar reaction to a recent thread on Twitter about who has lost confidence in American institutions. The answer is: everybody. But some more than others.  Patrick Egan of NYU looked at the “confidence” items in the General Social Survey and created these graphs showing the average confidence in twelve different institutions.

(Click on an image for a larger view.)

Confidence has dropped among all categories. But the steepest decline has come among non-college Whites. Their overall level of confidence is the lowest of any of these groups. They are also the strongest supporters of Donald Trump. This reinforces the image of the core Republican constituency — Trump’s staunchest supporters — as dissatisfied, even resentful. They have lost confidence in traditional American institutions, and they acclaim the strong outsider who could bring sweeping changes.

In response, Joshua Tucker posted a link to a report he was co-author on — the American Institutional Confidence Poll (AICP) from the Baker Center for Leadership & Governance at Georgetown University. The AICP found that demographic characteristics didn’t make much difference. Politics did. Here is AICP’s Number One Key Finding:


Why the discrepancy between the GSS data the AICP conclusions? I wondered if it might be the sample. It wasn’t.

The interviews were conducted online from June 12 to July 19, 2018, by the survey firm YouGov. The sample includes 3,000 respondents from the U.S. general population. Additionally, the poll includes samples of 800 African-Americans, 800 Latinx Americans, and 800 Asian Americans.

Their sample, as they note elsewhere, is larger than that of most political surveys, plus the  oversampling of smaller populations they want to have good data about. No problem there.

But what about the timing? We know that on November 1, 2016, Democrats were much more likely than were Republicans to say that the economy looked good. Two weeks later, those positions were reversed. The economy did not change in those two weeks. The occupancy of the White House did.

The AICP survey was done last summer, months before the midterm elections, when the GOP controlled the White House, the Senate, the House, and the Supreme Court. That seems like kind of an important fact, but to find it, you have to scroll down to the methodology notes at the end of the report. 

Even in the GSS graphs, Egan has drawn a trend line that smooths out these shifts that are possibly caused by electoral changes. Egan also has lumped together twelve institutions. Separating them in to categories (e.g,. government, non-government) might allow us to see even sharper demographic differences.

The AICP, on the other hand, does report about confidence in specific institutions, twenty in all. The authors conclude that “confidence in institutions is largely driven by party affiliation.” They neglect the corollary: who has confidence in which institution can shift quickly when an election changes the party in power. This volatility makes it a bit misleading to talk about confidence in “institutions” as though people were thinking about them in the abstract. For example, the authors say, “The executive branch is the institution in which Democrats have the least confidence, while Republicans rank it the fourth highest.” Surely this difference is not about what people think of “the executive branch.” It’s about Donald Trump. These days, isn’t everything?

“Cold War” — A Love Story

January 15, 2019
Posted by Jay Livingston

Like “Ida,” his 2013 film, Pawel Pawlikovsky’s new film “Cold War” is visually stunning. Like “Ida,” it is shot in black and white, often with high contrast, and in the old 4:3 aspect ratio. And like “Ida,” it departs from movie conventions we’ve come to take for granted. (My post on “Ida” is here.) The storytelling is elliptical. It skips over long periods of time, and the characters rarely explain their choices. The audience must fill in the gaps.

“Cold War” is set in Europe — mostly between Communist Poland and Paris, mostly  in the 1950s. That’s half the implication of the title.  It’s a love story, but not the kind we’re used to. That’s the other half. In “Cold War,” love is a powerful force of attraction between the couple, Wiktor and Zula. But while it brings them together, it brings them little joy. The main publicity picture for the film (the freeze frame below) shows a moment of happiness and tenderness between them. But in the film, such moments are rare.

The film spans fifteen years. For much of that time, the lovers are apart, in different countries having joyless, passionless affairs and marriages with other people. Yet when Wiktor and Zula are together, their relationship is marked by conflict, anger, and betrayal. They separate, sometimes for years, but they cannot escape the force that makes each reunion passionate and painful. That force could be called love, but it’s far different from the love played out in most American films.

In our movies, love makes sense. It brings together two people who should be together. It infuses their relationship with passion, warmth, contentment. Conflicts may arise, but love can overcome them. Usually those conflicts are internal — the person’s own thoughts or problems that prevent him or her or both from realizing their love for the other person. Or the problem may be external — another man or woman trying win one of them away. Usually this person is flawed, acting on some selfish motive, The main character eventually sees through it all and frees himself or herself from whatever hold this person has on them and returns to the one who was right for them all along. If the movie is drama and not comedy, the lovers might not wind up together in the end. One or both may die. They may go their separate ways. But they’ll always have Paris.

In “Cold War,” Wiktor and Zula have Paris, where the freedom of the West allows them to develop their music (he’s a pianist, she a singer). The folk tunes that in Poland they had to transform into large choral numbers and then hymns to Stalin in Paris become sultry, smoky jazz songs, notably the one in the trailer.


You can imagine how this would play out in an American film. With artistic freedom, the lovers blend their Polish traditional culture with jazz, find success, and live happily ever after. In “Cold War,” the transition from Poland to France brings little comfort. The translation of that song from Polish to French falls flat. (I assume the symbolism is deliberate.) The words are meaningless. “The pendulum killed time.” Worse, the translator is a woman Wiktor is sleeping with. Nor is Zula exactly faithful. (“Michel is great,” she tells Wiktor, “he fucked me six times in one night.”)*

Love in “Cold War” also has a decidedly unAmerican relation to the larger forces of government and society. In our films, these forces may keep the lovers apart for a while, but either the lovers fight against these constraints and win, or they escape. Even if they die, their spirit is undaunted. In “Cold War,” both Wiktor and Zula, in different ways, compromise themselves, or rather the state, personified by Communist bureaucrat Kaczmarek, forces compromise upon them. The solutions that are almost a cliche in movies set in the West are unimaginable in a totalitarian state, not even in the movies.

Bleak, yes, but well worth seeing. The film does not yet have wide distribution. It may not be coming soon to a theater near you. Currently it’s playing in only six theaters, three of them in New York. But if you have the chance, see it while you can.

----------------------------

* The only other clip from the film that I could find on the Internet also gives the same sense of their relationship in Paris. Wiktor and Zula sit at the bar in a club. Wiktor talks to a man seated next to him, Zula looks bored. “Rock Around the Clock” comes up on the jukebox. Zula gets up and dances with one man after another and finally gets up on the bar, dancing solo, flouncing her skirt, while Wiktor looks on with what seems to be a combination of resignation and distaste. The video is here.

Mrs. Maisel — Expletives Then and Now

January 13, 2019
Posted by Jay Livingston

When I watch “The Marvelous Mrs. Maisel,” the words that usually catch my attention are the anachornisms (see earlier posts here and here).  On Episode 7, which I watched last night, handsy, skill set, poster boy, and a few others sounded jarringly modern. But I also noticed a word that people in 1959 really would have used – goddam. The word stood out because on the show, it’s so rare.


The writers on “Mrs. Maisel” far prefer the word fucking. In fact, in the above scene, Susie’s brother-in-law has just said, “Give me the fucking chips.” The episode has just one other goddamn, but characters say fucking a total of sixteen times. That’s not unusual. Here are the totals for series.


In 1959, when educated, middle-class people wanted an expletive, fucking was not their go-to negative intensifier — especially among women and especially in mixed company. Think of Salinger’s Franny and Zooey, stories set and written in the mid-1950s. (The Glass family lives just across Central Park from the Weissman apartment we see so much of in TMMM.) I found an extensive collection of excerpts from the Salinger stories (here ) – thirty goddamns and not a single fuck. Google nGrams searches all books and finds something similar.


In 1959, goddamn and variants appear ten times as often fucking. (The fucking boom that begins in 1965 continues. The lines cross in 1970, and by 1990 fucking is three times as frequent as goddamn


(The above graph goes only through 1990. In 2019 fuck was more than ten times as frequent. In a graph from 1920 to 2019, the dominance of fuck would be so great as to make the differences in earlier years practically invisible.)

Apparently, goddamn, like damn and hell, is an intensifier that has lost its intensity. Two years ago, I speculated (here) that these words derive their power from the power of the religion they blaspheme. As religion fades as a dominant force in American life, so do religion-based swear words. As I say, I am just guessing. What the heck to I know about it?

I’ve Just Met a Face

January 3, 2019
Posted by Jay Livingston

Each month, the Harvard Business Review has a feature called “Defend Your Research.” I confess, I am not a regular HBR reader, but as I was searching for something else, a serendipitous click whisked me to an episode of “Defend Your Research” that was about names, something I am interested in. The researcher, Anne-Laurier Sellier, had found that people look like their names. More specifically, people shown a photo of a stranger can make a better-than-chance guess as to what that person’s name is.(The HBR article is here.)


I was a tad skeptical. Hadn’t we been through something like this before with men named Dennis choosing to become dentists and women named Florence living in Florida? At least that research had a theory to explain the supposed connection — “implicit egotism” — even if the data turned out to be less than what met the researchers’ eye.* And now we have people named Charlotte choosing to look like a Charlotte?

Plausible or not, the empirical findings about faces and names were interesting, and I was curious to try my luck. Conveniently, Sellier had provided HBR two examples.


George, Scott, Adam, Bruce. Which could it be? “What if it's just that the other names on the list were rarer and less likely?” asks Scott Berinato, the HBR interviewer.

We controlled for that by offering only choices that were as popular as the actual name, based on the frequency of use. We controlled for most things we could think of, including ethnicity, name length, and the socioeconomic background of the subjects and of the people in the photos.

Any good researcher would control for these things. Everyone knows that. But “Bruce?” My spider sense suggested that the names Bruce and Scott are not really equivalent in popularity. To check, I went to the Social Security database on names.

The guy on the left looks like he’s about 40, the one on the right, early 30s. The HBR article came out in 2017. I guessed that the research was done a couple of years earlier. So I looked up the numbers for boy baby-names in 1975 for the older guy, 1983 for the younger. Here are the results.

And what are the answers to the name-that-face quiz? The man on the left is Scott. The man on the right is James. The correct name is two to three times more frequent than the second-most popular name on the list. It’s possible that Sellier’s subjects were putting together their estimate of the man’s age and their intuitive knowledge of name popularity. A better design might have been to show people four pictures of men roughly the same age and ask, “Which one is Scott?”

Maybe Sellier just picked the wrong examples to illustrate her point. After all, she says that she and her fellow researchers did this study in the US, France, and Israel and got positive results in all three countries. And they do have a theory — that people change their appearance so as to conform with the cultural stereotype of their name. “In America people presumably share a stereotype of what a Scott looks like. . . and Scotts want to fit that stereotype.”

I haven’t looked at Sellier’s publications. All I know is what I see in the HBR. Maybe, knowing that the HBR interviewer was named Scott, she picked a couple of photos — one Scott, one not-Scott — just for this occasion and selected Bruce and the other names on the spur of the moment. Still, I assume that a researcher being interviewed for a feature called “Defend Your Research” would bring examples that best illustrate her ideas. If this is the best she’s got, I’m afraid I remain unconvinced

-------------------------------

* For more on Dennis the dentist, see this 2018 post by Andrew Gelman (here  and follow the links.

The Charter School Advantage

December 31, 2018
Posted by Jay Livingston

In his Wall Street Journal op-ed that I criticized last week (here), Jason Riley begins with the story of a father who was desperate to get his kid into a charter school.

I thought he was going to tell me that the charter school had smaller classes or better graduation rates. Instead, he wanted to talk about something most parents take for granted when they send Johnny and Susie off to school each morning: physical safety.

He didn’t take it for granted. He told me the atmosphere at the old school had been chaotic, that bullying was rampant, and that his son, a sixth-grader at the time, had become terrified of the place. One day the boy was attacked by other students in the school lavatory, and the father got a call to pick him up from the hospital. It was the final straw. “I didn’t know anything about charters,” said the father. “I was just looking for an escape.” After the new school assured him his child would not have to worry each day about being assaulted by his classmates, he was sold.

Riley uses this anecdotal evidence to support the decision by Trump’s Secretary of Education Betsy de Vos to rescind Obama administration efforts to reduce the disparity between discipline imposed on Black schoolkids and White schoolkids.

But this anecdote also speaks to another controversial issue in education — whether charter schools, compared with traditional public schools, do a better job of educating kids.  On that question, the scorecard is mixed. In most studies that compare charters with similarly situated publics, there’s little difference in students’ test scores. For the rest, in some places, the publics come out better. And in some cities — New York, for example —  some charters consistently outperform public schools.

Charter school boosters claim that charter students do better because their schools are unencumbered by the teachers’ unions and educational bureaucracies that hobble public schools. But critics point out that charter schools have one way of improving their test averages that is not available to the public schools, and it has nothing to do with unions or regulations: charter schools can get rid of bad students. If you can force out the low scorers, the school average will be higher not because the school does a better job of teaching but just because of the way an average is calculated.

That’s true. But the expulsion option has an impact far beyond the math. Difficult and disruptive kids don’t just bring down the class average because of their individual low scores. They affect the general atmosphere of the class and the school. As Riley’s anecdote illustrates, troublemakers make it harder for other kids to learn and harder for teachers to teach.

I wrote about this back in 2012 (here), but I was reminded of it a few weeks ago in a conversation with an expert on educational testing and measurement who also had once taught in a middle school. We were talking about rating teachers on the basis of student test scores. Disruptive kids in the classroom, he said, can undermine the efforts of a teacher. Even the good teacher who gets one kid like that is not going to score well on these measures. With more than one, the problem grows almost exponentially.

That atmosphere in the public school in Riley’s WSJ op-ed was chaotic not because of the UFT and not because of the Board of Ed. The “bureaucratic” regulation responsible for it was the law that requires public schools to find a place for all kids, even the very difficult ones.

“Other Than That, Merry Christmas”

December 28, 2018  
Posted by Jay Livingston 

Some countries have a ceremonial head of state — a person who stands above or at least apart form partisan politics and who therefore can more easily be seen as representative of the whole country. The UK has the Queen. It is the Queen, not the prime minister, who delivers the Christmas message.

When surveys ask Brits for the person they most admire, the Queen always wins. This year, Theresa May, the head of the government, didn’t even make the top five.

In the US, both roles — political/govermental leader and ceremonial head of state — fall to the president. The overlap can get tricky, but most US presidents, on ceremonial occasions, have tried to to avoid politics and to appeal to widely shared values and symbols. Their Christmas messages, for instance, project warmth and hope. Even if they mention problems (the suffering of those who are ill, poor, homeless, bereaved), they emphasize the American spirit that helps us overcome setbacks.

Donald Trump seems incapable of playing that role for more than a minute. The pre-recorded Christmas message from the Trumps (Donald and Melania) stayed true to the genre. But on Christmas day, Trump quickly returned to the spirit of Christmas Trump — belittling and combative. On Twitter, he wrote, “I hope everyone, even the Fake News Media, is having a great Christmas!” And speaking to reporters he concluded with, “It’s a disgrace what’s happening in this country. But, other than that I wish everybody a Merry Christmas.”*

Since 1946, the Gallup Poll has been asking Americans “What man that you have heard or read about, living today in any part of the world, do you admire most?” Nearly every year, the most admired man is the president or president-elect. In the graphic below, the names in red are most-admired men who were not.

(Click for a larger view.)

When a president is not the most admired, it’s because of policy failures (Truman and Korea, LBJ and Vietnam plus domestic strife, Carter and stagflation) or personal failure (Nixon and Watergate). But with Trump, it’s something else. In most of the years in the chart, the president was not really doing anything unusually admirable. The admiration was directed to him not as a person or politician but as the symbol of the nation. For better or worse, he is our Queen. What has kept Trump from the top of the list for both years of his presidency is his unwillingness or inability to play that symbolic role.

(Earlier blogposts about our lack of a Queen are here and here )

---------------------
* I doubt that anyone was surprised that Trump lumped together this supposed national disgrace and the national holiday. After all, at his very first ceremonial occasion, the inauguration, he spoke of “this American carnage.” (In that speech, he assured us that the carnage would “stop right now.” That was two years ago. Apparently, the carnage has not been stopped but merely transformed into disgrace.)

The Ferguson Effect Goes to School

December 27, 2018
Posted by Jay Livingston

The “Ferguson Effect” has disappeared from the headlines. It doesn’t come up much in political discussions. But now, conservatives are pushing the same idea applied to schools.

In case you’ve forgotten, proponents of the Ferguson Effect sketch out this scenario:
  • A White cop kills a Black person, usually an unarmed Black person.
  • Black people protest.
  • The government, dominated by liberals, pressures police to be less aggressive, especially towards African Americans. Sometimes cops who have killed Black people are prosecuted.
  • Cops, to avoid being exposed to prosecution and accusations of racism, withdraw from proactive policing.
  • Crime in Black neighborhoods increases.
  • Conclusion: A policy intended to reduce racism winds up hurting Black people.
The fault, according to this model, is not in our cops but in our liberals.

For the schools version, just substitute teachers and administrators for police; substitue disruption/violence/bullying for crime. The villain remains the same — liberal government policies. The equivalent of consent decrees forced on police departments is an Obama-era policy that threatened schools with loss of funds for disproportionately punishing Black kids.

Betsy de Vos, Trump’s Secretary of Education, is rescinding that policy, and conservatives are cheering. Here is Jason S. Riley in the Wall Street Journal:

Racial parity in school discipline, regardless of who was being disruptive. . . is as silly as demanding racial parity in police arrests, regardless of who’s committing crimes.

If the Obama policy means that Black kids are less likely to be punished for an offense, then Black kids’ misbehavior that will increase. The losers will be the other kids in their schools. And since US schools are racially homogeneous, the anti-racism policy will wind up hurting Black people. According to Riley, this Ferguson Effect has already happened since the Obama policy went into effect in 2014.

The result is that more schools have been disciplining fewer students in order to achieve racial balance in suspension rates and stay out of trouble with the federal government. . .  In Oklahoma City, principals told teachers not to request a suspension “unless there was blood.”

The “blood” thing is a great quote, but if you are making generalizations about a nationwide policy, Oklahoma City is a very small n. Elsewhere in his article, Riley cites the report by the National Center for Education Statistics (here), a national survey, so that’s where I went for a broader view. The NCES asks teachers whether misbehavior is undermining their teaching.

(Click on a graph for a larger view.)


The graphs show no sharp changes after 2014. Misbehavior that interfered with teaching began to rise in 2007-2008 and continued to rise at about the same rate. Enforcement of school rules showed no change.

What were the effects of this supposed pullback in punishment? More bad stuff. Here’s Riley again.

After school districts in Los Angeles and Chicago softened their policies to curb suspensions, teachers reported more disorder, and students reported feeling less safe. Following a similar move in Philadelphia, truancy increased and academic achievement fell. Schools in Wisconsin that followed the guidance also saw subsequent reductions in math and reading proficiency.

Riley gives us three cities and one state, each with its own negative outcomes. It’s possible that these outcomes are related — more students feel unsafe so they stay away from school, and achievement falls. But Riley doesn’t tell us whether Los Angeles students, with their lower feelings of safety, also scored worse on tests of achievement. Or whether in Wisconsin, where achievement scores dropped, students also felt less safe. He mentions “disorder” but not actual crime or even bullying. Nor does he tell us the magnitude of these changes.

Were these cities and outcomes representative, or were they merely a few unusually juicy cherries that Riley picked? To get the more general picture, I went back to the NCES survey.  Had 2014 brought in a new era of  fear?


Fear decreased in the 1990s, and it leveled off in about 2010, and did not rise appreciably after that. There is no discernible effect of the 2014 policy. Bullying shows a roughly similar pattern.


In private schools, less affected by the Obama rules, bullying declined from 2013 to 2015. In public schools, it remained unchanged — hardly the effect Riley claims.

Finally, there is actual victimization. (The data is from the National Crime Victimization Survey.)


Victimization at school increased from 2010 to 2013. In 2014, the year when the new policy was introduced, victimization declined and has not risen since. So what can we say about the unintended consequences of the Obama policy? Where are those bad outcomes claimed by conservatives? On average nationwide, schools have not seen an increase in violence, crime, bullying, or fear.

This doesn’t mean that Riley is totally wrong. In some schools and some cities, decreased punishment of Black kids may have had the effects he claims. But it’s also possible that in some schools, the Obama policy had the good effects its proponents hoped for — Black schoolkids feeling less alienated, less resentful, and more positive towards school. At the very least, the policy did not lead to the nationwide crisis that conservatives would predict.

For the next two years (and perhaps more), thanks to DeVos-Trump, school staff will once again be free to punish who they wish, how they wish, without having to worry about charges of racism and without having to worry about federal pressure. If conservatives are right, bad things (bullying, crime) will decrease, and good things (attendance, learning) will increase, especially for Black kids.

Will that happen? No doubt, in 2020, our president will claim that because of this policy, schools are now beautiful, the best they have ever been in US history. The Wall Street Journal will publish cherry-picked success stories. The rest of us will have to wait for more systematic evidence.

Social Nostalgia and Myths of Decline, Part I: The Loneliness Fascination

December 12, 2018
Posted by Jay Livingston

  I. The Epidemic That Wasn’t

A couple of weeks ago, Arthur Brooks, in the New York Times, told us that an “epidemic of loneliness” was “tearing America apart.”  Brooks, citing a Cigna survey, brought us the bad news: “Most Americans suffer from strong feelings of loneliness and a lack of significance in their relationships. Nearly half say they sometimes or always feel alone or ‘left out.’”

I blogged my skepticism (here). That number — nearly half — was way out of line with what other repeated surveys like the GSS have found. Last week, Pew issued a “Facttank” report about loneliness. The Pew survey found, as had previous studies, that loneliness went hand-in-hand with feelings of dissatisfaction with family, work, and community. No surprise there. But the estimate of the scope of the problem was much smaller. Did nearly half the population suffer by these feelings? Hardly.

(Click to enlarge.)

Overall, one in ten Americans say they are lonely. Not having a partner makes loneliness more likely. So does not having money. (Nobody knows you when you’re down and out. Or rather, nobody knows 16% of you when you’re down and out, which is really not all that many — nowhere near the nearly 50% Brooks cites, thought it is more than the mere 6% among people with higher incomes.)

“Calling Claude Fischer,” I said in that blog post, because for years, Claude has been  been debunking these claims about loneliness epidemics, comparing them against the available evidence from social science. On Sunday, the Times included his response to the Brooks article.

Loneliness is a serious social problem, but there is no good evidence that it has spiked over the last couple decades or so. . . . We have no current epidemic of loneliness, but we do have periodic epidemics of alarm about loneliness.


The Times published several other letters on this topic (here) . Claude’s was the only one expressing any doubt about the loneliness panic.

Even among sociologists, he is in the minority.  The plague-of-loneliness idea and its corollary, the demise of community, have been at the core of important sociology books going back a half century or more.



More tellingly, these three books – David Riesman’s The Lonely Crowd, Philip Slater’s Pursuit of Loneliness, and Robert Putnam’s Bowling Alone — are part of a small, select group — serious sociology books that sold well outside of academia. These books were bought and read even by people who weren’t going to be asked about them on the final. Apparently, Americans like reading about loneliness.

(Continued in the next post.)

Social Nostalgia and Myths of Decline, Part II: Turtles All the Way Down

December 12, 2018
Posted by Jay Livingston
(Continued from the previous post.)

II. A Myth of Decline for Everyone

Obviously, loneliness cannot have been increasing at epidemic levels every year since 1950. Nor could the sense of community have been similarly decreasing. If they had, we would be at 100% loneliness and 0% community. Yet each generation looks to the past as having been a time of greater community and less isolation. What makes this idea so irresistible?

My hunch is that the persistent appeal of this idea of a communitarian past has the same roots as another popular myth of decline — the authoritarian past. According to the authoritarian myth, parental authority has all but disappeared, and kids today are far less obedient than their counterparts of a generation ago. But of course, a generation ago, adults were saying the same thing about their kids, as were the adults of the generation before that about theirs, and so on. Turtles all the way down.

Nearly twelve years ago, I suggested (here) that these myths resemble the 19th century idea in evolution that “ontogeny recapitulates phylogeny” but in reverse. They project the experience of the individual onto the entire society.  In that post, I imagined the man who says, “The things kids say and do today — I could never have gotten away with that with my old man.”

He pictures his own father as much more powerful than he, the speaker, is now. But that’s only because he is remembering his father from the perspective of a child. When he was a child, his father really was much more powerful than he was — so much bigger and stronger, it seemed the father could do whatever he wanted. But when that child grows up and thinks about himself today, he is not looking up from the viewpoint of his own small children. Instead, he sees himself from his own place in the larger world. He knows that he is certainly not the biggest or strongest person around, he knows that his actions are limited by all sorts of constraints that are largely invisible to children. He sees that he cannot control all aspects of his children’s lives.

This perception generalizes to the idea that adults a generation ago were more powerful vis-à-vis children than are adults today.

The same logic underlies the idea of the decline of community. The world of the child is warm, nurturing, and personal; dependence on others is taken for granted. Compared with the world of grown-ups, life is simple. (Of course the child does not make that comparison; grown-ups do.) Adults, by contrast, move among a complicated diversity of separate settings where feelings count for less, where dependence is less tolerated, and where interactions are based on people trying to accomplish their own goals. Childhood is Gemeinschaft, or as that word is usually translated, community.  Then, as we grow up, the Gemeinschaft share of our lives dwindles, leaving us with a nostalgia for those simpler times. Mentally transposing that personal experience to the society at large takes us from “my childhood” to “the good old days,” you know, the time when people knew one another and cared about one another, when life was simpler, and nobody was lonely — just like when we were kids. But of course, when they were kids, their parents were similarly mourning the loss of the good old days, as were their parents. Turtles all the way down. 

There’s an interesting difference between these two myths of decline. The myth of the authoritarian past appeals mostly to those who find authoritarianism appealing. But the decline-of-community finds adherents across the political and cultural spectrum.  It’s not just liberal sociologists who patrol the loneliness-community axis. The Brookses at the New York Times who write about it (Arthur and David) are politically conservative but culturally liberal. But go way over to the right, and you’ll hear Hannity, O’Brien, Glenn Beck, and others mourning the loss of a more Gemeinschaft-like world. From left to right, these observers disagree about just what has caused the crisis (smartphones and social media are the latest villains), but they are united in their assumptions, despite the shakiness of the evidence.

Space and Time

December 9, 2018
Posted by Jay Livingston

Thanks to a link in the Times review of the new season of The Marvelous Mrs. Maisel, my post about the show’s language anachronisms has become the most viewed page on this blog. I hadn’t intended to write another post along similar lines, but then I watched the first episode of the new season. We are still in the late 1950s. Midge (Mrs. Maisel for you non-fans) has separated from her husband Joel, but she still loves him. She calls him from Paris. But he is not so keen on getting back together.


I had just seen folksinger-songwriter Christine Lavin (along with several other old folkies) at a 50th birthday celebration for the radio show “Woody’s Children.” And I recalled the title of one of her songs: “If You Want Space, Go to Utah.” It appeared on her album “The Bellevue Years.” That album was released in 2000. But when had “space” become part of the psychobabble lexicon? Probably the 1990s. That would have been early enough to allow it to become familiar by 2000 yet recent enough to still merit Lavin’s satirical take. My guess was that “space” came out of the EST training  that became popular in the late 1970s and spread from there. 

I checked Google nGrams using a phrase I thought would capture the idea of emotional space and exclude the more literal meaning — “need some space.”



The curve rises in the late 70s and shoots upward through the rest of the century. But in 1960, when Joel is talking on that rotary phone, the space people had was something that could be measured in square feet.

Someone on Twitter suggested that maybe Joel meant closet space. Could be. Nobody in New York has enough closet space – not now, and not in 1960.

Tom Waits

December 7, 2018
Posted by Jay Livingston

Tom Waits is 69 years old today.

I don’t remember how I found my way to Tom Waits, though it happened fairly late in my listening life,  or who showed me the way? Was it the jazz station DJ who played “Emotional Weather Report” early one morning as I was driving to New Jersey? Or my step-brother the huge Dylan fan? Or was it the friend who sent me a mix tape with the Tori Amos cover of “Time.”? (Waits’s songs do not lend themselves to covers. But Amos’s “Time” is an exception. And of course there’s Springsteen’s “Jersey Girl.”).

Waits’s lyrics, like Dylan’s, shine with novel imagery of the familiar world.

You’re east of East St. Louis
And the wind is making speeches
And the rain sounds
Like a round of applause.


But Waits, also like Dylan, often stays in his own room, inviting us in to look at the striking but puzzling pictures on the wall.



Oh and things are pretty lousy
For a calendar girl.
The boys just dive right off the cars
And splash into the street
And when they’re on a roll
She pulls a razor from her boot
And a thousand pigeons
Fall around her feet

Anyway, here’s the original, just Waits (voice and guitar) and an accordion sounding more like a concertina.



Doctor My Eyes

November 29, 2018
Posted by Jay Livingston

You could have seen it coming. A little over a year ago, the University of Wisconsin board of regents passed a Free Speech resolution. The intent, supposedly,  was to guarantee “all members of the university community the broadest possible latitude to explore ideas and to speak, write, listen, challenge and learn.”

A fine principle, free speech. Everybody likes free speech, so that’s what the regents had to go with. After all, they couldn’t very well pass a resolution that protected only conservative speech. But that seems to have been their intent. That part about “the broadest possible latitude” — just kidding.

So when a communications professor at UW LaCrosse had author and former porn star Nina Hartley give a talk during “free speech week,” the university system president and the board sent him a letter of reprimand. His “poor judgment,” as judged by the board, will affect his salary adjustment, though the board doesn’t say exactly how much he will have to pay for free speech. 

What struck me was not the obvious hypocrisy. As I say, that was predictable (the Inside Higher Ed story (here) has some of the more mealy-mouthed quotes). It was this gem in an op-ed written by one of the regents, Bob Atwell:

Most of us don’t need science to know how devastating pornography is to the mental, physical and social health of those enslaved by it. We can see it in the sad and empty eyes of millions of boys and young men whose zest for life is being sucked into their smart phones.

I was having double déjà vu. First, “we don’t need science.” Back in February. Ross Douthat said pretty much the same thing, though not quite so blatantly. In fact, when prodded, he acknowledged that rape, pregnancy, and abortion had all decreased as porn became more and more widespread. He thought porn made people unhappy, though he allowed that the evidence linking porn with unhappiness was flawed. Nevertheless, he persisted. Porn was just plain bad.

Years before, Irving Kristol, a founding father of neo-conservatism, writing in the Wall Street Journal had argued in language very similar to regent Atwell’s: “we don’t really need social science to confirm what common sense and common observations tell us to be the case. Can anyone really believe that soft porn in our Hollywood movies, hard porn in our cable movies, and violent porn in our ‘rap’ music is without effect?” (For more detail, see my earlier blog post ).

Then there were those “sad and empty eyes” and the lost “zest for life.” Where had I heard that before? I searched my files and found it.

This is a very degrading and destructive habit. There is probably no vice which is more injurious to both mind and body, and produces more fearful consequences than this. . . When the evil has been pursued for several years, there will be an irritable condition of the system; sudden flushes) of heat over the face; the countenance becomes pale and clammy; the eyes have a dull, sheepish look.

Back when I taught deviance, I would sometimes read a longer version of this passage to students and ask them to guess. Weed and cigarettes were the usual suspects, but even after I identified the source and date — Our Family Physician published in 1885 — nobody got it. Nor did it help when I would tell them the title of the chapter — “Onanism.”

I’m not all that familiar with the actual research on how porn (or masturbation) affects young men (or women). Its enduring effects on older conservatives seems clearer — a tendency to reject science and replace it with “common sense” and a deep look into the eyes of the afflicted.

Randy Newman

November 28, 2018
Posted by Jay Livingston

Randy Newman is 75 today.

Nearly 20 years ago, I was in a movie theater watching Toy Story 2. It may have been someone’s birthday party. I don’t remember. What I do remember is the song “When She Loved Me.”



As the song ended, I thought: here I am, a grown man  surrounded by a bunch of eight-year olds, and I’m practically in tears because of a song that a cartoon toy doll just sang about a cartoon girl.

If this song does not win an Academy Award, I thought, there is no justice. It didn’t and there wasn’t. The Oscar went to Phil Collins.

The song has none of the irony that pervades Newman’s non-Pixar songs. In those songs, the voice we hear is a flawed characters an unreliable narrator, like the voice in his biggest hit “Short People.” (Some unimaginative listeners, unable to see the satire and irony, took Newman literally and condemned the song.)

The ambivalence haunts even Newman’s love songs, like “Marie,” which seems merely beautiful until you listen to the lyrics and realize that this guy is an abusive drunk, someone Marie would be better off without..

    And I'm weak and I'm lazy
    And I've hurt you so
    And I don't listen to a word you say
    When you're in trouble I just turn away

And yet, his feeling is real.


(I made similar observations in this 2008 blog post after seeing Newman in concert at Carnegie Hall.)

All The Lonely People . . . Are There Really More of Them Than Before?

November 27, 2018
Posted by Jay Livingston

Loneliness seems to have an irresistible appeal. Last weekend, it was Arthur Brooks in the New York Times (here) bringing us the bad news: “America is suffering an epidemic of loneliness.”

The consequences of this loneliness are serious, says Brooks. Riffing off Sen. Ben Sasse’s (R Nebraska) recent book, he lists suicide, drug overdoses, the mail bombs to Trump’s opponents, the mass killing in Squirrel Hill, and above all, political polarization. The title of Sasse’s book is Them: Why We Hate Each Other — and How to Heal.  The title of Brooks’s op-ed is “How Loneliness is Tearing America Apart.” We now live, he says, in “a country suffering from loneliness and ripped apart by political opportunists seeking to capitalize on that isolation.”

Brooks’s other source of information Besides Sasse is a report (here) issued last May by Cigna, the insurance company, based on an online survey of 20,000 Americans. It shows, as Brooks says, that “most Americans suffer from strong feelings of loneliness and a lack of significance in their relationships. Nearly half say they sometimes or always feel alone or left out.”

Brooks is not the first loneliness spotter to cry “epidemic.” Back in April, a month before the Cigna report was released, the Times’s other Brooks, David, warned that “Facebook and other social media companies are feeding this epidemic of loneliness and social isolation.” Psychology Today ran an article “Epidemic of Loneliness” in 2009. The term has cropped up in the popular press for decades. Google nGrams shows the phrase first appearing in books in the early 1960s, taking a giant leap and fall in 1980, but holding steady since then.



But every so often, a Brooks or a Sasse runs in breathless with news of a dangerous loneliness epidemic (the nation's “number one health crisis” according to Sasse) —  all apparently unaware that sentries on the loneliness watch ten, thirty, and even sixty years earlier had issued the same alarm.

True, loneliness and social isolation are bad for your physical and mental health, as the Cigna report and much previous research confirms. But Brooks is claiming something else — that the increase in political polarization has been caused, at least in part, by an increase in loneliness. The only trouble with this idea is that there is no evidence that loneliness has been increasing.

Calling Claude Fischer. For years, with each rediscovery of a loneliness epidemic, he has added historical and methodological information in an attempt to calm the waters, usually to no avail. Nevertheless, he persists. As he says in a blog post (“Loneliness Scare Again… and Again… and…” ) inspired by one of the Brookses, it’s a dirty job, but someone has to do it. And in fact, Fischer is no longer a solitary voice crying in the wilderness. Yes, journos on deadline and Senators on the make ignore him, but now more official sources are sometimes echoing what Fischer has been saying. An article in CQ Researcher, an offshoot of Congressional Quarterly, cites sociologists Fischer and Eric Klinenberg, both skeptical about any increase in loneliness. And Sasse’s Senate colleague Mike Lee (R Utah), or whoever is doing the research and writing on his Webpage, says, “It is not at all clear that loneliness has increased over the last several decades.”

Still, we get articles like the one by Arthur Brooks, and Brooks is a man who respects sociological research. Why, in spite of all the evidence, does it seem as though Americans are getting lonelier and lonelier? I have an idea, which I will leave for a latter post.

About Joni Mitchell

November 7, 2018
Posted by Jay Livingston

Joni Mitchell is 75 today.

Fifty years ago, liking her music was so cool. But by the end of the century, that had changed, as I painfully realized when I saw “About a Boy.” She had become the punch line to a joke.

It’s not that Joni herself changed, though she did, nor that her music changed, though it did. But what had changed was the liking of her music. It has followed a cycle roughly similar to what Jenn Lena in Banding Together calls “genres,” from “avant garde” to “tradionalist.”

The boy “About a Boy” is about is Marcus, a twelve-year old who lives with his mother Fiona.

Marcus knew he was weird, and he knew that part of the reason he was weird was because his mum was weird.. . she didn't want him to watch rubbish television, or listen to rubbish music, or play rubbish computer games (she thought they were all rubbish), which meant that if he wanted to do any of the things that any of the other kids spent their time doing, he had to argue with her for hours.

She likes Joni Mitchell, and so does he. The two of them sing Joni Mitchell songs together. The scene in the movie — mother and son in the kitchen, singing not especially well — is painful to watch.

The political and cultural preferences Marcus has adopted from his mother do not do him much good outside the home, especially at his new school.

If he tried to tell Lee Hartley — the biggest and loudest and nastiest of the kids he'd met yesterday — that he didn't approve of Snoop Doggy Dogg because Snoop Doggy Dogg had a bad attitude to women, Lee Hartley would thump him, or call him something that he didn't want to be called.

Into their life comes Will (Hugh Grant in the movie), who makes it his mission to separate Marcus culturally from his mother, to transform Marcus into someone the other kids will not bully. He introduces Marcus to music that is more generationally appropriate, as in this clip.  (I’d embed it here, but the clip is Mystikal, and this post is supposed to be about Joni Mitchell.)

In the end Will is successful. The final lines of the book are reminiscent of the “K-Mart sucks” ending of “Rain Man.”

Will decided to give Marcus a little test. “Hey Fiona. Why don’t you get your music and we can all sing a Joni Mitchell song?”...

But Will was watching Marcus’s face carefully. Marcus was looking really embarrassed. “Please, Mum. Don’t.”

“But Marcus, you love singing. You love Joni Mitchell.”

“I don’t. Not now. I hate Joni Mitchell.”

Will knew then, without any doubt, that Marcus would be OK.

We Still Need a Queen — Now More Than Ever

October 31, 2018
Posted by Jay Livingston

As Durkheim noted long ago, the function of a ritual, regardless of its specific content, is to heighten group solidarity. So the important symbols in a ritual represent the group as a whole. Those symbols are objects, but they are also people — usually the group’s leader. That’s why America needs a queen. Or someone like her.

When Trump announced that we would go to Pittsburgh, the mayor asked him not to come. Many Jewish leaders said he should not come. Thousands of people signed a petition asking Trump to stay away from Pittsburgh. So did leaders of the Tree of Life Synagogue.

Nevertheless, he persisted. Thousands of people took to the streets in protest. The mayor and “the top four Republican and Democratic congressional leaders who were invited to join [Trump] all declined.” Not all of Pittsburgh’s tens of thousands of Jews opposed the visit. The Times reports (here) that “more than 40 ‘members of the Jewish community’” signed a letter welcoming Trump because they like his stance on Israel. Wow, more than forty.

If only we had a queen. Back in 2007, I wrote a blogpost with this same suggestion. I had just watched the movie “The Queen..”

Most European countries, with their long histories of monarchy, have retained a nonpolitical figure as symbolic ruler of the country. In some countries (England, the Netherlands, Norway, Spain, etc.) it’s an actual monarch; in others, it’s a president, who has only ritual duties, while the actual business of running the country falls to the elected prime minister. But in the US, we have this strange system where a partisan politician is also our ceremonial head of state.

The “partisan politician” at the time was George W. Bush. Today “partisan” seems like too weak a word. Trump rarely tries to accommodate the entire nation. He likes winning. . . . and gloating about winning, waving his triumph in the loser’s face. And when he does try to be accommodating, he’s not very good at it.

The family of Daniel Stein, a victim of the attack who was buried on Tuesday, explicitly told inquiring federal officials that they did not want to meet with the president. They cited Mr. Trump’s comments immediately after the shooting that the Tree of Life should have had an armed guard. “It was just a worthless thing to say,” said Steven Halle, Mr. Stein’s nephew. “When something tragic has happened, you don’t kick people when they are down. There should have been an apology.”

“You don’t kick people when they are down.” Well, Mr Stein, maybe you don’t.

One other observation from that 2007 post now strikes me now as quaintly amusing.

An early scene in “The Queen” shows Tony Blair coming to Buckingham Palace. He has just won the election in a landslide, but he will not be prime minister until he kneels before the Queen and is officially requested by her to form a government. As historian Robert Lacey says in his commentary track on the DVD, “People feel it’s good that these politicians have to kneel to somebody to be reminded that they are our servants.”

The president, going before someone who symbolically represents the entire nation, and kneeling. Imagine that, if you can.

Why Not the Vest?

October 28, 2018
Posted by Jay Livingston

Yesterday, Trump blamed the Dodger loss on the manager, Dave Roberts. He shouldn’t have taken out the pitcher.
Trump also blamed the slaughter in Pittsburgh on the Tree of Life syagogue. They should have had an armed guard.

Unlike many of Trump’s statements, these are not lies or untruths. They are counterfactuals about a single event; there is no evidence that can tell us whether they are accurate. It’s unlikely that a similar baseball situation will soon arise. And if, in some future fifth inning, a pitcher who is pitching well tells the manager that he’s tiring, and the manager thinks about replacing him, will anyone remember this game?

Mass shootings are different There will be more of them in our future — this is America after all — so we will continue to search for policies to reduce the carnage. The armed guard idea is very popular these days, especially among gun lovers — the people who want to increase the sale of guns.  After every mass shooting now, they tell us that the only solution is armed guards.

 I suppose it’s worth noting that the police who arrived at the synagogue were armed, heavily armed. They were also trained, well trained. Their training and weaponry exceeded that of any guard a synagogue might have had. Yet four of the officers were wounded. Two are still in the hospital. Were it not for their bulletproof vests, police officers too might have been among the fatalities.  And therein lies the answer —  bulletproof vests.

The assumption behind armed-guard policy is that we cannot do anything about the shooters. We cannot change their psychology, and we certainly cannot —  must not --- do anything to limit their access to extremely deadly guns. In that spirit, and using the same assumption, I am offering this modest proposal: All schoolchildren, all worshippers, all those who attend concerts or popular clubs, all spectators at movie theaters and sporting events — they should all wear bulletproof vests.

When you go into a synagogue, they usually have a large box so that you can pick out a yarmulke and tallit if you haven’t brought your own. Imagine if Tree of Life synagogue had also had a box of bulletproof vests. Or if Steve Scalise and those other Republican legislators had had the good sense to wear bulletproof vests when they went out on the field to play softball. Think of the death and injury that would have been prevented. At clubs, the person giving you the little bracelet or stamping your hand could also give you a bulletproof vest. Schoolchildren would have a bulletproof vest at home to put on as they leave the house for the schoolbus.

The NBVA membership would burgeon. States would pass laws promoting the manufacture and sale. Think of the variety as fashion designers get into the arena. Bulletproof vests for all occasions, in all colors. Cute, pink vests for girls to match the cute, pink AR-15s they can now buy (I am not making this up.).

Yes, some people may choose to walk around unvested. But hey, some people disable their car airbags and don’t use the seatbelt. If these risktakers get shot, we will make the same argument about bulletproof vests that our president makes about armed guards.

A vested society is a safe society. That will be one of the slogans. Or, “You can’t stop a bad guy with a gun, so make yourself less vulnerable.” OK, I admit it lacks the macho fantasy element of the good guy with a gun, but that’s true of “shelter in place” and other parts of “active shooter drills.”  Anyway, the goal is the same, and the vests will be more effective.

I have seen the future, and it is bulletproof vests for everyone. What a country.

Make America Great  — and Safe! — Again.

“The Deuce” — Old Porn, New Language

October 16, 2018
Posted by Jay Livingston

If you’re old enough, it’s easy to spot language anachronisms in period TV dramas like “Mad Men” and “The Marvelous Mrs. Maisel.” I’m old enough. I notice the terms that we now take for granted but were nowhere to be heard in the wordscape of a few decades ago. (Earlier posts on these shows are here and here.) It’s much harder to remember the opposite — words and phrases from the period that have since disappeared, words that place the scene firmly in its historical context.

I’ve been watching “The Deuce” on HBO. It’s set in the  world of West 42nd Street circa 1970, with its pimps and hookers, strippers and porn merchants, cops and gangsters, and assorted others who plied their trade in that neighborhood. Nothing in season one seemed out of place, maybe because the episodes were written or overseen by people old enough to have been bar-mitzvahed by 1970.

In Season Two, Candy (Maggie Gyllenhall), has gotten into porn as a way to escape the dangers of life as a street hooker. She has gone from being on camera to writing scripts. In Episode 4, we see her at a shoot where an actor complains about his lines, and others support him. The script is bad even by porn standards, they say. Candy agrees.

“I’m gonna try to tweak it,” she says.

No, no, no. In 1977, people didn’t tweak scripts. They didn’t tweak much of anything, but if they did, it was an actual thing you could pinch with your fingers. In porn, it might have been a nipple. Anywhere else, it was most likely a nose. Nothing had changed in the 370 years since Hamlet.* It was only towards the end of the 20th century that people began tweaking less tangible things like systems, colors, or designs.

(Click on the image for a larger view. 
The graphs show the last few years of each period and the 
most frequent completions of the phrase "tweak the” for the entire period.)

Candy has ambitions beyond grinding out low-budget, poorly written fuck films. She wants to produce a film with multi-layered story, with characters, and with a woman’s point of view. She has come up with the idea — a porno version of Little Red Riding Hood — but she realizes that she doesn’t have the talent to write the script. So she meets with a writer. When she reveals what the film will be, she fears that he’ll reject the project. But she’s wrong. “It’s genius,” the writer says.


The trouble is that in 1977 (when the writer of this episode was in second grade), genius was not an adjective. It was a noun and only a noun. Even today, Webster online does not recognize genius as an adjective.

I know what people did not say in 1977. But what did they say? What is the language equivalent of the disco suit, cool then but now forgotten? The only thing I can think of is groovy.  Yes, there was a brief period — a few weeks back in the late 1960s — when people actually spoke the word without a trace of irony. But what else?

--------------------------------
* Who calls me “villain”? Breaks my pate across?
Plucks off my beard and blows it in my face?
Tweaks me by the nose? (II, ii)