Tom Lehrer – “Sociology”


December 18, 2022
Posted by Jay Livingston

Tom Lehrer has put all his songs online and has ceded all copyright protection.

Performing and recording rights to all of my songs are included in this permission. Translation rights are also included. In particular, permission is hereby granted to anyone to set any of these lyrics to their own music, or to set any of this music to their own lyrics, and to publish or perform their parodies or distortions of these songs without payment or fear of legal action. [The full statement and the songs are here.]

In the movie White Christmas, Danny Kaye sings a song called “choreography.” It’s  not the most famous Irving Berlin song from this movie (guess what is). You’ve probably never heard of it. It’s gently satirical — a  commentary on the pretentiousness of hoofers in the dance biz. Once, they simply spoke of “dancing”; now they prefer the inflated term “choreography.”

Lehrer used Berlin’s melody and the structure of the song to do a similar skewering of quantitative social science. His target, as he explains in the introduction in this video, was really political science, but you can’t swap out “choreography” in the Berlin song and replace it with “political science.” “Sociology,” on the other hand is a perfect fit.

Can We Talk?

December 8, 2022
Posted by Jay Livingston

Molly Worthen begins her column in the Sunday Times Opinion section by quoting a student who said that if she had known her intro sociology class required oral exams, “I’m not sure I would have taken the class.”  Worthen goes on at length (2500 words) in favor of oral exams.  

I think she’s right in principle, though I cannot speak from experience. I had no oral exams as an undergraduate — Worthen is talking mostly about undergrad courses — and even for the Ph.D., my department did not require an oral defense.

“American universities tend to infantilize students,” says Worthen, “taking attendance in class, employing fleets of student affairs bureaucrats to tend to their needs.” She neglects to mention the most infantilizing and bureaucratic practice of all – multiple-choice exams. Bureaucratic because in the interests of efficiency and universalism (objectivity) multiple-choice exams force students to minimize the information they present. Infantilizing because multiple-choice exams treat students as though they are incapable of complex thought. To take a multiple-choice exam, you don’t have to be able to think about and discuss ideas and evidence. You don’t even have to know the material, though it helps.

Multiple-choice exams replace the original goal of education — learning — with the ability to answer simple questions. My favorite example of the difference is again from grad school, in this case the foreign language requirement. The idea underlying this requirement is that not everything relevant in your field is written in English, especially work that is more recent and not written by superstars like Bourdieu or Foucault.

My French at the time was so feeble that I doubt I could have read a newspaper, certainly not Le Monde, probably not even the French counterpart of the New York Post. But to fulfill the language requirement, all I had to do was get a #2 pencil and pass the standardized test from (if I recall correctly) ETS. I don’t know how low the bar was set, but I passed.

A friend who had gotten his degree at Brandeis told me what the language exam there was like. “You go see Hughes [Everett C. Hughes] and he gives you a piece of paper with a citation for an article in a foreign journal.. ‘Go read this, come back Wednesday, and we’ll talk about it.’”

As Worthen says, “The most empowering thing a teacher can do for her students has nothing to do with constant surveillance of their academic engagement . . . . It is to simply talk with them, face to face, as fellow thinkers.”

When Chappelle Says It, It’s Funny

November 18, 2022
Posted by Jay Livingston

Dave Chappelle, in his SNL monologue, offered an insight about language that I’ve used a few times in this blog. It’s about adding the definite article “the” to a demographic category.



Here’s how I put it in a blog post seven years ago after candidate Donald Trump (remember those good old days? they’re back) had told an interviewer, “I’d be phenomenal to the women.”

When you add “the” to a demographic group and speak of “the women” or “the Blacks,” you are separating them from the rest of society. Without the definite article, they are included. To say, “In our society we have Blacks, Jews, women. . . . .” implies that they are all part of our group. But, “We have the Blacks, the Jews, the women . . . .” turns them into separate, distinct groups that are not part of a unified whole.

Chappelle got more laughs.

In another post a year later (here),  I quoted linguist Lynne Murphy on the same topic.

“The” makes the group seem like it’s a large, uniform mass, rather than a diverse group of individuals. This is the key to “othering:” treating people from another group as less human than one’s own group.


Turning those individuals into “a large, uniform mass” not only allows for “othering”; it’s also the precondition for paranoid conspiracy theories. Even if, as Chappelle suggests,* there are a lot of Jews in Hollywood, you can still see them as individuals, as Jews trying to turn out successful movies and TV shows. To see them as a cabal conspiring against Kanye or Christians or America it helps to think of them as “the Jews.”

---------
* See also Joel Stein’s 2008 article “Who Runs Hollywood? C’mon” (here).




Poll Problems — the Wisdom of Crowds or Pluralistic Ignorance

November 6, 2022
Posted by Jay Livingston

In the last few elections, the pre-election polls have gotten it really wrong. Partly that’s because cell phones and low response rates have made sampling difficult. But it also might be that pollsters are not asking the right question. Maybe the usual question — “Who are you going to vote for?” — is not the best way to predict election results.

The most recent episode of NPR’s Planted Money explored this question and in the end tried a less direct approach that some polls are now using. They went to the Marist College poll and got the directors to insert two questions into their polling on local House of Representatives races. The questions were:

  • Who do you think will win?
  • Think of all the people in your life, your friends, your family, your coworkers. Who are they going to vote for?

At the time, the direct question “Who will you vote for?” the split between Republicans and Democrats was roughly even. But these new two questions showed Republicans way ahead. On “Who will win?” the Republicans were up 10 points among registered voters and 14 points among the “definitely will vote” respondents. On the friends-and-family question, the corresponding numbers were Republicans +12 and +16.

Planet Money sees these results as an example of “the wisdom of crowds” — the idea that the best estimate comes not from the experts but from the collective judgment of everyone with an opinion on the matter at hand. The idea goes back to Galton at the Fair – statistician Francis Galton at the Plymouth (UK) farmers’ fair in 1906.

At the fair, Galton noticed people submitting their guesses on the weight of an ox. Galton the statistician kept track of all the guesses— some 800 in all— and computed the group mean. Galton the eugenist assumed that the guesses of the ignorant would detract from the overall accuracy, while the guesses of farmers and butchers would be closer. The mean of the group was 1197 pounds; the ox’s weight, 1198 pounds. The group did better than the most expert individual

That’s from one of the many blogposts I have done on the topic (here). I’ve looked at predictions in the Superbowl, the Oscars, and securities trading. In some cases, notably the speculation that led to the financial crisis of 2008, the crowd has not always been wise.

Planet Money thinks that the crowd — the people being polled — is wiser than the experts doing the polling and analysis and that Republicans are going to win big.

But there are two other ideas from social science that can also explain the discrepancy between the responses to the questions.
  1. Pluralistic ignorance. This is the cognitive error where people think, mistakenly, think they are in the minority. For example, college students may think that just about everyone else on campus is having great sex and having it frequently when in fact most of their fellow students are in the same unfulfilled boat that they are.

  2. Social desirability. When asked questions in a survey, people avoid answers they think will make them look bad. Ask “How many books have you read this year?” and you’ll probably get an overcount.
If Republicans — the politicians, the right-wing media, Trump, the MAGA hats, et al.  — are making the most noise and generally dominating the political discourse, supporting the Democrats may just seem wrong or at least not the sort of thing you want yo bring up. If Democrats then are keeping their preferences to themselves, even they will perceive Republicans as the dominant party, and that’s what they’ll tell the Marist pollster on the phone asking who’s going to win. They may also think that most others in their social world are going Red.

It’s complicated. The people you call, even the few who don’t hang up, might give answers that are inaccurate — about what others think and even about what they themselves think. That may always have been true, but in what Planet Money calls “the Golden Age of polling,” roughly from the seventies to 2014, pollsters could make the necessary adjustments. Since then, poll problems have been sort of like Covid — you manage to solve one, and then a new variant comes along.

Mona Lisa — Becoming Great

August 8, 2022
Posted by Jay Livingston

Carol Gillot, at her Paris Breakfasts blog, had a post about visiting the Louvre. When she was last there, she sketched some of the art works and added,”It was very common back in the day to copy paintings at the Louvre.” As evidence, she included this 1833 painting of a man (lower left) and a woman (center) each copying one of the many renaissance paintings in the room.

One of the commenters on the blog noted how difficult it was now to see the Mona Lisa. She was right.

Of course. The Mona Lisa is the greatest painting in the world, or at least one of them, and certainly the most famous.  

But take another look at that 1833 painting. Look at the lowest row of paintings, especially the one in the middle of the canvas.

Yes, it’s the Mona Lisa. Two hundred years ago, it wasn’t the greatest painting in the world. It was just another very good renaissance painting, good enough to merit a place in the Louvre, But it was not as great as the Titian portrait of Francis 1, which has a position two canvasses higher and closer to eye level.

And now she sits in her own separate room, roped off from the masses who flock to see her beauty and to experience the greatness of the painting. In two centuries, Mona Lisa has raised her game considerably.

Of course that’s ridiculous. The painting didn’t change. But what did? The conventional explanation is that the greatness was always there but that art critics and ordinary people came to perceive and appreciate that greatness only later.

Aside from the arrogance — assuming that we are better at art appreciation than were people in the 19th century — this explanation ignores the social component of tastes and evaluations. Duncan Watts, in Everything Is Obvious ... Once You Know the Answer argues brilliantly and convincingly that the Mona Lisa’s rise to the top depended on two things – luck and cumulative advantage. Luck — in 1911, the painting was stolen from the Louvre. When it was recovered two years later, it was shown all over Italy, and its arrival back at the Louvre was widely covered in the media (or as it was called then the press).

As a result, critics turned their attention to the painting, pointing out all the qualities that made it great and that made the theft and recovery so important. Other people would read these accounts and see for themselves how great the painting was. The snowballing cycle of fame and attention, what social scientists call cumulative advantage, raised Mona Lisa’s position on the charts in much the same way that a song becomes a hit. As it becomes more popular, it gets more air play, and that air play makes the song more familiar and popular, further pushing it up the charts.

*          *        *        *

This is much too brief a treatment of Watts’s essay. His tour of artistic successes has stops at the Billboard charts and Harry Potter, all with the same insight. It’s not the qualities inherent in a book, song, or painting that account for its success. There are lots of similar works, indistinguishable in quality, that we’ve never heard of. It’s the lucky break and cumulative advantage that take it from just another painting to GOAT.



Only “Guys and Dolls” in the Building

July 21, 2022
Posted by Jay Livingston

I saw on a local news site that Nathan Lane is moving into the Dorilton, an elegant building on New York’s Upper West Side just a few blocks from where I live. Lane and his husband are paying $4.1 million for the seven-room apartment. 

I have been inside the Dorilton only once, and it was the site of one of the more embarrassing moments in my life.

In May of  1992, my son was invited to a birthday party for twins who were turning two. They were a half year older. We knew them and their parents from the nearby playgrounds, mostly the Elephant Playground in Riverside Park but in cold weather an indoor playground,  a large open space on the upper floor of a church. The family had an apartment in the Dorilton. There were only two apartments on that wing of the building. When you got off the elevator, if you turned right, you were in their apartment; if you turned left, you were in the other.

I knew some of the other people at the party — playground parents — but certainly not all.  At one point, I was passing through the foyer, and I came face to face with a man who I was sure I had seen before —  good looking, dark hair, 35-40. “You look familiar,” I said. “Do I know you maybe from the playground?” I thought he might have been an uncle who sometimes took the twins.

“No,” he siad, “I just live across the hall.”

“But I think I’ve seen you someplace,” I said.

“Well, I’m an actor, so maybe that’s it.”

Maybe so, but where had I seen him? On TV? A commercial?  New York is full of actors, and most of them are, to put it euphemistically, between roles — waiting tables and going to auditions. So not wanting to embarrass him, I asked as tactfully as I could what he was doing these days.

“I’m in the new production of ‘Guys and Dolls’”

I was too embarrassed to admit that I knew nothing about this production — the staging, the stars, etc. — except that it had opened just a few weeks before to rave reviews. I guess this guy had gotten lucky and landed a part. I didn’t want to reveal my ignorance, but I did know the show pretty well, so I asked, “What role do you have?”

“I’m Sky Masterson.”

Oh my god. He was the star of the show – well, one of the four stars.  Peter Gallagher, and he looked familiar because two or three years earlier, I had seen him in the movie “Sex, Lies, and Videotape,” where he played one of the four main characters. I said something and slinked away. OK, it probably didn’t look like slinking. It looked like moving on, mixing, coffee in hand. But it felt like slinking. In the same way, the questions I’d asked him proabably didn’t seem offensive or denigrating to him, but in my mind, I knew that I was treating a Broadway star as though he were merely one of the thousands of unsuccessful hopefuls.

Eventually, the Gallagher family moved out of the Dorilton and went back to Los Angeles. But here is where we come full circle. The other important male character in “Guys and Dolls” is Nathan Detroit, and in that 1992 production, the part was played by Nathan Lane.


--------------------
1. Here is the sociology I cropped out of the above narrative and have relegated to this long footnote:

Why was that incident embarrassing?

 Embarrassment, says Goffman in his famous essay on the topic, is about identity. “Identity” may be too grand a term here – “being a certain kind of person” would do — but “identity” is the term Goffman uses.

In a social situation, people must act in accordance with the identity they claim so that others will ratify that identity. If there’s a glitch on either side, you get embarrassment. Often, embarrassment disrupts a situation when a person does something that casts doubt on their “projected identity.” It’s hard to project an identity as a person who knows the norms of dress and decorum if you’re standing there with your fly not zipped.

But embarrassment also happens when we unwittingly fail to acknowledge or ratify someone else’s identity. This includes mistaken identity, like greeting someone warmly who turns out to be a total stranger, or making a remark to the “wrong” person. It also includes not knowing the relevant aspects about the other person’s identity, like the fact that they are the star of the biggest Broadway hit of the season.

2. Peter Gallagher made an appearance in this blog a few years ago (here) in a post with a video of him making the cast recording of the show.
 
3. The title of this post alludes to the Hulu TV series “Only Murders in the Building.” Nathan Lane will be a regular in the cast next season. The fictional building in that show is The Arconia. In real life, across Broadway from the Dorilton and two blocks north is a building called The Ansonia. It appears in several Hollywood films. Walter Matthau lives there in “The Sunshine Boys” as do Jennifer Jason Leigh and Bridget Fonda in “Single White Female.” In real life, I live there.

Gun Possession Law: I, the Jury

July 15, 2022
Posted by Jay Livingston

I thought that the Supreme Court decision in the recent gun-possession case was a bad decision. I still do, but as I listened to a recent podcast — Lulu Garcia-Navarro’s “First Person” (here) — my thinking about it changed. 

Sharone Mitchell Jr.  is a prosecutor. He’s Black. He grew up on the South Side of Chicago. He knows the dangers of guns. When he was in middle school, a classmate of his was shot. Another kid he was friendly with had a gun.

And yet, as he says on the podcast, he supports the Supreme Court’s recent decision that allows people to carry guns. His reason is that gun-possession laws are used mostly against Black men who are not criminals and who are merely trying to protect themselves.

There is just the random threat of violence growing up in the neighborhood, right, this idea of gangs and getting jumped, getting jumped for your Jordans or getting jumped for your Starter jacket. I think people had an interest in keeping themselves safe. My friend who showed me my first gun was of that same mindset. Like this is what I’m going to do to make sure that I protect myself.

Mitchell’s experience as a prosecutor provided him with more evidence to support that view.

As I became a more experienced attorney, more and more of our cases, my cases, involved guns. . . actually gun possession cases. They are people who are accused of illegally possessing a firearm. That was the vast majority of my cases.

Lulu Garcia-Navarro: And just to be clear, that’s the only crime these people are being charged with a quarter of the time, just having a gun?

Sharone Mitchell Jr.: Illegally having a gun, yeah.

I remembered my own encounter with gun laws. In the 1980s, I served as a juror on a gun-possession case. The defendant was a young Black man, and the case sounded very much like the ones Sharone Mitchell prosecuted.

Lulu Garcia-Navarro: So can you talk me through the circumstances in which people are being arrested for gun possession not involving another crime? What typically draws the attention of the police in the first place?

Sharone Mitchell Jr.: Typically a search, you know, an encounter. Police could pull people over in a car. They could stop people on the street. We know that certain communities are policed very heavily. There’s lots of contact folks will have with police. So we’re talking young Black men in very particular neighborhoods.

That was my case all right. Harlem, one a.m.. Three Black men in a gypsy cab, two in the back seat, one (the eventual defendant) in the front passenger seat. The driver too is black. They make a brief stop for cigarettes, and as they pull out from the curb, the cops pull them over. They say that the cab had pulled out from the curb illegally, but that was clearly a pretext. The real reason was what Mitchell says: Black men in a Black neighborhood.

The driver, the DA’s main witness, testifies that when the man in the passenger seat saw the flashing light, he said, “Oh, shit,” and put something under his seat. The cops searched the car and under the seat they found a gun.

It seemed like a strong case, and I wondered why it hadn’t been pleaded out to some lesser charge. I knew that in New York illegal possession carried a mandatory one-year minimum sentence. So my first assumption was that our defendant was a bad guy, a criminal well-known to the police, a guy with a string of arrests and maybe a few stretches in prison, but that the gun charge was what they would get him on. At least he’d be off the streets for a year.

Instead, he seemed much more to fit Sharone Mitchell’s typical case. He took the stand in his own defense, and while the DA could not bring in any criminal history, she could ask about employment history. But our defendant had no long gaps between jobs that would indicate prison time. Even the DA allowed that the defendant was carrying the gun for protection. He had been mugged recently, and he was going to Harlem at one in the morning. But the law is the law.

In the jury room several people asked pretty much the same question that Mitchell raises: Hey, this is New York City.  Murders, robberies, assaults. Bad people doing bad things. Why are they wasting our time with a case like this?

We deliberated for about an hour.

Not guilty.

Majority Rule and the School Curriculum

June 4, 2022
Posted by Jay Livingston

Should schools teach things that a lot of parents don’t like even if those things are true? If most parents are creationists, should the science curriculum nevertheless teach about evolution or mention that the earth is four billion years old rather than 10,000? Or should curriculum avoid anything that contradicts the views of parents?

When it comes to the facts that children are taught, Reihan Salam wants majority rule.

And here the big challenge is whether or not you have public institutions that are advancing curricula and ideas that are basically going against freedom of conscience. Do we have situations in which people are being compelled to adhere to certain ideas, certain controversial ideas?

(From his recent interview with Ezra Klein.)

I don’t know where Reihan went to school, but in the schools I went to you were “compelled” to learn what was taught. Well, you weren’t compelled. But you would do better on the exams if you did learn it. Even if your conscience told you that Aristotle’s four elements theory was spot on, your chem teacher went against your freedom of conscience and compelled you to learn about the periodic table. The exams didn’t have any questions about earth, air, fire, and water.

Reihan goes on:

And I think that that’s something where it is good and healthy to have transparency in school curricula. I think it is good and legitimate to believe that parents should have access to good, reliable information about what is being taught and about the ideological content of what is being taught. This is obviously a very contentious issue, but I think that it goes part and parcel with our general belief and educational pluralism. That is the idea that schools tend to work best when they’re broadly aligned with the values and sensibilities of families.

It sounds so good. Who could be against freedom of conscience, against transparency or good, reliable information, or pluralism, or families?

Of course, Reihan isn’t talking about creationism or phlogiston theory. He’s talking about Critical Race Theory, but the issue of who designs the curriculum is basically the same.  What Reihan is offering is a calmer and more reasonable-sounding version of what louder right-wing politicians are saying. Much like the con man in “The Music Man,” they are telling the parents, “You got trouble my friends, with a capital T, and that rhymes with C and that stands for Critical Race Theory.” Twenty years ago, politicians like these were railing and legislating against “Sharia law,” which they claimed posed an imminent threat to freedom, family, and apple pie. Now it’s CRT. Plus ça change.

Should schools teach about the crucial place that race, especially the treatment of African Americans, has in American history and in the American present? It’s not a pleasant story, and it does not show White people as being predominantly noble. Not surprising that it makes some people feel uncomfortable or that they would prefer the “patriotic education” promoted by Donald Trump.

And if that’s what they want, in Reihan’s book that’s what they should have. He doesn’t say exactly what he means by “educational pluralism.” But it certainly suggests that “the values and sensibilities of families” should shape the curriculum. In many cases, that means that kids will get a Whitewashed version of US history, a version that does not make the majority feel uncomfortable.

But wait. Aren’t we past that? Don’t we acknowledge that slavery was bad and that the secession in order to preserve slavery was wrong? What about all those statues that have been torn down?

The same day that the Reihan Salam interview appeared, sociologist Peter Moskos tweeted a photo he had taken of a sign in Greenville, SC, a city of 70,000, 70% White, 23% Black. (The Greenville County population is 533,000, 76% White, 18% Black.) The sign was next to a statue of Robert E. Lee.

“Dedicated in reverence and admiration for their courage and integrity to the five signers Ordinance of Secession from Greenville County.”

Here’s the text Peter tweeted below the photo:
I really did not expect to see this. My naiveté. "In reverence and admiration"? These aren't founding fathers. These were literally traitors.
I don’t know what the history curriculum in the Greenville schools compels students to learn, but I would guess that, like the sign and statue, it is, as Reihan says, “broadly aligned with the values and sensibilities of families.” Well, the 70% of families that are White, not the 23% that are Black.   


Guns — How Addiction Makes Sense, and Doesn’t

May 25, 2022
Posted by Jay Livingston

A useful definition of addiction: trying to solve a problem by doing more of what caused the problem in the first place.

That’s the definition that came to mind when I read the response of politicians like Sen. Ted Cruz and Texas Lt. Gov.Dan Patrick to the massacre in a Uvalde, TX elementary school. They want more guns.

It’s the good-guy-with-a-gun solution, and it seems to make sense. Once you have created a world in a world where guns are a big problem because every bad guy can arm himself with military assault rifles, more guns seems like a logical solution.

I studied compulsive gamblers for my dissertation research. The men I hung out with certainly seemed to be in the grip of an addiction that kept them from thinking clearly about their problem. There wasn’t a lot of scholarly literature on the topic then (a big time-saver if you’re a grad student writing a dissertation). The Freudians talked about unconscious desires, and the behaviorists talked about reinforcement schedules. This was long before the days when brain activity would light up the fMRI screens.

What nobody considered was that compulsive gambling makes sense.
 

Does it make sense, is it rational, for a man to bet twice his weekly salary on a basketball game? From a distance, that seems crazy. But suppose he is already thousands of dollars in debt and has payments falling due soon — rent, phone bills, loan sharks. He needs a lot of money fast. He can’t get it from friends, loan companies, and banks any more. He has already used up his credit with them. A bet on the right team though could solve a lot of these problems or at least give him some breathing room. Is it possible? Of course it’s possible. He knows that he has made winning bets before. He knows that thousands of people will make that bet tonight and win. Of course, other thousands will lose, but all he has to do is be in the first group. And if he loses, well, the financial pressure he’s under is already so great that losing another few thousand will not substantially change his life.

Most of the gamblers I knew were, to varying degrees, in Gamblers Anonymous, a program that promotes abstinence as the only solution. Many of the men (there were no GA women in those days) said that initially they didn’t think they could ever get out of debt on just their regular income, but gradually they had done so. None of them had ever gambled his way out of debt, nor had any of their gambling buddies. They recognized the dream solution of the big score as a tempting but dangerous fantasy.

Many of them had made a big score occasionally. They took delight in recounting these, like the guy who spent ten minutes telling me how he had once handicapped the exact order of finish of the eight dogs in a race at Raynham. You can get a nice payout when you hit an exacta or trifecta. But he wound up in deep debt and eventually in GA.

The NRA similarly has a storehouse of good-guy-with-a-gun stories. Similar to the big-score stories of the men in GA, these frame “more guns” as a rational solution to the problem of gun violence.*  Texas Governor Greg Abbott, back in 2015, touted more guns as a worthwhile goal.


Thousands of Texans got the message. This year, one of them was Salvador Ramos. 

------------
* In “Arise Heroes” (here) Seth Brady Tucker, who grew up in Wyoming with guns and served in first Persian Gulf war, explains why the good-guy-with-a-gun is largely a fantasy.

Making “I Won” the Default

May 19, 2022
Posted by Jay Livingston

“Dr. Oz should declare victory. It makes it much harder for them to cheat with the ballots they ‘just happened to find.’” So posted Donald Trump on his social media platform Truth Social.*

He may be right. By declaring victory, you make that the default outcome. You put the burden of proof on the other side. That was Trump’s strategy in 2020. He started claiming victory before the election. Thus, his claims of victory after the election were merely a continuation of an established “fact,” even though that fact was established only by Trump’s repeatedly asserting it. That made it easier for his supporters to remain convinced that he won and to believe all his claims about fraudulent vote counts. It also apparently has raised doubts even among those who were not ardent Trump supporters.

In the pre-Trump era, a candidate in Dr. Oz’s position would say something like, “Well, it’s a very close, and we’ll have to wait for all the absentee ballots. But when all the votes are counted, I’m sure that we will have won.” That is in fact the situation that exists.

Or he could play the Trump card and declare victory – loudly and frequently, on TV and on Twitter.  If the final tally shows McCormick winning, that result will seem to go against an established fact. And even if courts and recounts uphold the result, Dr. Oz will avoid being labeled a loser.

Maybe this same strategy would work in other areas. I imagine Mark Cuban, owner of the Dallas Mavericks, declaring on Tuesday that a Mavericks victory the next night was certain. Then, after the game, which Golden State won 112-87, he could claim that there was basket fraud – that many of the Warriors’ points were “fake baskets.” He could get Dinesh D’Souza to make a film showing nothing but Mavericks’ baskets and the Warriors’ misses. He could call up the scorekeeper and tell him to “find me just 26 more points.”

OK, maybe we’re not there yet in basketball. But in politics this is another area where Donald Trump may have a lasting influence.  I expect that more politicians will use the strategy of declaring victory and then claiming voter fraud. The gracious concession speech will become a rare event.

----------------------
* I think that they call these posts “truths.” Twitter has Tweets; Truth Social has Truths. I don’t think they have yet come up with a verb equivalent to Tweeting. “Dr. Oz should declare victory,” Donald Trump truthed?

“Julia” — Serving Up Words Before Their Time

May 4, 2022
Posted by Jay Livingston

“The Marvelous Mrs. Anachronism” (here) is the post in this blog with by far  the most hits and comments.  And now we have “Julia,” the HBO series about Julia Child and the creation of her TV show “The French Chef.” It’s set in roughly the same time period, the early 1960s. And like “Mrs. Maisel,” it offers a rich tasting menu of anachronisms.

I don’t know why the producers don’t bother to check their scripts with someone who was around in 1962 – a retired sociologist, say, who is sensitive to language – but they don’t. Had they done so, they would have avoided the linguistic equivalent of a digital microwave in the kitchen and a Prius in the driveway. They would not have had a character say, “I’m o.k. with it.” Nor would an assistant assigned a task say, “I’m on it.” Nobody working with Julia would be excited to be on the front lines of “your process.” “Your method” perhaps or “your approach” or even “all that you do,” but not “your process.”

If you’re a TV writer, even an older writer of fifty or so, these phrases have been around for as long as you can remember, so maybe you assume they’ve always been part of the language.

But they haven’t. Sixty years ago, people might have asked how some enterprise made money or at least made ends meet. But they would not have asked it the way Julia’s father asks her: “What's the business model down there? Does public television even* have a business model?” 

In her equally anachronistic reply, Julia says, “Nothing's a done deal yet,” That one too sounded wrong. I don’t recall any done deals in 1962.

To check my memory, I went to Google nGrams. It shows the frequency of words and phrases as they occur in books. Most of the phrases that seemed off to my ear did not appear in books until the 1980s. A corpus of the language as spoken would have been better, and there’s a lag of a few years before new usages on the street make it to the printed page. But that lag time is certainly not the twenty years that nGrams finds.

(Click on an image for a larger view.)

In another episode, we hear “cut to the chase,” but it was not till the 80s that we  skipped over less important details by cutting to the chase. (Oh well, at least nobody on “Julia” abbreviated a narrative with “yada yada.”) Or again, a producer considering the possibilities of selling the show to other stations says, “This could be game changer.” But “game changer” didn’t show up in books until four decades after “The French Chef” went on the air.   

“This little plot is genius,” says Julia’s husband. It may have been, but in 1962, genius was not an adjective. An unusual solution to a problem might be ingenious, but it was not simply “genius.” Even more incongruous was Julia’s telling the crowd that shows up for a book signing in San Francisco, “I'm absolutely gobsmacked by this turnout.”  Gobsmacked originated in Britain, but even in her years abroad, Julia would not have heard the term. Brits weren’t gobsmacked until the late 1970s, with Americans joining the chorus a decade or so later.

I heard other dubious terms that I did not know how to check. “The Yankees are toast,” says one character, presumably a Red Sox fan. It’s not just that in 1962 the Yankees were anything but toast, winning the AL pennant and the World Series; I doubt that anyone was “toast” sixty years ago.

The one that bothered me most was what Julia’s friend Avis says after making a small play on words. She adds, “See what I did?” I’m pretty sure this is a very recent usage and was not around in 1962. I’d just as soon not have it around today.

Finally, in the latest episode, which I just now saw and which inspired me to write this post, we have the anachronism that nobody notices — “need to” instead of “should” or “ought to” or other words that carry a hint of what is right or even moral. In “Julia,” a young couple meet for lunch at a diner. It’s a blind date, and as they talk, it becomes clear that they are a good match. They talk some more, and we cut to a different plot line. When we come back to the diner, the couple are still there, still talking, but they are now the only ones left in the place. The waitress comes to the table and tells them patiently, “You need to go.”

What she means of course is that she needs for them to go. In 1962, she would not have phrased it in terms of their needs. She would have said, “You have to go.”

-------------------------------
* “Even” as an intensifier in this way may also not have come into use until much later in the century. See this Language Log post on “What does that even mean?”





Robert Morse, 1931 - 2022

April 21, 2022
Posted by Jay Livingston

The opening sentence of the Times obit  for Robert Morse mentions his roles in both “How to Succeed in Business Without Really Trying” in 1961 and “Mad Men” forty-six years later. Those Morse roles were linked in subtler ways — the characters’ career trajectories and their clothing choices, as. I pointed out in a 2010 post which I am hauling out of the archives on this Throwback Thursday.

The post was mostly about America’s concern with “conformity,” but Morse’s performance in the video from “How to Succeed” is worth two minutes and fifty-three seconds of your time even if you’re not considering the cultural-historic questions.

*******************

July 1, 2010
Posted by Jay Livingston

“How to Succeed in Business Without Really Trying” was on TMC Tuesday night in honor of the centenary of Frank Loesser’s birth. The Broadway show opened in 1961, sort of a musical comedy version of William H. Whyte’s 1956 best-seller The Organization Man.


Loesser’s musical was light satire; Whyte’s book was sociology. But the message of both was that corporations were places that demanded nearly mindless conformity of all employees. Or as Mr. Twimble tells the ambitious newcomer (J. Pierpont Finch), “play it the company way.”
FINCH:When they want brilliant thinking / From employees
TWIMBLE: That is no concern of mine.
FINCH: Suppose a man of genius / Makes suggestions.
TWIMBLE: Watch that genius get suggested to resign.
Conformity was a topic of much concern in America in those days, in the popular media and in social science (as in the Asch line length experiments). Today, not so much.
the Organization Man, if he ever existed, is dead now. The well-rounded fellow who gets along with pretty much everyone and isn’t overly brilliant at anything sees his status trading near an all-time low. And all those brilliant screwballs whose fate Whyte bemoaned are sitting now on top of corporate America.
So wrote Michael Lewis in Slate 1997.

That’s one version. I don’t really know if the corporate climate is different today (where’s an OrgTheorist when you need one?). No doubt, “brilliant screwballs” can find save haven in corporations, at least in areas that require technical brilliance, and some may wind up at the top. But I wonder how such quirkiness survives in other areas like sales. Barbara Ehrenreich, in her recent book Bright-Sided, looks at corporations today – with their motivational speakers and “coaches” – and sees the same old demand for cheerful, optimistic obedience, especially in this era of outsourcing and downsizing.
The most popular technique for motivating the survivors of downsizing was “team building” – an effort so massive that it has spawned a “team-building industry” overlapping the motivation industry. . . .
The literature and coaches emphasize that a good “team player” is by definition a “positive person.” He or she smiles frequently, does not complain, is not overly critical, and gracefully submits to whatever the boss demands.
Or as Frank Loesser put it,
FINCH: Your face is a company face.
TWIMBLE: It smiles at executives then goes back in place.
Here’s the whole song from the 1967 film version:



The movie has another uncanny resemblance to today. The costumes and even the sets look like “Mad Men” – not surprising since both are set in the New York corporate world of the early 1960s. But there’s more. In the Broadway show and then the musical of “How to Succeed,” Robert Morse (Finch), rises to become head of advertising. Fifty years later, in “Mad Men,” Robert Morse (Bert Cooper) is the head of an advertising agency. (And he’s still wearing a bow tie.)


I asked my son, a “Mad Men” watcher, to look at the 1967 movie and try to identify the actor playing Finch. He couldn’t, at least not without a hint or two.

Baby Names and the Value on Distinctiveness

March 15, 2022
Posted by Jay Livingston

Namerology, the former Baby Name Voyager (here),is a great resource for anyone interested in graphs showing trends in baby names in the US. It uses the data from the Social Security Administration, but it’s graphs are much better than those you can create on the SSA Website. For instance, it allows you to compare names.

I wanted to explore the idea that the diversity of names is increasing. The most popular names today are not nearly as dominant as popular names in the past. It’s like TV shows. The ratings or share-of-audience of today’s most popular shows are numbers that twenty years ago would have marked them for cancellation. Compare the most popular name for girls born in the 1990s,  Emily, with her counterpart in the 70s, Jennifer.


 Jennifer’s peak was three times higher than Emily’s.

Jennifer also stacks up well against the top name of the sixties (Lisa) and of the eighties (Jessica).


Jennifer’s popularity was extraordinary. Jessica was at the top for nine of the eleven years from 1985 to 1995. And Lisa held top spot for eight years, 1962 - 1969. But Jennifer was number one for fifteen years, 1970-1984. We will probably never see her like again.

But then along comes Mary. The numbers for Mary back in the day dwarf those for Jennifer at the height of her popularity.

I read this graph to mean that the way we think about names has changed. Today, we just assume that you don’t want to give your kid the same name that everyone else has. You want something that different, but not too different. But a hundred years ago, distinctiveness was not important criterion for parents choosing a name. Year in year out, the girls name most often chosen was the same year in year out — Mary.

Names may be only part of a more general change in ides about children. Demographer Philip Cohen (here) speculates that compared with parents in the early 1900s, parents in the latter half of the twentieth-century saw each child as a unique individual. After all, children were becoming scarcer.  From 1880 to 1940, the average number of children per family declined from 4.2 to 2.2, And while Mary remained the most popular name throughout that period, its market share declined from over 30,000 per million to about 20,000 per million.

The real shift starts in the 1960s. It may have been part of the general rejection of old cultural ways. But this was also the end of the baby boom. With new birth-control (the pill), having children became more a matter of choice. Family size declined even further. Each child was special and was deserving of a special name.

On Becoming a Beatles Listener

March 5, 2022
Posted by Jay Livingston

In the spring of 1964, I was getting a haircut in a barbershop in Tokyo. In the background a radio was playing American rock and pop. The sound was familiar even though I didn’t recognize any of the songs. I didn’t know any of the latest hits because I had spent the previous seven months in a small town up in the mountains. The family I was living with may have had a radio, but I cannot recall ever hearing it or what it played. The music I would hear on the variety shows on TV was all Japanese pop or sometimes Japanese versions of American hits. To this day, there are certain songs that were popular then — “Devil in Disguise” or “Bye-bye Birdie” — which in my mind’s ear I still hear in Japanese rather than English.

As I sat there, not really paying attention to the music, I realized that the song now playing was repeating the words “Yeah, yeah, yeah.” So this is it, I thought. This must be the Beatles that I’ve been reading about.  The Japan Times, the English-language newspaper that came daily to the house, had run stories about them. It also showed the Billboard Top-20 each week, and I would see Beatles songs in several of the top slots. But in that barbershop that day, to me they sounded like the rest of the music that had been coming from the radio — conventional rock and roll.

I thought of that moment last month as I was reading David Brooks’s New York Times piece “What the Beatles Tell Us About Fame” (here).  “How did the Beatles make it?” Brooks asks, and he gets the answer right. Partly. He sees that it’s not just about the music. Whether that music gets heard — recorded, distributed, played on the radio — depends on lots of non-musicians.

But hearing is not the same as liking. So how do the people who heard this music decide that they liked it, and liked it a lot? Brooks has a simplistic model for this process. “If a highly confident member of your group thinks something is cool, you’ll be more likely to think it’s cool,” as though the Beatles happened because influencers (they weren’t called that in 1963) were at work promoting them. But is that how people form their judgments of music? Surely we don’t think “Cool people like this so I’ll like it too.”

To understand how so many people come to share the idea that something is really great, we need a model more along the lines of Howie Becker’s “On Becoming a Marijuana User.”  In that famous article, Becker identifies three necessary steps: learning the technique of smoking weed, learning to identify the effects, and learning to define those effects as pleasurable.

Of course, listening to rock and roll doesn’t require any special technique. But what about identifying the effects? As my barbershop experience illustrates, recognizing the Beatles is not automatic. Just as Becker’s marijuana users had to learn to perceive the effects of weed,* listeners had to learn to distinguish the Beatles sound from other music. That wasn’t the explicit goal of the people who listened to Beatles songs over and over, but it was an important side effect.

As for defining what we are hearing as great, the influence of others is not nearly so evident as it was among Becker’s pot smokers. In the diffusion of popularity, it doesn’t seem like anyone is learning or teaching. People around us are grooving to the Beatles, and so are we. Besides, millions of others have pushed these songs to the top of the charts, confirming our judgment that this stuff is the best. Popularity cascades upon itself. The more that the music becomes popular, the more of it you hear. The more familiar it becomes, the better it sounds. The process is less like instruction, more like contagion.

In November of 1963, my social geography — living in my small town in the Japan Alps — had quarantined me from the emotions that flooded Americans when Kennedy was assassinated. I did not feel what I would have felt if I had been in the US . (My 2013 post about that experience is here.)  Five months later in that barber shop, I was listening to the Beatles, but I had not yet become a Beatles listener.

-----------
* Becker was doing his research among musicians in Chicago in the late 1940s and early 50s. Marijuana back then had nothing like the potency of today’s cultivars. Yet even now, other more experienced users are important in showing the neophyte user how to ingest the drug and how to appreciate the effects. Maureen Dowd’s famous unaccompanied fling with edibles (here) is a negative case in point.

Cotton-picking — Real and Metaphorical

February 24, 2022
Posted by Jay Livingston

Words change. Usually the change in the literal meaning is so gradual that it's hard to see. More visile may be the change in the political and emotional meanings that surround a word, and those meanings often depending on who is using it.

“Get your cotton-picking hands off that” is what the substitute teacher at Farmington High in Michigan said. The Black student she said it to couldn’t believe his ears. “What did you say?” She repeated it and explained. “This kind of comment was a very common comment. And it was a very innocent comment. It was not meant to be offensive in any way.”

The verb tense is important. The adjective “cotton-picking” was very common. I don’t know how old the substitute teacher is, but on the video, she does not sound young. (You can hear her on the video posted to TikTok.) She probably thinks of the word the way it was used in the previous century. Fifty years ago, cotton-picking was a word intellectuals might use to make a statement seem down-to-earth. Milton Friedman in 1979: “before government and OPEC stuck their cotton-picking fingers into the pricing of energy.” A character on a sitcom  — a White, non-Southern character — might say, “Are you out of your cotton-picking mind?” It was funnier that way, believe me. It was a way of expressing disapproval but in a friendly, joking manner. White people used it as way to sound folksy and informal, perhaps in the way some well-educated, non-Southern people in this century have adopted “y’all.”

Google nGrams is not a good source on this one, but for what it’s worth, it shows cotton-picking as an adjective increasing till 1940 and declining steadily thereafter.

I suspect that nearly all of those instances from before 1950 were literal — things like references to cotton-picking machines. The metaphorical, disparaging cotton-picking came later. You can see this in the line for cotton-pickin’ since the dropped-g version would not have been used to talk about farm equipment. The earliest use the OED could find for this meaning was for this more colloquial spelling. It appeared in 1958, in the New York Post, which was then a liberal newspaper.* “I don’t think it's anybody’s cotton-pickin'’business what you’re doing.”

Of course, using cotton-picking this way worked only for people whose lives and world lay far from the actual picking of cotton.  That was the world of the Michigan substitute teacher, and she used the word without ever thinking about its origins, in the same etymology-ignoring way we all speak. But for the Black kid, the word evoked the history of slavery and post-bellum racial exploitation in the Jim Crow South. And there was nothing friendly, funny, or folksy about it.

The teacher later said that she now realized that the term was offensive, but she maintained that her motives and intentions were innocent if ignorant. If only she had been — what’s the word here? If only there were a word that means aware of racist aspects of US history, aware of how privilege even today has a racial component, and sensitive to the ways those things might look to Black people. Wait, there is a word — “woke.” Or maybe I should say that there was a word. The political and emotional connotation has changed rapidly; so have the people who use it. And the change has been rapid. The people who use woke now are White, and they are waving it about as something to be rejected.
--------------

*The joke back then was that a front-page weather story in the Post might run with the headline: Cold Snap in City. Negroes, Jews Hardest Hit.


Applied Sociology, the Zeitgeist, and Why I Am Not Rich

February 20, 2022
Posted by Jay Livingston

I wonder how  “Sex as Work” happened.  (For those who don’t follow this blog regularly, i.e., everybody, I discussed this 1967 Social Problems article by Lionel Lewis and Dennis Brissett in the previous post.)

I  imagine one of the authors mentioning, after a second or third beer one evening, that he had read a “marriage manual” not too long ago, whereupon his colleague confesses that he has too, though not the same one. “Not much fun in there that I could find.” “Not in the one I read either. What a letdown. I wonder if there are any others.” Thus are research studies born.

I’m surprised that “Sex as Work” ever got published. It has no statistical analysis, no quantitative data, not much data at all, just their take on fifteen “marriage manuals.” It reads more like something from a stand-up comic of the “observational” type. (“And what’s up with all this working on your technique? I mean, does anybody ever get off practicing scales on the piano?”) That and a really good title. In short, my kind of sociology. Yet it was the lead article in the flagship journal of the SSSP. Hey, it was the sixties.

Well, I said to myself when I had finished reading the article, that’s interesting and probably true, and it fits with other thoughts I have about American culture. And I closed the journal. 

But what anyone with half a brain — the half with the money-making lobe  — would have done is to call on their inner applied sociologist. And then they would have called on a publisher or literary agent. Here’s the elevator pitch:

The only sex-instruction books around are from the fifties or have a fifties mentality. We’re now in sixties. This is the decade that began with the pill. People in the book-buying classes are having more sex with more partners, and they’ve stopped kidding themselves about marriage. They’re having sex younger and getting married older. And there are lots more divorces. Is anyone really going to buy A Doctor’s Marital Guide for Patients? (Yes, that’s one of the books in the ”Sex as Work” bibliography).

What they would buy is a book whose attitude towards sex is that it’s fun, a book without a medicinal smell, a book that doesn’t turn sex into goal-achievement through dogged technical mastery, a book that instead offers a tasting menu of all sorts of sexual activities.
Alas, I did not make that pitch, I did not write that book or suggest it to a publisher, and I did not get rich. But not long after, someone did. Alex Comfort. The book was The Joy of Sex, and it was in the top five books on the New York Times best-seller list for about a year and a half. 

(Click on the image for a larger view.)

The cover, inspired by The Joy of Cooking, is just plain text. The table of contents includes entries like “g-string, bondage, foursomes and moresomes, soixante-neuf, etc., as well as more traditional topics covered in “marriage manuals.”

I’m not a big believer in the Zeitgeist, but The Joy of Sex was a book whose time had come. Actually, its time had come a few years earlier, around the time that some sociologists were writing “Sex as Work” in an academic journal and other sociologists were reading it.

Sex and the Work Ethic

February 18, 2022
Posted by Jay Livingston

“Climax as Work,” the Gender and Society article I discussed in the previous post, caught my attention for the obvious reason. But I had another immediate reaction, one that I suspect is unique.

What the title called to my mind was another article, one published in the sociology journal Social Problems in 1967, before the authors of “Climax as Work” were born. The title: “Sex as Work” by Lionel Lewis and Dennis Brisset.* It was a content analysis of fifteen “marriage manuals” as they were called at the time, published in the 1950s and early 60s..

The authors start from the observation that “fun” in American culture had become a requirement. Americans judged themselves and others on the basis of  what psychologist Martha Wolfenstein dubbed the “fun morality.” The irony is that making something a required part of the Protestant ethic largely takes the fun out of it. (See this post from sixteen years ago on how organizing kids’ sports inevitably crushes the fun.)  The authors quote Nelson Foote: "Fun, in its rather unique American form, is grim resolve. . . .We are as determined about the pursuit of fun as a desert-wandering traveler is about the search for water.” As the title of the article implies, when it comes to sex, these marriage manuals see work as an absolutely necessary prerequisite for fun.

The work ethic in these books first of all emphasizes technical skill. The word technique appears frequently in the text, the chapter headings, and one of the book titles — Modern Sex Technique. Learning technique requires work. The books give cautions like, "Sexual relations are something to be worked at and developed.” “Sex is often something to be worked and strained at as an artist works and strains at his painting or sculpture.”

Work to acquire the requisite technique means study and preparation. One book refers to “study, training, and conscious effort.” Another, “If the two of them have through reading acquired a decent vocabulary and a general understanding of the fundamental facts listed above, they will in all likelihood be able to find their way to happiness.” Is this going to be on the midterm?

Like work, sex must proceed on a bureaucratic schedule. This means  establishing a specific time for sex. But the manuals also break the sexual encounter into components much like an assembly line or  a schedule of work activities, sometimes even specifying the time allotted to each. "Foreplay should never last less than fifteen minutes even though a woman may be sufficiently aroused in five.” Lewis and Brissett don’t mention it, but the scheduling mentality was also the basis for what some of the manuals saw as the ideal product — simultaneous orgasm. The partners here are much like workers who must co-ordinate their separate activities to arrive at the same place at the same time. 

Lewis and Brissett also fail to mention other things that now seem obvious. First, these books are “marriage manuals” not “sex manuals.”  They imply not only  that sex is limited to married couples but that it is an obligation stipulated in the marriage contract.

Second, these books frame sex not just as technical and bureaucratic but as medical. Ten of the fifteen books have authors with M.D. after their name; others have Ph.D. Only three authors are uncredentialed. The M.D. or Ph.D. speaks from a position of authority, authority based on their own technical expertise. This too seems at odds with any notion of fun or pleasure. We rarely think of consulting a doctor as “fun,” perhaps even less so for consulting a Ph.D.

In any case, Lewis and Brissett had spotted the most important aspect of these sex books, one that nobody else seemed to have noticed. The insights of “Sex as Work” pointed to an obvious next step. Or maybe it wasn’t so obvious. If it had been, I might be rich. But I’ll leave that for the next post.

------------------------
* The full title of the article is “Sex as Work: A Study of Avocational Counseling.” Social Problems, Vol. 15, No. 1 (Summer, 1967), pp. 8-18

Minding the (Orgasm) Gap

February 16, 2022
Posted by Jay Livingston

“Climax as Work.” The title of this Gender and Society article by Nicole Andrejek, Tina Fetner, and Melanie Heath is almost like the sign that says “SEX” in large letters, and then “Now that we’ve got your attention . . . .”

Yes, the article is about sex. But it uses and illustrates the more general perspective of  the social construction of reality. We rarely think that we are actively working to maintain a particular reality, a more or less arbitrary way of looking at the world. But  each time we make use of those taken-for-granted  truths, we are reinforcing that reality. Or as we used to say back in the sixties, if you’re not part of the solution, you’re part of the problem.

The problem at issue is the “orgasm gap.” Among the men in the Andrejek/Fetner/Heath  survey, 86% said they had had an orgasm in their most recent sexual encounter; for the women the proportion was only 62%. That’s consistent with the results of most other studies on the topic. The authors start from this finding and move to two related points, one about the “work” in the title, the other about “labor,” or more specifically “gender labor.”

“Work” is a term that interviewees, both the men and the women, used in talking about women’s orgasm. While they saw the man’s orgasm as a matter of more or less doing what comes naturally, the woman’s orgasm took work. As one man put it, “It’s definitely easier for the male, that’s for sure. I think [for the] female, it takes more work and certain things have to be done, where a male is good for anything.” Similarly, a woman said that she enjoys, “if the man is behind me and he is able to pleasure me with his hands [but it takes] a lot more to work. It takes a lot more for me to get to that point where I’m going have an orgasm.”

It’s all about the clitoris. Attention must be paid. Ignore it, and the woman will be far less likely to come. There’s no mystery about it. Yet here we are, nearly a half-century after The Hite Report, a quarter-century after “Sex and the City,” and still a substantial segment of the population hasn’t gotten the message.

Our participants craft narratives that define regular sex as only penile–vaginal intercourse and sexual behaviors that prioritize clitoral stimulation, such as oral sex, vibrators, or manual stimulation, as “alternative” sexual practices. These alternative sexual practices to regular sex are depicted as more time-consuming labor and extra work for couples.

To avoid realizing the importance of the clitoris, or in the face of that realization to find reasons for not acting on that knowledge — that takes some mental effort. It is this effort that the authors, borrowing a term coined by Jane Ward, see as an example of “gender labor.” Of course the labor is mostly unconscious. We rarely think of ourselves, in bed or out, as laboring to, as A/F/H put it, “create a sex life that conforms to dominant narratives of ‘normal’ sexuality.” Even when we know that the sex could be better, especially for the woman, we don’t think of our explanations as reinforcing patriarchal hegemonic masculinity. We are just calling on “commonsense understandings of what constitutes sexual pleasure.”

I came away with the impression that the authors are calling for a revolution in sexual consciousness. The orgasm gap is not going away all by itself. Nor is it likely to disappear one clitoris at a time. “Our findings demonstrate the need to challenge the shared heteronormative meanings of what counts as sex.” We are left wondering about just how new meanings and ideas can diffuse through a population, especially when those meanings and ideas concern something that is not a topic of frequent, wide, or even audible discussion.

Valentines and Sentiment — Particularism vs. Universalism

February 14, 2022
Posted by Jay Livingston

For Valentine’s day when I was in kindergarten, we had to bring a Valentine’s card for every other kid in the class. Many years later, in my intro classes, I often used this as an example of universalism and particularism. We usually think of love as particularistic, something that depends on the particular people involved. We treat the other person according to the special aspects of the person and the relationship, not according to some universal rules that apply equally to everyone. But in Miss Carmen’s kindergarten, everyone said Be My Valentine to everyone else.

Maybe the same rule applied in first and second grade or beyond. I can’t remember. But at some point, we learn that a Valentine’s card and the sentiment it represents is for “that special someone,”

With friendships and friendliness however, we Americans are still kindergartners. Or at least that’s how non-Americans see it. When they come to the US they are often pleasantly surprised at how friendly and welcoming Americans are. Perfect strangers treating you so warmly. But after a while they are frustrated, for what passes as a friendship here seems superficial and temporary, so unlike friendships from their native countries. As a student from France told anthropologist Cathy Small,

Sure I have friends. It’s so easy to meet people here, to make friends. Well, not really friends. That’s the thing. Friendship is very surface-defined here. It is easy to get to know people, but the friendship is superficial. We wouldn’t even call it a friendship. In France, when you’re someone’s friend, you’re their friend for life

The other way I had of explaining particularism and universalism hinged on the idea of what something is worth. Usually, we measure that in the universalistic terms of money. A dollar is a dollar no matter whose wallet it’s in.  But I would glance around the room looking for a girl wearing a ring or necklace, one that looked special. An engagement ring was the ideal. “Where’d you get that?” I would ask, and often the answer was the kind I was looking for. “My boyfriend gave it to me,” or “It was my grandmother’s.”

We would come to some assessment of what its dollar value might be, and I would then ask if she would sell me the ring for double that. The answer was always No. I would then ask others in the class, “If you had bought this ring for $200 and I now offered you $400, would you take it.” Yes, of course. You could go back the store, buy one just like it, and pocket the extra $200. But to the girl wearing that ring, its value is particularistic, based on the particular people involved.

I would sometimes bring in the example from “Breakfast at Tiffany’s” — the ring from a box of Crackerjacks, worthless to everyone except the two lovers — even though I knew that most students would not be familiar with it.* But it’s such a good example.


I was reminded of this by a segment of the Valentine’s episode of the Planet Money podcast. One of their economics reporters, Mary Childs said that she would give a Valentine to her favorite website, where entire estates are auctioned. She loves it because it is a perfect example of “price discovery” — finding out how much money something is worth. She also seems pleased that discovering the price has the sobering effect of deflating the particularistic value.


It turns out, a lot of this stuff is basically worthless. There’ll be a lot of lots that go for like two dollars or five dollars. . . . .When we’re alive we imbue all our possessions with all this importance and all of this meaning. . . But in the end it turns out that all this stuff — your precious stuff — is just stuff.

As Oscar Wilde said, “What is an economist? A man who knows the price of everything and the value of nothing.” (OK, he didn’t say “economist”; he said “cynic.” But the difference may be hard to perceive here just as it is with the economic view of the “deadweight loss” of Christmas presents )

HAPPY VALENTINE’S DAY

To Everyone

-------------------------
* Not all students were unfamiliar with it. One semester, when I referred to “the movie based on the Truman Capote short story,” I heard a girl off to my right mutter sotto voce, “novella.”


Did They Really Say That in 1882?

February 12, 2022
Posted by Jay Livingston

Language anachronisms in “The Marvelous Mrs. Maisel” and “Mad Men” often came through loud and clear, at least to my ears. The shows were set in 1960, a time when I was alive — speaking and listening. (See earlier posts here and here ) “The Gilded Age” on HBO is set in 1882, before my time. Still, some of the language in this week’s episode, “Face the Music,” sounded more recent. Julian Fellowes, who created the show and did much of the writing, came in for some criticism (here, for example) for the language anachronisms in his “Downton Abbey.” I can just see him chuckling now as he waves the title “Face the Music” to lure in the language police and then swats them back by having Mr. Russell say, “To employ a modern phrase, I'm afraid you must face the music.’”

OK, “face the music” was not a phrase before its time. But in 1882 it wasn’t exactly modern either. My own memory does not extend back to 1882. That’s why we (and that includes Mr. Fellowes) have the Oxford English Dictionary, and according to the OED, this “modern phrase” has one example from a newspaper fifty years before the Gilded Age and another from 1850: “There should be no skulking or dodging...every man should ‘face the music’.”

If you’re not watching the show, know that Mr. Russell is the nouveau-riche businessman. His adversaries who must face the music are the establishment wealthy. They have connived to ruin him financially, but Russell outwits them, using his own wealth to put them on the verge of financial ruin. He will have his revenge. “I didn't see this coming. I admit it. I thought you were honorable men. Not too honorable to miss the chance of a fat buck, of course, but not greedy, dirty thieves.”

He adds, “I thought I was the one who might throw a curveball.”

(Click on an image for a larger view.)

Curveball? By 1882, pitchers had been throwing curveballs for a decade. But they were literal curveballs. Metaphorical curveballs didn’t come into play for another half-century.

In an earlier scene, Russell’s daughter uses the phrase “the thing is.”

This too sounded modern to my ears, even if she did not use the double “is” that many people today  add, as in “The thing is is that it’s very recent.”  I may have been wrong. The OED finds Matthew Arnold using it 1873. “The question [of a state church]..is..so absolutely unimportant! The thing is, to recast religion.” I’m not sure that this is exactly the way we use it. The first clear example of that in the OED is from John Galsworthy in 1915. “Look here, old man, the thing is, of course, to see it in proportion.”

Finally, there was “identify.” Miss Scott has submitted her short stories to a newspaper. They are, the editor tells her, “beautifully constructed and executed.” The problem is that Miss Scott is Black and so is the main character in the story under consideration. The editor tells her that some adjustments will be necessary.

“The little colored girl would need to be changed to a poor white child.”

Why, she asks.  

“Our readers will not identify with a colored girl's story of redemption.”

I was mostly wrong about this one. Identify in this sense goes back at least to the early 1700s. But until the mid-20th century there was always a pronoun like himself or onesself  between identify and with. What the editor should have said is “Our readers will not identify themselves with a colored girl’s story.” In 1882, the reflexive pronoun was still required. Today, it has been absorbed into the word identify.