Trump — Working the Crowd, Working the Refs

February 25, 2020
Posted by Jay Livingston

When people criticize or oppose Trump, or even provide information that contradicts him, his main strategy is to try to discredit them, to challenge their legitimacy. They are, he claims, unfair and biased against him. So when Justice Sotomayor noted the unprecedented number of cases where the Court’s conservative majority had acceded to Trump administration requests to fast-track cases, Trump, unsurprisingly, issued a typical tweet. I found it in this NPR story today.

(Click on an image for a larger version.)

Here Trump was trying to discredit specific judges. He has done it before. But many on the left fear that Trump is also trying to discredit American institutions. That’s because he often puts it that way. He attacks not just this or that journalist but “the fake-news media,” by which he means all media except Fox News. He attacks not just a judge but entire courts. “The 9th Circuit is a complete & total disaster.”   

Is the Trump strategy is having the effect that progressives fear? It’s hard to know. The GSS shows a decline in confidence in the courts in 2018, but since the previous rates are from 2008, we can’t know when that change occurred.


The Gallup Poll finds a small decrease in 2018 for confidence in the Supreme Court, but generally the percent of those who have great confidence in SCOTUS has not changed much in the last decade, fluctuating between 30% and 40%.  Before that, going back to 1973, confidence in SCOTUS stayed above 40%.


Last May, I posted an audio clip from an interview with Michael Lewis, who had just launched a podcast about attacks on the legitimacy of all kinds of judges, not just those in courtrooms. Here is the relevant excerpt from that blogpost. (The entire post is here.)
----------------------------

Lewis says that one inspiration for the series was what happened after a close play at home in a softball game played by nine-year old girls. It happened ten years earlier. But it can easily be an allegory for tactics and a tactician of the present moment.



The story continues (to hear the rest of it, get the entire episode and push the slider to about 12:40), but the excerpt here is sufficient. It shows a winning-obsessed and angry man using his position of power to bully an impartial judge. I chose to end the clip at the point where the angry bully says, “You’re fired.” (We’re not long on subtlety here at the Socioblog.)

The Times Wedding Announcements They Are a-Changin’

February 21, 2020
Posted by Jay Livingston

Here are some more trends dredged up from Wedding Crunchers, the New York Times corpus of words in its wedding announcements. As I noted in the previous post about brides keeping or changing their name (here), these announcements are not a representative sample of couples. And while they are not even representative of couples in the Times’s corner of US society, I think they point to some general trends in that elitish slice of the world.

Take grade inflation. This well-documented trend is reflected in Times weddings as well.

 (Click on an image to enlarge it.)
From 1981 to 2019, the proportion of announcements with a summa more than doubles (from 4% to 10%) as does the magna rate (9% to 19%). It’s possible, though unlikely, that Times has raised its bar for putting your announcement in the paper. Or maybe today’s couples really were better students in their college days. We do know that more of them are going on to post-BA programs. But which?


In the 1980s, we saw the rise of the MBA, the Wall Street “masters of the universe” in Tom Wolfe’s Bonfire of the Vanities, or in real life the recently pardoned Michael Milken (MBA Wharton 1979, Pleasanton Federal Prison 1993). The 1990s was for lawyers. (I recall a New Yorker cartoon, which I cannot now find online, which shows a young woman and man at a cocktail party. She is saying, “How did I know you’re a lawyer? Everyone’s a lawyer.”)

As we head to the 21st century, two other phrases start turning up —  “hedge fund” and “start up.”

The numbers are small, never more than one announcement in 25 including either of them, but starting about ten years ago, start ups began to replace hedge funds as the choice of the adventurous and ambitious (and perhaps avaricious).

The other newcomer to the these pages is the dating app. The steep increase starts in 2013 or 2014. In only 5-6 years, about 20% of the wedding couples announce that they met via a dating app.


Finally, remarriage in the Times seems to run parallel with national trends.



The US divorce rate peaked in 1980, and since the most remarriages occur on average 5 years after divorce, we should expect the downward slope that begins in 1984. More curious are the upward trend 1996 - 2004 and the decline after that. Of course, remarriage in the Times is somewhat rare — the rate ranges from about 7% to 13% — so maybe we shouldn’t make too much of these fluctuations.

If you’re curious and what to explore your own key words, go to weddingcrunchers.com.

Brides and Names, New York Times Edition

February 20, 2020
Posted by Jay Livingston

“She’s keeping her name,” a friend said the other day. We were talking about a girl we know who got married last year. Is that still a thing, I wondered, keeping your name. What I really meant was: how much of a thing is it? Then I remembered Wedding Crunchers, the corpus of all words in New York Times wedding announcements — sort of like Google nGrams but with a much narrower focus and far fewer filters for researchers. 

Unfortunately, the database goes back only to 1981, so we can’t know when the name-keeping trend started. It was underway by the eighties. By 2000, more than 20% of Times brides announced that they were keeping their names, so many that several of those who were changing their names felt it necessary to proclaim their traditionalism in the announcement.

(Click for a larger view.)

I’m not sure what happened in 2015. Maybe that was the year that the Times instituted the current policy, which finesses the politically tinged proclamations of the keepers and the changers. Instead, the Times puts the maiden name in the headline and the married name in the text. Finding out who’s keeping and who’s changing requires a closer reading, but those who are interested will figure it out.
Here are two weddings from Sunday’s paper. (I edited out the photos to save space.)

(Click for a larger and clearer view.)

Adrienne is becoming Mrs. Adams. Elle will remain Ms. O’Sullivan.

There’s another change in the language, though you have to go back to the eighties to see it. Adrienne graduated from UNM, Elle from UCSB. In fact all brides and grooms these days “graduate from” their schools. But in the old days, a student “was graduated from” the school. The Times, and many of the people whose wedding announcements they accepted were traditionalists.


Even as late as 1980, nearly 60% of the wedding announcements included someone who “was graduated from” a school.*

The wedding announcements in the New York Times are hardly a representative sample of anything, But they do offer a glimpse into the world of the elite. For more on that, see Todd Schneider’s excellent post from 2013. As for those at the other end of the social spectrum, graduating from college is not so much an issue, and as marriage rates decline, neither are wedding announcements in the newspaper or the question of whose name to use.

--------------------------
*Nowadays, you sometimes hear, “I graduated college in 2015,” much to the dismay of language prescriptivists, who insist that the correct expression is, “I graduated from college.” They don’t realize that their presciptivist counterparts of 150 years ago would have been just as appalled and in despair for the language because people were not saying, “I was graduated from college.”

School Structure and Superficial Friendships — Russia and the US

February 16, 2020
Posted by Jay Livingston

American schools teach kids the ideal of universalism. Treating everyone equally without favoritism squares perfectly with our value on equality. But then what about unique personal relationships? If you treat everyone alike, no one can be special. That was the gist of the previous post about the superficiality of American friendships, at least as non-Americans perceive them, and the rules of Valentine’s Day in American classrooms.

Two days after posting that, I happened to listen to a conversation from last August between American economist Tyler Cowen and Masha Gessen, a journalist who came to the US from Russia at age 14, lived here for ten years and then returned to Russia. In 2013, she moved back to the US because of the threat that the government might take her adopted son from her because she is gay. (The full podcast is here.)

Cowen asks two questions, one right after the other, the first about friendships, the second about schools. He doesn’t explicitly say that one affects the other. Neither does Gessen. Maybe they don’t see the connection.

Cowen asks, “Why do Russians purge their own friends so often?” He  Cowen refers to “loyalty cycles.” Gessen is puzzled, maybe because of the words purge and loyalty. Cowen explains that Russian friendships end in total breaks “whereas Americans will drift apart.”   

Gessen answer that if Cowen is right (and she seems not totally convinced that he is), it’s because friendships between Russians are much more profound.   




Here is slightly edited transcript: 

Russian friendships are much more emotional and intense than American friendships. When I moved back to this country five and a half years ago, it was like a sense of whiplash, because I had friend here, I had lived her for twenty years. And I would get together with my friends, and then two hours later the get-together would be over. And [I would think]What was the point of that? Was that just to let each other know that we still exist?

Because you don’t really get into a conversation till about four hours in, right?, and a number of bottles of alcohol. If you’re going to really get down, it’s a 3 a.m., 4 a.m. proposition. You can’t just have dinner and go home.
 

Maybe you’re just referring to the intensity of Russian friendship. It’s like lovers, even in this country, don’t drift apart usually. You have to break up. You can’t really just stop calling. You can’t go from talking every day to talking every few weeks and then forget about each other’s existence.

Cowen’s next question is about the way Russian schools group children.




COWEN: Russian grade school – you sit in the same seats and next to the same people year after year after year. Is that a good system or a bad system?

GESSEN: My older kids were educated partly in Russia and partly here, and my youngest son is now in elementary school here. I find it disorienting that every year Americans shuffle their classes and put kids in a new social situation. 

There’s something amazing to having gone through life from the time you’re six or seven with the same people. I think it can foster really incredible friendships. It can also foster awful dynamics obviously.

Gessen’s answers suggest a strong relation between the personal (friendships) and the structural (classroom groupings). Oddly, neither Gessen nor Cowen mentions the possible link between the two. But even if they did see that sociological connection, they would see it from different sides of the table. Cowen is saying, in effect, “Russia takes away a kid’s choice over who to associate with. As a result, they wind up with these screwed-up friendships so that the Russian word for “friend” is “future enemy.”

From Gessen’s point of view, it’s not that America our way of friendship is the right and normal way. Instead she sees American friendships as superficial (What’s the point?). After all, when the group of kids a child sees everyday lasts only nine or ten months, when kids are forced to form new relationships every year, you really can’t expect them to develop deep and long-lasting friendships when they’re older.The Russian system can produce friendships that are “incredible” and “amazing.”

Valentine’s Day the American Way

February 14, 2020
Posted by Jay Livingston

Foreigners often comment that while Americans are generally very friendly and open, friendships are often superficial, especially compared with friendships in their home countries. In class, I would use the example of the elementary school version of Valentine’s Day. (The picture below turned up in my Twitter feed today. I have edited out the personal information.)


In the US, universalism rules. A Valentine’s card denotes affection and friendship, but American kids  have to treat all their classmates alike — no special preferences lest any kid feel left out. You have to extend this token of friendship to every kid in the class.

I’ve never been near a French primary school on February 14th, but I doubt that they follow this custom. I would imagine that if French kids give Valentine cards, they do so on the basis of particularism. It matters very much who the other person is, and for each kid, the number of those special friendships is small.


Happy Valentine’s Day to all readers of the SocioBlog.

(A follow-up about Russian and US schools and friendships is here.)

Ya Got Trouble, My Friends

February 11, 2020
Posted by Jay Livingston

The Attorney General of the US, when he isn’t using his office to make sure Trump remains in power, tells us that the country is going to hell in a handbasket. In a speech at Notre Dame last year, he claimed that  “Virtually every measure of social pathology continues to gain ground.”

He’s wrong, of course. Most of those measures show that things are getting better, though there are some troubling numbers (suicide, drug addiction and overdose).  Claude Fischer has the data  at his Made In America blog. He also looks at Barr’s explanation for the nonexistent downward trend — “the steady erosion of our traditional Judeo-Christian moral system.” That trend too, says Fischer, isn’t all it’s cracked up to be.  We’re not losing our religion, and certainly not our religiousness; we’re losing our church affiliations.

Barr might also be wrong about the effects of religion or its absence. Fischer’s data seems to suggest that when it comes to social pathology, losing our religion might not be such a bad thing. States with higher levels of “religiosity” also tend to have higher levels of pathologies — things like violence and sexually transmitted disease. Even drug overdoses, historically the province of more urbanized and less religious states, have now become a big problem in the religious heartland.

(Click for a larger view.)

Of the variables in the graph, the one that stands out is Incarceration. Tell me what proportion of a state’s population is in jail, and I can make a very good guess as to how religious it is. But that doesn’t mean that religious states have more crime and criminals. It does mean that their laws and policies are more punitive. They favor harsh punishment for people who have broken the law. Not all laws and not all people, just those who by their actions and personal characteristics can be judged as outside the society — as not one of Us.

This preference for punishment is especially popular among fundamentalists and other religiously conservative Protestants. It is that same kind of fundamentalism that underlies Barr’s view of social problems. His speech reads like a sermon from a fire-and-brimstone revival preacher. (Barr’s actual delivery from the podium may have been different in style.) The odd thing is that Barr himself is a Catholic and he was speaking at Catholic university. This seeming contradiction puzzled Fischer too.

The speech sounds much more like one of the jeremiads Puritan ministers unleashed on their congregants centuries ago than it does like a Catholicism traditionally more tolerant of human failing. The underlying individualism in Barr’s account is also very Puritan Protestant. Social ills emerge from individual willfulness. The role of the community is to instill fear of God to check that willfulness

In this way, Barr is one more data point in a historical narrowing of differences between Protestants and Catholics. As Catholics became more similar to Protestant, as the social, economic, and geographic spaces they occupied grew more diverse, their views on social and political issues also became more varied. So it’s not surprising that in the 21st century we have a Catholic attorney general channeling Cotton Mather.*

-------------------------
* In the 1980s, another Catholic cabinet member, Education Secretary William Bennett, made similar noises. Like Prof. Harold Hill in “The Music Man” (and like Barr), Bennett told us, “Ya Got Trouble.” Hill blamed the pool table, Barr blamed secularism, and Bennett blamed the decline of virtue. That was convenient since he then went on to sell his Book of Virtues for parents to read to their kids just as Harold Hill sold trombones. Right now the only thing Barr seems to be selling is Donald Trump.

UPDATE: As I said in the fourth paragraph above, moralizers like Barr are especially fond of punishment when the offender is one of The Others. When the offender is One of Us, their sternness melts into compassion. Later in the day I wrote this, Mr. Barr’s justice department recommended a lighter sentence for Roger Stone, a friend of Trump, who had been convicted on a variety of charges. In doing so, DOJ was tearing up the recommendations of longer sentence from the prosecutors who worked on the case.

Uncut Gems Gamblers

February 6, 2020
Posted by Jay Livingston

As I sat through the movie “Uncut Gems,” (see the previous post) I kept thinking of the compulsive gamblers I studied hung around with decades ago. I had gone to the movie thinking that it would provide an inside view of the 47th St. diamond district, a business world that is probably not much different from what it was a century ago.  It has not been taken over by  private equity and MBAs with spreadsheets. If you’re looking for modern, rationalized corporate structures and procedures, go elsewhere. Here, personal relationships count for much; deals are sealed with a handshake, not a contract

That was the movie I wanted to see. After all, the Safdie brothers, who made the film, had grown up hearing diamond district stories from their father, who worked there. But intead of showing us that world, the film focuses relentlessly on a single figure, Howard Ratner played by Adam Sandler. And although Ratner may not be typical of jewelry merchants, he is typical of gamblers, especially compulsive gamblers, though with Ratner the more appropriate adjective would be impulsive gambler..

Ratner and the gamblers I knew had two important things in common.. First, their lives are centered around the problem of getting money — a lot of it and quickly. And second, their relationships with family are thin and brittle. That’s not surprising. Since their money problems crowd out other matters, close relationships are at best a distraction or an interference, at worst a threat. And yet, Ratner, like many of the gamblers I knew, still thinks of himself as a good husband and father, and he remains blissfully unaware of how he is seen by the people whose needs he is slighting.

He even thinks that his wife, who is divorcing him, might reconsider.She clarifies her position (“If I  had my way, I would never see you again”), and Ratner still doesn’t get it.



(She actually does convincingly fake a punch, hence the noise and laughter in the final seconds of this clip.)

There’s an old gamblers’ joke, about the horseplayer who, like Sandler lives on Long Island. That’s great for horseplayers, because Aqueduct and Belmont are not far. But in August, New York racing moves up to Saratoga.

The horseplayer complains to a friend. “It’s terrible. I have to get up before six to get the train in to Grand Central, get over to Penn Station to get the bus by nine so I can get to the track in time for the daily double. The races end at 5:30 or 6, and then I gotta do the same thing in reverse. I don’t get back to the house around eleven and get right to bed so I can get up the next morning.”
“You know what you should do,” says the friend, “Rent a room in Saratoga Springs. You’ll be near the track, you can sleep late. . . .”

“What?” says the horseplayer, “And neglect my family??”

OK. Jokes are not evidence. But blogposts are not journal articles. And the joke does capture the gambler’s distorted picture of domestic tranquility.

“Uncut Gems”

February 1, 2020
Posted by Jay Livingston

I saw “Uncut Gems” last week. It does not pass the Bechdel test. It does not have
  • two women who . . .
  • have a conversation . . .
  • about something other than a man.
The film doesn’t even have a conversation between two men who are not Adam Sandler (Howard Ratner). He is there in nearly every scene. Nor does it have a conversation about something other than money.
   
Even “conversation” is misleading. Usually, the men shout. The camera is in tight on most shots, so you feel as though the film is grabbing you by the lapels, pushing its face into yours, and shouting about money. Also, th men say fuck, fucked, and fucking a lot, never in the literal sense.

“Uncut Gems” is basically an action movie, a film where the protagonist struggles against threatening forces in his quest for some tangible goal. It’s all problem-solving. Thoughtful introspection is out of the picture. Ratner thinks only about money. He needs money to pay his gambling debts (he’s a sports bettor), and he needs money to gamble still more. That’s what the film is about.

As a motivation, this obsession with money can lead to complicated actions, but as psychology, it couldn’t be simpler. Ratner and the movie itself see all problems as external. Or really, there’s only one problem — how to get money. All relationships with other people, including family, are purely instrumental — how to use them to get money or avoid them if they are trying to get their money back. The film even has the cliche scene where the parent goes to see his kid in the school play but has to leave in the middle. In this case, Ratner has to duck out to deal with his own money-based problems.

People interested in non-money relationships might as well be speaking a foreign language which Ratner does not understand and does not see the point in learning. As adept as he is at knowing what will motivate Kevin Garnet to have a great game, he is utterly unaware of what his own wife is thinking or how she sees him. They are getting a divorce, but Ratner still thinks it’s possible that she might scrap that idea. With a twinkle in his eye, he suggests that they might stay together. Her response: “I think you are the most annoying person I have ever met. I hate being with you. I hate looking at you. And if I had my way, I would never see you again.”

It’s not too much of a spoiler to note that in the end, she gets her way.

(A follow-up to this post is here.)

Lobster Reconsidered

January 27, 2020
Posted by Jay Livingston

I was at the fish counter in Citarella, trying to decide what to get for dinner. I did not consider the lobster.


Eighty dollars a pound is a bit out of my usual price range.

Lobster, as David Foster Wallace mentions in passing in his famous essay,* was not always a delicacy. In the early days of the Massachusetts Bay Colony, lobster was trash food. It was fed to prisoners. Two hundred fifty years later, the social status of lobster hadn’t improved. In the 1870s, indentured servants sued, successfully, so that their masters could feed them lobster no more than three times a week.

Several accounts I found online say that lobster became a delicacy in the 1950s, but I’m not so sure. When I noticed that $80/lb price tag, I remembered a 1953 New Yorker article by St. Clair McKelway that the magazine had recommended not to long ago as retro reading. The main figure is Pearl, a salesgirl in a New York department store.

For a while, she lived with her mother and her stepfather in Brooklyn, but as soon as she got a job—as a salesgirl in a department store—she moved to a furnished room all her own on the upper West Side of Manhattan.. . . She made friends quickly with many of the salesgirls at the store and lunched at a soda fountain every day and dined in a cafeteria almost every night with large groups of them.

I picture her as much like the Rooney Mara character in “Carol,” the Todd Haynes movie set in early 1950s.



And what did Pearl have for lunch?

Her favorite lunch was African-lobster-tail salad and Coca-Cola, followed by a junior banana split. Her favorite dinner was chicken potpie with mushrooms, pecan pie with whipped cream, and coffee.


If shopgirls were eating lobster — even canned lobster — for lunch, how much of an upscale delicacy could it have been? Besides, the price of lobster did not begin to rise until a few years later [source].

(Click on an image for a larger view.)


Besides the rise in prices after the 1950s, the chart also shows a steady decline in price from about 1975 to 1990. Funny, but I didn’t notice. I guess I wasn’t paying attention. Since then, there has been a steady increase in production accompanied by a seemingly paradoxical rise in price as well. That’s because of increased demand from China. That trend was interrupted by the global financial crisis but has now returned. It may be a while before I haul out my recipe for the lobster mousse that I once served to dinner guests.

-------------------
* “Consider the Lobster” is the title piece in DFW’s 2005 collection of essays. Wallace is concerned mostly with the ethics of boiling lobsters. That and footnotes.

Gary Burton, b. Jan 23, 1943

January 23, 2020
Posted by Jay Livingston

Sometime in the early 1970s, I was listening to the radio and heard Gary Burton’s recording of the great Jobim tune “Chega de Saudade” (inEnglish, “No More Blues.”)  It sounded like this. Go ahead, click and listen to at least the first 16 bars (15 seconds).


If you’ve never heard this recording before, you probably are thinking what I thought: That can’t be one person playing vibes. He’s overdubbing, accompanying himself, like Bill Evans on the “Conversations With Myself” album released ten years earlier.

But no, it’s just Burton by himself. “Alone At Last” as the title says. No overdubs, no tricks. Here’s a live version. You can see him holding the four mallets, sometimes playing chords, sometimes rapid single-note lines.


Burton revolutionized jazz vibraphone. Before Burton, jazz vibists had used only two mallets. Even if they used four to play chords when comping behind a soloist, when it came time for their own solo, they would lay two mallets aside. Burton even invented a different way of holding two mallets in each hand, now called the “Burton grip,” that allowed for an easier adjustment of the interval between the mallets in each hand. 

What had seemed an incredible feat nearly 50 years ago has now become a standard part of the vibes repertoire. On YouTube you can find a 22-year old Austrian kid playing Burton’s “Alone At Last” version note for note (here), and an 18-year old American girl playing her own Burton-inspired arrangement of the same tune (here),  the familiar part starts at about 0:55).

Burton is also one of the few gay jazz musicians. He came out during a Fresh Air interview in 1994.

Abortion Rights and Motherhood — That Was Then, It’s Also Now

January 20, 2020
Posted by Jay Livingston

I’ll tell you what’s wrong with these women, especially the ones in this #MeToo movement. They’re over sensitive. They get offended by any little comment. Men have to walk on eggshells or they get accused of being sexists. These women want to make their issues a big deal in this election, and now more of them are running for office, as though that’s going to make things better. Guess what.* It isn’t. Not for the country, not for men, and not for women.
It’s easy to imagine who would applaud this statement and who might want to wring its neck. It’s also easy to imagine how those people would divide on the issue of abortion. But why? The abortion debate  usually divides on the status of an embryo. The pro-life side argues that an embryo is a baby, with all the rights and protections that babies have, especially the right not be killed. Pro-lifers often equate abortion with infanticide.

That’s the audible part of the debate. The usually unspoken part is not about embryos. It’s about women. The #MeToo movement is not about embryos. It was a response to rape, sexual assault, and sexual harassment, especially by men in positions of power. Yet only 23% of people who oppose abortion have a favorable view of #MeToo, compared with 71% of those who favor the right to abortion in most or all cases.

(Click for a larger view.)

(The chart is from a survey of likely voters done last summer by PerryUndem and housed at the New York Times (here). I wasn’t aware of it at the time; it popped up yesterday in my Twitter.)

Thirty-five years ago Kristin Luker reported this same correlation among pro-life and pro-choice activists. I don’t recall whether she said explicitly that attitudes about the role of women shape ideas about the status of the embryo. Conceivably it’s the other way round: if you believe that an embryo is a person, you won’t think highly of #MeToo. But she gave her book the title Abortion and the Politics of Motherhood, implying that the primary issue is the role of women, specifically their role as mother, and that ideas about embryos derive from ideas about gender roles.

Luker interviewed leaders in the movements for and against abortion rights, so we don’t know whether their rank-and-file supporters also shared their respective ideas about motherhood. On most issues, not just abortion, activists have more politically consistent sets of views than do ordinary people who are less involved. But however those views lined up in the early 1980s, today the thinking of ordinary pro-life and pro-choice voters resembles that of the leadership.
.                                                       
The Undem survey did not have a question explicitly about motherhood. But it did ask about something directly related to the decision of when and if to become a mother — birth control. Three-quarters of pro-choice voters agreed that access to birth control contributed to women’s equality. Only one-quarter (slightly more) or pro-life voters thought so. Why should pro-lifers discount the importance of birth control? The idea common to both issues is not the protection of innocent human life. If the condom, LARC, IUD, or other contraception works, there is no innocent life in the picture. Instead, the link is the question of how important it is that a woman becomes a mother.

Luker was right that motherhood and the role of women are the real issue in the abortion debate. They still are. She also predicted that the issue was going to remain contentious rather than becoming settled by civility, compromise, and moderation. She was right about that too.


-------------------
* “Guess what” gets a hat tip to Jim Jordan (R-OH). If you didn’t catch him at the House impeachment hearings — he was on both committees — just Google his name and that phrase.

Jeopardy II: Audiences — à la Goffman and ABC-TV

January 14, 2020
Posted by Jay Livingston

A Montclair professor who saw yesterday’s post about my having been on Jeopardy asked me how we could have known each other for decades without her knowing this about me. My answer is that it’s not the sort of thing you bring up. You don’t meet someone and say, “Hi, I’m Jay Livingston, and I was on Jeopardy.” It wasn’t a peg to hang even a small part of my identity on. I wasn’t even particularly proud of it. In Goffman’s terms, it was not a piece of “information” that was part of my “presentation of self” for the Montclair “audience.” Besides, that was a long time ago. I had a beard. I had hair. I had a suit with very wide lapels.

Here I am, between Mary, the woman from Virginia, and Pam, from Bloomfield, NJ. Italian American, mother of five. Poor Pam — already $40 in the hole at this early stage.  She finished in the red, and at Final Jeopardy was represented by an empty desk.


Even at the time, I didn’t tell people at work that I had been on the show. As I said when the host Art Fleming asked if I’d told my students, “No, but I expect word will get out.” But after the episodes were broadcast, nobody at the college said anything to me.

Fast forward eight years. The first day of the school year, a warm day in early September. I do my usual first-day routine — have students fill out 3" x 5" index cards (name, phone, major, etc.), go over the syllabus, talk about grading, including my standard pitch about class participation. It doesn’t count towards your grade, I say, but if I’m the only one here who talks, it’s going to be a very long semester.

And don’t be reluctant to ask a question, I add emphatically. In fact, here’s your first lesson in sociology. We think of our thoughts and feelings as internal and individual. But we’re less unique than we think. Our reactions are also social; they’re part of the situation. You all share the same situation — this class — so if there’s something you didn’t get or aren’t sure of, I guarantee that there are others here sharing this same situation who had the same reaction. And they’ll be very grateful if you ask about it.

Class ends. I’m putting my papers together. A girl comes up. She is short, with black hair. In those days, the ethnic make-up of Montclair was a bit different from today. Or as I used to say, half the girls were named Cathy. The K-Kathy’s were Irish, the C-Cathy’s were Italian. This was a C-Cathy.

 “Can I ask you a question?”

Goddamit, girl. Why the hell didn’t you ask during class? Didn’t you hear what I just said about asking questions? That if it’s not clear to you, then several other people also didn’t get it? Now I’m going to have to answer it for you and then, if I remember, answer it for the whole class next time.

That’s what I was thinking. What I said was, “Sure.”

“Were you on Jeopardy once?”

I was stunned. How had she discovered this fact that nobody else at Montclair knew? “Yes,” I say, “but that was years ago. How did you know?”

“My mom was on that show.”

I looked at her again and remembered — the woman from Bloomfield, the next town over from Montclair. “Oh, that’s right,” I said and added sympathetically. “She didn’t do very well, did she?”

Sometimes a student’s question is unique. And sometimes, we cannot control which audience sees which performance . . . and remembers it.

Not Ken Jennings, But . . .

January 13, 2020
Posted by Jay Livingston

With Jeopardy running its Big Three Showdown (Jennings, Holzhauer, Rutter) last week, people were telling their own Jeopardy stories. Here’s mine.

In 1972, I had just moved to New York. Most of the game shows were still here, and there were a lot of them — Jeopardy, Pyramid, Match Game, and others. Two friends from college had taken the test for Jeopardy. So I called the show. A few weeks later, I was sitting in a room in a nondescript midtown building with forty other people taking the test — paper and pencil, fifty questions, fill in the blank. It reminded me of high school. The only question I recall now was one that I knew I had missed — the capital of Wyoming. I looked it up later. Cheyenne.

In late January they called and told me to show up on February 9.

The host in those pre-Trebek years was Art Fleming, and the contestants instead of standing, sat behind desks. The dollar amounts were 1/20th the current rate — $10 to $50 in round one, $20 to $100 in Double Jeopardy.

The board was mechanical not electronic. The dollar amounts and questions were on square placards, almost like the scoreboard at Fenway, where guys behind the board  replace the 0 tile with a 1 when a team scores. When you selected a category and amount, “History for $30" for example, the $30 square would be mechanically (and often audibly)  yanked up to reveal the question on the card underneath.

Most important, you could ring in at any time. You didn’t have to wait for Art to finish reading the entire question. But finish it he would. So even if you rang the bell two seconds into the question, you would have the full reading time to think of the answer.

They taped a week’s worth of shows in a day.  My episode was a Friday, the last show they would do that day. The returning champion was a woman from Virginia. I had the middle seat, and to my left the other challenger, an Italian-American woman from Bloomfield, New Jersey.

I did well. I was ringing in quickly and getting most of the questions right. I even had a couple of lucky guesses on questions I wasn’t at all sure of.  At the end of the first round, I had $420, the champ had $40, and the woman from Bloomfield was at minus $10. (Remember, $100 then is like $2000 today.)

During the long commercial break before the Double Jeopardy round, assistants came out to adjust our make-up and give advice. “Try to ring in faster,” one of them said, trying to encourage the woman from Bloomfield. To me they said, “Could you try to smile a little bit more. People watching you win all this money want to think that you’re happy about it.”


(Please excuse the less-than-ideal photography. My girlfriend took pictures of the television.)

Going into Final Jeopardy, I was still way ahead — $880 to $160.  ($17,600 to $3200 in today’s Jeopardy dollars.) The woman from Bloomfield had rung in on only a few questions, and had gotten more wrong than right. She finished in the red. So it was just the two of us. Neither of us knew the Final answer (Joseph Lister), and I finished as the winner with $760.

I returned a week later as defending champion. “Did you tell your students?” asked Art in our 20-seconds of human interest. “No,” I said, “but I expect that word will get out.” I was wrong.

The competition was tougher this time, mostly in the person of Mary, born and raised in Oklahoma and now living in Pelham. Going into Final Jeopardy, she had $740 to my $560. (Again the third player had finished below zero.) The category was “state capitals,” but the question was really about theater. “The Western state capital that figures prominently in the musical ‘The Unsinkable Molly Brown.’”  

I had no idea. But I remembered the state-capital question I’d missed months before on the qualifying test. Maybe the Jeopardy producers had a thing for Wyoming. So I guessed Cheyenne. Mary also guessed — Denver, “the only Western capital I could think of,” she said later. I was wrong. She was right. Thus ended my career on Jeopardy.

That’s not quite the end of the story. There’s a sociological coda, which I hope to get to in the next post.

NFL Playoffs — FiveThirtyEight vs. the Bookies

January 10, 2020
Posted by Jay Livingston

The 49ers are a 7-point favorite tomorrow over the Vikings. According to FiveThirtyEight’s Elo rating, they’re only five points better.

Elo — named for Arpad Elo, a physicist who used it to predict chess matches — is a “power rating,” which takes into account a team’s record, the records of the teams it played, the margin of victory, and some other team variables. Power ratings are common in sports. Elo, as you would expect from something at FiveThirtyEight, is a bit more statistically complicated. For example:

We created a multiplier that gives teams (ever-diminishing) credit for blowout wins by taking the natural logarithm of their point differential plus 1 point. This factor also carries an additional adjustment for autocorrelation, . . .the tendency of a time series to be correlated with its past and future values. [Source]

Can you use Elo to make money betting on the NFL? Is that two-point difference tomorrow enough to warrant a bet on Vikings getting seven points? Below is a chart showing wins and losses for each level of difference between the Elo spread and the actual betting line. I set the minimum difference at 1½ points.

For example, in the last week of the regular season, the bookies had Broncos as 4-point favorites over the Raiders. Elo rated them 5½ points better. So according to my system, that 1½-difference means bet the favorite. As it turned out, the Broncos won, but by only one point. So we Eloists and other Broncos bettors lost.

So far this season, there has been an Elo-vs-bookmakers difference of exactly 1½ points in 37 games. If you had bet accordingly, you would have won 19 bets and lost 18. Unfortunately, most bets require the bettor to give 11-10 odds. You bet $110 to win $100. So on these thirty-seven 1½-point games, you would have lost a little (4%).

(Click for a slightly larger view.)


As the chart shows, Elo as a betting guide does not improve as the rating differential increases. If you had bet $110 on each game where the Elo rating differed from the betting line by 1½ points or more, you would have wound up winning 70, losing 81 — a net loss of $1100. Increasing the size of your bet as the differential increased might have made a small improvement.

In any case, here’s how Elo and Las Vegas see the games this weekend.

Favorite
Las Vegas Line
Elo
Underdog
49ERS
7
5
Vikings
RAVENS
13
Titans
CHIEFS
11
Texans
PACKERS
Seahawks
Elo likes the Vikings as underdogs. In the other three games, it thinks that the betting line is underestimating the strength of the favorite. That’s not an accident. As their methodological post at FiveThirtyEight says, “We found that, in the NFL playoffs, favorites tend to outplay underdogs by a wider margin than we’d expect from their regular-season ratings alone.”

UPDATE: Here are the Elo picks and their outcomes in the actual games.

1. Elo liked the Vikings + 7. They lost by 17 (27 - 10). A loss for Elo.

2. Elo liked the Ravens - 9½. They lost on the field. Badly (12-28). Nobody saw that one coming. A loss for Elo.

3. Elo liked the Chiefs - 11½. They won and covered (51-31). A win for Elo.

4. Elo liked the Packers - 4½. They won by 5 (28-23). A win for Elo.

Two up, two down. With Elo as a guide, if you had bet of $110 to win $100 on each game, your net would be -$20.

Impostor Syndrome, an Idea Whose Time Has Come . . . Again

January 8, 2020
Posted by Jay Livingston

The impostor syndrome is the dancing partner of pluralistic ignorance. That was the point of the previous post. In pluralistic ignorance, each person thinks that others are doing better at living up to cultural ideals. On campuses where “hook-up culture” rules, most students think that everyone else is having more sex and better sex than they are. Two axioms from Goffman account for this misperception. First, norms require that people present more or less idealized versions of themselves and keep contradictory, self-damaging information to themselves. Second, absent any contradictory information, we accept and ratify the self that the other person presents. These norms make it easy for real impostors to go unchallenged for so long.*

Unless we are playing Humiliation (see the previous post or David Lodge’s novel Changing Places), we don’t tell our colleagues which classics in our field we haven’t read. Instead, when the conversation turns to Weber’s Economy and Society, we nod and keep our silence, assuming that most of the others in the room have read it at least once. Maybe we make a tangential comment, or ask a general question, and the others for their part, observing our wisdom, assume that we too have made our way through all 1700 pages. We, meanwhile, feel like an impostor. (And by we in this paragraph, I mean me. And maybe you.)

The phrase in the seminal 1978 article** was “impostor phenomenon,” but “impostor syndrome” quickly became the more popular choice. It was a phrase just waiting to be coined.


(Click on an image for a larger view.)

The sharp increase of “imposter syndrome” in print did not reflect an increase in the thing itself. It was already widespread; it was just hiding in plain sight. It was the feeling that could not speak its name. But once someone did speak its name, people were seeing it everywhere.

Did impostor syndrome have a precursor? It did indeed. In the 1920s and 30s, “inferiority complex” followed a similar trajectory.


The person most responsible for putting the concept in play was psychoanalyst Alfred Adler. He used the term “inferiority feeling,” but in the same way that phenomenon gave way to syndrome with imposter, Adler’s feeling was soon swamped by complex. Wikipedia describes it as “feelings of not measuring up to standards, a doubt and uncertainty about oneself, and a lack of self-esteem.” Which sounds a lot like impostor syndrome. The “inferiority complexes” was popular in the 1940s and 50s, but nowadays, it’s hard to find anyone who has one, sort of like a Studebaker.

From the sociological perspective, the trouble with both these versions is that they emphasize the individual. That’s obvious with “inferiority feelings,” which assumes that these are a matter of individual psychology. But even the studies that look at gender, class, or race take these as permanent characteristics of the individual. What these ignore are the structural, situational conditions that make the feelings more likely or less likely.  And these conditions may matter far more than the psychological or demographic characteristics of the individual.

I hope to explore this idea in a later post.

-----------------------
* Frank Abegnale, the impostor played by Leo DeCaprio in “Catch Me If You Can,” once got a university adjunct position in sociology. He said he was a sociologist, and nobody tried to prove him wrong. He moved on after a semester, probably to keep ahead of the law. But apparently his student evaluations were good. The department chair asked him to stay for another semester.


** The Impostor Phenomenon in High Achieving Women: Dynamics and Therapeutic Intervention, by Dr. Pauline R. Clance and Dr. Suzanne A. Imes, Psychotherapy: Theory, Research & Practice. 15 (3): 241–247.

Impostor Syndrome and Cultural Rules

January 6, 2020
Posted by Jay Livingston

Many years ago, I was talking with a successful business consultant. He had a PhD in business from a prestigious school, was on the faculty of another pretigious business school, and for several years now, corporations were paying him large sums to come in for a day or two. Still, he didn’t feel secure. “I keep thinking, What if they find out?”

I remembered that conversation when a thread on impostor syndrome showed up in my Twitter feed. What set people off was this tweet in response to someone who had tweeted about her own feelings of being an impostor.


True, it’s the individual’s problem, not society’s. Society doesn’t suffer if you feel like an impostor. But that doesn’t mean that social factors are irrelevant. What if impostor syndrome is more prevalent and more persistent among women than men? That would certainly suggest that the causes are social and not just psychological. In fact, many of the responses to this tweet argued that a person’s race, class, gender, age and other social variables might affect the probability that they would feel like an impostor. Or as Nathalie Olah in the Guardian (here) put it, “what seems more likely is that impostor syndrome is a rather natural reaction of anyone from a working-class, disadvantaged or minority background to the various biases they face on a daily basis.”

The studies on these variables are far from unanimous, if only because there’s no standardized way of measuring the crucial variable of feeling like an impostor. But I’d like to throw in one more variable — culture. My guess is that impostor syndrome is more prevalent and more deeply affecting in societies that emphasize success, societies like, oh I don’t know, the United States.

The British, by contrast, seem not so obsessed by success. In some instances, not being a success and not having the right stuff can put a person one up on the others in the room. I’m thinking here of Humiliation, a party game played by literature professors in Changing Places by British novelist David Lodge. Each person has to name a literary work, preferably one in the canon, that they have never read. You get a point for every other player who in fact has read it. The winner will be the one who has not read the classics that a lit professor would be expected to be familiar with — in other words, the biggest impostor.

Presumably, for the British and for less success-obsessed Americans the game is just a bit of fun. But for Howard Ringbaum, a success-driven American professor, the game creates an unresolvable conflict. “He has a pathological urge to succeed and a pathological fear of being thought uncultured, and this game set his two obsessions at war with each other, because he could succeed in the game only by exposing a gap in his culture.”

Ringbaum and Humiliation are fictions invented by a novelist. But Kate Fox is an anthropologist, and in Watching the English: The Hidden Rules of English Behaviour, she describes a similar game of “competitive self-deprecation” when real Brits talk about their home-improvement projects.


When showing visitors the results of your DIY efforts . . . a strict modesty rule applies. Even if you are highly skilled, you must always play down your achievements, and if possible play up your most embarrassing mistakes and blunders... 

‘I managed to burst three pipes just laying the carpet!’ 


‘We bought an expensive carpet, but I ruined it by cutting it four inches short, so I had to build some bookcases to cover the gap.’


‘You think that’s bad: it took me an hour and three cups of tea to put up a coat-hook board, and then I found I’d hung it upside-down!’


Here’s more British fun: In a dinner-party scene in the film “Notting Hill,” the competition involves not a specific area like literature or home-improvement but more or less the total self. Except for the newcomer in their midst – a famous actress (played by Julia Roberts), the guests all know one another well, and as the the host says, “I've long suspected, that we really are the most desperate lot of under-achievers.” At dessert time, there’s one extra brownie, and the host says he will award the brownie to “the saddest act here.”

The video is nearly four minutes long, but it’s worth watching.   



I can’t think of anything similar in American novels or films. Maybe such scenes exist, and in any case, the connection between cultural fictions and real life is not always clear. But the larger point is that impostor syndrome depends on the rules of the game. Where the rules of everyday life allow for the disclosure of personal flaws,  people will be less likely to feel like an impostor and that feeling will be less salient.

I Do Not Like Thee, Dr. Phil

January 4, 2020
Posted by Jay Livingston

I’ve had that title in my head for years now, but I’ve never written the blog post that goes with it. That’s partly because I don’t think that many people know this nursery rhyme.

I do not like thee Dr. Fell
The reason why I cannot tell.
But this I know, and know full well,
I do not like thee, Doctor Fell.

Exactly right. I do not like Dr. Phil, yet I would be hard pressed to articulate the reason why. In order to do so, I’d have to watch a lot of his TV segments, and since I don’t like him, I’d find that unpleasant, so I don’t. Which is the other reason I never wrote that post.

But now we have this photo of the dining room in Dr. Phil’s house.

(Click for a larger view.)

Would you refer a friend to a therapist whose dining room featured cartoonish animal statuary and a wall display of a dozen military-style guns? It’s not a decision you’ll have to make. Dr. Phil has not been licensed to practice since the earlier part of this century.

Also, there are some minor factual quibbles, which William Hughes at AVClub (here) raises and dismisses.

McGraw does not currently live in the California estate, which is now owned by the family trust and occupied at present by McGraw’s son. And there’s no evidence that he had any impact on its decoration, beyond, presumably, footing at least some of the cash for its eclectic collection of guns, “FUCK” paintings, and big giant lips. And yet, we have to assume that McGraw has at least stood in this building in its current state, muttering approvingly of the dining room gun wall, the gaudy Star Wars art, and, in what has quickly become our favorite touch, the legion of small Lego-ish bears scattered around the home, including two who are dressed up like Batman and the Joker, because hey, fuck it, that’s fun.

The larger point is first that sometimes your initial impression of a person, immediate and without rational thought, is right; and second, that you may have to wait a long time — until you see their dining room decor — to know that you were right.

Mrs. Maisel Gets One Right

December 26, 2019
Posted by Jay Livingston

Since the new season of Mrs. Maisel dropped not long ago, my post from nearly two years ago about its language anachronisms (here) has been getting some action. It’s still the most frequently viewed and commented-on item on this blog, and some of the newer comments made it clear that the anachronisms were still in bloom.

I watched first couple of episodes recently, and sure enough, in Episode 2, “It’s the Sixties, Man!” we got contextualize.



In a blog post (here) about trendy words at the ASA meetings, I  cited contextualize, but that was a report from academia in 2013, not New York family life in the early 1960s.

To the show’s credit, it did have a period-perfect language moment. Joel has been speaking with the older Chinese couple who own the Chinatown space he has rented, planning to turn it into a club. He discovers that the space includes a Chinese gambling parlor. Worried about trouble from the police, he meets with the owners. After some back-and-forth to deal with Joel’s fears, the couple — through their translator,* the young med-student Mei — allude to a murder, significantly raising Joel’s level of anxiety.

After the Chinese couple leave, Joel is discussing the matter with Mei. What about the murder, he asks.


Talk of the “put-on” and “putting you on” came on the scene in the late 1950s, starting among young, hip people like Mei and eventually spreading throughout the society. I thought that its use had declined in the 21st century, but Google nGrams shows that at least until 2000, it was still found in books.


Still, my impression is that we rarely refer to “putting people on” these days. But what has replaced it?

--------------
* Another anachronism for anyone still keeping score — a language anachronism of sorts : the owners are speaking Mandarin. In the early 1960s, the language spoken in Chinatown was Cantonese. Immigration from Mandarin-speaking areas of China did not begin until the 1970s at the earliest.

Acting and Reacting as an Agent of Culture — Moi?

December 21, 2019
Posted by Jay Livingston

A long time ago I heard Margaret Mead speak, and one of the only things I remember her saying was this: “A society has to teach its cultural ideas to its children in terms so clear that even a social scientist can understand them.”

I am, allegedly, a social scientist, but only an encounter with something very unusual can jar me into seeing my own culture. Like most people, I usually take it for granted, like the air we breathe. That was the point of the previous post, where a psychologist was urging dog owners to give their dogs more choice. It took this extending of human culture to canines to remind me of the great emphasis American culture gives to individual independence and choice. All those times that I had heard parents, me included, ask their kids if they wanted Cheerios or Chex, it had never occurred to me that we were drilling a particular American value into the minds of our little tots. I thought we were just being parents.

I had a similar cultural-blindness experience a few years ago. A student born and raised in Turkey came for his obligatory meeting with his advisor — me. He was a grown man in his forties. “What courses I should take?” he asked. I explained about the core requirements and recommended he take the first in the sequence. “And then there are the electives” I said and showed him the list.

“Which courses I should take?

I explained that these were electives. He could take any of the ones we were offering that semester. If you’re interested in family, you could take that course, I said. If you’re interested in religion, we have that elective.

“Yes, but which ones I should take.”

I found it incredibly frustrating. What was so complicated about the concept of electives? It did not occur to me that our differences were cultural. I was so thoroughly an American I that could not imagine anyone rejecting the freedom to make their own choice of courses. Who would not seize that opportunity? Only someone who did not understand.

In retrospect, I now think that he did in fact understand. He just didn’t think it was such a great idea that the choice should be made by him rather than by a professor — department chair no less — who knew much more about the courses and the instructors. Maybe he was right.

There’s something else to be said for his approach. It creates a personal link between the advisor and the student in a way that the independent-choice model conveniently avoids. When he was asking me to choose courses for him, the thought crossed my mind that I could tell him to sign up for some of the low-enrolled courses that were in danger of being cancelled — courses students were avoiding because of the reputation of the course, the instructor, or both. That certainly would have made things easier for me as department chair. But I now felt that I had to look out for his best interests as well. I felt an obligation that was different and perhaps stronger that what I would feel towards other students.

As I say, when all this was happening, I didn’t think about the underlying cultural differences. I just felt uncomfortable. I will leave for another post the time when he presented me with a large assortment of almonds, figs, pistachios, etc., while I tried to explain to him the university rules about gifts.

Raise Your Dog to be an American

December 19, 2019
Posted by Jay Livingston

My local online webiste WestSideRag today ran an article with the title “Barnard Researcher Has Studied The Minds of Upper West Side Dogs, and They’re Way More Complicated Than You Think.”

I don’t have a dog, but I started reading.* And as I did, I saw that while the article was about dogs, it was more importantly a document about American culture, particularly our values and beliefs about Choice. We value individual choice as inherently good. We also believe that choice is beneficial and that denying people the freedom to choose will in some way harm them. So we insist that people make their own choices.

Recognizing the wonderfulness of choice is not something that comes naturally. You have to be carefully taught. And it’s never too early to start. It’s just that most of the time, we don’t think that we are hammering American cultural ideas into our kids’ psyches. We just think we’re raising them the right way.

In My Freshman Year, an ethnography of an American university, Rebekah Nathan** includes a chapter (“As Others See Us”) about the perceptions of the foreign students. A Korean student tells her:

Everything here is: “What do you want?” “What do you think?” “What do you like?” Even little children have preferences and interests in this country. I hear parents in restaurants. They ask. a three-year-old child, “Do you want French fries or potato chips?” Every little kid in this country can tell you, “I like green beans but not spinach, I like vanilla but not chocolate, and my favorite color is blue.”

If we think it’s good for three-year olds to make their own choices, why not dogs?

All dog owners should allow their dog to make certain choices, according to Horowitz, who strongly believes that giving dogs choices increases their welfare. . . . Owners should “allow the dog to make their own choice as opposed to your definition of the walk.” She recognizes that people want to feel in control, but points out “what we are in control of is to let the dog have a life in which the dog is partly choosing. This is something we want to give to anyone we love.”

WestSideRag has a relatively small readership — we’re not talking Slate.com — and an article extending our ideas about choice to dogs is extreme. But often the extreme case can call attention to the less extreme versions that are widely taken for granted and unnoticed. In America, even those with a more authoritarian outlook find it hard to refute arguments based on the idea of choice. It’s not just liberals who ask their kids what kind of cereal they want. 

-------------------------
* What originally drew me to the article was the opening paragraph, which contained a pun that I am nearly certain was unintended.

(Click on the box for a larger and clearer version.)

** “Rebekah Nathan” is a nom de plume. The author, Cathy Small, probably wanted to remain anonymous since she was writing about the school where she teaches. The ruse did not work for very long.