Methodological Trees and Forests

December 12, 2019
Posted by Jay Livingston

The units of analysis that researchers choose usually constrain the explanations they come up with. Measuring variables on individuals makes it harder to see the effects of larger units like neighborhoods.

For example, much research has found a correlation between female-headed households and crime. Most explanations for this correlation focus on the households, with much talk about the lack of role models or the quality of parent-child interaction. But these explanations are looking at individual trees and ignoring the forest. The better question is not “What are the effects of growing up in a single-parent home?” It’s “What are the effects of growing up in a neighborhood where half the households are headed by single mothers?”

In the early 1990s, I wrote a criminology textbook, and one of the things that differentiated it from others was that it took seriously the idea of neighborhoods and neighborhood-level variables.

That was then. But now, Christina Cross in a recent Times op-ed makes a similar argument. Research generally shows that it’s better for kids to grow up with two parents rather than one. That fits with our assumptions about “broken homes” even if we now call them “single-parent households.” But Cross’s research finds a crucial Black-White difference in the importance of this one dimension of family structure.

Looking at educational outcomes, she finds that White kids from two-parent families do much better than their single-parent counterparts. But for Black kids, the advantage of a two-parent home is not so great.
living in a single-mother family does not decrease the chances of on-time high school completion as significantly for black youths as for white youths. Conversely, living in a two-parent family does not increase the chances of finishing high school as much for black students as for their white peers.

 Why does a two-parent family have less impact among Blacks? Cross looks at two explanations. The first is that the effect of a very low-income neighborhood (“socioeconomically stressful environments”) is so great that it washes out most of the effect of the number of parents inside the home. For a kid growing up in an area with a high concentration of poverty, having a father at home might make a difference, but that difference will be relatively small, especially if the father is unemployed or working for poverty-level wages.

The other explanation is that having other relatives close by mitigates the impact of having only one parent in the home. Cross says that her data supports this idea, but the extend-family network explanation is not nearly as powerful as the neighborhood-poverty explanation.

For policy-makers, what all this means is that the traditional conservative, individual-based solutions miss the point. Exhorting people to stay married (and providing costly government programs along the same lines) aren’t going to have much impact as long as we still have racially segregated neighborhoods with high levels of unemployment and poverty.

The message for researchers is similar: if you confine your thinking or your variables to individuals, you risk ignoring more important variables.

Whatever Happened to “Broken Homes”?

December 11, 2019
Posted by Jay Livingston

“Just think about the last time you heard someone use the term “broken home” or “single-parent household” to explain the misbehavior or misfortune of a person in your social circle.” That’s from an Times op-ed (here) by Christina Cross, a sociology post-doc at Harvard.

I think the last time I heard “broken homes” was before Ms. Cross was born. It’s so 1950s, with its judgmental pronouncement on families that didn’t look like “Ozzie and Harriet” or “Father Knows Best.” In the 70s, as more middle-class people were getting divorced, we needed a less value-laden term. Enter “single parent.”

(Google nGrams shows the frequency of words in books, so the change in the use of these terms in the media and everyday talk probably happens a year or two earlier.)


“Single-parent” is not as blatantly stigmatizing as “broken homes,” but when we hear it, we still think that something is wrong. The more important point the Ms.  Cross makes is that broken homes — the harmful outcomes they bring — may be much more consequential for Whites than for Blacks. I hope to get to that in a later post. But for now, I'll just point out that the sharp decline in mentions of “single-parent” starting in the early 90s tracks with the decline of teen crime and teen pregnancy in the same period.

Alumn. . .us / a / i / ae / x?

December 9, 2019
Posted by Jay Livingston

Welsey Yang, a writer by trade, must have been browsing messages from his alma mater, Rutgers, when this caught his attention. He posted it to Twitter.

(Click on an image for a larger view. I have added the blue oval for emphasis.)

English has a problem with words that are imported from Latin and Romance languages. What do we do about the gender and number that require Latin to have four different forms of the word? Alumnus, alumna, alumni, alumnae. It’s just too many to keep track of, especially since the plurals, masculine and feminine, sound alike.

The simple solution is to use a single form of the word to cover everyone, and that’s what many people have done. For that single form, there are two choices. One is alum, plural alums, which conveniently gets rid of those gendered endings in the Latin versions.

The other is the all-purpose alumni, which many Engish speakers now use indiscriminately for either gender and for singular or plural. It must drive Latin scholars up the ivy-covered wall. Both of these have been getting more popular lately. I searched for them on Nexis-Uni’s database of news sources.


(Some of the increase in these numbers may be attributable to the increased size of the Nexis database. But I doubt that it has grown by the multiple of 20-30 since the 1990s that we see for an alum and an alumni.)

A Google search for “she is an alumni of” gets 177,000 results, slightly less than the 191,000 for “she is an alumna of.” For men, “an alumnus” still outscores “an alumi” by a factor of 10, but that includes a lot of old sources. I would be that the ratio decreases with time.

Personally, I would avoid the problem completely and go with graduates or, in less formal settings, grads.

The Rutgers Linguistic department has a different solution to the problem of gender: the very recent coinage alumnx, the word that inspired Yang to tweet. It’s a different sort of solution. The ungendered alum and the all-purpose alumni seem to have cropped up unplanned and without any ax to grind. But alumnx is a deliberate effort to change the language. What’s interesting here, as one of the Yang’s commenters points out, is that as linguists the department members are descriptivists, more interested in describing how people actually use language than in telling people which words to use. But here they prescribing alumnx as the correct way avoid the less woke Latin forms.

My guess is that these x-words will have a short life. Most people don’t care much about the politics of speech and have little interest in changing. Remember “Freedom Fries”? Worse, alumnx and Latinx don’t resemble real words in English or in any other language.* It’s one thing to replace the “man” in policeman or fireman. “Police officer” and “fire fighter” were already part of the language well before the feminists of the 1960s called our attention to the sexism of the more frequently used terms.

Maybe I’m wrong. Maybe these x-words will become as much a part of the language as Ms. In any case, I expect that the members of the department will, as descriptivist language researchers, monitor how well they are doing as prescriptivist language changers. (Or is it changerx?)

------------------
* Latinx has similar problems. It’s not very popular among the people it is meant to designate, probably because neither English nor Spanish has words in this form. Terry Blas suggests (here) that a better gender-neutral solution would be an “e” rather than an “x” — Latine and Latines.

My So-Called War Crimes

December 3, 2019
Posted by Jay Livingston

When I first saw this item in my Google News feed, I thought it must be from The Onion or Andy Borowitz.

 But no, it was real, from the Daily Beast not a parody site. “If Donald Trump gets his wish, he’ll soon take the three convicted or accused war criminals he spared from consequence on the road as special guests in his re-election campaign.”

Why would war criminals be an asset for Trump? Trump’s base. of course, will not abandon him no matter what he does. But what is the gain? What virtues do these men embody that will pump up the enthusiasm and perhaps attract others.

First, let’s meet the war criminal who Trump has supported most strongly.

Chief Petty Officer Edward Gallagher and other Navy SEALs gave the young captive medical aid that day in Iraq in 2017, sedating him and cutting an airway in his throat to help him breathe. Then, without warning, according to colleagues, Chief Gallagher pulled a small hunting knife from a sheath and stabbed the sedated captive in the neck. A week later, Chief Gallagher sent a friend in California a text with a photo of himself with a knife in one hand, holding the captive up by the hair with the other. “Good story behind this, got him with my hunting knife,” he wrote. [NYT ]

Gallagher was acquitted of murder when another SEAL claimed to have killed the victim first, cutting off his breathing tube as an act of mercy. But apparently Trump and those who share his views think there’s something noble about knifing a dead man in the neck. Certainly Chief Gallagher thinks it’s something to brag about.

One SEAL sniper told investigators he heard a shot from Chief Gallagher’s position, then saw a schoolgirl in a flower-print hijab crumple to the ground. Another sniper reported hearing a shot from Chief Gallagher’s position, then seeing a man carrying a water jug fall, a red blotch spreading on his back. Neither episode was investigated and the fate of the civilians remains unknown.

The Times is being extremely careful here, saying that the fate of the civilians Gallagher shot “remains unknown.” But take a wild guess.

At trial, the SEALs were found not guilty.* But while the evidence may not have been sufficient for the military jury to convict the men, it was enough for the Navy to seek their removal from the SEALs. Trump intervened and forced the Secretary of the Navy to resign, effectively allowing Gallagher and the others to remain as SEALs.

What makes Gallagher so appealing to Trump, his followers, and Fox News? They deny the accusations, of course, but even if the men had been convicted, the case elicits ideas and emotions that are essential elements of much conservative world view in the US these days.

To begin with, supporting the SEALs requires a strong sense of tribalism. This tribalism goes far beyond the “loyalty” Jonathan Haidt sees as a “moral foundation” of conservative thinking. Tribalism sees the world as Them against Us. We are under constant threat from Them. This view obviously pervades domestic politics, where Trump’s go-to strategy has been to claim that Democrats are out to get him.** In foreign policy, it means that anything We do to Them is justified. Anything. Trump has voiced his preferences for torture, putting children in cages, alligator-filled moats, and nuclear bombs, all on the grounds that these protect America from its various enemies. The willful killing of civilians easily fits into the list.

Therefore, We must defend the actual people who carry out these actions — the protectors of the country (the military, border guards, ICE) and of the social order (police). In practice, this means that there are no restraints on what they can do to people who are not Us or to people who dissent from or threaten the social order. So long as soldiers and cops are ostensibly doing their job, anything goes.

This defense of those who protect our tribe has an obvious corollary:  American autonomy, an innocuous term for the idea that we should not be involved in any relationship of mutual regulation or obligation with other countries. “World government” has long been a trigger for the right, and they remain suspicious of the UN and other international pacts.  What are “war crimes” after all except the product of international law, a concept which to US conservatives is illegitimate. So on Fox News, the shooting of a little girl or the knifing a wounded prisoner in the neck are “so-called war crimes.” The law that criminalizes these actions, in the America-first view, is not legitimate and is therefore null and void. No law, no crime.

Finally, the Trumpists see Gallagher as an example of the conservative ideal of masculinity. I have gone on too long already, and this really needs no further explanation. It’s enough to note that Trump refers to him admiringly as a “warrior.”
----------------------------------

*For most of the charges, the prosecution had no hard evidence, only the testimony of other SEALs.  The court convicted Gallagher on one count — the taking of a picture of the dead ISIS fighter. Kind of hard to ignore that photo he sent around. As for sniping at civilians, when Gallagher was shooting at the young girl, he didn’t take any selfies.

** This “out to get us” strategy was also part of Gallagher’s defense. His lawyers argued that the men who testified against him were motivated by a desire to get rid of him.

Cred — Authorship and Authenticity

November 25, 2019
Posted by Jay Livingston

Marjoe Gortner was a child prodigy Evangelical preacher. The 1972 documentary “Marjoe” includes home movies of him preaching at age four. The film, made when he was in his late twenties, shows him still at it, preaching to large crowds.  The spiritual and religious uplift he gives is palpable. But the film also shows Marjoe offstage saying to the filmmakers, “I can’t remember a time when I actually believed in God.” 

If a man does not practice what he preaches, must we ignore the content of the sermon? It might be a very good sermon. It might so what a sermon is supposed to do — cause many in the congregation to become more religious and more virtuous. Should we tell them to instead cover their ears?

Yesterday, I saw this tweet.

I recently peer reviewed a feminist article that cited Michael Kimmel’s work on how to be a “good man.” To state the obvious, this man has lost all credibility as an expert on this topic. Stop citing him. #MeToo  #MeTooPhD #MeTooSociology #AcademicChatter #AcademicTwitter
Advice on how to be a good man, much like a sermon, is prescriptive. Once we learn that the preacher is an atheist, we probably won’t go to the next revival meeting. But what about works that are descriptive? Does the author’s behavior invalidate his expertise and  accuracy?

Not that long ago, Jack-Alain Léger, a White Frenchman, wrote books under the Arab-sounding name Paul Smaïl.

The first Smaïl novel, “Vivre Me Tue” (“Living Kills Me”), was published in 1997, and its author was received as a genuine voice of the Beur community. [Beur is a slang term for North Africans living in France.] North Africa specialists were as fooled as anyone else. Sales were strong; a successful movie adaptation was produced; subsequent Smaïl novels were published; and people wrote dissertations on the work of Paul Smaïl. [Christopher L. Miller, Impostors: Literary Hoaxes and Cultural Authenticity]

When the hoax was revealed, many people were outraged. The book was derided as inauthentic, incompetent, and racist. But as Miller says, “Léger was delighted to point out that until he emerged as the real author the books had seemed Beur enough to everybody.” In the same way, until the accusations against Kimmel became public, his writings seemed feminist enough to everybody.

But Kimmel is a social scientist. Smaïl’s novels were fiction. Are the rules stricter for social science? Should we require a higher level of authenticity, of congruence between the qualities and character of the writer and the content of the publications?
 
In a recent blog post (here ) Andrew Gelman asks this question, putting it the same way we might ask it of fiction: “Does authorship matter at all?”*

Gelman blogs frequently about seriously flawed research, including outright fraud, and I expected that his answer would be “No. The data and analysis speak for themselves.” I was wrong.

Information about the authors can give a paper some street-cred. For example, remember that paper claiming that single women were 20 percentage points more likely to support Barack Obama during certain times of the month? That paper had both male and female authors. If all the authors were male, I wonder if it would’ve been considered too silly or too offensive to publish or to promote. [emphasis added]

Gelman writes mostly about quantitative research, so he is concerned with “any misrepresentation of data and metadata, which includes authorship as well as details of how an experiment was carried out, what steps were done in data processing and analysis, and so on.” So if we learn that the authors were not who they claimed to be, we should look very carefully at the details.

The accusations against Kimmel have cost him whatever street cred he might have had as a feminist and perhaps as a sociologist. I imagine he is now having a hard time getting his work published.

But read “Raise Your Son to Be a Good Man, Not a ‘Real’ Man” (here)    published just a few months before the accusations became public, and see if you think that its advice should be cancelled.

-------------

* It was Gelman’s post that led me to Léger/Smaïl. Gelman had been reading Louis Menand’s New Yorker article, which was based largely on Miller’s Impostors: Literary Hoaxes and Cultural Authenticity.

Brought to You by the Number 九十二

November 23, 2019
Posted by Jay Livingston

We were at 79th and Broadway, and I wanted to tell the French couple that they had to go up to 98th St. I had overheard them looking at their phone and puzzling about directions. I don’t get much chance to speak French, so I asked, in French, if I could help them.

Il faut aller jusqu’au . . .” I started, but it took me an extra moment to remember how to say “98" in French. “Au quatre-vingt dix huitième.”

I remember that my brother, a statistician by trade, once commented that France has had a disproportionate number of noted mathematicians, and he wondered if the difference might have something to do with how kids learn to count. Compared with English, counting in French involves more sophisticated mathematical operations. Once you get past 69, you can no longer use the base-10 template that worked for numbers in the 20s, 30s, and so on. Seventy is soixante-dix (sixty-ten); seventy-nine is soixante-dix neuf (sixty ten nine).*

After that you have to throw in some multiplication. Eighty is quatre-vingt (four twenty), and ninety-eight is quatre-vingt dix huit (four twenty ten eight) — 4 x 20 +10 +8.

In a recent BBC article (here), Anand Jagatia discusses the idea that how we count affects our ability in math. English, French, Dutch, Welsh all have slightly different ways of naming numbers. The biggest contrast is between Western systems and those of East Asia. 
 
In Mandarin, 92 is written ji shí èr, which translates as “nine ten two”. Japanese and Korean also use similar conventions. . . . Psychologists call systems like these “transparent”, where there is an obvious and consistent link between numbers and their names. There’s growing evidence that the transparency of a counting system can affect the way we process numbers.

The point is clearer if you use numbers rather than words — not “nine ten two” but “9 10 2.” To translate the Western “92” into math, you have to know about the tens place and the ones place. The Asian “9 10 2” shows more simply how the larger number is constructed from the smaller ones.

Does it make a difference?

Children who count in East Asian languages may have a better understanding of the base-10 system.

In one study, first-grade children were asked to represent numbers like 42 using blocks of tens and units. Those from the US, France or Sweden were more likely to use 42 individual unit blocks, while those from Japan or Korea were more likely to use four blocks of ten and two single-unit blocks, which suggests that the children’s early mental representation of numbers may have been shaped by their language. [emphasis added]


I’m not sure what the evidence is on the stereotype notion that Asian students do better in math than do Western students. But if there is any factual basis, maybe the language of numbers accounts for some of the difference.

------------
* Belgians speak French, but they have simplified the numbers. Seventy and ninety are, respectively, septante and novante --- yet another reason for the French to look down on les belges. For some reason, eighty remains quatre-vingt.

Edward Shils

November 11, 2019
Posted by Jay Livingston

Fabio linked to this recent tribute to Edward Shils by critic Joseph Epstein in Commentary. Shils was a brilliant man, a polymath. As Epstein says, “His writing . . . often aimed at a high level of generality in the German social-scientific tradition of Max Weber and George Simmel.” He was on the faculty at the University of Chicago, in both the Sociology Department and the Committee on Social Thought.

Two passages in Epstein’s piece caught my attention. In one, Epstein writes approvingly of Shils’s willingness to use the autocratic power of the university administration to stifle dissent.

When graduate students occupied the university’s administration building during the 1960s student protests, Levi [the president of the university], on Edward’s advice, told them to evacuate the building or be removed from the university. Those who chose to remain were summarily expelled, their principles intact but (in many cases) their academic careers ruined.

(One of the characterological hazards of being a conservative is smugness. Instead of  comforting the afflicted and afflicting the comfortable, conservatives often find pleasure in the afflicting of the afflicted. The final clause in that passage is a good example. Ah, those ruined careers. Serves ’em right. Epstein, in case it wasn’t clear, is a long-time conservative who regularly writes for conservative publications like Commentary and the Wall Street Journal.)

In another passage, Epstein speaks of Shils’s forthrightness (“he was a person who knew his mind and spoke it without looking over his shoulder”). For example,

He told me that at a dinner party he once queried a married woman who spoke admiringly of Philip Roth about what must be her concomitant admiration for adultery, since that was one of the specialties in Roth’s fiction

This is an incredibly stupid remark. It’s like saying that readers who admire Agatha Christie must also admire actual murder. (And by the way, since few of Roth’s protagonists are married, there isn’t all that much adultery in his novels. Deception is the most notable exception.) Yet here is Shils bragging to Epstein about insulting this woman as though his remark were a gem of Oscar Wilde-like wit. So yeah, Shils was brilliant, but not all the time.

Even dumber is that Epstein, in this remembrance of his “dear friend” chooses to include Shils’s bon mot. Much of Epstein’s writing over the years has been about literature (other topics as well, but mainly literature), and he himself has written many short stories. Surely he must appreciate the difference between fiction and reality.  Yet he repeats without comment Shils’s conflation of the two.

I was tempted to give this post the title “When Smart People Say Stupid Things.” Nil nisi bonum and all that, but Shils died 24 years ago, and Epstein is still alive and writing.

Proclaiming an Idealized History

November 6, 2019
Posted by Jay Livingston

“These people don’t have mothers and fathers. They have Mother’s Day and Father’s Day.”

I read Roger Brown’s excellent textbook Social Psychology at least four decades ago, but I still remember that sentence. It’s from the chapter on the authoritarian personality.  Most people when asked about their parents give more or less objective assessments. But those who score high on measures of authoritarianism paint a highly idealized portrait.

That preference for seeing only the ideal may apply not just to the home but to the homeland.

The sentence came to mind when I was reading a WaPo story today about the Presidential Proclamation making November the National American History and Founders Month. In case you hadn’t heard, President Trump issued that proclamation last week. NAH&FM is a new one, sharing November with, among others, National Family Caregivers Month and Heart Month, which Trump also proclaimed as did his predecessors. But those presidents, since Bush 41 have also proclaimed November as National Native American Heritage Month.  Last week, that proclamation did not appear.

Some people jumped to the conclusion that Trump was substituting the Founding Fathers for Native Americans. Not true. The Native American Heritage Month proclamation did appear on the White House website, though not till  yesterday and backdated to Oct. 31. But the larger point remains: Trump and his hardcore conservative supporters refuse to acknowledge any flawed motives in anything that the US — or Trump — has ever done. That includes the heritage of Native Americans, which on its face certainly raises questions about the motives and behavior of White men in America.

National Native American Heritage Month is a tacit acknowledgment of past sins, as if to say, “Yes, we may have stolen your land and slaughtered your people by the tens of thousands in the process, but we’ll give you a piece of November each year to make up for it.”  Trump’s proclamation does not, of course, mention any of that. Instead, in typical Trump fashion — “this isn’t about you, it’s about me” — it advertises all the wonderful things “my Administration” (the phrase appears five times in five short paragraphs) is doing for Native Americans.

Why add National American History and Founders Month? The proclamation explains. “To continue to advance liberty and prosperity, we must ensure the next generation of leaders is steeped in the proud history of our country." That sounds nice, but immediately the critics chimed in. “Some historians slammed the statement for an oversimplified and glorified portrayal of a national history that is far more complex.” Well, what do they expect — complexity? From a Presidential Proclamation? From Donald Trump?

Still, the criticism speaks to an idealized version of American promoted by conservatives, and not just in proclamations at WhiteHouse.gov. Red staters who protest the removal of statues of Confederate heroes, for example, and who continue to display the Confederate flag prefer a history where secession had no trace of tainted motives — motives like racism. In a similar way, conservatives find no impure intent in what White people did in the the westward expansion. Or if they do allow that some bad things happened, they see these in a “balanced” way, much like Trump’s view of the White nationalist rally in Charlottesville (“good people on both sides”).

Here for example is the conclusion to a long article in the right-wing magazine Commentary:

In the end, the sad fate of America’s Indians represents not a crime but a tragedy, involving an irreconcilable collision of cultures and values. Despite the efforts of well-meaning people in both camps, there existed no good solution to this clash.

Our hearts and our homeland are ever pure. A happy November to all.

How to Lie About Statistics — “Steady” vs. “Strong”

November 5, 2019
Posted by Jay Livingston

Evangelicals support Trump so strongly not because he promotes Christian values or beliefs and certainly not because he embodies or practices those values. They support him because he symbolizes the dominant position of White Protestants in the US. That was the gist of the previous post.

Just to make sure that this was about group identity and not Evangelical religious principles, I checked the Internet for information on Trump support among Black Evangelicals. If Trump’s appeal is tied to religious values, then Blacks should support Trump as strongly as do Whites. Sure enough, I found this headline in an article the appeared last March in the Washington Examiner. The article is reporting the results of a Pew survey.



It certainly sounds as though Trump is popular among Black Protestants. But the Examiner leans heavily to the right, so it’s best to look at the graphs of the Pew data.


Only 12% approved of Trump, and that percent was unchanged from a year earlier. So the Black Protestant support for Trump was “steady.” You could even say it was “firm.” It’s not a lie; it’s just misleading.

The headline could just as accurately been, “Decline in White Evangelical Support for Trump.”

Evangelicals for Trump — It’s Not About Religion

November 4, 2019
Posted by Jay Livingston

Evangelicals remain unwavering in their support for Trump, much to the puzzlement and consternation of those on the left. On Friday, Josh Marshall tweeted, “this is basically the most profound insult to christianity i have ever heard.” The insult was delivered by Robert Jeffress, an Evangelical megachurch pastor and frequent guest on Fox.

We’re going to talk about lobbying for those values that the President embraces. . . .Never in the history of America have we had a president who was a stronger warrior for the Judaeo-Christian principles upon which this nation was foundedthan in Donald J. Trump. . . The effort to impeach President Trump is really an effort to impeach our own deeply held faith values. [The tweet and a video of the quote are here. ]

The Fox host, as far as I know, did not ask which Judaeo-Christian principles the pastor had in mind. There’s abortion of course. But what principles apply to Trump’s other achievements — tax cuts for corporations and the wealthy, or anyone else for that matter; barring immigrants; reducing regulations on business, or raising tariffs? 

Evangelical support for Trump isn’t about policies, and it isn’t about religion or principles. It’s about “status politics” or what we now call “identity politics.” In status politics, the question is not which policies will prevail. Those policies are important not for their practical outcomes but for their symbolic value. The real question is “Whose country this is?”

Ten years ago, people like Pastor Jeffress and his followers opposed Obamacare not so much because of its effects on healthcare but because the change symbolized a lowering of their status. It was saying that people like them — White, Protestant, non-urban — were not longer the dominant group in the nation. (See this earlier post about healthcare and Prohibition as status politics.)

In that post, I said, “the election of Obama and now the possibility that he will enact a real change confronts them with the reality of their loss of dominance. That’s why they see health care in such apocalyptic terms.”

Today, these same people have tied their status not to any issue or policy but to a single person — Trump. They see the specter of Trump being removed from office, whether by impeachment or an election, as a huge threat. But what is threatened is not their “deeply held faith values” as the pastor says. It’s their status position of dominance.

Remembering Clifford

October 30, 2019
Posted by Jay Livingston

Clifford Brown, the brilliant jazz trumpet player, would have been 89 today. He died at the age of 25 in an automobile accident on the Pennsylvania Turnpike. It’s a poignant irony that one of his earliest jazz recording dates was with J.J. Johnson and included J.J.’s tune “Turnpike.”

Here is his best-known tune and recording — “Joy Spring.” Learning to play Brownie’s solo  (you can follow along with the transcription below) is part of the education of any serious jazz trumpet player. Ask Fabio.



After Brownie’s death, Benny Golson wrote a tune in tribute, “I Remember Clifford.” It is part of the repertoire of every trumpeter. Every trumpeter. There’s an old jazz joke:

A small combo — rhythm section and trumpet — has a gig, and at the last minute the trumpet player has to bow out. So they quickly get the first trumpeter they can find. The guy shows up with his horn, and as they’re talking about what they might play, he says that he only knows three tunes.
   
That’s OK, they say (they’re desperate). We can play them in different keys and different tempos, and somehow we’ll get through the night. What are the tunes?

“The Star Spangled Banner,” “My Country ’Tis of Thee,” and “I Remember Clifford.”

Not All Small-Town, Working-class Business Owners

October 22, 2019
Posted by Jay Livingston

Cone-E Island, Catskill, NY last Saturday.

(Click for a larger view.)

The sign says Fall Hours are 12 to 9 p.m. (you can read it if you click to expand the picture), but even though this was a beautiful autumn day, Cone-E Island was closed.

“Wanna buy it?” called out a raspy voice. As I was taking pictures, a pick-up truck had driven up and stopped. The driver was a man of sixty or so, fat and wearing a t-shirt. I walked over and asked the obvious question. “Three-fifty,” the man said.

A chocolate brown dog that looked to be part pit bull poked her nose through the half-open window and sweetly licked my offered hand. “Her name’s Mocha.”

Catskill is changing. Once a working-class town, it now has a tattoo parlor, a micro-brewery with its own beer garden, stores selling quirky things like LPs or old film cameras from the 1950s. Artisans priced out of Brooklyn are moving to the area. The New York Restaurant on Main street serves truffle Parmesan Brussels sprouts and salmon with miso honey, ginger steamed rice, and blistered edamame.

Mr. Cone-E Island had owned other businesses in the area. He seemed like the epitome of the working-class Joe trying to make it on his own rather than work for someone else. I thought about him again two days later when the Times ran an op-ed by Florida journalist Darlena Cunha about how the impeachment story is playing in her state.

Working-class Republicans in Alachua County see Donald Trump as a white businessman who made a lot of money. They like to think that could be them. The only thing standing in the way of achieving that dream, they tell me, are policies that elevate people of color, immigrants and poor people without health care. In their eyes, Mr. Trump is a patriotic man doing the best he can, and those who go against him are traitors to the country.

Although Trump is rich and these Republicans are not, they still identify with him because they are thwarted by the same forces. They have the same enemies.

Republicans here can equate these “witch hunts” to things that have happened to them in their own lives. Just like they, unfairly, have not been able to move up in the world, so too is Mr. Trump, unfairly, being hunted down, his words and motives twisted to suit the needs of that same enemy. The investigations only strengthen their kinship with him.

I wasn’t in central Florida. But Mr. Cone-E Island’s girth, his dog, his pick-up truck — I wondered if he had gun in the cab — plus the demographic (older, White, male, small town) all suggested that I shouldn’t be swinging the conversation to politics. I’d stick to business. “This town is going upscale,” I said. “In a couple of years . . . .”

“By then it’ll be four-fifty,” he said, then added, “if this idiot doesn’t ruin the whole economy.” He went on. He wondered how many millions of our tax dollars went to Trump’s golfing trips, to the floors of Trump tower the government had to rent from Trump. “Trump's a businessman.” I said. I was going to add, “like you,” but I didn’t have to.

“What kind of businessman,” he said. “He stiffs his supplies, his contractors, his creditors.”  He could have gone on.

Well Mocha, I thought, I guess we’re not in Kansas. Or Florida.

No More Nigels

October 21, 2019
Posted by Jay Livingston

Calvin Trillin once proposed that Americans and the English have a name exchange. English people would start naming their kids things like Sonny and LeRoy. American babies would be Cyril or Cedric.
“Think of how proud the English would be on the first year that every single linebacker in the National Football League all-star team is named Nigel.”
Trilling wrote this a while ago, and the NFL still has no Nigels. But neither does English professional soccer. Well, there might be one — Nigel Roe-Coker, a midfielder who Wikipedia identifies as currently a “free agent.”

Don’t look for Nigels to start popping up on British rosters any time in the future. In 2016 in the UK, no babies were named Nigel. None. In 2017, there were eleven, and last year, eight. You can still find Nigels walking around in England, but they are getting long in the tooth. Brexiteer Nigel Farage, probably the best known, is 55. And while there are no footballer Nigels, elsewhere in sport, over at the snooker table, you’ll find Bond, Nigel Bond, though his ranking has fallen to 99th and he’s roughly the same age as Farage.

This quintessentially English name has gone the way of the shilling and half-crown. And as with other names that have fallen from favor, it’s very hard to say how or why.

Quote TK

October 19, 2019
Posted by Jay Livingston

Peter Navarro is an economist who now works in the White House as an adviser on trade. You can find his books in the non-fiction section of the bookstore, though that label may now include an asterisk.

In his 2011 book Death By China, Navarro quotes an expert on China, Ron Vara, on how nasty and dangerous the Chinese are as trading partners: “Only the Chinese can turn a leather sofa into an acid bath, a baby crib into a lethal weapon, and a cellphone battery into heart-piercing shrapnel.”

It’s a great quote. The only problem is that Ron Vara is fictional.  Navarro made him up (the name is an anagram of Navarro). Ron Vara has made appearances in other Navarro books. I haven’t read these, but I would guess the purpose is the same — to include a really strong quote, so strong that for Navarro to acknowledge it as his own would reveal him as a very biased non-fiction writer.

Navarro claims it’s all in good fun, a “whimsical device.” Honest journalists who play by the rules see it as “making stuff up” or more simply “lying.”

But what Navarro did is not all that different from the legitimate journalisitic technique of searching out someone who will give you the quote you want, the quote that expresses your own views but that you can legitimately attribute to someone else. “Quote TK” (quote to come) in the draft of a story means that the writer needs a little more time to find someone who will express a particular opinion. Honest writers may have to go deep into their contact list, but eventually they usually get something usable.

Navarro’s method of making stuff up has great advantages over honest non-fiction writing:
  • It results in quotes that are much sharper and that are guaranteed to express precisely the opinions you want expressed
  •  It’s much less work.
  •  And as the NPR story notes, it’s perfectly compatible with the current occupants of the White House.

Hypocrisy and Intended Consequences

October 17, 2019
Posted by Jay Livingston

It’s one thing to be puzzled, surprised, or dismayed by unintended consequences. But when the consequences are intended, those reactions are either self-delusion or flat-out hypocrisy.

Yesterday, a boxer died of brain injuries a few days after his opponent scored a tenth-round knockout. (I’m not going to go into the details. You can read some of them here.)

An AR-15 rifle is designed to kill a lot of people in a few seconds. Most people who own an AR-15 do not use it for that purpose, so we pretend to be surprised when a civilian does use the weapon to do what it was designed to do. We ask, how could such a thing happen?

Most boxing matches do not end in death or serious brain damage. But the goal of boxing, unlike that of other sports, is to pound the other person into unconsciousness, usually by hitting them in the head with as much force as possible. Sometimes boxers suffer brain injury. Sometimes they die. And as with guns, we pretend to be surprised and dismayed when the outcome of the boxing match is precisely what the sport was designed to do. 

Philip Rieff — Moralist and Plagiarist

October 15, 2019
Posted by Jay Livingston

In the 1960s, Freud: The Mind of the Moralist by Philip Rieff was an important book.

The original paperback edition. I have added the
red asterisk for a reason discussed below.

Freudian ideas were still influential back then, not just in clinical psychology but more generally in liberal intellectual and academic circles. University bookstore shelves were stacked with required books like Eros and Civilization (Marcuse), Love’s Body (N.O. Brown), Childhood and Society (Erikson), heavily steeped in Freud, along with Civilization and Its Discontents.

Now, an article by Len Gutkin in the latest Chronicle questions the authorship of Freud: The Mind of the Moralist. The subhead asks “Did Susan Sontag’s husband steal credit for her first book?” The husband in question is Philip Rieff. They met when Rieff was teaching at the University of Chicago. She was seventeen, an undergraduate. He was 28. They married ten days later. The marriage lasted eight years.

Sontag as the author of the book is not a new idea. I’d first heard this rumor in 1966 when I was a graduate student at Penn, where Rieff taught the required course on theory. Most of us were willing to accept the rumor. As Benjamin Moser, whose recent book on Sontag is the source for the information in the Chronicle piece, says (here).

In his department at Penn, colleagues and students who saw past the presumptuous veneer that overlaid his interactions with them came away with the impression that there was something unearned about his eminence. The slum kid who dressed like a British grandee had something of the scam artist about him.

Moser got it right. “Presumptous veneer . . .  Dressed like a British grandee” and with an undertaker’s lack of color — charcoal gray or black suits, double breasted or with a vest, shirt always white, necktie solid, striped, or patterned but always gray. As one of my professors at Brandeis said (Reiff had been on the faculty there), “all so that nobody would think he was Rieff the butcher’s son from Chicago.”

And then there was the comb-over. A broad ribbon of hairs carefully drawn across the front of his forehead to the other, never quite covering the baldness just behind them.

He told us that he did not want to be the students’ “friend” — he said the word as though he were holding a worm at arms length — not that there was any chance of that. His lectures were uninterrupted monologues with many names dropped in — Saint-Simon, Le Maistre, Aristotle, and on and on —  to show his erudition and our lack of it. Sometimes I would keep a list, writing down each name as Rieff dropped it, just to keep my mind from wandering.*

Most of the lectures were talking versions of parts of the book he was working on. The Triumph of the Therapeutic, which Gutkin calls, “a dyspeptic polemic against modernity in the guise of a study of post-Freudian psychoanalytic theory.” Rieff seemed to think that his ideas were original and brilliant. The thing is that on those occasions when he would talk in depth about a specific book or social theorist — no name dropping, none of his own pet terms or coinages — he was actually good. I transferred after my first year.

So did Sontag write the book? The Chronicle headline seems like another example of Betteridge’s Law, which says (I’m amending it slightly) that when an article headline is in the form of a question, the author wants you to think that the answer is Yes, but the more accurate answer is No.

But in this case, the author seems ambivalent, and the correct answer is mostly Yes. My impression is that Rieff had accumulated notes and fragments over the years, including the years before he met Sontag, but it was Sontag, still in her early twenties, who organized the material, added her own thoughts and sources that Rieff had not considered, and did the actual writing. Moser suggests that Sontag, in the acrimonious divorce negotiations, gave up any claims to authorship in return for Rieff giving up any custody claims on their son.

Freud: The Mind of the Moralist was the basis for Rieff’s career. A year or two after it was published, he was offered a position at Penn, where he stayed till he retired. The Times obit  refers to the title as “paradoxical” because Freud’s ideas “ had a corrosive effect on Western morality and culture.” The other paradox — or is it irony? — is that is that a man so apparently concerned with morality and its corrosion would put his name on a book written by someone else. 

---------
* The Times obit had a slightly different take on Rieff’s lectures: “Dr. Rieff often dazzled and occasionally puzzled students with multilayered but always authoritative lectures that blended philosophy, theology, economics, history, literature, psychology and dashes of poetry and Plato like ingredients in a sociological mulligatawny.”

Art Blakey Centennial

October 11, 2019
Posted by Jay Livingston

Art Blakey, the great jazz drummer, was born one hundred years ago today in Pittsburgh.

There are only two drummers who I could identify in a blindfold test. Art Blakey is one of them. The other is Max Roach, who said of Blakey:

Art was an original. He’s the only drummer whose time I recognize immediately. And his signature style was amazing; we used to call him ‘Thunder.’ When I first met him on 52d Street in 1944, he already had the polyrhythmic thing down. Art was the perhaps the best at maintaining independence with all four limbs. He was doing it before anybody was. And he was a great man, which influenced everybody around him. [quoted in the Times obituary, October 1990]

He kept the Jazz Messengers going for thirty-five years. He would find talented young players who would, in a couple of years, become famous (well, jazz-famous) and go off on their own (Wynton Marsalis joined the group when he was seventeen). Blakey would then replace them with new talent, and the cycle would repeat.

His best-known album is probably “Moanin’”, released in 1959, an incredibly rich year for jazz. (See the daily entries at The 1959 Project . The video below begins with one of the tunes from that album, not the best-known — that distinction goes to the title tune by pianist Bobby Timmons — but “Along Came Betty” by the sax player Benny Golson, who wrote many other tunes for the Messengers and basically functioned as the group’s musical director. The video is from 1988 with a completely different cast, except for Blakey.

As the tune ends (at about 7:00), Blakey takes a one-minute drum solo followed by “I Get a Kick Out of You” in the rhythmically complicated Clifford Brown - Max Roach arrangement from 1954 with a minute and a half of pure Blakey at the end.

Health and Self-Denial — The (Coastal) American Ideology

October 9, 2019
Posted by Jay Livingston

As an undergrad, I took Deviance with Irv Zola, a wonderful man whose main research area was medical sociology. The two topics were related, he said. In his Medical course, he asked the students to keep health journals where they would make note of any health-related matters in their own lives. What he found was that students often framed their health in terms of morality. They got sick because they had done something wrong or had failed to do what was right.

I was reminded of this when I read this passage from Barbara Ehrenreich’s essay “Why I’m Giving Up on Preventative Care” Barbara Ehrenreich (here).*

Most of my educated, middle-class friends . . . undertook exercise or yoga regimens; they filled their calendars with upcoming medical tests and exams; they boasted about their “good” and “bad” cholesterol counts, their heart rates and blood pressure. Mostly they understood they the task of aging to be self-denial, especially in the realm of diet.

In matters of health, and especially food, we are puritanical moralists. If we stick to our vows of health-chastity, if we steadfastly resist temptation, we will be rewarded with eternal life, or at least very long life.

But who is “we”? Ehrenreich seems to think that it’s the people Joseph Henrich in 2009 (here) labeled as WEIRD — Western, Educated, Industrialized, Rich, Democratic.

In the health-conscious mind-set that has prevailed among the world’s affluent people for about four decades now, health is indistinguishable from virtue, tasty foods are “sinfully delicious,” while healthful foods may taste good enough to be advertised as “guilt-free.” Those seeking to compensate for a lapse undertake punitive measures like fasts, purges, or diets composed of different juices carefully sequenced throughout the day.

Even a quick glance around the country will tell you that in wide swaths of the geographical and social territory, this abstemious ethos has not taken root. All You Can Eat. At Applebee’s (and lots of other places) when it comes to fatty fatty foods, gluttony is a virtue.


In other WEIRD cultures, even the cosmopolitan elite may not conflate pleasure and sin. Foods which in the US are “sinfully delicious” may be merely delicious elsewhere. France for instance. In a 2013 post (here) on “Guilty Pleasures,”  I compared the pastry scene in the Judd Apatow film “This is 40” with a similar scene in the the French film “Cousin Cousine.”

In both films, the overload of desserts is a guilty pleasure, but in the French movie the emphasis is almost entirely on the pleasure, while the American film focuses on the guilt. The French lovers slowly feed each other one dessert after another; the scene is almost erotic. But Pete and Debbie [in the American film] seem like children, giggling and trying to eat as much as they can before they get caught. Both scenes mingle sex and pastry, but in the French movie the common theme is sensuality; “This Is 40” plays both for laughs.

Unfortunately, I cannot find even a still shot from “Cousin Cousine,” but here is the scene from “This Is 40.”


The whole film in fact is an exposition of the mindset that Ehrenreich identifies. No sugar, no gluten, a personal trainer, less screen time, salads without dressing, tofu. In scene after scene the film shows how difficult it is to keep to this regime. That’s the basis for most of its humor. But neither the characters nor the film itself can abandon the notion that self-denial is the ideal.

------------------
* Ehrenreich’s essay appeared at Literary Hub in April 2018, but I just found it yesterday, probably via a Twitter link. I cannot remember what the tweet was about, nor do I have any idea why the essay appeared at LitHub, a Website devoted mostly to fiction, poetry, and literary criticism.

$350K — Still Just Enough For the City

October 2, 2019
Posted by Jay Livingston

MarketWatch is taking some flak on Twitter and elsewhere for this story:


Here’s what should have been the pull-quote:
The thing is, that kind of income, while relatively huge, is barely enough, according to Dogen, for a family to lead a comfortable life in coastal counties — where almost half of the nation’s population calls home.
One reader of this blog reminded me that I’d posted something about this nine years ago, complete with a parody verse based on Stevie Wonder’s “Living for the City.” That 2010 post was occasioned by a Chicago law professor’s complaint that he could barely get by on his current income, which was probably a bit more than $350,000. (The original post is here.)

Coming In In the Middle

October 2, 2019
Posted by Jay Livingston

In the previous post, I said that up until the 1950s, it wasn’t uncommon that moviegoers would come into the theater halfway through the film. After The End, they could stay in their seats, wait for the movie to start again — after the previews, newsreel, and cartoon — and, when the film reached the part they’d already seen, leave.

It’s hard to imagine now, when everyone is in their seat by the time the feature starts. (A very few people may be late but only by a couple of minutes.) The only historical evidence I could offer was Roger Angell’s memoir Let Me Finish. As a twelve-year old, Angell would go to the movie theater right after school, and it was rare that the movie showtimes coincided with school dismissal.

There’s also this: Danny Kaye’s big breakthrough came in his first film, “Up in Arms,” in 1944. His tour de force in that movie became known as “The Lobby Number.” Kaye and friends are in the lobby of a large movie theater, and he tries to dissuade them from going in to see the musical they’d come for. These musicals are all alike, he says, and launches a parody of the genre, starting with the credits and the MGM lion’s roar. It’s Kaye at his manic best. And after about five minutes, as he is singing an up-tempo song, he stops suddenly and says calmly,
So here we are, back in Fresno, California.
And this is where you came in.
But do not fret my friend.                                                           
[singing] This is a picture that ends in the middle
For the benefit of the people who came in in the middle.
This, this is the end.
You can hear the whole thing. Or just push the slider to 5:10.*


If you can base the final joke on it, then people walking into the theater when the film is halfway through must have been, as we now say, “a thing.”

--------------------
*There’s a YouTube clip (here) from the movie itself, and it gives you a better sense of the context for The Lobby Number. Unfortunately, the clip ends before the final line.

This Is Where We Came In

October 1, 2019
Posted by Jay Livingston

Here’s a bit of cultural history — movie history — that you’re probably not aware of, even if you’re a cineaste (unless, perhaps, if you’re a cineaste who’s eligible to collect Social Security). It’s about what a movie is, or more accurately, it’s about the place movies occupy in our social and cultural lives.

When you go to the movies, it’s annoying when people arrive after the film has started. They crabwalk to their seats as everyone else in that row knee-twists to one side to let them by without spilling the popcorn. Even if you’re not in that row, your focus is unavoidably drawn away from what’s happening on the screen and toward the latecomers. It doesn’t happen often, what with the twenty minutes of trailers, and besides nobody wants miss any of the film.

But it hasn’t always been that way. It wasn’t until the mid- to late-1950s (just a guess, I have no actual historical data) that things began to change. Before that, it was not unusual for moviegoers to arrive well into the picture. In his memoir Let Me Finish, Roger Angell, who was born in 1920 and grew up in New York City, writes that when he was twelve or so, he started regularly ditching his after-school recreation program and sneaking off to the movies.

Mostly, I would turn up at the Orpheum or the 86th Street Garden while the second feature was in progress . . . Walking into the middle of movies was the common American thing during the double-feature era, and if one stayed the course, only minimal mental splicing was required to reconnect the characters and the plot of the initial feature when it rolled around again. The absence of the double bill has done away with this knack and has also expunged “I think this is where we came in” from the language — a better phrase, all in all, than “déjà vu,” and easier to pronounce.

I had forgotten. But reading that paragraph opened a childhood memory — not detailed and Proustian, just a moment in the dark theater with my family, hearing my mother or father whisper, “This is where we came in, isn’t it?” and all of us getting up and making our way out.

Why did things change? Angell blames it on the demise of the double-feature, and he implies that latecoming occurred mostly in the B-movie,* usually an uninspired genre picture. But I would guess that the norm of tolerating latecoming spanned the entire program, even when only one picture was on the bill (though that bill also included one or two previews, a newsreel, and a cartoon).

Here’s another guess: Television, or rather the absence of television. Today, movies are special. They have had to keep one step ahead of TV. When TV was black-and-white, movies had color. Even today, movies have sex, violence, and language not allowed on broadcast TV. And even cable can’t produce the sound and screen size of the movies or, until very recently, the special effects and high-priced actors.

Before the mid-1950s, movies occupied some of the space now taken by television — everyday, ordinary entertainment. Today, in our homes we might turn on the TV to “see what’s on television,” not to see particular show at a particular time. If it’s ten past the hour and we turn on the TV mid-program, that’s OK. (This was even more the case in the years before on-demand and the DVR.)

The movies were like that in the pre-television decades. People were less picky about what they saw. They often went “to the movies” rather than to a particular movie, especially if there was only one theater nearby.  And if they didn’t get to the theater exactly on time, that was OK.

It’s not that television allowed movies to become Art rather than Entertainment. Most people at the tenplex today aren’t thinking of what they’re seeing in terms of artistic categories. But even if movies are still entertainment, they fit into people’s lives in a way that’s different from that of the 1930s and 40s. And different as well are are the norms of going to the movies.

---------------------

* I once asked my students if they’d seen or heard of Jerry Seinfeld’s “Bee Movie.” Several people raised their hands. Then I asked them if they understood the double meaning in the title. Nobody raised a hand.

You Read It Here First

September 26, 2019
Posted by Jay Livingston

Today, the Facebook group Nerds With Vaginas posted this:


(Note the number of Likes, Comments, and Shares the post had already gotten in the first five hours.)

Four years ago, in a blogpost about swear words (here), I cited the work of Jack Grieve, who had been using Twitter data to discover regional and historical variations. Here is the final paragraph of that post.

You can find maps for all your favorite words at Grieve’s Website (here), where you can also find out what words are trending (as we now say) on Twitter. (“Unbothered” is spreading from the South, and “fuckboy” is rising). Other words are on the way down (untrending?). If you’re holding “YOLO” futures, sell them now before it’s too late. [Emphasis added.]

Oldsmoblie, or Why I Am Not a Genius

September 26, 2019
Posted by Jay Livingston

A Montclair State professor, Jeffrey Alan Miller, has been awarded a Genius Grant, also known as the MacArthur Fellowship. Four years ago, he discovered the earliest known draft of the King James Bible.


The Times (here) reported the story at the time.

Professor Miller discovered the manuscript last fall, when he was in the archives at Sidney Sussex College in Cambridge . . . He came across an unassuming notebook about the size of a modern paperback, wrapped in a stained piece of waste vellum and filled with some 70 pages of Ward’s nearly indecipherable handwriting.

As Professor Miller tried to puzzle out which passages of the Bible it concerned, he realized what it was: a draft of parts of the King James Version of the Apocrypha, a disputed section of the Bible that is left out of many editions, particularly in the United States.

The true scholar who learns of Prof. Miller’s discovery will immediately think of its implications not just for the history of the most widely read book in English literature but also for the history of the English language itself, the history of England, and the Anglican church.

My reaction, alas, was different. My first thoughts — and still my only thoughts — turned to Woody Allen’s 1974 essay on “The Scrolls.”

Scholars will recall that several years ago a shepherd, wandering in the Gulf of Aqaba, stumbled upon a cave containing several large clay jars and also two tickets to the ice show. Inside the jars were discovered six parchment scrolls with ancient incomprehensible writing . . . .

The authenticity of the scrolls is currently in great doubt, particularly since the word Oldsmobile appears several times in the text.

Shameless

September 24, 2019
Posted by Jay Livingston

No, not the Showtime series. The president.

In a “news analysis” piece in the Times today (here), Peter Baker, who has been reporting on the Ukraine story, says:

Even for a leader who has audaciously disregarded many of the boundaries that restrained his predecessors, President Trump’s appeal to a foreign power for dirt on former Vice President Joseph R. Biden Jr. is an astonishing breach of the norms governing the American presidency. [emphasis added]

In a series of posts two years ago, I explained why I was not a big fan of the “breaching” assignments that many instructors use in the unit on norms in Sociology 101. Important lessons that can be learned from these assignments, I said, but to learn them, we have to shift our focus away from the reaction of others, which is what the assignment usually calls for, and onto the reactions of the norm violator.

Lesson #2: When we think about breaking a norm, our anticipatory anxiety is highly exaggerated and not rational. When you ask people why they can’t, just can’t, break the norm, they imagine consequences far out of proportion to what might happen. When Stanley Milgram told his students to go into the NYC subway and ask people for their seats, one student said, “You want to get us killed?” When I’ve asked students about doing the breaching experiment, they imagine offended strangers raining mayhem upon them. But even as they say it, they know that it’s preposterous. Which leads to . . .

 Lesson #3: We follow the norms not out of some rational cost-benefit calculation. We follow them because we have internalized them. Society is not just “out there”; it’s “in here,” inside us, as well. [The entire post is here.]

Those internalized norms are what create the feeling of shame, the feeling that comes from knowing that other people around us strongly disapprove. Without that sense of shame, our only consideration would be the rational cost-benefit calculation. To the shameless, the disapproval of others matters only if it can be transformed into some sanction with real consequences. Most of the time, it can’t.

Years ago, I went into one of those narrow news stores, the kind that sell newspapers, magazines, lottery tickets, and cigarettes. A man was standing there paging through a skin magazine. (This was way before the Internet, before you could get free porn by just tapping your phone.) “Hey, fella,” the man behind the counter said, “you want to buy the magazine?” The reader ignored him. Maybe he even put down the Playboy and picked up a Penthouse. “Hey, this ain’t reading room. But it or get out.” The man went on reading for another minute or two despite the repeated demands from the man behind the counter.

I was amazed at his brazenness. On the shame spectrum, he was at the opposite pole from Woody Allen in this scene from “Play It Again Sam.” (That film was made it 1972. The final line in that scene, given what we now know, sounds very different today.)

Our president has demonstrated just how flimsy our norms are. The Times article quotes Richard Ben Veniste, former Watergate prosecutor, referring to Trump’s “profound disregard for presidential norms.” But this disregard has brought no meaningful sanctions. Of course, sanctions are less likely to be imposed on norm violators who have some power. As Trump said in connection with his disregard for other, non-presidential norms, “When you’re a star . . . you can do anything. Grab ’em by the pussy. You can do anything.” If you’re a star and, he forgot to add, if you are shameless.

As Gwenda Blair, a biographer of the Trump family, put it.  “What he’s learned is you can get away with just about anything if you’re willing to gamble and you have zero shame.”

John Coltrane, b. Sept. 23, 1926

September 23, 2019
Posted by Jay Livingston

My first year at college, I shared a tiny room with quiet somewhat strange guy from Denver whose choice in records seemed to be based on how impressive the music was as audio. (“High fidelity” records and equipment were still relatively new in those days.) The Soviet Army chorus, E. Power Biggs playing some world-famous organ, an “1812 Overture” with lots of cannons, that sort of thing.

But he also had a copy of Soultrane. I had plenty of Coltrane on my Miles Davis records, including Milestones, which it turns out was recorded only three days before Soultrane. But this was the first Coltrane-as-leader record I’d encountered. I listened to it over and over.

This Coltrane birthday post should probably feature “Giant Steps,” or “My Favorite Things,” or “A Love Supreme” — recordings that clearly mark him as perhaps the most important jazz figure of the 1950s and 60s. But I’m going with side one, track one of Soultrane, Tadd Dameron’s tune “Good Bait.”



Blog-mitzvah

September 18, 2019
Posted by Jay Livingston

This blog is turning thirteen this month. It was originally supposed to be a group blog. That didn’t quite work out.

At our first department meeting in September 2006, we were thinking of ways to get more majors and to keep the ones we had. “How about a department blog?” I said. “We can post about things we see in our everyday lives but that we can relate to sociological ideas. That way, students will see that we’re just people with ordinary lives, and maybe they’ll see how sociological ideas can be useful.”

Everyone thought this was a good idea. So I set up the blog with posting privileges for all department members. After eleven or twelve years, I began to notice that with maybe three or four exceptions, all the posts were by me. So I changed the settings to make it my blog, though I kept the Montclair name just for the sake of continuity. I didn’t even change the name now that my connection to Montclair has become the thin thread of emeritus status.
          
                    *                   *                *               *

In the past year, the post that has gotten the most attention is one from the previous year — the one about language anachronisms in “The Marvelous Mrs. Maisel” (here).In his Times review the show last December, James Poniewozik mentioned these anachronisms in passing but with a link to my post. That brought a slew of visits, and the hits just keep on coming. A week or so ago, the page views for that post passed the 12,000 mark, an extraordinary number for this blog.

Besides that, here are some posts from the past year that I thought were worth revisiting.

1. Two posts about “Nostalgia and the Myth of Social Decline” (here and here) got me twenty minutes of air time on the Sociology Annex Podcast. That was fun, though in retrospect I quickly realized that I could have stated my ideas much more clearly.

2. Along similar lines, “The Past Is Never Uncertain” looks at the idea of that things today are more “uncertain” than things in the past. But the past is more certain only because now we know what happened.

3. People had different reactions to Brett Kavanagh at his confirmation hearings. But all of these reactions, for and against, seemed to share the same assumptions about “character” and about what a person is. This post (“A Different Person” ) tries to show the limitations of those assumptions.

4.  Aside from Mrs. Maisel, the post that got more views than any other in the past year was “Suicide and Well-Being. SOC 101, Week 1”). Were Soc 101 instructors assigning it?

It used current data but the same Durkheimian idea (and one of the same jokes) I’ve been using since I started teaching this stuff. The main point: rates are a property a group or society, not of individuals. Variables that explain individual cases (happiness, well-being) don’t seem to work so well at explaining rates.



Do the Poor Suffer From Elite Ideas?

September 14, 2019
Posted by Jay Livingston

People in the lower class and working class are more likely do things that violate middle-class standards. They drop out of school, have children out of wedlock, take drugs, don’t have a job, and commit crimes all at higher rates than their middle-class counterparts. Traditional conservative explanations for these shortcomings focus on the individual. These people fail to live middle-class lives because they lack virtue.

In modern times, conservatives have pinned that lack of virtue on the policies of liberals —  policies like not punishing criminals severely enough, not punishing idleness, giving poor unwed mothers assistance for themselves and their children, and other programs that encourage the irresponsibility of the undeserving poor. 

Starting a half-century ago or so ago, conservatives began to indict liberals not just for their social policies but for their ideas about things like happiness and freedom. James Q. Wilson, for example, attributed the 1960s increase in crime in part to the ideology of self-expression and “do your own thing.” “This attitude of radical self-indulgence, had affected a significant fraction of the population, and this weakened the ordinary social constraints that were operating on people.”

Of course, the people who were tuning in to these messages of self-indulgence (or as they might have styled it “self-actualization”) were largely young, White, and middle-class. Wilson never traced the paths of this diffusion of ideas. He just left us to assume that muggers, rioters, and welfare mothers in the cities had come together with the Whiter, less urban Woodstock generation, and they were all listening to Tim Leary, reading Fritz Perls or Abe Maslow, and putting those ideas into practice. Those practices —  the self-actualization among the middle-class, crime among the poor —  might have looked very different on the surface, but in Wilson’s view they were all based on the same ideas.       

A recent version of this theory — that the poor and uneducated have absorbed the ideas of affluent liberals and are worse off because of that — comes to us from Rob Henderson (here) in his catch-phrase “luxury beliefs” —  “ideas and opinions that confer status on the rich at very little cost, while taking a toll on the lower class.” He even claims that these beliefs explain the increase in economic inequality. “These beliefs . . . produce real, tangible consequences for disadvantaged people, further widening the divide.”

Take, for example, ideas about the causes of success.

Then there’s the luxury belief that individual decisions don’t matter much compared to random social forces, including luck. This belief is more common among many of my peers at Yale and Cambridge than the kids I grew up with in foster care or the women and men I served with in the military. The key message is that the outcomes of your life are beyond your control. This idea works to the benefit of the upper class and harms ordinary people.[emphasis added]


As I said in a previous post, most of Henderson’s assertions are hard to test against actual data. But for the last 45 years, the GSS has in fact asked people about the importance of luck.
GETAHEAD: Some people say that people get ahead by their own hard work; others say that lucky breaks or help from other people are more important. Which do you think is most important?

I have broken the sample down into three educational categories: those who finished college, those who never finished high school, and those in between (a high school degree and possibly some college).  If Henderson is right, we should see a steady upward trend in the percent who say that Luck is important. The trend should begin among the most educated. If their ideas are filtering down through the class system, the less educated should also be trending upward but with a lag time of a few years.

(Click for a larger view.)

Henderson does not specify the time period for the trends he’s talking about, but if he’s thinking about very recent history, the graph offers some support. Among those with a college degree, the percent citing Luck rose in the two most recent iterations of the GSS — from about 8% in 2012 to 17% in 2018. Is that a trend? I don’t know. Prior to 2016, the percent fluctuates in no discernible pattern.

More relevant for Henderson’s claims, the fashion in Luck among the educated has no apparent effect on those with less education.  Since the mid-80s, among those who never finished high school, the belief that success depends mostly on luck does not follow the fluctuations of the college educated; instead it trends slightly downward.

It does not look as though the less educated are adopting the ideas of those who finished college. More tellingly, the GSS data also raises the question of whether beliefs about luck affect behavior. Henderson says that the well educated (“my peers at Yale and Cambridge”) are more likely to believe in the importance of luck and “random social forces.” Yet they behave in a contradictory way. They work hard. Henderson also seems to be implying that the less educated do not work so hard. That’s why they’re poor. Unlike the Yalies, they are acting on their belief about luck and winding up worse off for it.

But what the graph shows is that these ideas have not changed much.  If anything, the dropouts believe in luck less now than in the past. And yet, their incomes have left them farther and farther from the well-educated. Maybe economic inequality has less to do with virtuous ideas and more to do with the economy.