Personal Needs and Public Morality

May 6, 2021
Posted by Jay Livingston
    

Victoria Eng updated the page [a GoFundMe for the victim] on Wednesday to say that her grandmother is recovering well after surgery.

“These Asian hate crimes need to stop," she wrote. "San Francisco is my home and my Grandma's home. We need to feel safe where we live and not in constant fear.”

Over the last sixty years, we have largely abandoned the language of moral imperatives for the language of psychological well-being. We don’t say what should or must happen or what people ought to do. Instead, we talk about what is needed.

I have noted this change before (here and here, for example). But I couldn’t pass up the above quote in yesterday’s NPR news story.  It so perfectly uses both the new and the old sense of need.

The second need — “We need to feel safe. . .” — would have been as unremarkable in the 1960s as it is today. It’s about needs, specifically the needs of Asians.

But “These Asian hate crimes need to stop” is about morality. It says what should or must happen. But we longer use words like should or must. We don’t tell people what they ought to do. “You ought to stop drinking so much.” That would be imposing an external morality, and morality is always about what’s good for others and for the society as a whole, not the individual. Instead we phrase it in terms of the person’s own needs; we tell them what is in their own interest and will benefit them.. “You need to stop drinking so much.”

This use of need has expanded to the point that we now talk about the needs of Asian hate crimes.

Of course, the meaning of the sentence is clear. It’s Asians or the society as a whole that needs for these hate crimes to stop. But because of the change in language, we now phrase it in a way that syntactically makes no sense.

Ron Carter, b. May 4, 1937

May 4, 2021
Posted by Jay Livingston

Ron Carter’s Downbeat Blindfold Test was the best I’ve ever read. In most of these, the musician tries to guess the identity of the performer,adds some evaluative comments or personal recollections, and then gives the track a rating of one to five stars. I don’t recall how accurate Carter was in identifying the musicians. But instead of focusing on who the musicians were, he told you what they were doing. Not the soloist so much as the rhythm section, the part of the performance that most people’s ears are not going to pick up.

It was the only Blindfold Test where you actually learned something about the music, and I told him as much when I happened to see him in Fairway one morning a few weeks later. (I think this was about fifteen years ago.)

In an interview posted yesterday, he does something similar. Carter was the bassist in Miles’s second great quintet, the group of the mid-sixties. The  rhythm section —  Carter, Herbie Hancock, and Tony Wiliams — created something new in jazz, a sound very different from that of the late-1950s quintet with Paul Chambers, Red Garland, and Philly Joe Jones. They were kids then. Ron and Herbie were in their twenties, Tony Williams was barely twenty. Miles was nearly forty.

The question in the interview asked whether the rhythm section ever talked specifically — with one another or with Miles — about what they were doing. The answer is, not very much; they just listened to one another and learned.

But Carter’s anecdote goes beyond that generality to focus on a specific and non-intuitive note (B natural) that he played in Autumn Leaves.



It goes without saying that Carter is one of the greatest bass players of all time. At 84, he’s still going strong and eager to get back to work when the clubs and concert halls reopen.

Abbie Hoffman – A Personal Memory

April 25, 2021
Posted by Jay Livingston

I’ve just watched “The Trial of the Chicago Seven.” Sacha Baron Cohen’s Abbie Hoffman brought back memories of my own brief contact with Abbie. Maybe it’s getting into tl;dr territory, and what follows is certainly less sociology, than it is, in Chris Uggen’s phrase, self-indulgery.

I met Abbie Hoffman before he became Abbie Hoffman, the Abbie Hoffman everyone knows, the Abbie Hoffman of “The Trial of the Chicago Seven,”  It was two years before the Chicago convention, the summer of 1966. I was 22 and about to enter graduate school. Abbie was 29.  

I had just finished college and was working in the Democratic primary campaign of an anti-war candidate for U.S. senate in Massachusetts.  His name was Thomas Boylston Adams, and he came by it honestly.  I am not sure about the Boylston strand, but the Adams part went directly back to John and John Quincy. More important, he was the only candidate who opposed the war in Vietnam.

His opponents were Endicott Peabody, governor of the state, and John Collins. mayor of Boston. We knew we had no chance to win against establishment Democrats. We were in it to get out the anti-war message. But because the vehicle for that was an election, we had to do what you do in electoral politics.

Much of my work, as I recall, consisted of “canvassing”—handing out literature and trying to get people to sign petitions to get Adams on the ballot. We would canvas in Boston one day, Brockton the next, Fall River the next. One hot day, the campaign manager sent a group from the Boston office out to Worcester to canvas there. He told us that we’d get more information from the campaign co-ordinator for that part of the state—Abbie Hoffman.

Abbie met us, assigned us to different parts of the city, and told us a little bit about the neighborhoods we would be canvassing. Then we were on our own, and I didn’t see him again that day.

At the time, his biography was much like that of many of the other people in the campaign who had worked for liberal and left-wing causes. Yippies did not yet exist. It seems odd now to think of Abbie Hoffman as regional manager, directing conventional political work like canvassing for a candidate who looked, sounded, and acted every bit the Boston Brahmin.

I didn’t see Abbie again until late August, towards the end of the campaign, when the candidate invited everyone for a picnic at his summer home on the South Shore near Quincy. It was a modest, wooden house on several acres of land. Most of the Boston-area people went. A few of the workers from the regional offices also came in for the event. All told, we numbered no more than two dozen.

At some point in the afternoon, several of us went to play softball, and I found myself walking next to Abbie. The field for the game lay on the other side of a slight ridge. When we got to the top and looked out at the large open area below us, Abbie stretched out his arm and made a sweeping gesture. “Comes the revolution, my son,” he said in a fake Russian accent, “all of this will be yours.”

Except for the picnic, most of us rarely had a chance to speak with the candidate himself—an arrangement that was probably for the best. About the only thing most of us had in common with him was a general opposition to the Vietnam war. But while Adams’s views on Vietnam made him the most liberal person running for office in Massachusetts, he was several steps to the right of just about everybody in his own campaign, including of course, Abbie. They were an assortment of 1960s activists. Some were students from SDS. Others had been recruited from past electoral campaigns for other non-mainstream candidates. Some came out of the civil rights movement, having only a few years earlier worked on voter registration in the South with SNCC and CORE. Nobody on the staff had illusions about winning a Senate seat. Instead, people spoke of the campaign as “educational” (i.e., to educate the public about the war). Many of them, like Abbie, also saw it as a way to build a foundation for future political organizing, whether for local or national issues. That was their job. They were political organizers.

One other scene has stayed in my mind from that warm, August day. I do not remember how we all got back to Boston or how it was that Abbie and I were the only ones from the group taking the MTA back to Cambridge, but that is where the memory begins—near sunset at the end of a long day, me standing in an MTA car, talking with Abbie Hoffman.

We were both tired. The picnic had, in effect, marked the end of the campaign. The September primary was only a week or two away. The candidate, we all knew, would get only a handful of votes, and the student workers like me would go back to school. But I wondered about the “older” people.

“What do you think you’re going to do now?” I asked.

“I don’t know,” Abbie said, “I really don’t.” He talked about other political issues that needed people, but it all seemed vague, as though he were tired of it all.

Then he said, “A lot of people I know are getting jobs in the poverty program. It’s a steady job, and you can do pretty well.” Remember, this was the hopeful era of the Great Society, of OEO programs that needed workers and administrators. “I guess I can always get something there too.”

He paused, and for an instant the twinkle returned to his eyes. He shook his head slightly. “But I don’t think I could do that.”

Herbie Hancock, b. April 12, 1940

April 12, 2021
Posted by Jay Livingston

Herbie Hancock turns eighty-one today. I felt I had to post something, but what? Herbie has recorded so much in in such a variety of genres, but the this was my first — “Dolphin Dance” from the Maiden Voyage album, 1965. It’s innovative in the melody, chords, and structure. Herbie’s playing encompasses funk and the post-bebop “out” style. And while I usually don’t care much for the idea of “program” music, yu can easily imagine standing on the shore, looking out at dolphins on a calm ocean.



The Wikipedia article quotes Herbie saying that when he was first getting into jazz in the 1950s, he learned a lot from the Hi-Los and their arrangements by Clare Fischer. That surprised me. Fischer and the Hi-Los were four white guys, as is Fischer, and their sound has none of funk or bluesy quality that Herbie has always had. But the arrangements are indeed interesting. (Here  is their version of Jerome Kern’s “Yesterdays.”

A summer in the early 1970s, I was hanging out at the tennis courts. One of the afternoon regulars there was a trumpet player. I asked him if he knew the tune. He did. I asked him if he could tell me the changes. He could. (The Real Book did not yet exist.) I managed to borrow a pencil and a scrap of paper, and he dictated the changes to me bar by bar, in piano key, not trumpet. As I said, the changes are unusual, not easy to learn and commit to memory, and if you forget a chord, it’s hard to guess at. For years, when I went to play “Dolphin Dance,” I put that same scrap of paper up on the piano.

Coda: How could I have missed the obvious choice of tunes: Eighty-One. Ron Carter wrote this when he and Herbie were in Miles’s second great quintet. Herbie was 25, Carter was a few years older. Drummer Tony Williams could not drink legally in many states. I doubt that any of them were thinking of Eighty-one as an age.

Herbie recorded it with that quintet on the E.S.P. album, 1965. (here)
And again in 1994 with the same quintet but with Wallace Roney replacing Miles. (here)


Memory and Identity

 April 2, 2021
Posted by Jay Livingston

Was it just a coincidence that this week both The New Yorker and This American Life included pieces on the same case of alleged sexual child abuse ? Neither mentioned the recent HBO documentary series Allen v. Farrow, but the cases are similar. A parent in a custody battle is accused of inserting a finger into the vagina of a six or seven-year old girl. The accused parent suggests that the other parent has coached the child and implanted a false memory.

The point of the less famous case — it involves a woman named Nicole Kleumper — is that memory is fallible. Most of us don’t like to admit that. We think that if we remember something, then it must have happened. Oh, we might forget unimportant details, but the details that do stand out in our memory are  facts.

But that’s not how memory works. Psychologist Elizabeth Loftus has devoted a lifetime of research to revealing the unreliability and malleability of memory, especially when it involves eyewitness testimony in criminal cases. She was especially skeptical of “recovered memories” — memories of traumatic events that do not come to mind until long after the fact.

In 1997, psychologist David Corwin, published a paper documenting what seemed like a clear case of recovered memory. In a custody battle, the father of six-year old Nicole claimed that the mother had sexually abused her. There were videotapes of a psychologist interviewing Nicole. The father won custody, and the girl did not see her mother again. But ten years later, in speaking with Corwin, Nicole could not remember why she had become estranged from her mother. After her father died, she reunited with her mother and wondered if her father had gotten her to lie, for she had no memory of the alleged abuse.

But Corwin showed her the videotape — six-year old Nicole saying that her mother was “rotten” and had put a finger in her vagina. For Nicole at age 17, the video triggered a sort of memory. “I remember it happening, that she hurt me. I was getting a bath, and I don’t remember anything specific until I felt that pain.

Here, Corwin claimed, was a clear case of a memory that had been repressed and then recovered. Elizabeth Loftus was skeptical and set about debunking this case. She was dogged about it. Corwin had thrown a cloak of anonymity over Nicole, but Loftus sniffed out clues, eventually tracked down Nicole’s identity,and then set about casting doubt on the recovered memory. (Nicole once referred to herself as “a survivor of Elizabeth Loftus.”)

What’s amazing and admirable about Nicole Keumper is that over time she took Loftus’s ideas seriously and in the end came to question her own memory. What had been the firm footing of memory was now soft. And she has come to accept this uncertainty.

I'm never going to know. I'm never going to know. And even after all these years, I think I still thought that at some point I would come to a solid decision, yes or no. And really, really, I'm never going to know. And that just has to be OK.

How many of us would do that? Not Dylan Farrow or her many supporters, including the filmmakers who did the HBO documentary. They all but admit that they wanted to create a one-sided case for the prosecution and had no interest in presenting Allen’s side.

Towards the end of the This American Life segment, Nicole Kleumper says something very perceptive. She recognizes that with some things, when remembered “facts” meld into uncertainty, it is not just a matter of our ability to remember; it can be a matter who we are. The interviewer asks, “How disorienting was it to feel like you had the truth, and then you lost it?”

Disorienting is a good word, but I don't think it fully captures. It goes to my identity. It really goes to the heart of who I am, and who I thought I was, and who I think I am. The most important, the key memory on which I rebuilt and then rebuilt again my identity has now been called into question

For those who have built a public identity around the “fact” that Woody Allen sexually molested his adoptive daughter, uncertainty* would be intolerable.

-------------------------

* Unfortunately, the only person whose testimony that he did it would be convincing is Allen himself. Symmetrically, the only person who could give convincing testimony that Mia Farrow coached her daughter to make a false accusation is Mia herself.

For a thorough statement of skepticism about the HBO series, see Cathy Young at Quillette


Trauma and Therapy-Speak

March 30, 2021
Posted by Jay Livingston

It’s nice to have your perceptions ratified so that you can stop asking yourself, “Is it just me that’s noticing this?”  Lately, it seemed that I was hearing more talk about trauma — and for some things that didn’t seem especially traumatic. Katy Waldman heard the same thing. “Around every corner, trauma, like the unwanted prize at the bottom of a cereal box. The trauma of puberty, of difference, of academia, of women’s clothing.” Women’s clothing? Oh well, Waldman is a staff writer at The New Yorker and presumably more plugged in to the zeitgeist than I am. That sentence is from her article “The Rise of Therapy-Speak” (here).

Google nGrams confirms our suspicions. Mentions of both trauma and harm rose starting about 1970.

But trauma’s market share increased.

The important difference is that while both trauma and harm injure a person, trauma implies long-lasting psychological damage.  
                                        
Waldman can’t decide whether therapy-speak is really a recent development. The title of the article (“The Rise of . . .”) implies that it is, and she says that “the language of mental health is burgeoning.” But she also quotes a psychologist who tells her that “the language of the therapist’s office has long flooded popular culture.” I agree. The specific words that are in fashion come and go — trauma is on the rise, inferiority complex and midlife crisis are relics of the past — but the process remains the same. So does the criticism. Waldman takes aim at “therapy-speak”; forty years ago the same target was “psychobabble.”

Psychotherapeutic discourse usually remains inside the gated city of the educated liberal elite. I imagine that on Fox News there’s about as much talk of “toxic” relationships or emotional “triggers as there is of “mindfulness.” Those outside this world can find therapy-speak and its attendant world-view annoying. Waldman speaks of “irritation that therapy-speak occasionally provokes,”

the words suggest a sort of woke posturing, a theatrical deference to norms of kindness, and they also show how the language of suffering often finds its way into the mouths of those who suffer least.

Therapy-speakers are annoying partly because they are parading their self-absorption. As Lee Rainwater said a half-century ago, “the soul-searching of middle class adolescents and adults,”  when compared with the problems of the poor, “seems like a kind of conspicuous consumption of psychic riches.” Nobody likes a show-off.

In one important way, trauma talk is different from earlier therapy-speak. Among the people Waldman is writing about and their counterparts in earlier generations (those who suffer least), therapists, neuroses, depression, anxieties, etc. have long been part of the conversation, These are, after all, the people who went to Woody Allen films.  The trauma frame shifts the focus to some external source. To some extent that has always been true of psychoanalytic ideas, with their emphasis on childhood experiences with parents. But calling it trauma puts it in the same bin as the post-traumatic stress disorder suffered by soldiers who have been in combat. Besides magnifying the harm of these more mundane forms of suffering, it also implies that the harm was done by others, whether by intent or inadvertently. Imagine if Philip Larkin had written, “They traumatize you, your mum and dad.”

-----------------
* I thought that “therapy-speak” might be Waldman’s own coinage. An Internet search turned up only one instance of this term, in a 2019 article at Slate.
                                                                                

The Filmmaker — Bertrrand Tavernier (1941-2021)

March 25, 2021
Posted by Jay Livingston

“The Clockmaker” was Bertrand Tavernier’s first feature film. I saw it in 1978 when it came out, maybe because one of the two theaters were it opened was only a few steps from where I lived. What the film tught me— and I’m sure this was not Tavernier’s intent — was that so many movie tropes that I had assumed were universal aspects of film story-telling were merely American. But that’s what the movie does, mostly by avoiding those tropes or cliches. The dining table looks familiar — the plates and glasses and flatware — but the meal that’s served is very different.

Here’s the movie’s set-up. A young man, still in his teens, has disappeared from his job at a factory So has his girlfriend, who also worked there.  Somebody murdered the factory boss, an unpleasant man who hit on female workers. The police suspect the young man and are trying to track him down. The head police inspector brings in the boy’s father, tells him that the son has committed murder, and asks the father (Michel, a clockmaker) to help in the search.

You know how this will play out. The father will start an investigation of his own, but he will be constantly thwarted by the police, who continue to pursue their theory that the son is the killer.  As the father gets closer to solving the case, the police will threaten to jail him on one or another pretext. In the end the father will find the real killer and expose the incometence or corruption of the police. There may even be a final gunfight where the father has to dodge bullets from both the bad guy and the police before finally outwitting everyone and killing the bad guy.

None of that happens because this is not an American film. It’s “L’Horloger,” based on a Simenon novel. In an American film the hero would focus almost entirely on solving external, practical problems — outwitting the killer and the police. But in “The Clockmaker,” there’s no mystery to solve. The son killed his boss. Instead, the film shows Michel coming to terms with that reality and coming to a better understanding of his son as, over the course of the flim, the son is found in the North, brought back to Lyon for trial, convicted, and sentenced to twenty years. The film is also about the relationship that develops between Michel and the police inspector, who also comes to  a better understanding of both Michel and the son.

The film differed from America films in other ways that I came to see were typical. First, the protagonist is not physically attractive. Michel (Philippe Noiret) is pudgy, with thinning hair and a weak chin. Nor is he physically active. This is not Liam Neeson pursuing his daughter’s kidnappers.

Second, in American films, children are superior to parents. They are more capable, more competent, and more moral. Even when the older character (an actual parent or a parent-like figure) is a good guy, he must be saved from his own incompetence by the younger person. In French films, by contrast, it is the  parents who must suffer and deal with the missteps of their children. The parent-child, older-younger pattern also appears as more powerful - less powerful, in this case police-civilian. In American films, the character we admire is rarely an agent of the government.

Third, in both French and American films, larger forces — “society” or the government — may be unfair. American films are about the protagonist’s struggle against injustice, a struggle that is usually successful, if not entirely than at least in some small personal way. French films are more likely to follow the protagonist’s inner struggle in coming to understand the reality of those larger forces even if they cannot be changed.

I have seen other Tavernier films, notably “Round Midnight,” but the one that has stays with me is “The Clockmaker.”*

--------------------------
* A trailer, without subtitles, is here.


Could Anything Ever Outweigh Gun Rights? Let’s Ask Megan McCardle

March 20, 2021
Posted by Jay Livingston

“He bought his gun legally, so there’s nothing that could have been done to stop it.” Yes, you do hear this argument posed against the obvious truth that if the Atlanta killer hadn’t been able to get a gun, he could not have committed these murders.

The response of course is that his purchase was legal because the laws are so lax. In other places with other laws, that purchase would have been illegal.

The defense of the current law is the Second Amendment, which the gunslingers interpret as absolute (except of course for that pesky preamble about a well-regulated militia). The carnage, to use the Steve Bannon - Donald Trump turn of phrase, is the price we pay for our liberty and freedom.

Here is what Megan McCardle, a thoughtful and reality-based conservative, said in a podcast discussion a day or two after the Atlanta shootings:

As with any other civil liberty, curtailing [Second Amendment rights] has costs as well as benefits, and those have to be weighed. I think that I would place a different weight on the liberty than [gun-restricting liberals] would.
The killer walked into a gun store and walked out a few minutes later with a 9 mm gun. So McCardle is speaking for those who weigh the killer’s convenience in buying a very deadly weapon against the lives of his victims, and her scale tips in favor of the killer.

So here’s the question for her and all those others who talk about “the price we pay for our liberty.” Is there any “price” that’s high enough to warrant restricting guns?

  • the eight killed in Atlanta– pennies
  • the 20 children slaughtered in Sandy Hook – what a shame, still a bargain
  • the 49 killed in the Pulse nightclub massacre – cheap
  • the 60 dead, 400 wounded, and another 400 injured in the panic in the Las Vegas shooting – still a small price to pay for a big liberty.

These are just the mass shooting, the headline grabbers. They are far outnumbered by shootings with only one or two victims, shootings often done with guns that were bought illegally. Our gun laws, such as they are, make buying those guns about as easy as it was to buy marijuana back when that was illegal.

The NRA answer is obviously that no “price” — no number of bullet-ridden bodies — outweighs the right of anybody to buy any gun. But what about less doctrinaire conservatives like McCardle. She usually takes an economist-like approach, weighing costs and benefits. So is there any price she would find too high? If so, what is it?


When “Legends” Fail

March 1, 2021
Posted by Jay Livingston

There’s a difference between liberals and conservatives, not just in their policy preferences or their views of Trump. They also differ in how they react to bad stuff in their own camp. Liberals are far more willing to recognize these inconvenient truths and to do something about their flawed leaders. Conservatives rally to the defense.

But you wouldn’t know that from reading Ross Douthat. It’s hard for conservatves like Douthat — thoughtful, principled, horrified by Trump — to say something good about Republicans these days. So instead, he goes after liberals. In his column yesterday, “The Twilight of the Anti-Trump Idols” Douthat is at pains to show folly and error of liberals for making heroes of Andrew Cuomo and the Lincoln project. Both of these parties turned out to have feet of foul-smelling clay.

. . .in the substitution of figures who ended up exposed as corrupt or just incompetent, we can see once again the importance of thinking about how we got Trump in the first place. Our society’s sickness may be particularly acute in Trump worship, but the affliction is more general. The stink of failure hangs over the liberal and cosmopolitan as well the populist and provincial,

See, Douthat says, liberals are just like conservatives. They share the same moral failings; they both suffer from the same “general affliction.”

Well, no. Of the “legends” Douthat mentions, only one is an actual Democrat — Cuomo. The other legends include the Lincoln Project, a group of anti-Trump Republicans; Mitt Romney, also a Republican who occasionally opposes Trump; and “Europeans,” whose countries seemed to be doing better on controlling Covid-19.

Republicans in government and media have stood by Trump and his administration through all the lies, corruption, impropriety, cruelty, and incompetence. The strongest criticisms from official Republican organizations and Fox News have been aimed at those who dared criticize Trump. They are even hard pressed to find anything bad to say about the insurrectionists who invaded the capital.

Democrats and other liberals, by contrast, are hardly coming out in support of Gov. Cuomo. Nor have I heard them laud the Lincoln Project and Mitt Romney lately, though I don’t pay as close attention to these things as does Ross Douthat. As for the Europeans, have liberals been dismissing troubling numbers as “fake data.”? Do liberals circle the wagons when one of them has stumbled? Ask Al Franken.

I’m not sure how to account for this difference. Is it just Trump? Or do conservatives generally run no risk of losing support when they shoot someone in the middle of Fifth Avenue or the Capitol building?  I’m skeptical about Jonathan Haidt’s “moral foundations” as causes of behavior rather than as after-the-fact justifications, as when conservatives use “loyalty” as an ideal to support their choices. But in this case maybe their stronger emphasis on loyalty leads them to defend their “legends” even when those paragons have done things unbecoming a legend.


Singing Badly — Farce and Tragedy

March 1, 2021
Posted by Jay Livingston

Marx said that history repeats itself, first as tragedy, then as farce. Maybe, but sometimes it’s the other way round.

The woman who opened the CPAC meeting in Orlando with her rendition of the National Anthem* chose to do it a capella. As you can hear, that’s probably because the true pitch of an instrument would only accentuate her notes that fall somewhere in between the keys of a piano. Besides, no accompanist could possibly keep up with her unpredictable key changes.

Did she have a precursor? She did indeed. “Darlene Edwards,” a parody of a nightclub singer, was a character created in the 1950s by Jo Stafford, a pop singer with classical training. Darlene performed overwrought versions well-known songs like “I Love Paris” and  “Autumn Leaves.” She would hit off-key notes and add or drop beats in a measure, all the while accompanied by “Jonathan Edwards,” her real-life husband Paul Weston, playing a florid style piano you might hear in cocktail bars of the time. Here is how they destroy “Take the A Train.”

Jonathan and Darlene Edwards were clearly farce. The humor is based on the proposition that “this is not us.” And in fact they were talented musicians, and you get a sense that what they’re doing to the pitch and meter is far more difficult than a straight performance.

The CPAC singer’s two minutes on the stage is probably not tragedy, at least not according to literary definitions. But it is sad. There is no distance between the performer and the role. She even seems to think that she’s doing a fine job. **

There’s an obvious parallel to be drawn between her and the most important performer at the CPAC, a man who apparenly really does believe that everything he has done has been perfect. His performance too appeared to be farce, and it was easy to laugh at. Eventually however, it became clear that this was no laughing matter.

-----------------------
* I still have no idea who she is. My searches on Google and Twitter turned up nothing. A Facebook friend said that she was the daughter or niece of someone who gives a lot of money to CPAC.

** I’m not sure where Florence Foster Jenkins fits here. Accorfding to Wikipedia, “The question of whether ‘Lady Florence’ . . .was in on the joke, or honestly believed she had vocal talent, remains a matter of debate.”


Can We Talk? – Redux

February 19, 2021
Posted by Jay Livingston

Is talking a concession?

This exchange turned up in my Twitter feed this morning,

In case the screenshot is not legible, Robert Wright is responding to a paragraph from a WSJ story “U.S. Says It Would Meet for Nuclear Talks With Iran, Other Powers”  (here behind the WSJ paywall)

The plan was denounced by a key congressional Republican. “It is concerning the Biden administration is already making concessions in an apparent attempt to re-enter the flawed Iran deal,” said Rep. Michael McCaul of Texas, senior Republican on the House Foreign Affairs Committee.
Wright disagrees:
The “plan” being denounced by this “key” Republican is literally just to sit down and talk to Iranian officials. He's opposed to talking, which he considers a “concession”.
But McCaul is not the only one who considers talking a concession. We all do, at least when the talkee is someone we strongly disagree with. But should we? The tweet took me back to this post from 2006, when this blog was a mere toddler not even three months old. It was called “Can We Talk?” It seems as relevant today as it did then.

            *                    *                    *                    *

The news today is that North Korea has agreed to sit down in talks about their nuclear bomb. North Korea leader Kim Jong-il (son of former leader Kim Il Sung) had previously demanded that the US talk with North Korea one-to-one, but US leader George Bush (son of former leader George Bush) had refused. Lil' Bush refused direct talks and insisted that four other countries had to be there.  Lil' Kim eventually caved, probably because China was threatening to cut off its oil.  

North Korea isn’t the only country we won’t talk to directly. Syria, Iran, maybe others. As with North Korea, if we’re going to communicate with them at all, we need other countries as intermediaries to relay the messages.

When I was a kid, I would sometimes have a dispute with one of my brothers, and we’d get so angry, we’d refuse to talk to each other. At the dinner table, I’d say something like, “Tell Skip that if  he doesn’t give back my racer, I’m not going tell him where I hid his airplane.” My mother would dutifully turn to her right and repeat the message, as though my brother hadn’t been right there to hear it. Then she’d do the same with his answer. You see similar scenes in sitcoms and movies. Maybe it happened in your family too.

In real life, at least in my house, it never lasted long. Everyone would see how stupid it was, how impossible to sustain, and usually we’d wind up dissolving in laughter at how ridiculous we were.

I imagine our ambassador turning to the Chinese representative and saying, “You tell North Korea that we aren’t going to give it any food unless they stop making bombs.” China turns to North Korea, just as my mother turned to my brother, and repeats the same message. North Korea says to China, “Yeah, well you tell the US . . . .” and so on. That’s pretty much what these countries have been doing anyway, though without actually sitting down in the same room.

When people insist on this “I’m not talking to him” charade, we call it childish and silly. When nations do it, we call it foreign policy.

(Full disclosure: I think I may be borrowing — i.e., stealing— this observation from something I heard Philip Slater say many years ago.)




We Didn’t Talk About Healing and Unity in the 60s. Why now?

January 22, 2021
Posted by Jay Livingston

Now that the inauguration has finally settled the question of who is president, the calls for “unity” and “healing” will probably taper off. But for a while, you couldn’t turn on the TV without hearing those words. I wouldn’t have been surprised if, after the debacle also known as the Steelers-Browns playoff game, Mike Tomlin had said that the Steelers needed a time for healing.

In past times of national division, healing and unity were not part of the political discourse, They have become popular only recently, sort of like Liam and Olivia. In the 1960s, nobody named their kid Liam or Olivia. The 1960s was also, you may recall, a period of political conflict and division over civil rights and the war in Vietnam. Riots in the cities, assassinations of political leaders, killings and terrorism by White supremacists who were sometimes also cops and sheriffs. And yet, there wasn’t a lot of talk about healing and unity.

(Click on an image for a larger view.)

Unity actually declines in the sixties. Healing is just beginning its rise, and I suspect that much of the healing talk in those books was about personal rather than political healing. The crossover into politics does begin in the sixties, but the rise was nothing like what happened a quarter-century later.

Google nGrams, the source of the above graphs, counts words in books, so it lags behind the actual change in fashions. For something more up-to-the-minute I tried the Nexis-Uni tally of words in news publications. The graphs I could get quickly are not as nuanced, not as granular (speaking of fashionable words), but they show the same trends. The concern with healing a divided nation doesn’t set in until very late in the 20th century,

Why were we not talking about unity in the 1960s? My guess is that the difference between then and now is that although the nation was divided, it was not polarized. Certainly, the two major parties were not as polarized. The news media were also more concentrated, less divided. The most trusted man in America was a TV news anchor, something unimaginable today.

As for healing, its popularity is part of the more general diffusion of the language of psychotherapy into all areas of life, including politics. The therapy-based issues, as in “he has commitment issues,” has replaced the more secular problems. Decades ago, if I said, “Houston, we have an issue,” I would get a smile of recognition. Now, most people would think it was an accurate quote. We also talk about what someone “needs” to do rather than what they “should” do — the therapy language of personal needs replacing the morality language of right and wrong.* It’s a tribute to what might be called the triumph of the therapeutic that in a time when an actual disease has killed hundreds of thousands of Americans and infected millions, our talk of healing is all about politics.

---------------

* I’ve said this before in somewhat greater detail in earlier posts (here
 Mad Men — Language Ahead of Its Time) and here (Needs — One More Time).




Dissing Hunter-Gatherers

January 20, 2021
Posted by Jay Livingston

I was listening to the podcast “Think Like an Economist” this morning, the episode called “Economic Growth — Improving Our Lives.” About two minutes in, I nearly choked on my coffee when I heard Betsey Stevenson say this.  

For pretty much the last million years, people were hunter-gatherers living a hand-to-mouth existence. The main focus of life was finding enough food to eat.

Now Betsey Stevenson and her partner in podcasting and life Justin Wolfers are widely respected economists. But what they say here about hunter gatherers is flat out wrong.

Of course, we can’t be certain how foragers of 100,000 years ago actually lived. But the accounts that we do have of contemporary foraging societies paint a picture far different from the image of grim hunter gatherers toiling unhappily for long hours to avoid starvation. Foragers spend far less time working than do people in agricultural or industrial societies. In fact, they don’t really have the concept of “work” since they do not separate work and the rest of life. And the basis of that life is involvement with other people, often in a manner we would call playful.

Immediately after the statement about foragers, Stevenson and Wolfers tell us what happened next.

Things got a little better when people started farming about 12,000 years ago. People went from spending most of their time finding food to growing food to stay alive. Unfortunately though, starvation was still common. There were innovations, but they rarely led to sustained economic change because political systems were designed to keep any extras in the hands of an elite few.

No, things didn’t get better, they got worse. Wolfers implies as much in the next sentences. Agriculture wiped out the freedom and equality that foragers take for granted. And yes, it did bring starvation. Even when people in agrarian societies weren’t starving, they had a much poorer diet than that of foragers, who ate a wide variety of plants and animals.

Little wonder then that foragers are also happier than people in more “advanced” societies. They are happy, but, as James Suzman (here) says of the Bushmen, they don’t have a word or concept for “happiness.”

Bushmen have words for their current feelings, like joy or sadness. But not this word for this idea of “being happy” long term, like if I do something, then I'll be “happy” with my life long term.

Perhaps Stevenson and Wolfers have this incorrect picture of life before the agricultural revolution because they are economists, and economics is about scarcity. In fact, one definition of economics is that it is the study of the allocation of scarce resources. But foragers like the Bushmen live in a world of abundance relative to their wants and needs. Scarcity was something imposed by economic growth.

As the title of the podcast episode implies, economists take it for granted that economic growth improves our lives. But does it? I think we need to ask two other questions first: “Compared to what?” And “How do we measure how ‘good’ our lives are?” Economists are not comparing us to the Bushmen, nor is the economists’ idea of a “good” life one that foragers would have. In other words, the economists’ vigorous cheerleading for economic growth requires that we ignore the evidence from most of the history of our species.*

----------------------------------

* Given that for more than 90% of our history on the planet  we humans were hunter gatherers, you’d think that social scientists would not base their ideas about “human nature” on only the most recent sliver of that history. But they do. See these posts from a decade or so ago — one about virginity, the other about private property.

Like a Virgin — Whatever That Was

Sandbox Sociology — Sharing and Human Nature


Grow Up

January 19, 2021
Posted by Jay Livingston

Ever since I watched the events at the Capitol on January 6, I have been trying to pin down just why it all seemed so childish. Childish with serious consequences — people were injured and killed — but childish nevertheless. Trump’s childishness is easy to see. His name calling and trash talking; his casting of everything in absolute terms — things are a “disgrace” or a “disaster” if Obama did them, “fantastic,” if he did them; his sense that he is the victim of unfair treatment; his refusal to do anything that might benefit others but not himself; his petulance (his refusal to attend the inauguration being the latest and most obvious example)

His supporters too often seem childish. I’m not talking about those who are motivated by real issues — rich people who want a huge tax cut, businesses that wish to avoid environmental regulation, people who think that abortion is murder. I’m talking about those whose support for Trump less a matter of issues and more a matter of identity. I’m talking about the insurrections at the Capitol. Maybe what seems child-like is their refusal to separate play and “reality.”

They had a specific real-world purpose – “stop the steal.” Maybe they had managed, with Trump’s help, to convince themselves that this was possible. But by breaking into the halls of Congress? Get real.

But the insurrectionists were not about getting real. They were playing — playing at being 1776 patriots, playing at being soldiers and commandos with their camos, their climbing gear, and their zip-tie handcuffs. It was like playing some combination of paintball and capture the flag.  As in play, there was no real external goal. The goal was to capture the Capitol. Once they had succeeded in breaking into the building, they were like the dog that catches the car he’s been chasing.  Videos show them uncertain of what to do, wandering around like tourists, taking selfies, making videos. Sometimes they remember to chant their slogans (“USA,” “America First,” “Trump”), like fans at a football game. A video shows a group in the Senate chamber using their smartphones to photograph documents lying on desks. But it’s clear the men have no idea what they are photographing or why.

Yes there was real violence. But that too seems to have had the same purpose — getting in.  And a small number may have had actual plans to kidnap members of Congress. But I wonder what they would have done if they had managed to find any. As Fabio Rojas tweeted, “A bunch of cosplaying MAGA nerds won't topple the Federal government.”

                                  *                          *                       *                          *

“It’s time to grow up, “ said president-elect Biden. His remarks were occasioned by the refusal of Republicans in Congress to wear masks even when they were confined in close quarters during the insurrection, even when a Congresswoman repeatedly offered them masks, and even though a House rule requires them to wear masks.

As I was listening, I thought: what a change this is — to have an actual grown-up in the White House asking that elected officials, and by implication, the nation, act like grown-ups.

Capitol Losses

January 10, 2021
Posted by Jay Livingston

A friend asked on Facebook, “There is something that is kind of mystifying me. Trump clearly crossed some kind of line with the Capitol riot, but what line was it exactly, and why wasn't it crossed before?”

We don’t really know yet how Trump’s supporters and other Republicans reacted. The surveys currently available give different estimates. Generally, Republicans did not think highly of the rioters. Their views of Trump, however, were more charitable. Most thought he was not at all to blame for what the mob did. Go figure.

In contrast to the street Trumpists, Republican officials — legislators, White House staff — are now criticizing Trump. But why? Why is this outrage different from Charlottesville, the Ukraine shakedown, the pardons to Manafort and Roger Stone, or any of the Trump’s other breaches of norms and laws?

At times like this, I turn to religion — or rather to Durkheim’s ideas about religion. In Durkheim’s view, religion is all about the group. Whatever benefit the individual may draw from it, religion, including American “civil religion,” is of the group, by the group, and for the group. Its symbols— a cross, a flag — are symbols of the group, the entire group. The Stars and Stripes represents the US as a whole, and Americans get very upset by someone setting one of these flags on fire

The Capitol is the same kind of symbol. The word included in so many of the public statements — by  Congress people, in news editorials —  was sacred. “Sacred things,” says Durkheim, “are simply collective ideals that have fixed themselves on material objects.” [emphasis added] So on Wednesday, when Trump’s army was desecrating the Capitol, it was hard for most Americans to see him as a patriot, as someone who loved America and revered its symbols. (It was especially hard for those who were hiding inside, fearful that the mob Trump had sent after them might do them real harm.)  Trump was symbolically attacking our country.

So were his supporters. They called themselves patriots, and they carried American flags. But some of the mob carried the flag of the Confederacy, a country that fought a war against the United States of America. And many of the flags had no America symbolism at all; they were purely Trump banners, suggesting that the mob’s loyalty was not to the country but to a single man.

(Click on the image for a larger view.)

Durkheim would have a similar take on Trump’s refusal to attend the inauguration. The inauguration is a ritual, and for Durkheim, rituals are the most important element of religion. If sacred symbols embody the group in an object, then rituals embody it in the behavior of the members.

Rituals mark the group’s boundaries. They tell you who is in the group and who is not. By attending, you identify yourself as a member. By not participating, by not joining the group, you define yourself as not a member, as not one of “us.” So the answer to “Why do I have to go to mass?” is not any utilitarian one, but rather one of defining group membership: “Because you’re a Catholic.” When someone says, “Because you're a . . . .” that last word that completes the sentence (a Catholic, an American, a member of this family, etc.) tells you which group the ritual is defining, and it implies a commonality with everyone else who participates in that ritual and is therefore a member of that group.

Ex-presidents come to the inauguration. They are living symbols of the continuous line of American history from 1781 to the present. By absenting himself, Trump is saying that he is not part of this history and not part of the group whose history it is. If Trump goes to Scotland, as is rumored, if he leaves the country, he only adds to this image that he stands outside of American history and outside of America.

The attack on the Capitol too was a ritual — that’s obvious — and like other rituals, it drew a boundary between “us” and “them.” But in this case, the Trumpists — and by implication Trump himself — were drawing a line between themselves and  the nation as a whole.

I expect that Trump base will remain loyal.  Whether he shoots someone in the middle of Fifth Avenue or pushes a mob  to attack the Capitol, he’s got them. But people who do not share that deep and emotional attachment to Trump may move farther away. The line that Trump crossed is the boundary of the nation.  National unity is a myth of course, not part of our everyday reality. It is a “collective ideal” that we bring out for ceremonies of our civil religion. In attacking his enemies, Trump also attacked symbols of the nation as a whole.


Ring in the New — With a Bit of the Old

January 5, 2021
Posted by Jay Livingston

Last week, I learned a new word, skeuomorphism — a elements of a new technology that look like those of the old technology, elements that are now unnecessary. Virtual sticky notes for your computer screen or a push-button phone that looks like a rotary phone. The examples I found were all in design. But maybe skeuomorphism can apply to language as well.

Many of the stories about Trump’s recent phone call to Georgia referred to “tape.”

But there was no actual tape.

The call was recorded on some sort of electronic drive.

Radio journalists too use this term. They talk about “getting good tape” — audio quotes that will sound good. A 2019 Columbia Journalism Review article had the title “For the record: 18 journalists on how—or whether—they use tape recorders.” Some of those journalists refer to “recording” on a “device” or iPhone. But some use “tape” to mean 21st-century recording.

Is there a different word for this — using a word from an old technology even though that bit is no longer in use? And are there any other examples? I can think of only one. Musicians still sometimes talk about making “a record.” Not a recording, not an album, but a record.

Does  “cc” qualify? People still say, “I’ll cc you on that memo,” knowing full well that “cc” is an abbreviation for “carbon copy” and that there is no carbon paper involved. But “cc” has become a stand-alone term, now free of its finger-smudging origins. Nobody says, “Send me a carbon copy.”

The YouTube logo is an example of skeuomorphism in both design and language.


The image is shaped like the screen of an old television set, not a modern flat-screen. Those old TVs also used cathode-ray tubes. By the time YouTube came along (2005), nearly all TVs were flat-screens with square corners. And gone were the days when you might hear someone refer to “the tube” (or “the boob tube”)? Yet YouTube chose to retain both the word and the image of the old technology.

The Lack of a Need for Unclear Negative Writing Can Never Be Underestimated

January 4, 2021
Posted by Jay Livingston

Twelve years ago in this blog, I wrote a post with the title, “Accentuate the Positive; Eliminate the Negative.”*  I was borrowing the title of a hit song of the 1940s. The post, unlike the song, was about language and writing, not general philosophical outlook. My point was simply that the more negatives a writer loads into a sentence, the harder it is for readers, and sometimes the writer, to understand the meaning of that sentence.

Those negative words include more than  just no, not, and never. My made-up example was parody of those newspaper summaries of Supreme Court decisions. “The court failed to overturn a lower-court ruling that denied a request to reverse . . .”

Do real writers write like that? Yes, sometimes to the point that they lose track of their multiple negations and write something whose literal meaning is the opposite of what they intended. Here’s economist Noah Smith at Bloomberg a few days ago (here). He was arguing that Texas, if it is to succeed in its effort to become a technology hub, will have to do something about “noncompete” clauses that prevent employees from leaving one firm and taking a job with a competing firm.

Banning noncompetes would be inconsistent with Texas’ principles and reputation as a defender of free markets. Noncompete agreements are restrictions on the free movement of labor; they gum up markets.


 Banning, noncompetes, inconsistent. Smith nails the triple. 

Smith is an economist (and a Texan). He likes free markets and competition. If noncompetes gum up markets, then banning them would free up those markets. So banning them would be consistent with principles of free markets. What he meant was, “Noncompetes are restrictions on the free movement of labor; they gum up markets. Banning them would be consistent with Texas’ principles of free markets.”

------------------------------------
*Other examples noted in this blog include a Financial Times piece by someone  “widely considered to be one of the world's most influential writers on economics” (“No, No, a Thousand Times No”) and even the GSS ( “The Power of Positive Phrasing”).

A Cold War Joke and US Healthcare

December 23, 2020
Posted by Jay Livingston

There was a joke Republicans liked to tell about Soviet Russia back during the Cold War. Republicans then, unlike Republicans now, were highly critical of the Russian government and its leaders. The joke was about an American visitor getting the official tour. He is taken to a factory, where he gets an interview with a worker. The worker proudly tells him how, working at this wonder job in in this state-owned factory, he has saved enough money over the years to be able to buy a car.

“And what are you saving for now?” the American asks.

“A pair of shoes.”
   
                           *                       *                        *                   *

I was talking to a friend last night. “Emma took a job at UPS,” she said. Emma is her daughter-in-law. “It’s back-breaking work, and she can only do it part time. And you know why she took the job? Because at UPS, even part-time workers get health coverage.” My friend added that her son, Emma’s husband, gets medical benefits that cover only him, not the family. They looked at the available insurance plans, and to get anything decent, it would cost them $1500 a month.

“So,” said my friend, “they’re saving as much as they can so that Emma can quit the UPS job and they can buy health insurance.”

                           *                       *                        *                   *

The Soviet worker didn’t know that in other countries, people don’t have to scrimp and save to buy a pair of shoes. He didn’t realize that his unscripted answer revealed deep flaws in the Soviet system.

My friend’s son and daughter-in-law know what healthcare is like in other countries — many Americans don’t. They know that their predicament reveals deep flaws in our healthcare system. They just can’t do anything about it.

Doctor? My Eye.

December 14, 2020
Posted by Jay Livingston

“Madame First Lady — Mrs. Biden — Jill — kiddo: a bit of advice... Any chance you might drop the ‘Dr.’ before your name? ‘Dr. Jill Biden’ sounds and feels fraudulent, not to say a touch comic.” So begins Joseph Epstein’s WSJ op-ed of this weekend “Is There a Doctor in the White House? Not if You Need an M.D.”
   
Many on the left got upset. They disliked the tone. Smugness has long been a chronic, perhaps unavoidable, flaw among right-wing intellectuals, and Epstein is not as bad as most of them on that score. His opening descent down the ladder of formality of terms of address seems more friendly than condescending. The article is, after all, about what to call the First Lady, and he’s trying out several possibilities.  Even so, he seems to be trying to trigger the libs, and triggered they were.

Some people accused Epstein of sexism. You wouldn’t have done that if it were a man — that sort of thing. Counterfactuals like this are hard to prove, but the critics may be right. Epstein’s main argument against “Dr.” Biden is that this honorific should be reserved for medical doctors (“A wise man once said that no one should call himself “Dr.” unless he has delivered a child.”) Jill Biden has no medical degrees. She as an Ed.D.

However, six years ago, Epstein wrote, “One teacher I do remember fondly was Dr. Branz, a German émigré who taught a course called Commercial Law. He must have been a refugee from Hitler, with a doctor of laws degree...” Yes, Dr. Branz, a law professor. I doubt that this juris doctor had ever moonlighted as an obstetrician.

The WSJ has no objections to using “Dr.” for non-physicians in the White House —  among others, Dr. Condoleezza Rice and of course non-M.D. Henry Kissinger, who insisted on on being called “Dr. Kissinger.” As far as I know, Epstein never gave Rice or Kissinger the same friendly advice he’s offering Jill Biden.                                                         
                                                                
Why shouldn’t we use the same honorific for advanced degrees both medical and academic? Is it confusing? Or does calling our teachers “doctor” cheapen the value of medical doctors? Epstein implies that it’s both. Equating physicians and professors does not fit with a value system that accords teachers much less prestige than they might have in other cultures.

Once long ago, I taught English for one semester in a high school in a small town (pop. 3000) in Japan. My students addressed me as Jay-sensei, sensei being the Japanese word for teacher. I lived with a Japanese family. One day, I had some stomach problems. My family insisted that I go to the doctor — Kimura-sensei. Hmm, I thought, we call our teachers “doctor”; the Japanese call their doctors “teacher.” A commentary on their values?

Doctor originally referred to theologians,  explainers of doctrine — closer to teachers than to physicians. Dr. Webster explains:
<table align="center" border="1" cellpadding="10" nbsp="" style="width: 450px;"><tbody>
<tr><td bgcolor="#F8FCFC" style="text-align: left;">
The word doctor comes from the Latin word for “teacher,” itself from docēre, meaning “to teach.”

The 14th century was the birth of the Renaissance, and lots of teaching and learning was afoot. By the century's end, the word doctor was being applied not just to a select few theologians, but also to qualified and/or accomplished academics and medical practitioners.he word doctor comes from the Latin word for “teacher,” itself from docēre, meaning “to teach.”
.</td></tr>
</tbody></table>

I don’t know the history of sensei. Maybe in Japan, as we speak, some Epsteinian curmudgeon is complaining about all these physicians who insist on being called sensei even though they have never taught even one student how to interpret a multiple regression.

Trump's Defense Fund — Fleecing the Rubes

December 7, 2020
Posted by Jay Livingston

What Trump learned at Trump University was the value of fleecing the rubes. Most of the other victims of his salesmanship had been big-money people and institutions — banks and other supposedly wise investors. The $3000 that Trump U students lost — and some got ripped off for tens of thousands — is serious money for the individual but small potatoes for Trump.

He made it up in volume. The scam grossed at least $40 million.

Now he’s at it again. So far, his Election Defense Fund has raised over $200 million.

I can understand the Trump U victims. They thought that their money would buy them the secrets to real estate success. They would be repaid a hundred fold. But what do the Election Defense Fund contributors get?

First, and most obviously, they think that the money will get them four more years of Trump. This belief requires a prior belief that is axiomatic among Trump loyalists:  believe Trump, not the so-called experts.

From the start, the consensus among legal scholars was that Trump’s cases would not win. As one lawyer I heard on (I think) NPR said, “His chances are slim and none, and silm just left town.” But if Trump and his lawyers say that they have a ton of evidence of massive fraud, it must be true, and these other voices must be fake legal opinion.

It has now become clear that the experts were correct. Trump has lost one legal challenge after another. So much losing. But still the donations roll in.

For Trump, of course, everything is, to use the current term, “transactional.” The fine print on the Election Fund website says that for donations of less that $5000, nothing will go to the legal fight.  Instead, 25% will go to the Republican National Committee, and 75% to “Save America,” a Trump PAC. There are restrictions on what Trump can do with that money. For instance, the rules say that he can’t use it to pay off his campaign debts and certainly not his personal debts.

But, as Brendan Fischer, Federal Reform Program director at the Campaign Legal Center, said, "Although Trump cannot use campaign funds to pay himself or his family members excessive salaries, or to buy enough copies of Don Jr.'s book to land it on the bestseller list, he might try to use leadership PAC funds for such purposes.”

Remember, we’re talking about Donald Trump. If nobody can or will effectively enforce the rules, the rules don’t matter. Even if the rules are enforced, breaking them might have been worth it. After all, Trump U was fined $25 million, probably less than half of what it brought in. And in the end it wasn’t Trump who paid the judgment. It was some billionaire supporter. 

Even if Trump did use the money for whatever he wanted, including paying of personal debts, he probably wouldn't lose much support. For his followers, a donation is not transactional. The rewards it brings are intangible: it strengthens their identity as members of the group; and it reaffirms the value of that group. Those functions are especially important for those who perceive their group as being under attack. And Trump supporters, correctly or incorrectly (mostly incorrectly) see themselves as being discriminated against because of their race (White), their religion (Christian), and their views on gender roles (“traditional”). Trump embodies their ideas, and he perfectly and loudly expresses their resentments against those who are supposedly discriminating against them. You can’t put a price on that.

Trump Claimed Vote Fraud in 2016. What’s Different This Time?

December 4, 2020
Posted by Jay Livingston

I was having an e-mail conversation with a Christian conservative. He still believes Trump’s claims of massive voter fraud. I’m willing to accept the conclusion of all those judges (including some Republicans and at least one Trump appointee), governors, secretaries of state, and election officials.

But Trump still has a large core of believers who, despite Trump’s losing all those court cases, still believe his claims that the election was rigged. That seems new. If  if McCain or Romney, or on the Democratic side Hillary, had made similar claims that had similarly failed in court, would their supporters continued to believe them? Would they have made the kind of high-volume protests that we now hear? Probably not. But why, I asked. Why is 2020 different?

Usually, I find myself on the structural side of these questions and shy away from explanations based on individuals and  individual-level variables. But this time, it was my conservative correspondent who looked first at structural changes. (He is not a sociologist, not a political scientist, not an academic.) He wrote:

It's hard to compare this 2020 election with any before it, at least in my lifetime, with the preponderance of mail-in ballots.  It was shocking to me to see the vote counting stopped simultaneously in these key states.
For me, the big difference was the relation between Trump and his supporters. It wasn’t Trump himself. The similarities between Trump 2020 Trump 2016 are too obvious to ignore. In 2016,
:
    1. Trump lost the popular vote (about 3 million in 2016, 7 million this year).
    2. Trump claimed that the vote was rigged and that there was fraud.
    3. He appointed a commission to investigate and prove him right. The person he appointed to head the commission, like his lawyers this time around, was an ardent supporter, who echoed Trump’s claim of fraud even when other Republicans were silent on the matter.
    4. The commission could not find evidence of widespread voter fraud.
    5. Trump disbanded the commission but still that Clinton’s numbers were due to massive fraud.
    6. Trump continued to lie about the commission and the election.

Trump 2020 was perfectly predictable. He did the same thing when lost at the Emmy Awards — insisted that the voting was rigged. The important difference in 2020 is not the mail-in ballots. It’s the reaction of several Republican politicians and millions of Trump voters. Their continued belief is partly a simple matter of confirmation bias — recognizing only the evidence that confirms your idea, even when that evidence has been discredited, and finding reasons to reject disconfirming evidence. But given how extreme their reactions have been — death threats against election officials and their families — there has to be something else going on. But that’s a matter for another post.


Can “Up” Make Masks Masculine?

November 6, 2020
Posted by Jay Livingston

A lot of people in this country still refuse to wear masks. In most places, no laws require masks, so anti-maskers will have to be persuaded. But how, especially now that those same people associate masks with weakness and femininity? Appeals to altruism run up against American individualism. As the subway rider in New York said this week when other riders repeatedly asked him to wear a mask and even held out masks for him to take, “I live by my principles. . . I don’t wear a muzzle.” (The full story is here )

One strategy that seems to have caught on is “Mask Up.”

(Click on the image for a larger view.)

These campaigns remind me of “Don’t Mess With Texas.” Now it’s an all-purpose slogan, but it originated in 1985 as a campaign to get “Bubba,” the stereotypical Texas truck-driving male, to stop tossing beer cans and other litter onto the highway. Highway beautification had the same problems as masks. It required that the individual inconvenience himself for the sake of a goal that benefited only the general society, not himself, and in a way that was not immediately visible. In addition, the goal highway beautification reeked of flowers and femininity.

A slogan like “Let’s Keep Our Roads Beautiful” wasn’t going to cut it. But “Don’t Mess With Texas,” with TV ads featuring Dallas Cowboys linemen, combined masculine toughness with state chauvinism.*

Covid is a far more serious problem than litter, but the strategy is the same — masculinity and local patriotism.  The pugnacious “Philly Never Backs Down. Mask Up” seems too similar to “Don’t Mess With Texas” to have been a coincidence. But it’s the “up” that I find most interesting. “Mask Up.” to my ear at least, sounds more masculine than “Wear a mask.” I’m not sure why. Maybe the “up” implies a bold action, like an athlete suiting up for the big game, a game for which he is also amped up, revved up, and even juiced up.

“Listen up,” says the coach in the locker room just before he gives his speech to get the team psyched up. To “Listen up” is active than to merely “listen.”

The Texas campaign reduced highway litter by 72%. Unfortunately, I don’t know any research showing the effect of “Mask Up.” 

-------------------
* See my 2009 post Lone Star Litter and Values.