Showing posts with label Language and Writing. Show all posts
Showing posts with label Language and Writing. Show all posts

Mrs. Maisel Yet Again — 1961 Talking Like It’s 2011.

April 18, 2023
Posted by Jay Livingston

Five years ago, I posted here about the language anachronisms in “The Marvelous Mrs. Maisel.” That post (here) remains the most visited page in this blog, the number of views now approaching 20,000. Last night I watched the first two episodes of Season Five, and I have the same question that I had five years ago with Season One: How can they let these obvious anachronisms make it into the final script?

In Season Five, the year is 1961, and Midge has taken a job in the writers’ room for a TV show much like The Tonight Show. She was reluctant to take the job – she’s a performer, not a writer — and on her first day, she calls her manager Susie to ask for advice.

Fake it till you make it may be good advice, but nobody used that phrase in 1961. Nobody. Nor did anyone talk about “going rogue.” But in Episode One, Midge tells Susie, “We had a plan, then I went rogue.”

I ran these phrases by Google nGrams. Here are the results:

(Click on the image for a larger view.)

These phrases did not exist in 1961. Yes,the nGrams data is from books, and trendy phrases turn up in speech — in conversations, on television — before they appear in print, so we should allow for some lag time. A few years perhaps, but not a few decades.

There were others you can find on nGrams: out of the loop, track record, not on the same page. These too arrived much later in the century. Other words in Season Five just sounded wrong --- at least to my ears, and my ears were around in 1961 —  but I had no quick way to check them: not gonna happen, and suck meaning to be of poor quality.

In that 2018 post, I quoted Amy Sherman-Palladino, co-creator and chief writer of the show saying that she hired a “delightful researcher who has like twelve masters degrees in everything in the world” and who questions things that don’t sound right. Sherman-Palladino herself says, “The last thing I want to do, when everyone is making sure that the piping on the wall and the colors are all correct, is . . . come in and throw in a bunch of dialogue that’s not appropriate.”

But the glaring anachronisms remain, and I’m still puzzled.

When Chappelle Says It, It’s Funny

November 18, 2022
Posted by Jay Livingston

Dave Chappelle, in his SNL monologue, offered an insight about language that I’ve used a few times in this blog. It’s about adding the definite article “the” to a demographic category.



Here’s how I put it in a blog post seven years ago after candidate Donald Trump (remember those good old days? they’re back) had told an interviewer, “I’d be phenomenal to the women.”

When you add “the” to a demographic group and speak of “the women” or “the Blacks,” you are separating them from the rest of society. Without the definite article, they are included. To say, “In our society we have Blacks, Jews, women. . . . .” implies that they are all part of our group. But, “We have the Blacks, the Jews, the women . . . .” turns them into separate, distinct groups that are not part of a unified whole.

Chappelle got more laughs.

In another post a year later (here),  I quoted linguist Lynne Murphy on the same topic.

“The” makes the group seem like it’s a large, uniform mass, rather than a diverse group of individuals. This is the key to “othering:” treating people from another group as less human than one’s own group.


Turning those individuals into “a large, uniform mass” not only allows for “othering”; it’s also the precondition for paranoid conspiracy theories. Even if, as Chappelle suggests,* there are a lot of Jews in Hollywood, you can still see them as individuals, as Jews trying to turn out successful movies and TV shows. To see them as a cabal conspiring against Kanye or Christians or America it helps to think of them as “the Jews.”

---------
* See also Joel Stein’s 2008 article “Who Runs Hollywood? C’mon” (here).




“Julia” — Serving Up Words Before Their Time

May 4, 2022
Posted by Jay Livingston

“The Marvelous Mrs. Anachronism” (here) is the post in this blog with by far  the most hits and comments.  And now we have “Julia,” the HBO series about Julia Child and the creation of her TV show “The French Chef.” It’s set in roughly the same time period, the early 1960s. And like “Mrs. Maisel,” it offers a rich tasting menu of anachronisms.

I don’t know why the producers don’t bother to check their scripts with someone who was around in 1962 – a retired sociologist, say, who is sensitive to language – but they don’t. Had they done so, they would have avoided the linguistic equivalent of a digital microwave in the kitchen and a Prius in the driveway. They would not have had a character say, “I’m o.k. with it.” Nor would an assistant assigned a task say, “I’m on it.” Nobody working with Julia would be excited to be on the front lines of “your process.” “Your method” perhaps or “your approach” or even “all that you do,” but not “your process.”

If you’re a TV writer, even an older writer of fifty or so, these phrases have been around for as long as you can remember, so maybe you assume they’ve always been part of the language.

But they haven’t. Sixty years ago, people might have asked how some enterprise made money or at least made ends meet. But they would not have asked it the way Julia’s father asks her: “What's the business model down there? Does public television even* have a business model?” 

In her equally anachronistic reply, Julia says, “Nothing's a done deal yet,” That one too sounded wrong. I don’t recall any done deals in 1962.

To check my memory, I went to Google nGrams. It shows the frequency of words and phrases as they occur in books. Most of the phrases that seemed off to my ear did not appear in books until the 1980s. A corpus of the language as spoken would have been better, and there’s a lag of a few years before new usages on the street make it to the printed page. But that lag time is certainly not the twenty years that nGrams finds.

(Click on an image for a larger view.)

In another episode, we hear “cut to the chase,” but it was not till the 80s that we  skipped over less important details by cutting to the chase. (Oh well, at least nobody on “Julia” abbreviated a narrative with “yada yada.”) Or again, a producer considering the possibilities of selling the show to other stations says, “This could be game changer.” But “game changer” didn’t show up in books until four decades after “The French Chef” went on the air.   

“This little plot is genius,” says Julia’s husband. It may have been, but in 1962, genius was not an adjective. An unusual solution to a problem might be ingenious, but it was not simply “genius.” Even more incongruous was Julia’s telling the crowd that shows up for a book signing in San Francisco, “I'm absolutely gobsmacked by this turnout.”  Gobsmacked originated in Britain, but even in her years abroad, Julia would not have heard the term. Brits weren’t gobsmacked until the late 1970s, with Americans joining the chorus a decade or so later.

I heard other dubious terms that I did not know how to check. “The Yankees are toast,” says one character, presumably a Red Sox fan. It’s not just that in 1962 the Yankees were anything but toast, winning the AL pennant and the World Series; I doubt that anyone was “toast” sixty years ago.

The one that bothered me most was what Julia’s friend Avis says after making a small play on words. She adds, “See what I did?” I’m pretty sure this is a very recent usage and was not around in 1962. I’d just as soon not have it around today.

Finally, in the latest episode, which I just now saw and which inspired me to write this post, we have the anachronism that nobody notices — “need to” instead of “should” or “ought to” or other words that carry a hint of what is right or even moral. In “Julia,” a young couple meet for lunch at a diner. It’s a blind date, and as they talk, it becomes clear that they are a good match. They talk some more, and we cut to a different plot line. When we come back to the diner, the couple are still there, still talking, but they are now the only ones left in the place. The waitress comes to the table and tells them patiently, “You need to go.”

What she means of course is that she needs for them to go. In 1962, she would not have phrased it in terms of their needs. She would have said, “You have to go.”

-------------------------------
* “Even” as an intensifier in this way may also not have come into use until much later in the century. See this Language Log post on “What does that even mean?”





Cotton-picking — Real and Metaphorical

February 24, 2022
Posted by Jay Livingston

Words change. Usually the change in the literal meaning is so gradual that it's hard to see. More visile may be the change in the political and emotional meanings that surround a word, and those meanings often depending on who is using it.

“Get your cotton-picking hands off that” is what the substitute teacher at Farmington High in Michigan said. The Black student she said it to couldn’t believe his ears. “What did you say?” She repeated it and explained. “This kind of comment was a very common comment. And it was a very innocent comment. It was not meant to be offensive in any way.”

The verb tense is important. The adjective “cotton-picking” was very common. I don’t know how old the substitute teacher is, but on the video, she does not sound young. (You can hear her on the video posted to TikTok.) She probably thinks of the word the way it was used in the previous century. Fifty years ago, cotton-picking was a word intellectuals might use to make a statement seem down-to-earth. Milton Friedman in 1979: “before government and OPEC stuck their cotton-picking fingers into the pricing of energy.” A character on a sitcom  — a White, non-Southern character — might say, “Are you out of your cotton-picking mind?” It was funnier that way, believe me. It was a way of expressing disapproval but in a friendly, joking manner. White people used it as way to sound folksy and informal, perhaps in the way some well-educated, non-Southern people in this century have adopted “y’all.”

Google nGrams is not a good source on this one, but for what it’s worth, it shows cotton-picking as an adjective increasing till 1940 and declining steadily thereafter.

I suspect that nearly all of those instances from before 1950 were literal — things like references to cotton-picking machines. The metaphorical, disparaging cotton-picking came later. You can see this in the line for cotton-pickin’ since the dropped-g version would not have been used to talk about farm equipment. The earliest use the OED could find for this meaning was for this more colloquial spelling. It appeared in 1958, in the New York Post, which was then a liberal newspaper.* “I don’t think it's anybody’s cotton-pickin'’business what you’re doing.”

Of course, using cotton-picking this way worked only for people whose lives and world lay far from the actual picking of cotton.  That was the world of the Michigan substitute teacher, and she used the word without ever thinking about its origins, in the same etymology-ignoring way we all speak. But for the Black kid, the word evoked the history of slavery and post-bellum racial exploitation in the Jim Crow South. And there was nothing friendly, funny, or folksy about it.

The teacher later said that she now realized that the term was offensive, but she maintained that her motives and intentions were innocent if ignorant. If only she had been — what’s the word here? If only there were a word that means aware of racist aspects of US history, aware of how privilege even today has a racial component, and sensitive to the ways those things might look to Black people. Wait, there is a word — “woke.” Or maybe I should say that there was a word. The political and emotional connotation has changed rapidly; so have the people who use it. And the change has been rapid. The people who use woke now are White, and they are waving it about as something to be rejected.
--------------

*The joke back then was that a front-page weather story in the Post might run with the headline: Cold Snap in City. Negroes, Jews Hardest Hit.


Did They Really Say That in 1882?

February 12, 2022
Posted by Jay Livingston

Language anachronisms in “The Marvelous Mrs. Maisel” and “Mad Men” often came through loud and clear, at least to my ears. The shows were set in 1960, a time when I was alive — speaking and listening. (See earlier posts here and here ) “The Gilded Age” on HBO is set in 1882, before my time. Still, some of the language in this week’s episode, “Face the Music,” sounded more recent. Julian Fellowes, who created the show and did much of the writing, came in for some criticism (here, for example) for the language anachronisms in his “Downton Abbey.” I can just see him chuckling now as he waves the title “Face the Music” to lure in the language police and then swats them back by having Mr. Russell say, “To employ a modern phrase, I'm afraid you must face the music.’”

OK, “face the music” was not a phrase before its time. But in 1882 it wasn’t exactly modern either. My own memory does not extend back to 1882. That’s why we (and that includes Mr. Fellowes) have the Oxford English Dictionary, and according to the OED, this “modern phrase” has one example from a newspaper fifty years before the Gilded Age and another from 1850: “There should be no skulking or dodging...every man should ‘face the music’.”

If you’re not watching the show, know that Mr. Russell is the nouveau-riche businessman. His adversaries who must face the music are the establishment wealthy. They have connived to ruin him financially, but Russell outwits them, using his own wealth to put them on the verge of financial ruin. He will have his revenge. “I didn't see this coming. I admit it. I thought you were honorable men. Not too honorable to miss the chance of a fat buck, of course, but not greedy, dirty thieves.”

He adds, “I thought I was the one who might throw a curveball.”

(Click on an image for a larger view.)

Curveball? By 1882, pitchers had been throwing curveballs for a decade. But they were literal curveballs. Metaphorical curveballs didn’t come into play for another half-century.

In an earlier scene, Russell’s daughter uses the phrase “the thing is.”

This too sounded modern to my ears, even if she did not use the double “is” that many people today  add, as in “The thing is is that it’s very recent.”  I may have been wrong. The OED finds Matthew Arnold using it 1873. “The question [of a state church]..is..so absolutely unimportant! The thing is, to recast religion.” I’m not sure that this is exactly the way we use it. The first clear example of that in the OED is from John Galsworthy in 1915. “Look here, old man, the thing is, of course, to see it in proportion.”

Finally, there was “identify.” Miss Scott has submitted her short stories to a newspaper. They are, the editor tells her, “beautifully constructed and executed.” The problem is that Miss Scott is Black and so is the main character in the story under consideration. The editor tells her that some adjustments will be necessary.

“The little colored girl would need to be changed to a poor white child.”

Why, she asks.  

“Our readers will not identify with a colored girl's story of redemption.”

I was mostly wrong about this one. Identify in this sense goes back at least to the early 1700s. But until the mid-20th century there was always a pronoun like himself or onesself  between identify and with. What the editor should have said is “Our readers will not identify themselves with a colored girl’s story.” In 1882, the reflexive pronoun was still required. Today, it has been absorbed into the word identify.




Trends in the Word Market

February 10, 2022
Posted by Jay Livingston

Kieran Healy tweeted recently about his 2017 paper “Fuck Nuance.”

(Click on an image for a larger view.)

I wondered again, as I wondered when I heard Kieran present this paper at the ASA meetings,* is nuance itself a recent thing, or is it just that the word has become fashionable? The Nexis-Uni database of news finds only seven instances of the word before 1975, the first coming in 1969. Before then, there was no nuance to fuck.

The word seems to have been put into play by theater critics. But surely there must have been performances in earlier decades that critics of the 70s and beyond would have called “nuanced.” Praise for scholarly writings as nuanced happens a decade or so later. But how might an earlier take on those same performances or writings have phrased it?

I don’t know.

Then there’s “sustainable.” How I wish I had bought stock in Sustainable in 1980. It would have been like buying Bitcoin in 2010. But in this case, I have a good idea of the word sustainable replaced: viable.

I associate the word with the Kennedy administration. It seemed that government higher-ups were always talking about “viable options.” Today we would call them “sustainable options.” For example, today’s Inside Higher Ed (here) quotes someone saying of a colleague, “the demands of both his role here and his elected position are not sustainable.” He means that the colleague can’t fulfill the demands of both roles. Or to put it in the language of 1965, continuing in both roles is not a viable option.

Perhaps “nuance” no longer be viable. It will see the fading of its cachet, and I will look back and wonder why I didn’t sell my Nuance shares as soon as I heard Kieran present that paper.

 ------------------------
* The title was the first slide in Kieran’s presentation, and it remained on the screen as Kieran took care of technical matters at the podium. Then he clicked to the second slide, which, if memory serves, was “No, seriously. Fuck it.”

Being the Ricardos — Who’s Gaslighting Who?

January 17, 2022
Posted by Jay Livingston

The most frequently viewed post on this blog by far is the original one about language anachronisms on “The  Marvelous Mrs. Maisel” (here). Google “Maisel anachronism,” and this blog will be near the top of the list.  “Mad Men” too sometimes dotted the1950s landscape with twentieth-century language. (Blog posts are here and here).

This month, television once again took us back to the 1950s with “Being the Ricardos,” and once again the script has language that sounds much too new. We’re not talking about TV fluff where historical precision hardly matters — sitcoms like “The Godldbergs,” set in the 1980s but with a writers’ room stocked with writers who in that decade were barely toddling. But “Being the Ricardos,” written and directed by Aaorn Sorkin, asks to be taken seriously, and Sorkin has a great ear for dialogue.

Yet he gives us this moment in the writers’ room when Madelyn, one of the writers, has suggested a story line that involves Ricky cheating on his taxes. Desi, who is honest about his own taxes, says that his TV character too would never cheat on his taxes. Another writer, Bob, thinks the cheating plot element would work. “It’s very relatable. Everyone does.”

In 1955, things were not “relatable.”  

A few seconds later, Sorkin’s script has this:

Her process? People did not have processes in the 1950s. They just did things the way they did them. They weren’t relatable either. Here are the graphs from Google nGrams, which tallies the frequency of words in books. Both these terms come into wide use only well after the 1950s. True, it takes time for a trendy word to go from everyday talk to a published book, but the lag time is not forty years.

Then there’s gaslighting. Gas as a way of lighting streets and rooms came in around 1800, and that was the gaslight referred to in the 1944 movie, which was set in the late 19th century. Gaslight was a noun. The current usage — as a verb meaning to try to make someone doubt their own true perceptions — didn’t appear until the 21st century.

In “Being the Ricardos,” although Desi does not cheat on his taxes, he may be doing another kind of cheating. Lucy suspects, Desi denies and suggests that she is unreasonably suspicious, that the problem is in her mind.

In the 1950s, people talked about lying and cheating, Men might suggest say their wives were imagining things, might even suggest that they see a psychiatrist, and wives might see all that as a baseless ploy. But nobody called it gaslighting.

Is Sorkin trying to get us to think that 65 years ago people talked about their process and whether something was relatable? Is Sorkin gaslighting us?



When Less Is More . . . More Correct

August 12, 2021
Posted by Jay Livingston

“Fewer than one in four people who are pregnant are vaccinated,” said Noelle King on NPR this morning. Fewer? Really? Why not less?

I suppose that Ms. King or whoever wrote the script was thinking of the individual people. Grammarly.com, an online source, offers the same prescription.

To decide whether to use fewer or less with a percentage, you will have to look at the bigger picture and ask yourself, “What is this a percentage of? Is it countable?”

Fewer than eight percent of the world’s people have blue eyes.

Although counting the world’s people would be an unenviable task, it is possible to count individual people. Therefore, eight percent of the world’s people is countable and we use the word fewer.

Even if you think you should use fewer when talking about separate, countable things, the NPR lede makes no sense. The only number “fewer” than one in four is zero in four. That would mean that no pregnant people are vaccinated.

Dollars are countable, but we don’t talk about “people whose income is fewer than seventy thousand dollars a year.” The same goes for many other things. We don’t say, “I weigh eight pounds fewer than I did in March,” or “Stop for gas. We have fewer than two gallons left in the tank.”  

All of these statements — vaccinations, incomes, gas tanks — are not not about individual people or things; they are about a level or rate. And when you are talking about levels, it makes more sense to use less.

In the NYT last month, the print edition had a story about Covid rates in counties where “fewer than 40 percent” of residents had been vaccinated. The online version corrected this to “under 40 percent.” I guess the copy editor didn’t have the confidence to change it to less.


I seem to be hearing this kind of fewer more and more. (I wish I had some actual data to show the trend, but I don’t,) Those contests from my childhood where you were asked to write something in “twenty-five words or less” would now be “twenty-five words or fewer.”

What’s wrong with less? My guess is that fewer sounds like what educated people say. Fewer is more sophisticated; less sounds so ordinary. It’s like using fortuitous rather than fortunate. The words sound alike, and in many instance, both could apply — things that are fortunate may also happen unexpectedly by chance. So why not use the one that sounds like something a person with a large vocabulary would say? Of course, I’m fighting a losing battle here.  I expect that in a few years, if it hasn’t happened already, dictionaries will tell us that the meaning of fortuitous has now expanded to cover both. But to my ear, it’s like being served the salad course at dinner and asking someone to “pass the dressage.”


I’m So Excited

July 27, 2021
Posted by Jay Livingston

The French, says Julie Barlow (here), don’t show excitement. They don’t even have a word for it, or if they do, it’s not our word. Je suis excité implies arousal that is physical, not emotional. In France, it’s difficult to say you’re excited.

Not so in the US. Barlow quotes a bilingual American in France who says that the French can in fact express excitement. It’s just that most of the time they prefer not to.

The American public, he says, has been trained “to have a fake, almost cartoonish view on life, in which superficial excitement and false happiness are the norm.” By comparison, he notes, in France, “excitement is typically shown only when it is truly meant.”
      
Excitement is indeed the norm to the point that it looks like excitement inflation. Where people once might have been “glad” do or say something, they are now excited.  Three years ago, my university e-mail brought this message from IT   
The Web Services team, in collaboration with groups across the University, is very excited to announce the latest round of completed projects in support of our ongoing comprehensive redesign of the montclair.edu website.
The trend may be more noticeable to us older folks whose language still belongs to the era before excitement inflation. I doubt that anyone else who saw this e-mail wondered about the excitement sweeping through IT. Or maybe they just didn’t notice.

Each year, at our first college meeting, department chairs who have been lucky enough to get a line or two introduce their new faculty. When I did this in my last year as chair, introducing Tim Gorman, I began something like this:
I don’t know if you’ve been on a search committee and read applications lately, but one of the things that struck me this time was that most of the applicants say they’re excited. “I’m excited to be applying to Montclair State.” “I’m very excited to be applying for the position . . . “ A lot of them began like that.

And all I could think was that either these people lead very dull lives [this got some quiet  laughter] or else they know something about this place that I, in my four decades here, have yet to discover. [more laughter, which is really all that I cared about]

So when I read Tim’s letter and it began, “I’m applying for the position” or something like that, I thought, now here is a man with reasonable sense of proportion.
  
I don’t have any systematic data on this inflation of the excitement, but the laughter of the faculty at that meeting tells me that I was onto something.

Gender and the C-word

May 26, 2021
Posted by Jay Livingston


The C-word is back in the news thanks to a retweet by Rep.Marjorie Taylor Greene.  Greene had likened vaccination and mask policies to the forced wearing of yellow stars by Jews under the Nazis. Republican House leader Kevin McCarthy had said that her comparison was wrong. A Greene supporter came to her defense on Twitter:


Look you moron, nobody supported Israel in their recent conflict with Hamas more than MTG,. Her analogy may not have been perfect but you seriously need to get a grip you feckless c**t. Pelosi is the villain here.

@AsimplePatriot was alluding to Samantha Bee’s use of “feckless cunt” three years ago to describe Ivanka Trump. At the time, I blogged about the different ways that cunt is used in the US and the UK. For Brits, cunt does not explode the conversation the way it does here. In the US, uttering (or tweeting) cunt changes the question from who is a moron to who used that word.

What’s different this time is that an American used it to characterize a man (Kevin McCarthy). As I pointed out in the earlier post, Brits have been calling men cunts for at least a few decades. As an example, I embedded a Monty Python sketch from the 1960s. It’s still funny. (The post is here.)

We Americans have imported some Britishisms. One-off is now common, and I see gobsmacked coming through customs. But I doubt that @AsimplePatriot’s tweet signals the start of a trend towards making cunt less taboo and less gender-specific.*

-------
* I remember a dorm-room discussion long ago, mostly Jewish guys in the room, where someone characterized a girl as a schmuck. “You should never call a girl a schmuck,” said another guy, and after a pause added, “Unless she’s a real schumck.”

Personal Needs and Public Morality

May 6, 2021
Posted by Jay Livingston
    

Victoria Eng updated the page [a GoFundMe for the victim] on Wednesday to say that her grandmother is recovering well after surgery.

“These Asian hate crimes need to stop," she wrote. "San Francisco is my home and my Grandma's home. We need to feel safe where we live and not in constant fear.”

Over the last sixty years, we have largely abandoned the language of moral imperatives for the language of psychological well-being. We don’t say what should or must happen or what people ought to do. Instead, we talk about what is needed.

I have noted this change before (here and here, for example). But I couldn’t pass up the above quote in yesterday’s NPR news story.  It so perfectly uses both the new and the old sense of need.

The second need — “We need to feel safe. . .” — would have been as unremarkable in the 1960s as it is today. It’s about needs, specifically the needs of Asians.

But “These Asian hate crimes need to stop” is about morality. It says what should or must happen. But we longer use words like should or must. We don’t tell people what they ought to do. “You ought to stop drinking so much.” That would be imposing an external morality, and morality is always about what’s good for others and for the society as a whole, not the individual. Instead we phrase it in terms of the person’s own needs; we tell them what is in their own interest and will benefit them.. “You need to stop drinking so much.”

This use of need has expanded to the point that we now talk about the needs of Asian hate crimes.

Of course, the meaning of the sentence is clear. It’s Asians or the society as a whole that needs for these hate crimes to stop. But because of the change in language, we now phrase it in a way that syntactically makes no sense.

Trauma and Therapy-Speak

March 30, 2021
Posted by Jay Livingston

It’s nice to have your perceptions ratified so that you can stop asking yourself, “Is it just me that’s noticing this?”  Lately, it seemed that I was hearing more talk about trauma — and for some things that didn’t seem especially traumatic. Katy Waldman heard the same thing. “Around every corner, trauma, like the unwanted prize at the bottom of a cereal box. The trauma of puberty, of difference, of academia, of women's clothing.” Women’s clothing? Oh well, Waldman is a staff writer at The New Yorker and presumably more plugged in to the zeitgeist than I am. That sentence is from her article “The Rise of Therapy-Speak” (here).

Google nGrams confirms our suspicions. Mentions of both trauma and harm rose starting about 1970.

But trauma’s market share increased.

The important difference is that while both trauma and harm injure a person, trauma implies long-lasting psychological damage.  
                                        
Waldman can’t decide whether therapy-speak is really a recent development. The title of the article (“The Rise of . . .”) implies that it is, and she says that “the language of mental health is burgeoning.” But she also quotes a psychologist who tells her that “the language of the therapist’s office has long flooded popular culture.” I agree. The specific words that are in fashion come and go — trauma is on the rise, inferiority complex and midlife crisis are relics of the past — but thev process remains the same. So does the criticism. Waldman takes aim at “therapy-speak”; forty years ago the same target was “psychobabble.”

Psychotherapeutic discourse usually remains inside the gated city of the educated liberal elite. I imagine that on Fox News there’s about as much  of “toxic” relationships or emotional “triggers as there is of “mindfulness.” Those outside this world can find therapy-speak and its attendant world view annoying. Waldman speaks of “irritation that therapy-speak occasionally provokes,”

the words suggest a sort of woke posturing, a theatrical deference to norms of kindness, and they also show how the language of suffering often finds its way into the mouths of those who suffer least.

Therapy-speakers are annoying partly because they are parading their self-absorption. As Lee Rainwater said a half-century ago, "the soul-searching of middle class adolescents and adults,”  when compared with the problems of the poor, “seems like a kind of conspicuous consumption of psychic riches.” Nobody likes a show-off.

In one important way, trauma talk is different from earlier therapy-speak. Among the people Waldman is writing about and their counterparts in earlier generations (those who suffer least), therapists, neuroses, depression, anxieties, etc. have long been part of the conversation, These are, after all, the people who went to Woody Allen films.  The trauma frame shifts the focus to some external source. To some extent that has always been true of psychoanalytic ideas, with their emphasis on childhood experiences with parents. But calling it trauma puts it in the same been as the post-traumatic stress disorder suffered by soldiers who have been in combat. Besides magnifying the harm of these more mundane forms of suffering, it also implies that the harm was done by others, whether by intent or inadvertently. Imagine if Philip Larkin had written, “They traumatize you, your mum and dad.”

-----------------
* I thought that “therapy-speak” might be Waldman’s own coinage. An Internet search turned up only one instance of this term, in a 2019 article at Slate.
                                                                                

We Didn’t Talk About Healing and Unity in the 60s. Why now?

January 22, 2021
Posted by Jay Livingston

Now that the inauguration has finally settled the question of who is president, the calls for “unity” and “healing” will probably taper off. But for a while, you couldn’t turn on the TV without hearing those words. I wouldn’t have been surprised if, after the debacle also known as the Steelers-Browns playoff game, Mike Tomlin had said that the Steelers needed a time for healing.

In past times of national division, healing and unity were not part of the political discourse, They have become popular only recently, sort of like Liam and Olivia. In the 1960s, nobody named their kid Liam or Olivia. The 1960s was also, you may recall, a period of political conflict and division over civil rights and the war in Vietnam. Riots in the cities, assassinations of political leaders, killings and terrorism by White supremacists who were sometimes also cops and sheriffs. And yet, there wasn’t a lot of talk about healing and unity.

(Click on an image for a larger view.)

Unity actually declines in the sixties. Healing is just beginning its rise, and I suspect that much of the healing talk in those books was about personal rather than political healing. The crossover into politics does begin in the sixties, but the rise was nothing like what happened a quarter-century later.

Google nGrams, the source of the above graphs, counts words in books, so it lags behind the actual change in fashions. For something more up-to-the-minute I tried the Nexis-Uni tally of words in news publications. The graphs I could get quickly are not as nuanced, not as granular (speaking of fashionable words), but they show the same trends. The concern with healing a divided nation doesn’t set in until very late in the 20th century,

Why were we not talking about unity in the 1960s? My guess is that the difference between then and now is that although the nation was divided, it was not polarized. Certainly, the two major parties were not as polarized. The news media were also more concentrated, less divided. The most trusted man in America was a TV news anchor, something unimaginable today.

As for healing, its popularity is part of the more general diffusion of the language of psychotherapy into all areas of life, including politics. The therapy-based issues, as in “he has commitment issues,” has replaced the more secular problems. Decades ago, if I said, “Houston, we have an issue,” I would get a smile of recognition. Now, most people would think it was an accurate quote. We also talk about what someone “needs” to do rather than what they “should” do — the therapy language of personal needs replacing the morality language of right and wrong.* It’s a tribute to what might be called the triumph of the therapeutic that in a time when an actual disease has killed hundreds of thousands of Americans and infected millions, our talk of healing is all about politics.

---------------

* I’ve said this before in somewhat greater detail in earlier posts (here
 Mad Men — Language Ahead of Its Time) and here (Needs — One More Time).




Ring in the New — With a Bit of the Old

January 5, 2021
Posted by Jay Livingston

Last week, I learned a new word, skeuomorphism — a elements of a new technology that look like those of the old technology, elements that are now unnecessary. Virtual sticky notes for your computer screen or a push-button phone that looks like a rotary phone. The examples I found were all in design. But maybe skeuomorphism can apply to language as well.

Many of the stories about Trump’s recent phone call to Georgia referred to “tape.”

But there was no actual tape.

The call was recorded on some sort of electronic drive.

Radio journalists too use this term. They talk about “getting good tape” — audio quotes that will sound good. A 2019 Columbia Journalism Review article had the title “For the record: 18 journalists on how—or whether—they use tape recorders.” Some of those journalists refer to “recording” on a “device” or iPhone. But some use “tape” to mean 21st-century recording.

Is there a different word for this — using a word from an old technology even though that bit is no longer in use? And are there any other examples? I can think of only one. Musicians still sometimes talk about making “a record.” Not a recording, not an album, but a record.

Does  “cc” qualify? People still say, “I’ll cc you on that memo,” knowing full well that “cc” is an abbreviation for “carbon copy” and that there is no carbon paper involved. But “cc” has become a stand-alone term, now free of its finger-smudging origins. Nobody says, “Send me a carbon copy.”

The YouTube logo is an example of skeuomorphism in both design and language.


The image is shaped like the screen of an old television set, not a modern flat-screen. Those old TVs also used cathode-ray tubes. By the time YouTube came along (2005), nearly all TVs were flat-screens with square corners. And gone were the days when you might hear someone refer to “the tube” (or “the boob tube”)? Yet YouTube chose to retain both the word and the image of the old technology.

The Lack of a Need for Unclear Negative Writing Can Never Be Underestimated

January 4, 2021
Posted by Jay Livingston

Twelve years ago in this blog, I wrote a post with the title, “Accentuate the Positive; Eliminate the Negative.”*  I was borrowing the title of a hit song of the 1940s. The post, unlike the song, was about language and writing, not general philosophical outlook. My point was simply that the more negatives a writer loads into a sentence, the harder it is for readers, and sometimes the writer, to understand the meaning of that sentence.

Those negative words include more than  just no, not, and never. My made-up example was parody of those newspaper summaries of Supreme Court decisions. “The court failed to overturn a lower-court ruling that denied a request to reverse . . .”

Do real writers write like that? Yes, sometimes to the point that they lose track of their multiple negations and write something whose literal meaning is the opposite of what they intended. Here’s economist Noah Smith at Bloomberg a few days ago (here). He was arguing that Texas, if it is to succeed in its effort to become a technology hub, will have to do something about “noncompete” clauses that prevent employees from leaving one firm and taking a job with a competing firm.

Banning noncompetes would be inconsistent with Texas’ principles and reputation as a defender of free markets. Noncompete agreements are restrictions on the free movement of labor; they gum up markets.


 Banning, noncompetes, inconsistent. Smith nails the triple. 

Smith is an economist (and a Texan). He likes free markets and competition. If noncompetes gum up markets, then banning them would free up those markets. So banning them would be consistent with principles of free markets. What he meant was, “Noncompetes are restrictions on the free movement of labor; they gum up markets. Banning them would be consistent with Texas’ principles of free markets.”

------------------------------------
*Other examples noted in this blog include a Financial Times piece by someone  “widely considered to be one of the world's most influential writers on economics” (“No, No, a Thousand Times No”) and even the GSS ( “The Power of Positive Phrasing”).

Doctor? My Eye.

December 14, 2020
Posted by Jay Livingston

“Madame First Lady — Mrs. Biden — Jill — kiddo: a bit of advice... Any chance you might drop the ‘Dr.’ before your name? ‘Dr. Jill Biden’ sounds and feels fraudulent, not to say a touch comic.” So begins Joseph Epstein’s WSJ op-ed of this weekend “Is There a Doctor in the White House? Not if You Need an M.D.”
   
Many on the left got upset. They disliked the tone. Smugness has long been a chronic, perhaps unavoidable, flaw among right-wing intellectuals, and Epstein is not as bad as most of them on that score. His opening descent down the ladder of formality of terms of address seems more friendly than condescending. The article is, after all, about what to call the First Lady, and he’s trying out several possibilities.  Even so, he seems to be trying to trigger the libs, and triggered they were.

Some people accused Epstein of sexism. You wouldn’t have done that if it were a man — that sort of thing. Counterfactuals like this are hard to prove, but the critics may be right. Epstein’s main argument against “Dr.” Biden is that this honorific should be reserved for medical doctors (“A wise man once said that no one should call himself “Dr.” unless he has delivered a child.”) Jill Biden has no medical degrees. She as an Ed.D.

However, six years ago, Epstein wrote, “One teacher I do remember fondly was Dr. Branz, a German émigré who taught a course called Commercial Law. He must have been a refugee from Hitler, with a doctor of laws degree...” Yes, Dr. Branz, a law professor. I doubt that this juris doctor had ever moonlighted as an obstetrician.

The WSJ has no objections to using “Dr.” for non-physicians in the White House —  among others, Dr. Condoleezza Rice and of course non-M.D. Henry Kissinger, who insisted on on being called “Dr. Kissinger.” As far as I know, Epstein never gave Rice or Kissinger the same friendly advice he’s offering Jill Biden.                                                         
                                                                
Why shouldn’t we use the same honorific for advanced degrees both medical and academic? Is it confusing? Or does calling our teachers “doctor” cheapen the value of medical doctors? Epstein implies that it’s both. Equating physicians and professors does not fit with a value system that accords teachers much less prestige than they might have in other cultures.

Once long ago, I taught English for one semester in a high school in a small town (pop. 3000) in Japan. My students addressed me as Jay-sensei, sensei being the Japanese word for teacher. I lived with a Japanese family. One day, I had some stomach problems. My family insisted that I go to the doctor — Kimura-sensei. Hmm, I thought, we call our teachers “doctor”; the Japanese call their doctors “teacher.” A commentary on their values?

Doctor originally referred to theologians,  explainers of doctrine — closer to teachers than to physicians. Dr. Webster explains:
<table align="center" border="1" cellpadding="10" nbsp="" style="width: 450px;"><tbody>
<tr><td bgcolor="#F8FCFC" style="text-align: left;">
The word doctor comes from the Latin word for “teacher,” itself from docēre, meaning “to teach.”

The 14th century was the birth of the Renaissance, and lots of teaching and learning was afoot. By the century's end, the word doctor was being applied not just to a select few theologians, but also to qualified and/or accomplished academics and medical practitioners.he word doctor comes from the Latin word for “teacher,” itself from docēre, meaning “to teach.”
.</td></tr>
</tbody></table>

I don’t know the history of sensei. Maybe in Japan, as we speak, some Epsteinian curmudgeon is complaining about all these physicians who insist on being called sensei even though they have never taught even one student how to interpret a multiple regression.

Can “Up” Make Masks Masculine?

November 6, 2020
Posted by Jay Livingston

A lot of people in this country still refuse to wear masks. In most places, no laws require masks, so anti-maskers will have to be persuaded. But how, especially now that those same people associate masks with weakness and femininity? Appeals to altruism run up against American individualism. As the subway rider in New York said this week when other riders repeatedly asked him to wear a mask and even held out masks for him to take, “I live by my principles. . . I don’t wear a muzzle.” (The full story is here )

One strategy that seems to have caught on is “Mask Up.”

(Click on the image for a larger view.)

These campaigns remind me of “Don’t Mess With Texas.” Now it’s an all-purpose slogan, but it originated in 1985 as a campaign to get “Bubba,” the stereotypical Texas truck-driving male, to stop tossing beer cans and other litter onto the highway. Highway beautification had the same problems as masks. It required that the individual inconvenience himself for the sake of a goal that benefited only the general society, not himself, and in a way that was not immediately visible. In addition, the goal highway beautification reeked of flowers and femininity.

A slogan like “Let’s Keep Our Roads Beautiful” wasn’t going to cut it. But “Don’t Mess With Texas,” with TV ads featuring Dallas Cowboys linemen, combined masculine toughness with state chauvinism.*

Covid is a far more serious problem than litter, but the strategy is the same — masculinity and local patriotism.  The pugnacious “Philly Never Backs Down. Mask Up” seems too similar to “Don’t Mess With Texas” to have been a coincidence. But it’s the “up” that I find most interesting. “Mask Up.” to my ear at least, sounds more masculine than “Wear a mask.” I’m not sure why. Maybe the “up” implies a bold action, like an athlete suiting up for the big game, a game for which he is also amped up, revved up, and even juiced up.

“Listen up,” says the coach in the locker room just before he gives his speech to get the team psyched up. To “Listen up” is active than to merely “listen.”

The Texas campaign reduced highway litter by 72%. Unfortunately, I don’t know any research showing the effect of “Mask Up.” 

-------------------
* See my 2009 post Lone Star Litter and Values.



Watching Your Language — Gerunds and the Fantasy Echo

August 27, 2020
Posted by Jay Livingston

Gabriel Rossman has a very funny Twitter thread today detailing the mistakes he found when he reviewed the transcription of his lectures made by UCLA software (Kaltura). One llecture included a reference to the Trojan War and the Greek warrior Diomedes.


Similar human mis-hearings (officially “mondegreens”) are so common in rock music that they fill countless webpages. Many of these mondegreens — e.g., “’Scuse me while I kiss this guy”  — make perfect sense. So does diabetes. They’re just out of place.

But Kaltura also transcribed “emergent from norms of gerontocracy” as “emergent from ruins of gerund talk receipt,” which makes no sense.

But is Kaltura so much more ignorant than the students. How many undergraduates would recognize the name Diomedes? And how many would have a good idea of just what “norms of gerontocracy” are? Or even what a gerontocracy is?

I was reminded of a story from my undergraduate days* — so we’re going way back before transcription software or, to be honest, 8-track tape. The poet Allen Grossman was grading the final exam of his course on (I think) modern poetry, modernism defined as beginning around 1890. In one of the first blue books he read through, he was struck by the phrase “fantasy echo.” What a striking coinage for an undergrad to come up with, and yet it captured the feel of some early modern poetry.

But then the same phrase appeared in the essay of another student and then another. They couldn’t all have separately invented the same unusual metaphor. He thought back over the readings and his lectures. No fantasy echoes there. But then he realized that he had spoken frequently about the fin-de-siècle, and he had given the term his best French pronunciation. I don’t know whether in subsequent semesters he resigned himself to “turn of the century.”

And now I can’t stop think about “ruins of gerund talk receipt.” I hear a fantasy echo of grumbling, of the crumpling of a receipt, strewn on the ground by a language student who has passed the orals, or at least the part on gerunds.

------------------
*The story has become something of an urban legend, ascribed to various teachers on various campuses. The OG prof may well be George Mosse, with the phrase cropping up in a history course he taught at the University of Wisconsin in 1964. (See here.) The Brandeis version I heard dates back to roughly the same time, so that’s my narrative (comme on dit), and I’m sticking to it.

“Black People”or “The Black People”

May 31, 2020
Posted by Jay Livingston

There’s a subtle but important difference between talking about “Black people” and taliking about “the Black people. Here’s Trump yesterday. (I’m usins Sarah Cooper’s version because she’s physically so much more expressive than Trump playing his invisible accordion.)



“By the way, they love African American people. They love Black people. MAGA loves the Black people.”

Does anybody really believe that Trump was being sincere? Or accurate? Does team MAGA love “the Black people”? The definite article, that the, gives him away.

During the 2016 campaign, when some suggested that Trump presidency would not be good for women, Trump said, “I’d be phenomenal to the women.”

At the time, I wondered how “I’d be phenomenal to the women” is different from just “I’d be phenomenal for to women.”  The blog post (here) continued:   

when you add “the” to a demographic group and speak of “the women” or “the Blacks,” you are separating them from the rest of society. Without the definite article, they are included. To say, “In our society we have Blacks, Jews, women. . . . .” implies that they are all part of our group. But, “We have the Blacks, the Jews, the women . . . .” turns them into separate, distinct groups that are not part of a unified whole.

This construction using the definite article fits well with the MAGA notion that America is their country. In their view, they are, as Sarah Palin put it, “the real America.” Republicans, when they are out of office, talk a lot about “taking back our country,” as though the Democratic party were a bunch of foreign usurpers. (See this post from when Obama was in office and running for second term.)  Now that they have taken back the country, they may allow others — the Blacks, the women, and others — to live in it.

Brides and Names, New York Times Edition

February 20, 2020
Posted by Jay Livingston

“She’s keeping her name,” a friend said the other day. We were talking about a girl we know who got married last year. Is that still a thing, I wondered, keeping your name. What I really meant was: how much of a thing is it? Then I remembered Wedding Crunchers, the corpus of all words in New York Times wedding announcements — sort of like Google nGrams but with a much narrower focus and far fewer filters for researchers. 

Unfortunately, the database goes back only to 1981, so we can’t know when the name-keeping trend started. It was underway by the eighties. By 2000, more than 20% of Times brides announced that they were keeping their names, so many that several of those who were changing their names felt it necessary to proclaim their traditionalism in the announcement.

(Click for a larger view.)

I’m not sure what happened in 2015. Maybe that was the year that the Times instituted the current policy, which finesses the politically tinged proclamations of the keepers and the changers. Instead, the Times puts the maiden name in the headline and the married name in the text. Finding out who’s keeping and who’s changing requires a closer reading, but those who are interested will figure it out.
Here are two weddings from Sunday’s paper. (I edited out the photos to save space.)

(Click for a larger and clearer view.)

Adrienne is becoming Mrs. Adams. Elle will remain Ms. O’Sullivan.

There’s another change in the language, though you have to go back to the eighties to see it. Adrienne graduated from UNM, Elle from UCSB. In fact all brides and grooms these days “graduate from” their schools. But in the old days, a student “was graduated from” the school. The Times, and many of the people whose wedding announcements they accepted were traditionalists.


Even as late as 1980, nearly 60% of the wedding announcements included someone who “was graduated from” a school.*

The wedding announcements in the New York Times are hardly a representative sample of anything, But they do offer a glimpse into the world of the elite. For more on that, see Todd Schneider’s excellent post from 2013. As for those at the other end of the social spectrum, graduating from college is not so much an issue, and as marriage rates decline, neither are wedding announcements in the newspaper or the question of whose name to use.

--------------------------
*Nowadays, you sometimes hear, “I graduated college in 2015,” much to the dismay of language prescriptivists, who insist that the correct expression is, “I graduated from college.” They don’t realize that their presciptivist counterparts of 150 years ago would have been just as appalled and in despair for the language because people were not saying, “I was graduated from college.”