Mrs. Maisel Gets One Right

December 26, 2019
Posted by Jay Livingston

Since the new season of Mrs. Maisel dropped not long ago, my post from nearly two years ago about its language anachronisms (here) has been getting some action. It’s still the most frequently viewed and commented-on item on this blog, and some of the newer comments made it clear that the anachronisms were still in bloom.

I watched first couple of episodes recently, and sure enough, in Episode 2, “It’s the Sixties, Man!” we got contextualize.



In a blog post (here) about trendy words at the ASA meetings, I  cited contextualize, but that was a report from academia in 2013, not New York family life in the early 1960s.

To the show’s credit, it did have a period-perfect language moment. Joel has been speaking with the older Chinese couple who own the Chinatown space he has rented, planning to turn it into a club. He discovers that the space includes a Chinese gambling parlor. Worried about trouble from the police, he meets with the owners. After some back-and-forth to deal with Joel’s fears, the couple — through their translator,* the young med-student Mei — allude to a murder, significantly raising Joel’s level of anxiety.

After the Chinese couple leave, Joel is discussing the matter with Mei. What about the murder, he asks.


Talk of the “put-on” and “putting you on” came on the scene in the late 1950s, starting among young, hip people like Mei and eventually spreading throughout the society. I thought that its use had declined in the 21st century, but Google nGrams shows that at least until 2000, it was still found in books.


Still, my impression is that we rarely refer to “putting people on” these days. But what has replaced it?

--------------
* Another anachronism for anyone still keeping score — a language anachronism of sorts : the owners are speaking Mandarin. In the early 1960s, the language spoken in Chinatown was Cantonese. Immigration from Mandarin-speaking areas of China did not begin until the 1970s at the earliest.

Acting and Reacting as an Agent of Culture — Moi?

December 21, 2019
Posted by Jay Livingston

A long time ago I heard Margaret Mead speak, and one of the only things I remember her saying was this: “A society has to teach its cultural ideas to its children in terms so clear that even a social scientist can understand them.”

I am, allegedly, a social scientist, but only an encounter with something very unusual can jar me into seeing my own culture. Like most people, I usually take it for granted, like the air we breathe. That was the point of the previous post, where a psychologist was urging dog owners to give their dogs more choice. It took this extending of human culture to canines to remind me of the great emphasis American culture gives to individual independence and choice. All those times that I had heard parents, me included, ask their kids if they wanted Cheerios or Chex, it had never occurred to me that we were drilling a particular American value into the minds of our little tots. I thought we were just being parents.

I had a similar cultural-blindness experience a few years ago. A student born and raised in Turkey came for his obligatory meeting with his advisor — me. He was a grown man in his forties. “What courses I should take?” he asked. I explained about the core requirements and recommended he take the first in the sequence. “And then there are the electives” I said and showed him the list.

“Which courses I should take?

I explained that these were electives. He could take any of the ones we were offering that semester. If you’re interested in family, you could take that course, I said. If you’re interested in religion, we have that elective.

“Yes, but which ones I should take.”

I found it incredibly frustrating. What was so complicated about the concept of electives? It did not occur to me that our differences were cultural. I was so thoroughly an American I that could not imagine anyone rejecting the freedom to make their own choice of courses. Who would not seize that opportunity? Only someone who did not understand.

In retrospect, I now think that he did in fact understand. He just didn’t think it was such a great idea that the choice should be made by him rather than by a professor — department chair no less — who knew much more about the courses and the instructors. Maybe he was right.

There’s something else to be said for his approach. It creates a personal link between the advisor and the student in a way that the independent-choice model conveniently avoids. When he was asking me to choose courses for him, the thought crossed my mind that I could tell him to sign up for some of the low-enrolled courses that were in danger of being cancelled — courses students were avoiding because of the reputation of the course, the instructor, or both. That certainly would have made things easier for me as department chair. But I now felt that I had to look out for his best interests as well. I felt an obligation that was different and perhaps stronger that what I would feel towards other students.

As I say, when all this was happening, I didn’t think about the underlying cultural differences. I just felt uncomfortable. I will leave for another post the time when he presented me with a large assortment of almonds, figs, pistachios, etc., while I tried to explain to him the university rules about gifts.

Raise Your Dog to be an American

December 19, 2019
Posted by Jay Livingston

My local online webiste WestSideRag today ran an article with the title “Barnard Researcher Has Studied The Minds of Upper West Side Dogs, and They’re Way More Complicated Than You Think.”

I don’t have a dog, but I started reading.* And as I did, I saw that while the article was about dogs, it was more importantly a document about American culture, particularly our values and beliefs about Choice. We value individual choice as inherently good. We also believe that choice is beneficial and that denying people the freedom to choose will in some way harm them. So we insist that people make their own choices.

Recognizing the wonderfulness of choice is not something that comes naturally. You have to be carefully taught. And it’s never too early to start. It’s just that most of the time, we don’t think that we are hammering American cultural ideas into our kids’ psyches. We just think we’re raising them the right way.

In My Freshman Year, an ethnography of an American university, Rebekah Nathan** includes a chapter (“As Others See Us”) about the perceptions of the foreign students. A Korean student tells her:

Everything here is: “What do you want?” “What do you think?” “What do you like?” Even little children have preferences and interests in this country. I hear parents in restaurants. They ask. a three-year-old child, “Do you want French fries or potato chips?” Every little kid in this country can tell you, “I like green beans but not spinach, I like vanilla but not chocolate, and my favorite color is blue.”

If we think it’s good for three-year olds to make their own choices, why not dogs?

All dog owners should allow their dog to make certain choices, according to Horowitz, who strongly believes that giving dogs choices increases their welfare. . . . Owners should “allow the dog to make their own choice as opposed to your definition of the walk.” She recognizes that people want to feel in control, but points out “what we are in control of is to let the dog have a life in which the dog is partly choosing. This is something we want to give to anyone we love.”

WestSideRag has a relatively small readership — we’re not talking Slate.com — and an article extending our ideas about choice to dogs is extreme. But often the extreme case can call attention to the less extreme versions that are widely taken for granted and unnoticed. In America, even those with a more authoritarian outlook find it hard to refute arguments based on the idea of choice. It’s not just liberals who ask their kids what kind of cereal they want. 

-------------------------
* What originally drew me to the article was the opening paragraph, which contained a pun that I am nearly certain was unintended.

(Click on the box for a larger and clearer version.)

** “Rebekah Nathan” is a nom de plume. The author, Cathy Small, probably wanted to remain anonymous since she was writing about the school where she teaches. The ruse did not work for very long.

“Real” Disney Princesses

December 17, 2019
Posted by Jay Livingston

What do you if you’re a good feminist parent — you want kids to see genders as similar, not exaggeratedly different — and your daughters keep watching those Disney princess movies?

If you’re Philip Cohen, you start blogging about gender dimorphism to call attention to how unrealistically the Disney characters are drawn. The blogposts have titles like “Disney’s dimorphism, ‘Help! My eyeball is bigger than my wrist!’ edition.” You link to data about the sizes of hands, wrists, necks, etc. among real men and women.*

A post gets picked up at more widely read sites like Slate, and you get comments complaining that  “The less realistic the proportions, the more endearing and charming we find the character. The closer to realistic they are, the creepier/blander they can become.”

Maybe you wonder: What if someone Photoshopped the Disney characters to make them look more like real people? Well, someone has. Jirka Vinse Jonatan Väätäinen, a graphic designer in Finland (not too far from “Frozen” territory) has shrunk those princess eyeballs, enlarged those princess wrists and necks, and posted the results on his Website (here). See if you can tell which is which.


(Click for a larger view.)

-------------------
*You can find Philip’s dimorphism posts at his Family Inequality blog here.

Sacred Interiors — Full and Empty

December 17, 2019
Posted by Jay Livingston

Different Games, Different Rules: Why Americans and Japanese Misunderstand Each Other by Haru Yamada. It’s been sitting on my shelf for a while, but I just now started reading it and came across this.

This contrast between the Judaeo-Christian and Buddhist viewpoints is probably what prompted my mother to look up at the domed ceiling of a Catholic church in Florence painted with cherubs and scenes of men and women in heaven and hell, and say, “I guess the idea is to fill your mind with sacred thoughts, not to empty it.”

Okaasan (Mom) nailed it.

(Ceiling of the Duomo in Florence — Brunelleschi, 1436.
Click on an image to enlarge.)

 (Tenryuji in Kyoto. Originally built in 1339.)

Replication Complications

December 14, 2019
Posted by Jay Livingston

Some people can tell a joke. Others can’t. Same joke. One person has everyone laughing, the other gets zilch. Does the null response mean that the joke isn’t funny?

 What we have here is a failure to replicate.

 A couple of days ago, the Psychology Archive (PsyArXiv) published results showing a failure to replicate an experiment on Terror Management Theory (TMT).* Among the possible reasons for this failure, the authors say,

There was substantial nuance required in implementing a successful TMT study. . . . These nuances include how the experimenter delivers the experimental script (tone, manner ). . .

I offered this same idea five years ago. I didn’t use the term “nuance.” Instead, I speculated that some experimenters knew how to “sell it” —  “it” in this case being the basic manipulation or deception in the experimental set-up. You can read the whole post (here), but here’s a somewhat shorter replication. I’m copy-and-pasting because as we get more results from replication studies, it’s still relevant. Also, I liked it.

*              *            *             *


One of the seminal experiments in cognitive dissonance is the one-dollar-twenty-dollar lie, more widely known as Aronson and Carlsmith, 1963. Carlsmith was J. Merrill Carlsmith. The name itself seems like something from central casting, and so did the man – a mild mannered, WASP who prepped at Andover, etc. Aronson is Eliot Aronson, one of the godfathers of social psychology, a Jewish kid from Revere, a decidedly non-preppy city just north of Boston.

In the experiment, the subject was given a boring task to do — taking spools out of a rack and then putting them back, again and again — while Carlsmith as experimenter stood there with a stopwatch. The next step was to convince the subject to help the experimenter. In his memoir, Not by Chance Alone, Aronson, describes the scenario.

[Merrill] would explain that he was testing the hypothesis that people work faster if they are told in advance that the task is incredibly interesting than if they are told nothing and informed, “You were in the control condition. That is why you were told nothing.”

At this point Merrill would say that the guy who was supposed to give the ecstatic description to the next subject had just phoned in to say he couldn't make it. Merrill would beg the “control” subject to do him a favor and play the role, offering him a dollar (or twenty dollars) to do it. Once the subject agreed, Merrill was to give him the money and a sheet listing the main things to say praising the experiment and leave him alone for a few minutes to prepare.

But Carlsmith could not do a credible job. Subjects immediately became suspicious.

It was crystal clear why the subjects weren't buying it: He wasn't selling it. Leon [Festinger] said to me, “Train him.”


Sell it. If you’ve seen “American Hustle,” you might remember the scene where Rosenfeld (Christian Bale) is trying to show the FBI agent disguised as an Arab prince how to give a gift to the politician they are setting up.  (The relevant part starts at 0:12 and ends at about 0:38)



Here is the script:


Aronson had to do something similar, and he had the qualifications. As a teenager, he had worked at a Fascination booth on the boardwalk in Revere, Massachusetts, reeling off a spiel to draw strollers in to try their luck.

Walk right in, sit in, get a seat, get a ball. Play poker for a nickel. . . You get five rubber balls. You roll them nice and easy . . . Any three of a kind or better poker hand, and you are a winner. So walk in, sit in, play poker for a nickel. Five cents. Hey! There’s three jacks on table number 27. Payoff that lucky winner!

Twenty years later, he still had the knack, and he could impart it to others.

I gave Merrill a crash course in acting. “You don't simply say that the assistant hasn't shown up,” I said. “You fidget, you sweat, you pace up and down, you wring your hands, you convey to the subject that you are in real trouble here. And then, you act as if you just now got an idea. You look at the subject, and you brighten up. ‘You! You can do this for me. I can even pay you.’”

The deception worked, and the experiment worked. When asked to say how interesting the task was, the $1 subjects give it higher ratings than did the $20 subjects. Less pay for lying, more attitude shift.

 The experiment is now part of the cognitive dissonance canon. Surely, others have tried to replicate it. Maybe some replications have not gotten similar results. But that does not mean we should toss cognitive dissonance out of the boat. The same may be true for TMT. It’s just that some experimenters are good at instilling terror, and others are not.

----------------------------------
  * If you’ve never heard of TMT (I hadn’t), it’s basically the idea that if you get people to think about their own mortality, their attitudes will become more defensive about themselves and their group. Of the twenty-one replications, a very few got results that supported TMT, a very few got results that contradicted TMT. Most found no statistically significant or meaningful differences. 

Here’s the set-up for the independent variable: The subjects in the Terror condition were asked to write about “the emotions they experienced when thinking about their own death, and about what would happen to their physical body as they were dying and once they were dead.” The non-Terror subjects were asked to write about the same things about watching television — e.g., what happens to your physical body when you watch TV. (I am not making this up.)

Methodological Trees and Forests

December 12, 2019
Posted by Jay Livingston

The units of analysis that researchers choose usually constrain the explanations they come up with. Measuring variables on individuals makes it harder to see the effects of larger units like neighborhoods.

For example, much research has found a correlation between female-headed households and crime. Most explanations for this correlation focus on the households, with much talk about the lack of role models or the quality of parent-child interaction. But these explanations are looking at individual trees and ignoring the forest. The better question is not “What are the effects of growing up in a single-parent home?” It’s “What are the effects of growing up in a neighborhood where half the households are headed by single mothers?”

In the early 1990s, I wrote a criminology textbook, and one of the things that differentiated it from others was that it took seriously the idea of neighborhoods and neighborhood-level variables.

That was then. But now, Christina Cross in a recent Times op-ed makes a similar argument. Research generally shows that it’s better for kids to grow up with two parents rather than one. That fits with our assumptions about “broken homes” even if we now call them “single-parent households.” But Cross’s research finds a crucial Black-White difference in the importance of this one dimension of family structure.

Looking at educational outcomes, she finds that White kids from two-parent families do much better than their single-parent counterparts. But for Black kids, the advantage of a two-parent home is not so great.
living in a single-mother family does not decrease the chances of on-time high school completion as significantly for black youths as for white youths. Conversely, living in a two-parent family does not increase the chances of finishing high school as much for black students as for their white peers.

 Why does a two-parent family have less impact among Blacks? Cross looks at two explanations. The first is that the effect of a very low-income neighborhood (“socioeconomically stressful environments”) is so great that it washes out most of the effect of the number of parents inside the home. For a kid growing up in an area with a high concentration of poverty, having a father at home might make a difference, but that difference will be relatively small, especially if the father is unemployed or working for poverty-level wages.

The other explanation is that having other relatives close by mitigates the impact of having only one parent in the home. Cross says that her data supports this idea, but the extend-family network explanation is not nearly as powerful as the neighborhood-poverty explanation.

For policy-makers, what all this means is that the traditional conservative, individual-based solutions miss the point. Exhorting people to stay married (and providing costly government programs along the same lines) aren’t going to have much impact as long as we still have racially segregated neighborhoods with high levels of unemployment and poverty.

The message for researchers is similar: if you confine your thinking or your variables to individuals, you risk ignoring more important variables.

Whatever Happened to “Broken Homes”?

December 11, 2019
Posted by Jay Livingston

“Just think about the last time you heard someone use the term “broken home” or “single-parent household” to explain the misbehavior or misfortune of a person in your social circle.” That’s from a Times op-ed (here) by Christina Cross, a sociology post-doc at Harvard.

I think the last time I heard “broken homes” was before Ms. Cross was born. It’s so 1950s, with its judgmental pronouncement on families that didn’t look like “Ozzie and Harriet” or “Father Knows Best.” In the 70s, as more middle-class people were getting divorced, we needed a less value-laden term. Enter “single parent.”

(Google nGrams shows the frequency of words in books, so the change in the use of these terms in the media and everyday talk probably happens a year or two earlier.)


“Single-parent” is not as blatantly stigmatizing as “broken homes,” but when we hear it, we still think that something is wrong. The more important point that Ms.  Cross makes is that broken homes — the harmful outcomes they bring — may be much more consequential for Whites than for Blacks. I hope to get to that in a later post. But for now, I'll just point out that the sharp decline in mentions of “single-parent” starting in the early 90s tracks with the decline of teen crime and teen pregnancy in the same period.

Alumn. . .us / a / i / ae / x?

December 9, 2019
Posted by Jay Livingston

Welsey Yang, a writer by trade, must have been browsing messages from his alma mater, Rutgers, when this caught his attention. He posted it to Twitter.

(Click on an image for a larger view. I have added the blue oval for emphasis.)

English has a problem with words that are imported from Latin and Romance languages. What do we do about the gender and number that require Latin to have four different forms of the word? Alumnus, alumna, alumni, alumnae. It’s just too many to keep track of, especially since the plurals, masculine and feminine, sound alike.

The simple solution is to use a single form of the word to cover everyone, and that’s what many people have done. For that single form, there are two choices. One is alum, plural alums, which conveniently gets rid of those gendered endings in the Latin versions.

The other is the all-purpose alumni, which many Engish speakers now use indiscriminately for either gender and for singular or plural. It must drive Latin scholars up the ivy-covered wall. Both of these have been getting more popular lately. I searched for them on Nexis-Uni’s database of news sources.


(Some of the increase in these numbers may be attributable to the increased size of the Nexis database. But I doubt that it has grown by the multiple of 20-30 since the 1990s that we see for an alum and an alumni.)

A Google search for “she is an alumni of” gets 177,000 results, slightly less than the 191,000 for “she is an alumna of.” For men, “an alumnus” still outscores “an alumi” by a factor of 10, but that includes a lot of old sources. I would bet that the ratio decreases with time.

Personally, I would avoid the problem completely and go with graduates or, in less formal settings, grads.

The Rutgers Linguistic department has a different solution to the problem of gender: the very recent coinage alumnx, the word that inspired Yang to tweet. It’s a different sort of solution. The ungendered alum and the all-purpose alumni seem to have cropped up unplanned and without any ax to grind. But alumnx is a deliberate effort to change the language. What’s interesting here, as one of the Yang’s commenters points out, is that as linguists the department members are descriptivists, more interested in describing how people actually use language than in telling people which words to use. But here they are prescribing alumnx as the correct way avoid the less woke Latin forms.

My guess is that these x-words will have a short life. Most people don’t care much about the politics of speech and have little interest in changing. Remember “Freedom Fries”? Worse, alumnx and Latinx don’t resemble real words in English or in any other language.* It’s one thing to replace the “man” in policeman or fireman. “Police officer” and “fire fighter” were already part of the language well before the feminists of the 1960s called our attention to the sexism of the more frequently used terms.

Maybe I’m wrong. Maybe these x-words will become as much a part of the language as Ms. In any case, I expect that the members of the department will, as descriptivist language researchers, monitor how well they are doing as prescriptivist language changers. (Or is it changerx?)

------------------
* Latinx has similar problems. It’s not very popular among the people it is meant to designate, probably because neither English nor Spanish has words in this form. Terry Blas suggests (here) that a better gender-neutral solution would be an “e” rather than an “x” — Latine and Latines.

My So-Called War Crimes

December 3, 2019
Posted by Jay Livingston

When I first saw this item in my Google News feed, I thought it must be from The Onion or Andy Borowitz.

 But no, it was real, from the Daily Beast not a parody site. “If Donald Trump gets his wish, he’ll soon take the three convicted or accused war criminals he spared from consequence on the road as special guests in his re-election campaign.”

Why would war criminals be an asset for Trump? Trump’s base. of course, will not abandon him no matter what he does. But what is the gain? What virtues do these men embody that will pump up the enthusiasm and perhaps attract others.

First, let’s meet the war criminal who Trump has supported most strongly.

Chief Petty Officer Edward Gallagher and other Navy SEALs gave the young captive medical aid that day in Iraq in 2017, sedating him and cutting an airway in his throat to help him breathe. Then, without warning, according to colleagues, Chief Gallagher pulled a small hunting knife from a sheath and stabbed the sedated captive in the neck. A week later, Chief Gallagher sent a friend in California a text with a photo of himself with a knife in one hand, holding the captive up by the hair with the other. “Good story behind this, got him with my hunting knife,” he wrote. [NYT ]

Gallagher was acquitted of murder when another SEAL claimed to have killed the victim first, cutting off his breathing tube as an act of mercy. But apparently Trump and those who share his views think there’s something noble about knifing a dead man in the neck. Certainly Chief Gallagher thinks it’s something to brag about.

One SEAL sniper told investigators he heard a shot from Chief Gallagher’s position, then saw a schoolgirl in a flower-print hijab crumple to the ground. Another sniper reported hearing a shot from Chief Gallagher’s position, then seeing a man carrying a water jug fall, a red blotch spreading on his back. Neither episode was investigated and the fate of the civilians remains unknown.

The Times is being extremely careful here, saying that the fate of the civilians Gallagher shot “remains unknown.” But take a wild guess.

At trial, the SEALs were found not guilty.* But while the evidence may not have been sufficient for the military jury to convict the men, it was enough for the Navy to seek their removal from the SEALs. Trump intervened and forced the Secretary of the Navy to resign, effectively allowing Gallagher and the others to remain as SEALs.

What makes Gallagher so appealing to Trump, his followers, and Fox News? They deny the accusations, of course, but even if the men had been convicted, the case elicits ideas and emotions that are essential elements of much conservative world view in the US these days.

To begin with, supporting the SEALs requires a strong sense of tribalism. This tribalism goes far beyond the “loyalty” Jonathan Haidt sees as a “moral foundation” of conservative thinking. Tribalism sees the world as Them against Us. We are under constant threat from Them. This view obviously pervades domestic politics, where Trump’s go-to strategy has been to claim that Democrats are out to get him.** In foreign policy, it means that anything We do to Them is justified. Anything. Trump has voiced his preferences for torture, cages for border-crossing children, alligator-filled moats, and nuclear bombs, all on the grounds that these protect America from its various enemies. The willful killing of civilians easily fits into the list.

Therefore, We must defend the actual people who carry out these actions — the protectors of the country (the military, border guards, ICE) and the protectors of the social order (police). In practice, this means that there are no restraints on what they can do to people who are not Us or to people who dissent from or threaten the social order. So long as soldiers and cops are ostensibly doing their job, anything goes.

This defense of those who protect our tribe has an obvious corollary: American autonomy, an innocuous term for the idea that we should not be involved in any relationship of mutual regulation or obligation with other countries. “World government” has long been a trigger for the right, and they remain suspicious of the UN and other international pacts. What are “war crimes” after all except the product of international law, a concept which to US conservatives is illegitimate. So on Fox News, the shooting of a little girl or the knifing a wounded prisoner in the neck are “so-called war crimes.” The law that criminalizes these actions, in the America-first view, is not legitimate and is therefore null and void. No law, no crime.

Finally, the Trumpists see Gallagher as an example of the conservative ideal of masculinity. I have gone on too long already, and this really needs no further explanation. It’s enough to note that Trump refers to him admiringly as a “warrior.”
----------------------------------

*For most of the charges, the prosecution had no hard evidence, only the testimony of other SEALs.  The court convicted Gallagher on one count — the taking of a picture of the dead ISIS fighter. Kind of hard to ignore that photo he sent around. As for sniping at civilians, when Gallagher was shooting at the young girl, he didn’t take any selfies.

** This “out to get us” strategy was also part of Gallagher’s defense. His lawyers argued that the men who testified against him were motivated by a desire to get rid of him.