That Word Again — More Taboo, Less Taboo

April 5, 2019
Posted by Jay Livingston

The latest issue of the Hasbrouck Heights High School journal, The Pilot’s Log, reports on their student survey.
  • 98% of students polled hear or see the word used on a daily basis
  • 85% of those students say the word was used in a non-derogatory manner
  • 70% of students polled admit to using the word in a friendly manner
The word, of course, is nigger, or in the Pilot’s Log version “the N-word.”

OK, this survey isn’t the GSS. The editors make no claims for their sample (n = 160) as representative even of their school. As for Hasbrouck Heights, it’s an upper middle class suburb eight miles from New York City, median family income greater than $80,000. The high school students are mostly White, with some Hispanics, and fewer Asians. Less that 4% are Black.

Still, the results, whatever they’re worth, suggest contrary motion. At the same time that nigger is becoming less acceptable and more deplorable, it is also changing its meaning and becoming more widely used and accepted in places where it was once largely unspoken.

In the world controlled by grown-ups, the word is basically taboo — powerful and dangerous. It must be treated with special circumspection. Steven Pinker told the Pilot’s Log that their survey results surprised him.

In the public sphere . . . the word is more taboo than ever. . . Writers have been excoriated for simply mentioning the word as a word, commenting on how it is used . . . I notice that not even you spell out the word . . . but use the euphemism”’N-word” — that is an indicator of how taboo it is.

Note the important caveat Pinker starts with: “In the public sphere . . . .” He’s talking only about the world ruled by grown-ups, the world where even when Whites are not in control they are still within earshot. In private, of course, things are different. African Americans speaking among themselves do not accord nigger a sacred/taboo quality; maybe they never did. And now, among White kids as well, the word is apparently losing its strong overtones of denigration and hostility.

I would guess that the main cause of this change in usage among young Whites is hip-hop. The historical arc of rap resembles that of earlier Black music like the blues and R&B. Those too began as Black musicians speaking to Black audiences. Eventually, White folks listened in, especially White folks who wanted to be hip or cool. That’s true of rap as well, But rap, with its wordy and uncensored narratives, gives White listeners (and maybe Black listeners) the impression that this is how Black people really talk among themselves, or when they just don’t care what White people think.

As in the past, White people, especially young White people are adopting the  sounds and rhythms and moves of Black culture. Also its language. Not just new coinages (bro, 24/7). But some words that have been around for a long time are losing their White meaning and coming to be used the way they are used among Blacks. I’m not a linguist, but my guess is that dude and bitch fall into this category. Sixty years ago, a dude was a “city slicker” — a too-nicely dressed urban dandy, the guy who showed up at a “dude ranch.” Only among Blacks was it a generic term for men.*

Now, it seems that White kids are using nigger not with its White meaning — a nasty racist epithet — but with something more like its Black meaning. I noticed this five years ago seeing a bunch of middle-class White and Hispanic girls at a Sweet Sixteen party in the Bronx.

I was impressed watching these kids recite by heart the rapid-fire lyrics, and I realized they could do the same for lots of other rap hits. Those songs too have this same taboo word. Yet there they were, these sweet sixteen and fifteen year old girls, rapping along with Jay-Z about their gang of niggas. (The full blog post is here.)

I expect there may be some conflict during this evolution, some people insisting that it’s wrong to for certain people to use the word this way.**  But from being on the losing side of language battles too often, I expect that political arguments about what’s right will be just as ineffective as my shouting, “It’s ‘for you and me,’ not ‘for you and I,’” at the people on television.

----------------------------
* In the early 1960s, a Black co-worker — we were shampooing rugs on site, and the boss, for purposes of future sales, wanted information on the homes we went to. My co-worker, filling out the form later, asked me, “How many rooms did that dude have in his crib?”

**See the famous 1975 SNL sketch (here) with Chevy Chase and Richard Pryor, where a job interview morphs into a tense battle of racial epithets.

Is Jeanine Pirro Taking Orders From the Pope?

March 18, 2019
Posted by Jay Livingston

Jeanine Pirro’s comment about Ilhan Omar (D - MN) is a perfect example of the variation on Betteridge’s Law I offered a while ago (here).
Whenever the title of a book or article is phrased as a question, two things are almost certain:
  •     The author thinks that the answer to the question is “Yes.”
  •     The more accurate answer is “No.”
In this case, the question in question is not a headline but part of her commentary about Rep. Omar.

Think about it: Omar wears a hijab. Is her adherence to this Islamic doctrine indicative of her adherence to Sharia law, which in itself is antithetical to the United States Constitution?

It’s a cheap rhetorical trick. It lets you promote an idea without having any evidence. And if challenged, you can claim that you were not making an accusation, but merely asking a question.

This time it didn’t work. Even Fox News suspended Pirro, saying that they “strongly condemn” her comments, and that the comments "do not reflect those of the network.”

My other reaction to this incident is: how soon we forget.

Pirro is a practicing Catholic. She was nine years old when John F. Kennedy ran for president. At the time, some Protestants argued that if Kennedy were elected he would be “taking orders from the Pope.” It’s the same charge that Protestant ministers made in 1928 against the country’s first Catholic presidential candidate Al Smith. Catholics, so the argument went, cannot be true to their religion and still uphold the Constitution. They are under the control of nefarious non-American religious sources.

 And now a Catholic commentator is making the same accusation against a Muslim that was made, in her lifetime, against her own co-religionists .

Danny Boy — Bill Evans

March 17, 2019
Posted by Jay Livingston

For a long time, I dismissed “Danny Boy” as a treacly song that was usually crushed under the weight of too much sentiment, especially on St. Patrick’s Day, especially by tenors. Yet never did I breathe its pure serene till I heard Bill Evans’s eleven-minute exploration of it.




The story I’ve heard (but haven’t fact-checked) is that after his bassist, the incredibly talented Scott LeFaro, died in an automobile accident in July 1961, Evans went into mourning, or at least stayed out of the studio. In April, 1962, Evans went into the studio alone, sat down at the piano, recorded four tunes, and walked out.

For the first few choruses, he stays very close to the melody, first in B-flat, then B-natural (!), then F. Only in the fifth chorus, back in B-flat, does he improvise single-note lines.

For more on Evans, see the documentary “Time Remembered” (available at Amazon Prime), which is also the title of the album on which this take was eventually released. If you watch it, or maybe if you just listen to this track, you will understand why I keep a picture of him on my piano.




Scientific Management and Child Rearing

March 12, 2019
Posted by Jay Livingston

Listen to the opening minutes of the 2006 episode of This American Life that was rerun this past weekend. (If you don’t want to listen, a transcript is here. The clip ends with Ira saying, “and this is the way it was for decades.”)



The John Watson mentioned in this excerpt — the psychologist who thought that kissing your child more than once a year was “overkissing” — wasn’t just president of the American Psychological Association, he was the founder of behaviorism, which dominated academic psychology in the US for much of the twentieth century. Behaviorism focused on behavior. Thoughts, emotions, desires, personal attachment — these internal states were invisible, and behavioral psychology brushed them aside as unimportant or irrelevant.

Behaviorism in psychology was a close cousin of Taylorism in business. Both had little use for these human feelings. Frederick Taylor’s “scientific management” reduced work to a carefully controlled series of steps for workers to carry out with machine-like consistency. It was an inconvenient fact for Taylorism that the workers were people, not machines. They would just have to suppress those human qualities, at least on the job.

Behaviorists usually did their experiments on animals. The lives of these lab animals, as far as the experimenters were concerned, consisted entirely of learning — learning to get food, learning  to avoid electric shocks. These subjects could not convincingly protest to the psychologists that their thoughts and emotions were being ignored.

It’s only a short step from the psych lab to the nursery, from white rats to children, and the idea that raising kids was, or should be, a matter of conditioning. In both settings, but especially with children, the person in charge was imposing order and control on what in nature tended to be messy. And in bringing order to this messiness, scientific rationality was better than relying unthinkingly on what seemed natural. Scheduled feeding was better than on-demand feeding, and scientifically produced formula was better than breast milk. It may seem natural to respond to your child physically and emotionally, and in fact evolution may have made that response part of human nature, but John Watson says, “Let your behavior always be objective and kindly firm. Never hug and kiss them, never let them sit in your lap.”

As I listened to the podcast, glad that the days of behaviorist conditioning had been left behind, I suddenly remembered “Ferberizing.” In 1985, Dr. Richard Ferber published Solve Your Child's Sleep Problems. Its centerpiece was a technique that would condition your child to sleep through the night. When the child cries, do you wake up and give comfort? Bad strategy. Instead, well here’s Wikipedia’s summary of the Ferber approach:

1. At bedtime, leave the child in bed and leave the room. 

2. Return at progressively increasing intervals to comfort the baby (without picking him or her up). For example, on the first night, some scenarios call for returning first after three minutes, then after five minutes, and thereafter each ten minutes, until the baby is asleep. 

3. Each subsequent night, return at intervals longer than the night before. For example, the second night may call for returning first after five minutes, then after ten minutes, and thereafter each twelve minutes, until the baby is asleep.

It’s pure behaviorism. Instead of rewarding the undesirable behavior, the parents “extinguish” (as behaviorists say) the crying response.

You don’t ignore the child completely. It’s OK, even good, to stand in the doorway so that the poor kid doesn’t think you’ve abandoned them. But do not go into the room and comfort the child. Let them “cry it out.”

It works, said my fellow parents. The nights of crying it out had been followed by uninterrupted slumber for all. So my wife and I decided to try it. I cannot remember how old our son was at the time, nor do I remember the hour when he awoke crying, maybe around eleven p.m. We went to the door of his room. I checked my watch and mentally started counting down the three minutes. Our son, seeing us through the bars of his crib, cried even harder. And why not? The parents who he knew as reliable sources of comfort were now choosing to let him suffer.

After thirty seconds or so, I knew there was no way I could last three minutes.I turned to my wife.  “Dr. Ferber is saying let him cry.  Half a million years of evolution is saying go pick up the kid and hold him.”

And that was our one attempt at Ferberization.

Faith and Disaster

March 9, 2019
Posted by Jay Livingston

What do groups do when they are faced with strong evidence that their core beliefs are wrong? Ever since When Prophecy Fails (1957), we’ve known the answer. They try spread the word, both to others who they try to convert, and to themselves with greater demonstrations of their faith.

The phrase “acts of God” usually refer to natural disasters — floods, hurricanes, earthquakes, tornadoes. Yet after these events, no matter how devastating, people rarely give up their belief in God as a beneficent being.


When Prophecy Fails followed a group that believed that on a given date, the world would be destroyed but that aliens in flying saucers would come and rescue them. They were not unusual. Faith often is a belief in a distant and powerful figure who will save the group from disaster. If there are two such figures, the faiths can be combined.

The idea of a God-Trump alliance may be widespread among his Christian supporters. White House press secretary Sarah Sanders stated the idea explicitly: “I think God calls all of us to fill different roles at different times and I think that He wanted Donald Trump to become president, and that’s why he’s there.”

An act of God that caused great destruction and loss of life is not going to shake the faith of Alabama Christians. As for Trump, it’s possible that his administration will come through for Alabama. But even if FEMA fails to deliver the kind of relief Alabamians expect, and even if their lives do not improve during the Trump years,they will probably maintain their belief in his goodness and blame any misfortunes on others.

----------------

Previous posts include examples of failed prophecy among liberals, the NRA, economists, and a Trump supporter in the South who is young, Black, and gay.

Suicide and Well-Being. SOC 101, Week 1

March 3, 2019
Posted by Jay Livingston

I begin the semester with Durkheim’s idea of social facts, and I use his example of suicide rates. The rate may be made up of individual cases, but that rate takes on an existence that seems separate from those cases. It is more a property of the society or the specific group. Here are the numbers of suicides and the rate per 100,000 (age-adjusted) in New Jersey for the last four years (CDC)
2014   786 (8.3)
2015   789 (8.3)
2016   687 (7.2)
2017   795 (8.3)
In three of the four years, the numbers are nearly identical, differing by only 9 suicides in a population of over eight million. So it makes sense to think of the rate as something about the state, not about the individuals that make up that rate. Rates in the other 49 states, though they vary widely from state to state, show the same kind of stability. Each year the state produces roughly the same number of suicides.

In case students had missed the point that it’s not about individuals, I remind them, “The 789 people who killed themselves in 2015 cannot be the same 786 who killed themselves in 2014.” I add, “There aren’t many facts in social science that we’re 100% sure of, but that’s one of them.”

My second point is that while we can use individual facts to explain other individual facts, when we try to explain social facts, those same explanatory individual facts often aren’t much help. For explaining the individual suicide, it makes obvious sense to look at a variable like happiness. I’m willing to assume that people who kill themselves are not as happy as people who don’t. But are people in Greece three times as happy as Americans? 

A headline in the local papers a couple of days ago looped us back to that first week of class.



In fact, New Jersey ranked 31st. The headline is referring to a recent Gallup report (here). Gallup calls its measure “well-being,” not “happiness.”  Whatever. As for the happiest or wellest-being  states? Here’s the map.


The map of well-being looks strikingly similar to the map of suicide that I show students in Week 1. The same states that have a lot of well-being also have a lot of suicide. Here is Gallup’s list of the top ten on well-being. I have added a column to show the ranking and rate for age-adjusted suicide.

(Click for a larger, clearer view.)
All but two of the states highest on well-being are in the top twelve on suicide rates. Only Delaware has a lower-than-average suicide rate.

If happiness doesn’t keep suicide rates low, what does? Durkheim’s answer was “social integration.” Unfortunately, Gallup doesn’t have a variable by that name. But the Well-being index is a score made up of five components: Career, Financial, Physical, Social, and Community. The one that seems closest to Durkheim’s conception of social integration is not Community (“liking where you live, feeling safe and having pride in your community”) but Social (“ having supportive relationships and love in your life”). What the scale-makers call Community does not sound a lot like Gemeinschaft. It's more an individual feeling of pride or safety. It does not require actual involvement with other people. By contrast, Social does seem to be a measure of interpersonal involvement.

Since Social seems much closer to Durkheim’s notion of social integration than does Community. So we shouldn’t be surprised that those high-suicide mountain states also rank high in Community. But mostly they are not among the highest in Social. New Jersey, with its low suicide rate, is low on Community (ranked 40th) but high on Social (9th).



There are many anomalies. Colorado, for example, comes out very well on Social and all the other sub-scales of Well-being, yet its suicide rate is 10th highest (tied with Nevada). New York  ranks in the bottom half on four of the five components, including Social, and in the bottom fifth on three of them (Community, Career, Financial), yet it has the lowest age-adjusted suicide rate among the fifty states.

The Gallup numbers do support the Durkheim explanation — not overwhelmingly, but enough for the first week of class, enough to open the door to social  explanations of what seems like a highly personal decision.

Let’s Write a Zeitgeist Hit

February 21, 2019
Posted by Jay Livingston

It’s easy to look back and see how a movie, TV show, or book was a massive hit because it fit perfectly with the spirit of its time. Maybe it expressed what we, all of us, were feeling, or maybe it gave us something we lacked. Think of those 1930s musicals, with Fred Astaire and Ginger Rogers, elegantly dressed and dancing their way through sets that dripped with luxury.  Perhaps they were so popular because “their musicals offered the purest form of escape from the woes of the Depression, a fantasy of the 1920s seen through the darker prism of the 30s.” (John Rockwell in the New York Times, “Escaping Depression? Just Dance Blues Away.”).

The implication is that there is a zeitgeist tide in the affairs of culture which taken at the flood leads on to fortune, or at least a 15 Nielsen. They’re pretty much the same thing. All you have to do is suss out the zeitgeist.

I had thought that people in the business would be skeptical about this way of thinking. Old Hollywood hands who have a lot of experience in actually making movies and TV shows know how hard it is to create a hit, to know what the public will respond to. Try to imitate a hit by incorporating those elements in it that seem to have resonated with the audience, and you often fail miserably. As screenwriter William Goldman famously wrote in 1983, “Nobody knows anything.”

I had thought that sociologists would be skeptical about this way of thinking. They would be familiar with Wendy Griswold’s 1981 AJS article showing that the content of novels published in the US in the late 19th century may have had more to do with the economics of publishing rather than with a supposed cultural transformation. It was the change in copyright laws, not the feminization of American culture.

Ken Levine (rhymes with divine) is an old Hollywood pro, mostly as a writer — Cheers, M*A*S*H, Frasier, Simpsons, and so on. He also does a weekly podcast (“Hollywood and Levine”). I hadn’t listened to it in a while, but the episode title “How SEINFELD Got on the Air”  made me curious. It turned out to be a conversation with another old Hollywood hand, Preston Beckman, whose metier is scheduling. He knows not only how Seinfeld got on the air but why it was on the air Wednesdays at 9:30. He used to blog anonymously as “The Masked Scheduler.” He also has a Ph.D. in sociology from NYU.*

The entire conversation is interesting. Here’s the part that includes the word zeitgeist. They are discussing the success of “American Idol” and “24."



Here’s a transcript, somewhat edited.

KEN LEVINE: Back in 1963, when Kennedy was assassinated, the country needed something to get out of its funk. And the Beatles came along at just the right time a couple of months later.And in a sense I always felt that “American Idol” was similar in that it was after 9/11. It came along and we needed something positive to focus on. And that became the zeitgeist hit.

PRESTON BECKMAN: I totally agree with you. I totally agree. I don’t think anybody at Fox thought it was going to be what it was going to be.  And afterwards, thinking about it — and maybe my background as a sociologist before I went into this business. . . . I think the country needed something.

What was great about American Idol is that it put control of the process in the hands of the viewer. So after seeing the devastation and everything we had seen, it was like “OK, I have input into this. I have some control over this event.

The pilot episode of “24” was completed in time for the opening of the fall TV season in 2001 but was delayed because Fox was broadcasting the baseball playoffs. Then came 9/11. 

PRESTON BECKMAN:  We actually had to edit the pilot because there was a scene of a plane being blown up. That was another situation where we didn’t know whether this was going to be rejected because of what had happened or going to be embraced, and fortunately I think the casting of Kiefer Sutherland had a lot to do with the success of the show.

The Hollywood pro and the sociologist-turned-Hollyword-pro agree: It’s the zeitgeist — what the country needed. “American Idol” offered “something positive,” something that gave viewers control at a time when people’s sense of control over their lives and their country had been shaken. It was, says Levine, a “zeitgeist hit.” The zeitgeist was there; the Idolators just figured out a way to cash in on it. Apparently, William Goldman was wrong. Somebody knew something.

But Beckman, though he seems to be unaware of it, says something that’s much closer to Goldman’s view. Nobody at Fox expected “American Idol” to be such a huge hit. The same goes for “24.” Before the show aired, “we didn’t know” if it was going to be a hit. It’s only in retrospect that Levine and Beckman can construct the zeitgeist connection. Even then, Beckman seems to be giving greater weight to casting decisions than to the post-9/11 zeitgeist.  It’s only in retrospect that we can look at the attributes of these shows, match them up with elements of the zeitgeist, and then “predict” their success.

--------------------------------
* The title of his 1981 dissertation is “Predicting Television Viewing: an Application of the Box-jenkins Methodology for Time Series Analysis to Levels of Television Usage in the United States (1966-1975).”

The Sorrows of Old Brooks

February 17, 2019
Posted by Jay Livingston

In his Valentine’s Day op-ed bemoaning the supposed disappearance of romantic love, Arthur Brooks begins with Goethe’s Sorrows of Young Werther. In that novel, the title character falls in love with a woman who is already engaged to someone else. She gets married. He commits suicide. Got it?

I wouldn’t pitch this plot to Netflix today if I were you, but in 1774 in Europe, it was a huge hit.  “Young men began to dress like Werther. Most alarming, the novel was said to have stimulated copycat suicides among brokenhearted lovers.” It was “Werther Fever.” And, says Brooks, it’s what we need more of in America today. I am not making this up.

What is the opposite of Werther Fever? Whatever it is, we’re suffering from it in the United States today. Particularly among young people, there is an increasing absence of romantic love.

I looked through the General Social Survey for data that would confirm Brooks’s idea about the withering of romantic love. The evidence was hardly convincing (see the previous post). But let’s suppose Brooks is right, that younger Americans are turning away from romantic love. If young Werther is our shining exemplar, maybe we should ask whether romantic love is such a good thing.

To begin with, romantic love has little connection to reality. Can one person satisfy all the emotional and erotic needs of another person? We know that this notion is unrealistic. That’s why romantic love is often likened to a dream state, with a “dream lover,” the “man of my dreams,” and so on. Even more unrealistic is the idea that only one person in the world can work this dream-like effect. For young Werther, it’s Charlotte, and if he can’t have this one person, there’s no point in living. Up close, it seems idealistic. But take a step back, and it looks pretty silly. As Philip Slater says, what would we think of a man who died of starvation because he couldn’t get any Brussels sprouts?

These stories also tell us, inadvertently, that romantic love is unsustainable. The lovers in these stories spend almost no time together. Instead, the plot focuses on the lovers’ struggles against the obstacles that separate them. Once these obstacles are overcome – or not – game over. Can these two people sustain romantic love over the long (or even not-so-long) course of a marriage?  Tales of romantic love dodge that question. They end either with the death of one or both lovers (Romeo and Juliet, Young Werther) or with their union. “They lived happily ever after. The end. Don’t ask what actually happened in that ever-after.”

The “ever after” is hard to imagine because romantic love is based on fantasy. You may fall in love with and pursue the “dream lover.” You may even wind up together. But in a sustained relationship (what is still often called a marriage), you have to live every day with a real person, not a dream.

Brooks is particularly concerned about the “precipitous decline in romantic interest among young people. . . . . While 85 percent of Generation X and baby boomers went on dates as high school seniors, the percentage of high school seniors who went on dates in 2015 had fallen to 56 percent.”

To which I am tempted to respond, “You say that like it’s a bad thing.” Maybe Brooks’s memories of dating in high school are sunnier than mine, but it seems to me that kids dated because there was no alternative. They felt they were required to pair off in some simulation of a romantic couple. Often, neither boy nor girl was comfortable with that arrangement — about what you’d expect with two people more or less forced together having to come up with the rules and roles in this new relationship. My impression is that for most kids, that relationship rarely achieved the romantic love that Brooks imagines.

Much more pleasant were the times I spent hanging around with groups of friends. And apparently that’s where teen-age culture is heading. Less dating, more hanging out and hooking up. It’s not perfect. The “hookup culture” among college students that Lisa Wade describes in American Hookup seems joyless and unsatisfying. But college students do go on dates, and most wind up in pair relationships. It’s just that these often develop out of and follow more casual relations and hookups.

Brooks thinks that this is a change for the worse. Me, I’m not so sure. When those baby boomers went to the high school prom, it was a date; they went as couples, two to a car, and if you didn’t have a date, if you were not paired off, you didn’t go. Today, they clamber into limousines as a group — as many as the limo will hold — some in couples, others not.  I don’t know why Brooks wants to re-impose the rigidities of dating. Maybe he misses those Werther-like sorrows.