Ten Years a Blog

September 20, 2016
Posted by Jay Livingston

Many years ago, I saw Penn and Teller’s off-Broadway show in a small theater on West 43rd St.  Before his fire-eating routine, Penn explained that it wasn’t a trick. It was more of a skill. The performer really is putting a flaming torch in his mouth, and if he lets his mouth get at all dry, he can get burnt. It happens. It hurts. “So the next time you see someone eating fire” Penn said, “the question to ask is not how. It’s why.

The same might be said for keeping a blog going for a decade. Ten years, 1572 posts. My answer to Penn’s question is the answer Teller, if asked, would give on stage. But I will follow my custom of culling a handful of posts that I liked from the last year.


The Philosophy of the Gun in Trumpland

September 17, 2016
Posted by Jay Livingston

It seems odd for a politician to boast that many of his supporters are potential murderers, terrorists, and assassins. But then much about the Trump campaign is strange and odd.

Trump acknowledges that his supporters believe in what I once called (here) “The Philosophy of the Gun.” Here’s the gist of that post from seven years ago.

The philosophy of the gun is simple: if someone does something you don’t like, shoot them. If you can’t shoot that person, shoot someone like them.

If you don’t like abortions, shoot an abortion doctor . If you don’t like an anti-abortion protester , shoot him. If you feel wronged by people at work, go postal. If a woman has rejected you, shoot her. If you can’t find a woman who actually rejected you, shoot several women. Don’t like the kids in your school? Shoot them. Feel you’ve been dissed by someone from another gang, shoot them.

Gun advocates put this in terms of self-defense. If you have gun, you can defend yourself, your property, and your loved ones from people who are doing something you don’t like. Which is just another way of saying that if you don’t like what the person is doing, shoot them. The only difference is that such shootings might be legal.


My blog usually gets few comments, but on this one, the gunslingers descended en masse,* though as I said in a post the next day (here), they mostly agreed with my basic point; they just didn’t like the way I put it.

Now Donald Trump has joined me. A few weeks ago, he hinted that if gunlovers (“Second Amendment people”) didn’t like a judge, they would take aim and assassinate the judge (or perhaps the president who appointed the judge).  Yesterday, he suggested that they would shoot Hillary Clinton. No mention of policies or judicial appointments. They would shoot her just because they don’t like her – if they could get away with it.

I think they [Clinton’s Secret Service detail] should disarm immediately. Take their guns away, she doesn't want guns. Take their— and let’s see what happens to her. Take their guns away. OK, it would be very dangerous.


Very dangerous. Trump is talking about people who adhere to the philosophy of the gun, and apparently he counts many of these people among his supporters. He is saying essentially that they will shoot the Democratic nominee for US president. He merely adds the caveat, if they can be successful, i.e., if the Secret Service cannot shoot them first.

When Clinton said that half of Trump supporters were racist, sexist, homophobic, or xenophobic, the Trump campaign and supporters took umbrage. When Trump himself suggests that many of them are potential terrorists and assassins, they seem to take it as a compliment. 

----------------------------------
* Peter Moskos, whose blog Cop in the Hood is well worth reading, once told me that sometimes when he’s feeling neglected and lonely, he’ll put up a post about guns. And very soon, he’s got lots of company.

Has Trust Gone Bust?

September 16, 2016
Posted by Jay Livingston

David Brooks was preaching again this week. His Tuesday column (here) was a jeremiad on the ills that “an avalanche of distrust” is bringing to US society.  (The phrase “avalanche of distrust” came probably from the headline writer, not Brooks. Brooks refers to “an avalanche of calumny.” Either way, the country’s being buried under a lot of bad snow.)

A generation ago about half of all Americans felt they could trust the people around them, but now less than a third think other people are trustworthy.

Young people are the most distrustful of all; only about 19 percent of millennials believe other people can be trusted. But across all age groups there is a rising culture of paranoia and conspiracy-mongering.

Brooks is partly right, partly misleading. He seems to be referring to the General Social Survey, which, since 1972, has asked regularly about trust.  The GSS has three items that pertain to trust.
  • TRUST – Generally speaking, would you say that most people can be trusted or that you can't be too careful in life?
  • HELPFUL – Would you say that most of the time people try to be helpful, or that they are mostly just looking out for themselves?
  • FAIR – Do you think most people would try to take advantage of you if they got a chance, or would they try to be fair?
The GSS data does show some decline in all these since 1990 – i.e., a generation ago.

(Click on an image for a larger view.)
               
Of the three variables, TRUST has declined the most. The percentage of people saying that most people could be trusted fell from 39% to 30%, Brooks’s statement about “half of all Americans” a generation ago being trusting is a bit misleading as it implies that 50% was standard year in year out. In fact, in only one year, 1984, did the percentage reach that level. As for the other variables relevant to the “culture of paranoia,” perceptions of other people’s helpfulness also declined; perceptions of their fairness changed little.

What about the age differences Brooks notes?  I extracted GSS data on the Trust variable at three different periods – 1972-1976, 1989-1991, and the most recent years that we have data for.

In every period, the young are the least trusting, but the difference between them and people in older age groups is much greater now than it was 25 or 40 years ago. That’s because millennials, as Brooks correctly notes, are much less trusting than were their 18-30 year old counterparts in the 1970s.

But what about the rest of us? According to Brooks, “across all age groups there is a rising culture of paranoia.”  In the 1972-76 period, in the youngest group, 38% were trusting. In 1990, those people would be in their early 40s to early 50s. In that year, that age group was somewhat more trusting than they had been 25 years earlier. And 20 years later, when they were in their late 50s and up, they were still as trusting as they had been before. The same is true of the people who were 30 and up in the 1970s. Similarly, the youngest group in 1990 had the same level of trust twenty years later – about 30%.

Trust seems to be remarkably resilient – impervious to the vicissitudes of aging or of social and political changes. The Watergate era, the Reagan years, the terrorist attacks of 9/11, the recent recession – none of these seems to have made much difference in each cohort’s level of trust.  Go figure.

Brooks’s sermon then turns from the aggregate data to the effect on people’s souls.

The true thing about distrust, in politics and in life generally, is that it is self-destructive. Distrustful people end up isolating themselves, alienating others and corroding their inner natures.
Over the past few decades, the decline in social trust has correlated to an epidemic of loneliness. In 1985, 10 percent of Americans said they had no close friend with whom they could discuss important matters. By 2004, 25 percent had no such friend.

That finding, which made headlines a decade ago, has since been questioned if not debunked.The GSS data it’s based on contained a coding error, and other surveys have found no such drastic increase in friendlessness. Claude Fischer has an excellent blog post about this issue (here). He includes this graph based on results from the Gallup poll.


Brooks obviously is not interested in these corrections. He is, after all, crying “avalanche” in a crowded political theater. But it tuns out there’s not all that much snow. The change to worry about is that over the last 25 years, each new cohort is less trusting, and this is one time when hand-wringing about what’s wrong with kids today might be appropriate. Those attitudes, once formed, are little effected even by major changes in the society and government.

Trump Unrestricted

September 15, 2016
Posted by Jay Livingston

Commas are important. Sometimes.
  • I waved to the young man who was wearing a gray suit.
“Who was wearing a gray suit” is a restrcitive clause. It’s called that because it limits the subject. There could have been a lot of other young men, but I waved to the one in the gray suit.
  • I waved to the young man, who was wearing a gray suit.
With the comma added, the clause becomes nonrestrictive. The young man is the only possible one, and he was wearing a gray suit.

Today we have this statement from Donald Trump’s doctors.

We are pleased to disclose all of the test results which show that Mr. Trump is in excellent health, and has the stamina to endure — uninterrupted — the rigors of a punishing and unprecedented presidential campaign and, more importantly, the singularly demanding job of President of the United States.

With no comma, the clause “which show that Mr. Trump is in excellent health” becomes restrictive. It implies that there may other test results which do not show Trump to be in excellent health. Was the copy editor being cagey or merely careless?

Today’s statement is a bit more specific than the previous medical records released by a Trump doctor — “If elected, Mr. Trump, I can state unequivocally, will be the healthiest individual ever elected to the presidency,” which sounds almost as if the Donald himself could have written it.

Of course Mr. Trump is in great health, the best – believe me. I know his doctors, the finest really. I admire them, and they say his stamina is great. I can’t believe that people are complaining about a comma. Don’t get me wrong. I love commas. China – by the way, a very great country – China doesn’t have commas, and we’ve let them get away with that. China is incredible in many ways. I eat Chinese food. I mean, my Chinese chef is tremendous, tremendous. But many people say that there are no commas on the menus. No commas at all. Sad.

All In the Family

September 11, 2016
Posted by Jay Livingston

“God must love the common man,” goes the quote usually attributed to Lincoln, “He made so many of them.”

Greg Mankiw must love the rich. He writes so many articles promoting policies that help them. In today’s installment in the NY Times Business section (here), he writes about the estate tax.*


Mankiw probably didn’t write that headline, and it’s slightly misleading. It suggests that Mankiw wants to get rid of all inheritance taxes without making any other changes. In fact, he proposes other ways to “make sure those at the top pay their fair share.” But the headline captures the takeaway – or at least what rich people and their advocates take away.

Not taxing inheritances is what happened when Mankiw was, as he reminds us, chief economic advisor to President George W. Bush. Bush phased out the estate tax entirely. Presumably Mankiw raised no strong objections. It’s possible that Mankiw recommended to Bush the alternate taxes he mentions in today’s Times. We don’t know. But if he did, the only item in his advice package that Bush and the Congressional Republicans paid attention to was the call to just get rid of this pesky tax on the heirs of the wealthy.

In writing against the estate tax, Mankiw pulls the same switcheroo that other opponents of the tax use. He writes about it as though the people who pay the tax are those who accumulated the fortune. They aren’t. Calling the inheritance tax the “death tax” makes it seem as though the dead are being taxed for dying. They aren’t. If you leave an estate to your heirs, you have departed this mortal plane and are safely beyond the reach of the IRS.

The people who would pay the tax are those who inherit the money. If they had gotten this money the old fashioned way, by earning it, they would pay tax on it – and nobody objects on principle to taxing the money people get by working. But money they get because someone gave it to them gets preferable treatment. Mankiw dodges this issue by talking about “families,” as though the family were still the same as it was before the death of the one who made the money – as though the money hadn’t really changed hands.

The same logic would shield income in the paychecks that someone got by working in the family business. And who knows? Maybe Republicans will propose making income from the family business tax-free. The Trump kids would get a free ride. After all, if we don’t want a death tax, why would we allow a parenthood tax?


-------------------
*Another post on Mankiw and taxes is here. Other posts about Mankiw in this blog are here and here.

(A day after I posted this, Matt Levine made basically the same criticisms of the Mankiw piece. Levine writes for Bloomberg, so his post probably had a few more readers.)

Trigger Warnings

September 10, 2016
Posted by Jay Livingston

I posted a trigger warning last week, the first one I have ever used.

I begin the semester contrasting individual facts with social facts, and the example I use is Durkheim’s study of suicide – suicide rates and social integration as social facts. In each of the past two semesters, a student has told me weeks later that he or she (one of each) had recently experienced the suicide of someone they were close to, and the topic still upset them. I had had no idea that I was tromping around on someone else’s understandably sensitive toes. For the remainder of the course, in selecting examples to illustrate sociological ideas in the remainder of the course, I tried to avoid suicide.

This semester, before the first class meeting, I posted an announcement on Canvas (or “course management system”):

(Click for a larger view.)

The University of Chicago does not approve. In a now-famous letter sent to incoming students last month, the Chicago Dean of Students Jay Ellison said

Our commitment to academic freedom means that we do not support so called ‘trigger warnings.’

I’m not sure why the Dean thinks it’s a good idea to spring disturbing material upon students without any advance notice. Maybe when it comes to movies he doesn’t like the MPAA warnings either. I do.

One evening long ago when I was a student, I went with some friends to see a new movie that they said had gotten good reviews and was by an important director.  Back then, before DVD, VHS, HBO, etc., if you wanted to see a movie, you had to go to the theater before the movie finished its run.
The movie was “Straw Dogs.” It’s another version of the adolescent boy’s fantasy that used to grace the inside covers of comic books.

(Click for a larger view.)

In “Straw Dogs,” instead of the bully kicking sand in the boy’s face while the girlfriend watches, the bullies rape the girl. And instead of merely returning to punch out one bully, the hero dispatches a septet of baddies using variously a fire poker, a shotgun, boiling oil, a nail gun, and a bear trap.

Immediately after seeing the movie, I was upset – angry at the movie, even angry at my friends. It was not the stupidity of the movie that disturbed me.  I’d seen the basic plot not just in comic book ads but in many American films. We American guys just loves us some justifiable revenge violence. What upset me was that the violence was viscerally arousing. The movie was rated R, but I had seen plenty of R movies. I just hadn’t seen any that put violence on the screen so effectively.* My reaction, I realized later, was probably like some people’s reaction to sex in the movies – it’s arousing in a way that they don’t want to be aroused, at least not by a movie. (They don’t want others to be aroused by it either, but that’s a separate issue.)

If someone had told me beforehand what to expect, my reaction to and against the film would not have been as strong, nor would I have been as pissed off at my friends for selecting the film. Maybe when film classes at Chicago show “Straw Dogs,” they remove the R rating and generally keep students in the dark. Apparently Dean Ellison would prefer it that way. Me, I’d warn the students in advance and risk being scoffed at as politically correct.

-------------------------------
* Making violence arousing is something that Peckinpah is very good at. Pauline Kael famously said of “Straw Dogs” that it was “a fascist work of art.”     

Labor Day - Unions on Film

September 5, 2016
Posted by Jay Livingston

A guy I know who hates teachers unions equates them with the corrupt and violent Longshoremen’s union portrayed in “On the Waterfront.” True, the union comes off badly in this movie. When Brando, having testified against union boss Johnny Friendly says, “I’m glad what I done to you, Johnny Friendly,” the audience is glad too. But what about other films?

Hollywood is much more likely to give us business executives than workers. The corporate biggies are usually corrupt and evil, but at least they’re up there on the screen. Workers, not so much.

I tried to think of American movies (non-documentary) where a union or even the idea of a union had an important role. The list I came up with on the spur of the moment was very short
  • Norma Rae (1979)
  • The Grapes of Wrath (1940)

Putting the question out to my Facebook friends brought only a few more to the list (ht: Philip Cohen).
  • Matewan (1987)
  • Hoffa (1992)
  • Salt of the Earth (1954)
Googling “movies about unions” added
  • The Devil and Miss Jones (1941)                                   
  • F.I.S.T. (1978)
  • Bread and Roses (2000)
  • Blue Collar (1978)
  • Won’t Back Down (2012)
  • The Garment Jungle (1957)
  • Black Fury (1935)
Given the Hollywood depiction of corporated bosses as bad guys, I expected that movies would also portray unions as  virtuous organizations helping virtuous workers.  That’s sort of true of 40s and 50s, though obviously “Waterfront” is an exception.* But in “Hoffa,” “F.I.S.T.,” “Blue Collar,” and “Won’t Back Down,” unions don’t come off so well.  Of movies from the last 30 years, only “Matewan” is unambiguously pro-union, and it was a low-budget indie. So much for the idea that Hollywood is dominated by leftists and liberals.

My favorite nomination – one ignored even by Google – came from my cousin, who wasn’t even born till about twenty years after the movie came out: “The Pajama Game” (1957).


------------------------------
* The point of “Waterfront” was to make a virtue out of testifying to the government against the team you used to be on. Both the writer and the director, Budd Schulberg and Elia Kazan, respectively, had testified before the House Committee on UnAmerican Activities and had ratted out other Hollywood people – naming names and ruining careers. Kazan acknowledged the parallel – he was glad what he done to his former associates. But Schulberg denied that the movie had anything to do with HUAC investigations.

Take My Rug – Please

August 31, 2016
Posted by Jay Livingston

Two couples at dinner. The check comes.
“We’ll get it,” says one man.
“OK, thanks,” says the other.

Of course, that’s not what happens. The other person is supposed to protest, to make the same offer to pick up the whole tab. Then comes the negotiation.

I’m not concerned here with how these things get resolved or the basis of the moves and countermoves. My point is just that we’re not supposed to take that first offer at face value. It might be sincere, or it might not. That’s what’s to be discovered in the subsequent negotiation.

When it comes to this generosity gambit, Iranian culture is way ahead of us. In Farsi, it’s known as tarof. I first heard about it on a 2011 episode of “This American Life.” An Iranian-American woman, Nazanin Rafsanjani, explains.

It’s basically this social custom of never saying what you want and offering things to people that you may or may not really want to give them. . . An uncomfortable thing that would often happen to me growing up is that I’d have all these American friends coming over to our house, and it always makes me slightly uncomfortable when someone’s at my parents’ house and they compliment my parents on something.

Because the custom is, if you go to my parents’ house and you say, like, “That’s a beautiful painting on the wall,” they’ll offer it to you. They’ll just be like, “Take it. It’s yours. It’s not good here anyway. It would look better in your house. Take it. It’s not worth anything to us. It’s much more important that you have it.”

Which brings me to the “Modern Love” column in the Styles section of the New York Times Sundays ago. It’s by Sharon Harrigan, presumably not Iranian. She begins:

The nicest thing I own is the first thing you see when you walk into my house: a red handmade rug bought in Tehran, haggled over in Farsi and delivered, in person, to the Brooklyn apartment of the man who would become my husband.

Back then, James told me the woman who gave him the rug, a woman he had recently dated, was by then “just a friend.”


I’ll skip the friends-become-lovers-become-married story (you can read it here) and jump cut to the next appearance of the rug, later in the narrative but earlier in chronology, when the author first sees it.

Weeks later, he buzzed me up to his apartment. The door opened to reveal the most beautiful rug I had ever seen, so finely woven it was more like a tapestry. The kind of precious object that could be ruined by a few stray Cheerios crumbs.

“It’s a gift from a friend,” he said. “She bought it when she visited her family in Iran.”

“She’s trying to get you back,” I said.

“What? She’s just being kind. Don’t you love it?”

It’s possible that she’s right – that the old girlfriend is trying to get him back. It’s possible that he’s right – that the old girfriend was just being kind. But – and I’m just guessing here, and I could be very mistaken – it’s also possible that the Iranian girlfriend, the one who haggled over the rug in Farsi, was tarof-ing her ex.  And he, not knowing any better, accepted the offer.

In the “This American Life” episode, the American producer does ask Nazanin Rafsanjani what happens if someone does just accept the offer. Rafsanjani answers: “I don't know. I mean, that just never happens.” But she means that it never happens between Iranians. They know how the game is played. But if the other person is a tarof-ignorant American, maybe an Iranian woman winds up losing her beautiful rug.*

------------------------------------

* In the episode, which you should really listen to (here) or read the transcript of (here), Rafsanjani describes how tarof happens even in retail sales. The storekeeper offers the merchandise for free.  “Take it” etc. The customer of course refuses and insists to know how much it costs. At some point in this negotiation, the storekeeper names a price. Now the sides switch, for the storekeeper, who had been saying that the goods were nearly worthless and that the customer should take them, asks a price which is invariably far more than the goods are worth, and the customer must haggle the price down.

Trickle-down Culture – As American As Pad Thai

August 26, 2016
Posted by Jay Livingston

Resentment against cultural elitists holds a prominent place in the populist energy driving Trump supporters. Mainstream conservatives have been playing this card since way back in the game with playground taunts like “brie-eating, chardonnay-drinking liberals.” In fact, this animus against elite culture may be what divides the pro-Trump and never-Trump conservatives, at least those who babble publicly in the media. Ann Coulter and Sean Hannity imagine themselves to be one with what another Trump supporter, Sarah Palin, called “the Real America” united against the cultural snobs.

But as Paul Krugman (here) points out, we’re all cultural elitists now. Or rather, what was once elite culture has gone mainstream.

But most of all, this kind of punditry, while ostensibly praising the Real America, is in fact marked by deep condescension. One pats the simple folk on the head, praising their lack of exposure to quinoa or Thai food — both of which can be found in food courts all across the country. Sorry, but there are no country bumpkins in modern America.

Even as recently as the early 2000s, part of the liberal stereotype mocked by conservatives was “latte-sipping.” Now NASCAR dads might well have a chai latte or venti in the cup holder of their pick-up. That didn’t just happen. Starbucks spent a lot of money opening outlets and spreading the word.

The same is true of Thai food. Americans didn’t wake up one morning with a cravings for pad thai and green curry. Matt Yglesias links to an article in The Economist.

In a plan ambitiously called Global Thai, the government aims to boost the number to 8,000 by 2003. This, it is argued, will not only introduce deliciously spicy Thai food to thousands of new tummies and persuade more people to visit Thailand, but it could subtly help to deepen relations with other countries.

In the United States at least, . . .  the number of Thai restaurants has grown from 500 in 1990 to more than 2,000 now [i.e., 2002]  . . .  More modestly, the Thai government aims to make it easier for foreign restaurants to import Thai foods, to help them to hire Thai cooks and sometimes to benefit from soft loans. It has been much encouraged that Tommy Tang, a Thai chef working in the United States, has said that he plans to open 200-300 Thai restaurants there during the next five years

Sometimes popular tastes change seemingly without anyone with a vested interest pushing things along, as when names like Barbara and Ashley go out of fashion, and Olivia and Ava become all the rage. In other areas, an entire industry – clothing for example – depends on its ability to convince people to follow the latest fashion. With food, there’s a complicated interaction between person-to-person influence within the population and a strong push from the outside by players who have a stake in the outcome. I don’t know about quinoa, but thanks in part to the efforts of the government of Thailand, Thai food may be on its way to becoming as American as pizza.*



*As a food becomes more popular, restaurateurs in the US who want more customers will find ways to make it more palatable to Americans, probably by toning down the spices and ramping up the sweetness. That’s the cue for elitists to look down on The Olive Garden and other “inauthentic” foods. (Pad Thai is to thai cuisine roughly what chop suey is to Chinese.) The politically correct will decry the cultural appropriation in Hawaiian pizza or a college food court version of banh mi. I know: cultural propriation, bad; Asian fusion, good. But sometimes it’s hard to tell the difference.

Lenny and Me

August 25, 2016
Posted by Jay Livingston
(No sociology, just –  to borrow Chris Uggen’s term –  self-indulgery.)

Leonard Bernstein was born on this day in 1918.

Earlier this summer, I was walking around Tanglewood on a weekday. The Koussevitsky music shed - the open-air concert venue – was deserted, so I walked up onto the stage and stood on the conductor’s podium where decades earlier I had seen Bernstein stand and conduct. (I’ll spare you the photo. I was not wearing my tux.) But that was not the first time our paths – Lenny and mine – had crossed.

In the early 1950s, Bernstein was a visiting professor at Brandeis. No doubt he felt at home in a department that was eager to go beyond the bounds of what traditional music departments did.

Some years later, when I was a sophomore at Brandeis, I had a campus job in the music building. A few days a week, at 5 p.m., I would play records (this was long before Spotify, long before CDs) for the students in the Opera course. I mean, I would play the records if any students came to the classroom in the music building, which they rarely did. I think a couple may have come when the topic was Don Giovanni; that’s the only reason I can think of that I have some familiarity with “Madamina.” We never got beyond Act I.

The classroom had a piano at the front for the instructor to use – a Baldwin baby grand – and sometoimes I would sit there and do my inept version of playing piano. I’d never had a lesson, and I played a sort of jazz by ear. (I recall that Horace Silver’s “St. Vitus Dance” was one of the tunes I was trying at the time.) One day late in the semester, I noticed a small metal plate, about 2" x 3" attached at the right edge of the piano above the keyboard. I read it. It said something like, “This is the piano that Leonard Bernstein learned to play on as a child, and is donated by his parents. . . .” I played that piano more frequently for the rest of the semester.

Here’s the Bill Evans solo version of “Lucky to Be Me,” from “On the Town” (1944 – i.e., when Lenny was 26). Evans takes some liberties – modulating the last eight bars to A♭instead of F the first time through. But Bernstein’s own chord changes on the bridge are incredible as is the melody – very chromatic and hence not easy for singers.



Soundtrack of the Zeitgeist

August 22, 2016
Posted by Jay Livingston

Yesterday’s post was skeptical about a link between pop culture and the Zeitgeist. I questioned whether the a change in the content of fiction or film or other products of the imagination reflected important social change. Still, when done well, Zeitgeist explanations often sound plausible.

Chris Molanphy knows a lot about popular music. He has  a feature at Slate called “Why Is This Song No. 1?” where he would give a super-hit of the day a full physical exam. The performer, the producer, the studio, the way the song evolved musically, the market, the audience, the distribution – all the elements internal to the music business and the creation of songs come under his stethescope. (See his take on “Love Yourself” or Drake (“One Dance” and “Hotline Bling”) or any of his other pieces here)

Molanphy also appears in a regular segment on The Gist, where he and host Mike Pesca turn back the pop-music calendar to a single year. Historical hindsight allows them to align the hits with forces outside of the music itself – politics, the economy, the general spirit of the time. When you’re looking backwards, the temptation to go full-out Zeitgeist is irresistible.

Here they are discussing 1999.

PESCA: What’s the case for its importance, what’s the case for its artistic value, what’s the case for 1999?

MOLANPHY: The case for 1999 is that this is American-led pop at its absolute apex. This is the height of the Clinton era writ large in pop music. It’s empire America at its peak, very candy colored, very Total Request Live, very Britney Spears “Baby One More Time” even. . . and surprisingly a lot of artists who were hits in this TRL era of pop have proved quite enduring. Britney Spears . . . Back Street Boys . . . Christina Aguilera . . . Nsync.

If the first half of the nineties was all about rather grim-faced music – it was about grunge, it was about gangsta rap – this is the candy colored half of the 1990s.

PESCA: This is the Clinton expanded economy, the Internet, this is the years of peace and prosperity, this is the pre-9/11, pre-wakeup-call, good time.

MOLANPHY: If you watch that year’s VMAs. . .  All of the artists that you see on this list of number ones are there. Britney Spears is there, Christina Aguilera is there, the Back Street Boys are there, and all the Latin Pop stars. Ricky Martin is there. . . You see a culture that feels like the good times are going to last forever. The dot-com era is at its height, Clinton’s in the White House, unemployment’s at five percent, everybody’s got money to burn, and the good times are here again. We know what happened two years later, but we know that 1999 is a very neon colored bright and shiny year, and for that I have an odd sort of nostalgia. [Emphasis added. Listen to the entire episode here].                            


Pesca even implies that the “good time” Zeitgeist of 1999 somehow knows what will happen two years later when it will give way to a bad times mood descending upon the country. “This is the pre-9/11, pre-wakeup-call, good time.” To paraphrase Yogi Berra, prediction is easy, especially about the past.”

Sometimes the producers of pop culture do try to come up with songs or movies or TV shows that align with the Zeitgeist as they perceive it. Usually, that means copying the most recent big success.  So we get a wave of superhero movies or doctor TV shows. (There are probably equivalents in music; I just don’t know them.) Sometimes it works; often it flops. As the screenwriter William Goldman famously said, “Nobody knows anything.” Including culture analysts who write about the Zeitgeist.    

Take My Zeitgeist, Please

This summary is not available. Please click here to view the post.

Ideology and Memory

August 16, 2016
Posted by Jay Livingston

Political ideology shapes what we see and what we consider important, as I’ve blooged recently (here and here). Ideology also skews what we remember and how we remember it.

The worst terrorist attack on this country happened on Sept. 11, 2001. George W. Bush had taken office nine months earlier on Jan. 20, 2001. Yesterday, Rudy Giuliani said, referring to Bush’s two terms,“Under those eight years, before Obama came along, we didn’t have any successful radical Islamic terrorist attack inside the United States.” Here’s the video.



He is not the only one to make this mistake. Bush’s former press secretary Dana Perino left the White House at the end of Bush’s term and took a job at Fox News, where in 2009 she told viewers, “We did not have a terrorist attack on our country during President Bush’s term.”  (A video is here. Push the slider to the 0:35 mark.)

I do not think that Giuliani and Perino are deliberately lying. It’s just that their political views have prevented them from seeing or remembering the facts. The belief that George W. Bush effectively prevented terrorist attacks does not square with the fact that the attacks of 9/11 happened when Bush had been in office for nine months. If the facts don’t fit the belief, too bad for the facts. They are no match against the need for cognitive consistency.

What is striking about the Giuliani/Perino view is how widespread it is. I have long thought that one of the great public-relations achievements of the Bush administration was its ability to create the impression that the attacks happened on someone else’s watch. Many people seem to believe that it was someone else’s fault, though they never get around to thinking who that might be. Maybe Obama.

Even today, few people publicly blame the Bush administration for being asleep at the switch. That is certainly true of Giuliani. He loves to recount his reaction on that day.

At the time, we believed that we would be attacked many more times that day and in the days that followed. Without really thinking, based on just emotion, spontaneous, I grabbed the arm of then-Police Commissioner Bernard Kerik, and I said to him, “Bernie, thank God George Bush is our president.”

The Bush-Cheney administration had been in office for nine months, getting regular reports from its terrorism experts like Richard Clarke warning of Al Qaeda, reports that Bush-Cheney discounted. Clarke, when he heard the news on the morning of Sept. 11, said to himself, “Al Qaeda.”
Rudy Giuliani said, “Thank God George Bush is our president.”

Given his public commitment to Bush, Giuliani could not very well publicly acknowledge any facts suggesting that Bush was at all responsible for the attacks. It seems that he cannot even acknowledge those facts to himself. And so he winds up making a statement that is so obviously wrong the video instantly flies around the Internet (or at least around the leftward territories). 

A Boy Named Sue Ashley

August 12, 2016
Posted by Jay Livingston

“Has anyone here ever seen the movie ‘Gone With the Wind’?” I ask my class during a discussion of names. “Do you remember that there was a character named Ashley Wilkes?” I say. “That role was played by Leslie Howard.”


Most students have not seen GWTW, and they are surprised to learn that Leslie Howard as Ashley Wilkes is the one on the left. They know that Leslie can be a boy’s name, though it’s mostly for girls. But Ashley? Yes, Ashley. Until about 1939 (the year “Gone With the Wind” was released), Ashley was unknown as a name for girls. As a name for boys it was not common – most years, fewer than 10 per 100,000 – but it was not weird, certainly not among Southern gentry.

(Click on an image for a slightly larger view.)

Then starting around 1950 and accelerating in the 1960s, Ashley took off among girls, followed by a smaller boom among boys. (The y-axes on the graphs are not the same scale. Male Ashleys at their peak in 1980 numbered only about 40 per 100,000. In the peak year for girls, the rate was nearly 700 per 100,000.)

Boys names becoming girls names is not unusual. Nameberry has a list of more than sixty names that have “morphed from blue to pink.”  The reverse almost never happens. Leslie is a good example. Until 1940, it was rare among girls, fairly common for boys. Up until about 1910, it ranked in the top 100 names for boys.


In the mid-1940s, Leslie became increasingly popular for girls, increasingly unpopular for boys. These contrasting trajectories suggest a sort of “there goes the neighborhood” effect. As girls move in, boys move out. Eventually the name becomes thought of as feminine, and parents no longer consider it fit for boys.

Kelly follows a similar pattern. For boys, the name is unusual; for girls it’s unheard of.


Then, around 1950, the number of boy Kellys triples in a decade, though those numbers are still relatively small – only in its peak year, 1968, does it break into the top 100 and then just barely at #97.  But following the boys by ten years or so, girl Kellys come on strong.  From ranking 904th in 1950 Kelly rose in popularity so that by 1966 she was in the top 20, where she remained for another fifteen years. The gender tipping point came in the late 1960s. Kelly became a girl’s name, and parents of boys stop choosing it.

The unusual thing about Ashley is that it reverses this pattern. The increased popularity for boys follows the girl Ashley boom by about ten years. That is, a small but increasing number of parents continued to name boys Ashley even after the name had become established as a name for girls.

Despite this exception, the unwritten rule of naming seems to be that you can give a girl a predominantly male name; she and her name will still be accepted. You might even be in the vanguard of a trend, like the parents in the late 1940s who named their daughters Ashley. But you can’t send a boy out into the world with the name Sue.                                        

Males are more constricted by norms of masculinity than are females by the norms of femininity. And not just in naming. Girls will do boy things, but the reverse is less common. It’s more acceptable for a girl to be a “tomboy” than for a boy to be a “sissy.”  Girls will watch movies targeted at boys, but boys shy away from girl-centered films. Among adults as well, women give favorable evaluations to TV shows targeted at men,  but men are less able to appreciate shows outside their narrow band of interest. (Walt Hickey at FiveThirtyEight thinks men are “sabotaging” women’s shows by giving them low ratings.) 

The same is true in fashion, where women can choose from a wider variety of colors and styles, including those usually for men. Men’s choices  are more constrained. Men will not wear skirts, but women will wear pants and even pants suits, an item of clothing I mention only as a cheap way of getting to one final name.


It follows the usual pattern – a male name, albeit an uncommon one, declining in popularity, crosses over and becomes a name for girls. Its popularity increases rapidly. Up to a point. That point was 1993. Hillary was doing fine before that, but then for some reason, parents of daughters were no longer with her.

Who’s Shameless?

August 11, 2016
Posted by Jay Livingston

How can Donald Trump, with his 39 Pinocchios from Fact Checker, continue to make false and outrageous claims? How could he denigrate the gold star parents of a Musliim US soldier killed in Afghanistan? Why has he no sense of shame?

Trevor Noah, interviewed on Ezra Klein’s podcast, suggested that it started with bankruptcy. For most people, declaring bankruptcy is a matter of shame. It is a public admission of failure. But for a business, it’s not really so bad. American bankruptcy laws allow business persons to pick themselves, dust themselves off, pay their creditors and suppliers a fraction of what they are owed, and start all over again. Which is what Trump has done at least four times. Even if he might have felt a slight touch of shame the first time, it quickly wore off in subsequent bankruptcies. Trump the businessman might have taken a financial hit, but Trump the public person suffered no loss of social standing.

Before looking for other explanations – surely they must be out there – I wanted to  see the extent of the image of Trump as shameless, I went to Google.



Nearly 700,000 hits. The difference between him and other polticians must be huge. For comparison, I tried the Democratic nominee.



Hillary, by this measure, is not quite so shameless as the Donald, but 500,000 seemed like a lot. Then again, her opponents could reel off a list of scandals dating back to her days in Arkansas. I tried a few successful presidential candidates.



Obama and Bush were not so far behind. The toll was high even for Ronald Reagan and Jimmy Carter, who served before shouts of “shameless” could be echoed around the Internet. Besides, Reagan and Carter, whatever you thought of their policies, seemed like decent human beings. Yet their quotient of “shameless” pages runs to hundreds of thousands. I confess I am ignorant of the ways of the Google algorithm and what those numbers actually reflect. Still, nearly half a million seems like a lot.

Maybe this is not about the politicians themselves. It’s about reactions to those politicians, especially in a polarized polity. Partisans strongly committed to their own point of view often believe that those who disagree with them are acting in bad faith. (See this earlier post about politics and perception.) They think that their own views are so obviously valid and true that a person who sees things otherwise must be denying reality and deliberately lying. These denials and lies are so blatant, so transparent, that most people would be ashamed to utter them. Who could say things that they know are factually and morally wrong?  The politician who is shameless. But the shamelessness may be mostly in the eye of the beholder

Weber at the DNC

August 3, 2016
Posted by Jay Livingston

Like those Japanese soldiers in Southeast Asia who held out long after Worrld War II was over, a few Bernie supporters are vowing to stay in the jungles fighting the good fight. Some are going with the Green party. The Guardian quotes one of them: “I just really strongly believe that you should always vote your conscience.” 

She is voicing what Max Weber called an “ethic of conviction.” In “Politics as a Vocation” (1919), Weber distinguished between that ethic and an “ethic of responsibility.” Conviction, as the name implies (at least in this English translation*), carries a religious certainty about ultimate values. Those who operate solely on an ethic of conviction refuse to compromise those values. How could conscience let them do otherwise? They remain faithful to their values regardless of the actual consequences in the shorter term.  Weber quotes the maxim, “The Christian does rightly and leaves the results with the Lord.”

By contrast, those guided by an ethic of responsibility pay attention to the immediate effects of an action or policy. “One has to give an account of the foreseeable results of one's action.”

These two ethics seem contradictory. Yet, Weber says, those who engage seriously in politics must blend these two seemingly incompatible orientations.

The ethic of conviction and the ethic of responsibility are not absolute opposites. They are complementary to one another, and only in combination do they produce the true human being who is capable of having a “vocation for politics.”


Max Weber, meet Sarah Silverman (2016): “Can I just say to the ‘Bernie or Bust’ people: you’re being ridiculous.”

------------------
* The German term, Gesinnungsrthik, has been translated as “Ethic of ultimate ends,” “Ethic of single-minded conviction,” “Ethic of absolute conviction or pure intention,” “Ethic of principled conviction,” and “Ethic of intention.”

The Social Fox Construction of Reality

August 2, 2016
Posted by Jay Livingston

Why do they love Trump? I often lean to the psychological explanation that many of the Trumpistas are voting with their emotions. It’s not about policy. It’s not about what might be best for their own lives or the country. They like Trump because he gives voice to their thoughts and feelings, especially thoughts and feelings that are unacceptable in the current cultural orthodoxy. Trump expresses their many resentments – against cultural elites, against economic elites, against immigrants, against the poor – especially the Black and Hispanic poor – against government.

I’m not quite ready to abandon that view, but Mark* at the blog West Coast Stat View (here) has a simpler explanation. Voting for Trump is rational. It is realistic – that is, it is consistent with reality. Mark doesn’t explicitly use the term “bounded rationality,” but he’s getting at something similar.  A decision may seem irrational to others (“What was he thinking?”); we ourselves may find it irrational looking back on it (“What was I thinking?”). But at the time, it  made sense given the available information and our ability to process that information. The problem was not with our rationality but with the boundaries that limited what we could see. It was consistent with our reality.

The important question in voting then is “Which reality?” And the answer Mark gives for Trump voters is: “the reality of Fox News.”

The tile of Mark’s post is “Explaining Trump in Four Words,” and the four words are, “Republicans believe Fox News.”

Here is Mark’s summary of that reality:

  • Global warming is a hoax
  • The government and the media are hostile to Christians
  • Food-stamp recipients live on steak and lobster
  • While America is the most taxed nation in the world
  • The financial crisis was caused by government policies that required loans to be made to poor minority members
  • The 2008 election was probably stolen
  • President  Obama's birth records are possibly fraudulent, the product of a massive cover-up
  • President Obama is certainly anti-American
  • As are most Democrats
  • Voter fraud is rampant
  • Islamic terrorist are on the verge of major attacks on Americans
  • America is in decline

A few hours after the The West Coast Stat View post appeared, Trump told an interviewer, “You have radical Islamic terrorists probably all over the place, we’re allowing them to come in by the thousands and thousands.” A dozen Pinocchios or Pants-on-Fire ratings from fact-checkers won’t matter. Those thousands of terrorists streaming in to the US are a reality for Trump and for his followers.

---------------------------
* I’m not on a first name basis with Mark. I don’t know him at all. But I’ve searched WestCoastStatView and cannot find his full name or that of his co-blogger Joseph. Which is too bad because they run an excellent blog.

Race, Voting Laws, and an Old Joke

July 30, 2016
Posted by Jay Livingston

A federal appeals court yesterday overturned North Carolina’s new Voter ID laws. The judges unanimously agreed that the laws “target African Americans with almost surgical precision.”

The law, of course, did not mention race at all. Neither did its historical antecedents – laws that required a poll tax or a literacy test. Or the secret ballot.  Yes, as I was surprised to learn, the secret ballot too, in its early days, was supported and used as a tool to suppress the votes of minorities.

The voting that we now take for granted – privately marking a ballot provided by the government – did not become standard in the US until well into the 19th century. The first presidential election where secret ballots predominated was the election of 1896, which, not coincidentally, was the first year without an election-day killing. Earlier in the century, voters got their ballots from newspapers –this was back in the day when newspapers were highly partisan – and brought them to the polls. These tickets were long and brightly colored –  a different color for each party – and the only way to keep them secret was to fold them up and put them in your pocket. But that was considered unmanly.

Some of the first secret-ballot laws were passed in the 1880s in states where women had won the right to vote – Massachusetts, New York – and wanted to be protected from public scrutiny and possible harm. The other states that went for the secret ballot at this time were in the South. This was the post-Reconstruction era, the era of Jim Crow laws. The secret ballot also had support from Northern states that wanted to suppress the immigrant vote.

Historian and New Yorker writer Jill Lepore, interviewed on by Terry Gross NPR’s “Fresh Air” back in February, explained:

LEPORE: It was a way to disenfranchise newly-enfranchised black men. None of them knew how to read – they’d been raised in slavery, lived their entire lives as slaves on plantations. The real success of the secret ballot as a national political institution had to do with the disenfranchisement of black men.

GROSS: So the secret ballot was a way of helping them get the vote. [Note that Terry Gross is so thoroughly modern that she doesn’t grasp what Lepore is saying.]

LEPORE: No, it was preventing them from voting. If you could cut your ballot out of the newspaper and knew you wanted to vote Republican, you didn't have to know how to read to vote. Immigrants could vote. Newly-enfranchised black men in the South could vote. It actually was a big part of expanding the electorate. But people in the North were like, hey, we don't really like when all those immigrants vote. And people in the South were like, we really don't want these black guys to vote. There were good reasons for the secret ballot too. [It was] very much motivated by making it harder for people who were illiterate to vote. It’s essentially a de facto literacy test.

[I have edited this for clarity. The full transcript is here.]
                       
If you could not read, you could stilld clip your ballot of the newspaper whose views matched your own and put it in the ballot box. But if you had to go into a booth and choose candidates from a printed ballot, you were lost.

But what if some Blacks might be able to read the ballot?

Some counties in Virginia, in the 1890s print some regular ballots. But then they print ballots in Gothic type - like, deep medieval Gothic type. And they give all those ballots to the black men. Its a completely illegible ballot.



There’s a joke that I first heard during the campaign for voting rights in the 1960s. [The version I heard included a word too offensive these days to use casually, so I have censored it.]

On election day in Alabama, a Black man shows up at the polls. “Sure, you can vote, boy” the poll watchers tell him, “but you know, you got to pass the literacy test.” The Black man nods. “Can you read, boy?” He nods again. The poll watcher hands him a newspaper – the Jewish Daily Forward.



“Can you read it, boy?”

“Well,” says the Black man, “I can’t read the small print, but I can read the headlines.”

“Yeah? What’s it say?”

“Schvartzim voten nisht in Alabama hay yor.”

The official title of the North Carolina law when it was proposed was

An act to restore confidence in government by establishing the voter information verification act to promote the electoral process through education and increased registration of voters and by requiring voters to provide photo identification before voting to protect the right of each registered voter to cast a secure vote with reasonable security measures that confirm voter identity as accurately as possible without restriction, and to further reform the election laws.


The shorter title was

Schwartzim ain’t votin’ in North Carolina this year.


Honor and Politics at the RNC

July 28, 2016
Posted by Jay Livingston

So far, the speakers at the Democratic convention have seemed much nicer than their Republican counterparts. The Republicans reveled in demonizing, insulting, and humiliating the people they disagree with. Trump of course, is the shining example, with his insulting names for his opponents. But that style is just a nastier variation on a theme that runs through Appalachia and the South – honor.

It’s not just Trump and the Trumpistas.

Maybe it was because I’d just been reading Honor Bound, Ryan Brown’s new book about “honor culture,” that I paid attention to this clip that Guardian reporter Ben Jacobs tweeted from the Republican convention. Ted Cruz had just spoken at a contentious breakfast meeting with the Texas delegation. Trump had already sewn up the nomination and many Texans resented Cruz’s refusal to endorse Trump. Here's a screenshot. For the full 28-secod video click on the link.

The delegate on the left, a Cruz backer, tells the bearded delegate (Steve Toth).“You’re a coward” 

“I’m a Texan,” says Toth.

“No, you’re a coward,” says the other.

The Cruz supporter makes a reference to Trump’s statements to the effect that Cruz’s wife is ugly and that Cruz’s father was in cahoots with Lee Harvey Oswald.  “If he said that about your wife or your dad, I hope you’d do the same thing. I hope you’d have some character to stand for your family.” A few seconds later, Toth responds to the earlier accusation, “You’re calling me a coward, sir.”

If you’d asked actors to improv Southern honor culture in thirty seconds, you couldn’t get much better than this. The central element in the culture of honor is reputation. From that, the rest follows:
  • Hypersensitivity to insult, especially insult to one’s reputation and character or that of one’s family
  • Chivalrous defense of women (so long as those women are loyal)
  • Value on group loyalty
  • Formal politeness
  • Willingness to use violence to defend that reputation. (This does not make an overt appearance in this clip, but I could easily imagine that “You’re calling me a coward, sir,” being followed by, “Them’s fightin’ words.” Similarly, the Cruz supporter is implying that when a man scurrilously insults your family, you don’t then make deals with him. You challenge him. You fight him.)
The “coward, sir” line nicely embodies the aspect of the Southern culture that Dov Cohen calls “the paradox of politeness.” Cohen, along with Richard Nisbett, contributed much of the early thought and research on honor culture. Some of their experiments tested how men would react to a person who was being annoying and rude. Northerners showed their anger earlier on and increased their anger as the provocations continued. Southerners remained polite. . . up to a point. But when that point was reached, as Ryan Brown puts it, “Southerners went ballistic. Their reactions were so extreme, in fact, the researchers decided the study should be shut down.”

Honor culture extends beyond personal interactions. Its ethos gets written into l aws and policies. The most obvious examples are gun laws and stand-your-ground laws. States and regions where honor culture runs deepest are least likely to restrict guns and most likely to permit their use against other people. The arguments favoring these laws are always about protecting what’s yours - your life, your property, your family -even when you might safely retreat.


Those arguments rarely mention protecting your reputation and honor. And even in Texas, if you shoot a man for calling you a coward, you’re probably going to wind up in prison. But jurors, judges, and parole boards might be more sympathetic there might be more sympathetic than those where people are less burdened by the idea of honor.

Less Policing, More Crime?

July 25, 2016
Posted by Jay Livingston


Does crime go up when cops, turtle-like, withdraw into their patrol cars, when they abandon “proactive policing” and respond only when called? The Ferugson effect is a variant on this idea. It adds the reason for the police retreat into “reactive policing” – criticism from citizens and politicians, usually touched off by the police killing of an unarmed person.
   
The Ferguson effect is a corollary of another idea – “Broken Windows” policing. That policy is based on the idea that if police do not enforce laws on minor “quality of life” offenses, serious crimes will increase.

The Ferguson effect has been blamed for increases in homicides and shootings in Chicago, Baltimore, and perhaps other cities. In New York too, the police, angry at the mayor, drastically cut back on “Broken Windows” policing starting in early December of 2014. The slowdown lasted through early January. This change in policing –less proactive, more reactive – gave researchers Christopher M. Sullivan and Zachary P. O'Keeffe a natural experiment for looking at the effects of Broken Windows.

First of all, they confirmed that cops had indeed cut back on enforcing minor offenses. In the graphs below, the yellow shows the rate of enforcement in the previous year (July 2013 - July 2014) when New York cops were not quite so angry at the mayor. The orange line shows the next year. The cutback in enforcement is clear. The orange line dips drastically; the police really did stop making arrests for quality-of-life offenses.

(Click on an image for a larger view.)

Note also that even after the big dip, enforcement levels for the rest of the year remained below those of the previous year, especially in non-White neighborhoods.

Sullivan and O’Keeffe also looked at reported crime to see if the decreased enforcement had emboldened the bad guys as the Ferguson hypothesis would predict. The dark blue line shows rates for the year that included the police cutback; the light blue line shows the previous year.


No Ferguson effect. The crime rates in those winter weeks of reduced policing and after look very much like the crime rates of the year before.

It may be that a few weeks is not enough time for a change in policing to affect serious crime. Certainly, Broken Windows theorists would argue that what attracts predatory criminals to an area is not a low number of arrests but rather the overall sense that this is a place were bad behavior goes unrestrained. Changing the overall character of a neighborhood – for better or worse – takes more than a few weeks.

I have the impression that many people, when they think about crime, use a sort of cops-and-robbers model: cops prevent crime and catch criminals; the more active the cops, the less active the criminals. There may be some truth in that model, but the New York data shows that the connection between policing and crime is not so immediate or direct.


---------------
Sullivan and O’Keeffe have written up their research in the Monkey Cage section of the Washington Post website (here). I have copied their graphs. I do not know if their work has been published in any peer-reviewed journal.

Police-Speak, Again

July 21, 2016
Posted by Jay Livingston

A recent post (here) noted that police departments often resort to contorted and vague language rather than say that a cop shot someone. “An officer-involved shooting occurred.”
The Washington Post this morning has this story.



The the man sitting up is autistic. He wandered away from his assisted living facility. The Black man lying on the ground is a therapist there and was trying to bring him back. The police showed up, heavily armed. The Black man lay on the ground, hands raised, and tried to tell the autistic man to do the same. He also shouted to the cops that the autistic man was holding a toy truck, not a gun.


One of the cops shot the Black man. Or as the statement from the North Miami PD put it,

“At some point during the on-scene negotiation, one of the responding officers discharged his weapon, striking the employee of the [assisted living facility].”

As someone (OK, it was me) tweeted, “I discharged my weapon striking the sheriff, but I did not discharge my weapon striking the deputy.”

Language is one of the less important aspects of this incident, but the other important details have not yet been reported, We do know that the bullet hit the man in the leg, that the police handcuffed him and kept him on the ground, still bleeding according to the Post, for twenty minutes.