Take My Rug – Please

August 31, 2016
Posted by Jay Livingston

Two couples at dinner. The check comes.
“We’ll get it,” says one man.
“OK, thanks,” says the other.

Of course, that’s not what happens. The other person is supposed to protest, to make the same offer to pick up the whole tab. Then comes the negotiation.

I’m not concerned here with how these things get resolved or the basis of the moves and countermoves. My point is just that we’re not supposed to take that first offer at face value. It might be sincere, or it might not. That’s what’s to be discovered in the subsequent negotiation.

When it comes to this generosity gambit, Iranian culture is way ahead of us. In Farsi, it’s known as tarof. I first heard about it on a 2011 episode of “This American Life.” An Iranian-American woman, Nazanin Rafsanjani, explains.

It’s basically this social custom of never saying what you want and offering things to people that you may or may not really want to give them. . . An uncomfortable thing that would often happen to me growing up is that I’d have all these American friends coming over to our house, and it always makes me slightly uncomfortable when someone’s at my parents’ house and they compliment my parents on something.

Because the custom is, if you go to my parents’ house and you say, like, “That’s a beautiful painting on the wall,” they’ll offer it to you. They’ll just be like, “Take it. It’s yours. It’s not good here anyway. It would look better in your house. Take it. It’s not worth anything to us. It’s much more important that you have it.”

Which brings me to the “Modern Love” column in the Styles section of the New York Times Sundays ago. It’s by Sharon Harrigan, presumably not Iranian. She begins:

The nicest thing I own is the first thing you see when you walk into my house: a red handmade rug bought in Tehran, haggled over in Farsi and delivered, in person, to the Brooklyn apartment of the man who would become my husband.

Back then, James told me the woman who gave him the rug, a woman he had recently dated, was by then “just a friend.”


I’ll skip the friends-become-lovers-become-married story (you can read it here) and jump cut to the next appearance of the rug, later in the narrative but earlier in chronology, when the author first sees it.

Weeks later, he buzzed me up to his apartment. The door opened to reveal the most beautiful rug I had ever seen, so finely woven it was more like a tapestry. The kind of precious object that could be ruined by a few stray Cheerios crumbs.

“It’s a gift from a friend,” he said. “She bought it when she visited her family in Iran.”

“She’s trying to get you back,” I said.

“What? She’s just being kind. Don’t you love it?”

It’s possible that she’s right – that the old girlfriend is trying to get him back. It’s possible that he’s right – that the old girfriend was just being kind. But – and I’m just guessing here, and I could be very mistaken – it’s also possible that the Iranian girlfriend, the one who haggled over the rug in Farsi, was tarof-ing her ex.  And he, not knowing any better, accepted the offer.

In the “This American Life” episode, the American producer does ask Nazanin Rafsanjani what happens if someone does just accept the offer. Rafsanjani answers: “I don't know. I mean, that just never happens.” But she means that it never happens between Iranians. They know how the game is played. But if the other person is a tarof-ignorant American, maybe an Iranian woman winds up losing her beautiful rug.*

------------------------------------

* In the episode, which you should really listen to (here) or read the transcript of (here), Rafsanjani describes how tarof happens even in retail sales. The storekeeper offers the merchandise for free.  “Take it” etc. The customer of course refuses and insists to know how much it costs. At some point in this negotiation, the storekeeper names a price. Now the sides switch, for the storekeeper, who had been saying that the goods were nearly worthless and that the customer should take them, asks a price which is invariably far more than the goods are worth, and the customer must haggle the price down.

Trickle-down Culture – As American As Pad Thai

August 26, 2016
Posted by Jay Livingston

Resentment against cultural elitists holds a prominent place in the populist energy driving Trump supporters. Mainstream conservatives have been playing this card since way back in the game with playground taunts like “brie-eating, chardonnay-drinking liberals.” In fact, this animus against elite culture may be what divides the pro-Trump and never-Trump conservatives, at least those who babble publicly in the media. Ann Coulter and Sean Hannity imagine themselves to be one with what another Trump supporter, Sarah Palin, called “the Real America” united against the cultural snobs.

But as Paul Krugman (here) points out, we’re all cultural elitists now. Or rather, what was once elite culture has gone mainstream.

But most of all, this kind of punditry, while ostensibly praising the Real America, is in fact marked by deep condescension. One pats the simple folk on the head, praising their lack of exposure to quinoa or Thai food — both of which can be found in food courts all across the country. Sorry, but there are no country bumpkins in modern America.

Even as recently as the early 2000s, part of the liberal stereotype mocked by conservatives was “latte-sipping.” Now NASCAR dads might well have a chai latte or venti in the cup holder of their pick-up. That didn’t just happen. Starbucks spent a lot of money opening outlets and spreading the word.

The same is true of Thai food. Americans didn’t wake up one morning with a cravings for pad thai and green curry. Matt Yglesias links to an article in The Economist.

In a plan ambitiously called Global Thai, the government aims to boost the number to 8,000 by 2003. This, it is argued, will not only introduce deliciously spicy Thai food to thousands of new tummies and persuade more people to visit Thailand, but it could subtly help to deepen relations with other countries.

In the United States at least, . . .  the number of Thai restaurants has grown from 500 in 1990 to more than 2,000 now [i.e., 2002]  . . .  More modestly, the Thai government aims to make it easier for foreign restaurants to import Thai foods, to help them to hire Thai cooks and sometimes to benefit from soft loans. It has been much encouraged that Tommy Tang, a Thai chef working in the United States, has said that he plans to open 200-300 Thai restaurants there during the next five years

Sometimes popular tastes change seemingly without anyone with a vested interest pushing things along, as when names like Barbara and Ashley go out of fashion, and Olivia and Ava become all the rage. In other areas, an entire industry – clothing for example – depends on its ability to convince people to follow the latest fashion. With food, there’s a complicated interaction between person-to-person influence within the population and a strong push from the outside by players who have a stake in the outcome. I don’t know about quinoa, but thanks in part to the efforts of the government of Thailand, Thai food may be on its way to becoming as American as pizza.*



*As a food becomes more popular, restaurateurs in the US who want more customers will find ways to make it more palatable to Americans, probably by toning down the spices and ramping up the sweetness. That’s the cue for elitists to look down on The Olive Garden and other “inauthentic” foods. (Pad Thai is to thai cuisine roughly what chop suey is to Chinese.) The politically correct will decry the cultural appropriation in Hawaiian pizza or a college food court version of banh mi. I know: cultural propriation, bad; Asian fusion, good. But sometimes it’s hard to tell the difference.

Lenny and Me

August 25, 2016
Posted by Jay Livingston
(No sociology, just –  to borrow Chris Uggen’s term –  self-indulgery.)

Leonard Bernstein was born on this day in 1918.

Earlier this summer, I was walking around Tanglewood on a weekday. The Koussevitsky music shed - the open-air concert venue – was deserted, so I walked up onto the stage and stood on the conductor’s podium where decades earlier I had seen Bernstein stand and conduct. (I’ll spare you the photo. I was not wearing my tux.) But that was not the first time our paths – Lenny and mine – had crossed.

In the early 1950s, Bernstein was a visiting professor at Brandeis. No doubt he felt at home in a department that was eager to go beyond the bounds of what traditional music departments did.

Some years later, when I was a sophomore at Brandeis, I had a campus job in the music building. A few days a week, at 5 p.m., I would play records (this was long before Spotify, long before CDs) for the students in the Opera course. I mean, I would play the records if any students came to the classroom in the music building, which they rarely did. I think a couple may have come when the topic was Don Giovanni; that’s the only reason I can think of that I have some familiarity with “Madamina.” We never got beyond Act I.

The classroom had a piano at the front for the instructor to use – a Baldwin baby grand – and sometoimes I would sit there and do my inept version of playing piano. I’d never had a lesson, and I played a sort of jazz by ear. (I recall that Horace Silver’s “St. Vitus Dance” was one of the tunes I was trying at the time.) One day late in the semester, I noticed a small metal plate, about 2" x 3" attached at the right edge of the piano above the keyboard. I read it. It said something like, “This is the piano that Leonard Bernstein learned to play on as a child, and is donated by his parents. . . .” I played that piano more frequently for the rest of the semester.

Here’s the Bill Evans solo version of “Lucky to Be Me,” from “On the Town” (1944 – i.e., when Lenny was 26). Evans takes some liberties – modulating the last eight bars to A♭instead of F the first time through. But Bernstein’s own chord changes on the bridge are incredible as is the melody – very chromatic and hence not easy for singers.



Soundtrack of the Zeitgeist

August 22, 2016
Posted by Jay Livingston

Yesterday’s post was skeptical about a link between pop culture and the Zeitgeist. I questioned whether the a change in the content of fiction or film or other products of the imagination reflected important social change. Still, when done well, Zeitgeist explanations often sound plausible.

Chris Molanphy knows a lot about popular music. He has  a feature at Slate called “Why Is This Song No. 1?” where he would give a super-hit of the day a full physical exam. The performer, the producer, the studio, the way the song evolved musically, the market, the audience, the distribution – all the elements internal to the music business and the creation of songs come under his stethescope. (See his take on “Love Yourself” or Drake (“One Dance” and “Hotline Bling”) or any of his other pieces here)

Molanphy also appears in a regular segment on The Gist, where he and host Mike Pesca turn back the pop-music calendar to a single year. Historical hindsight allows them to align the hits with forces outside of the music itself – politics, the economy, the general spirit of the time. When you’re looking backwards, the temptation to go full-out Zeitgeist is irresistible.

Here they are discussing 1999.

PESCA: What’s the case for its importance, what’s the case for its artistic value, what’s the case for 1999?

MOLANPHY: The case for 1999 is that this is American-led pop at its absolute apex. This is the height of the Clinton era writ large in pop music. It’s empire America at its peak, very candy colored, very Total Request Live, very Britney Spears “Baby One More Time” even. . . and surprisingly a lot of artists who were hits in this TRL era of pop have proved quite enduring. Britney Spears . . . Back Street Boys . . . Christina Aguilera . . . Nsync.

If the first half of the nineties was all about rather grim-faced music – it was about grunge, it was about gangsta rap – this is the candy colored half of the 1990s.

PESCA: This is the Clinton expanded economy, the Internet, this is the years of peace and prosperity, this is the pre-9/11, pre-wakeup-call, good time.

MOLANPHY: If you watch that year’s VMAs. . .  All of the artists that you see on this list of number ones are there. Britney Spears is there, Christina Aguilera is there, the Back Street Boys are there, and all the Latin Pop stars. Ricky Martin is there. . . You see a culture that feels like the good times are going to last forever. The dot-com era is at its height, Clinton’s in the White House, unemployment’s at five percent, everybody’s got money to burn, and the good times are here again. We know what happened two years later, but we know that 1999 is a very neon colored bright and shiny year, and for that I have an odd sort of nostalgia. [Emphasis added. Listen to the entire episode here].                            


Pesca even implies that the “good time” Zeitgeist of 1999 somehow knows what will happen two years later when it will give way to a bad times mood descending upon the country. “This is the pre-9/11, pre-wakeup-call, good time.” To paraphrase Yogi Berra, prediction is easy, especially about the past.”

Sometimes the producers of pop culture do try to come up with songs or movies or TV shows that align with the Zeitgeist as they perceive it. Usually, that means copying the most recent big success.  So we get a wave of superhero movies or doctor TV shows. (There are probably equivalents in music; I just don’t know them.) Sometimes it works; often it flops. As the screenwriter William Goldman famously said, “Nobody knows anything.” Including culture analysts who write about the Zeitgeist.    

Take My Zeitgeist, Please

August 21, 2016
Posted by Jay Livingston

So many plays today, on and off Broadway, are small, character-driven dramas, the kind that let the actors show their chops. Jesse Tyler Ferguson in “Fully Committed,” Jeff Daniels and Michellle Williams in “Blackbird.” Perhaps America in the post-Bush era is becoming more inward looking, more cautious about external adventures, more attentive to problems at home. I mean, that’s the kind of bullshit interpretation favored by some op-ed writers and bloggers. They take some trend in popular culture as a reflection of an all-encompassing spirit of the times (in German, Zeitgeist). And why not? After all, popular culture is by definition popular. It must strike a responsive vibration in the psyches of lots of people.

Ken Levine has a different take. He’s a sitcom writer (Cheers, Frasier, The Simpons . . .), who blogs  (every day!) mostly about the entertainment industry.  Working in the biz, he is highly sensitive to the non-Zeitgeist constraints on what does and doesn’t wind up in the cultural stream.

He wrote recently (here) about “Fully Committed,” which he saw on Broadway. He must have gotten to the theater early because apparently he read all of the Who’s Who bios in the Playbill – offstage people too, not just the cast. Maybe that’s what people in the biz just do, read the entire Playbill. Or maybe it didn’t take him all that long to read the bios for the entire cast (n = 1) so he kept reading. The bio for the writer of the show (Becky Mode) reminds Playbill readers that since 2001, this play has been “one of the ten most produced plays in the United States.” Wow. Is this ranking a clue to the Zeitgesit? Does the popularity of “Fully Committed” reflect a 21st-century concern with full commitment? Or with trendy restaurants?


Not according to Ken Levine. He thinks it’s about the economics of theater.

It’s one actor, one desk, and two phones. It also must be one of the ten cheapest plays to produce in the United States. The actor gets quite a workout, but still, it’s very doable. Especially if a theatre is planning its season and has another play that requires say...actual costumes.

The theatre scene is really run today on a tight budget. . . . The requirements today (unless you’re Tony Kushner or Tom Stoppard) are this: No more than four actors, preferably one set or just a few props that can suffice for a set, and not a lot of wardrobe or effects. I feel bad for us playwrights because that severely limits the kinds of plays we can write . . .


This reminded me of Wendy Griswold’s classic 1981 article about American novels in the 19th and early 20th centuries. Some culture analysts saw in them a “femininization” of American culture starting in the 1890s. Before then, American novels were more about masculine and uniquely American concerns (think Moby Dick, Huck Finn, Last of the Mohicans). The more sentimental novels read by Americans (mostly women) came from British authors, not Americans. But towards the end of the century, American writers began to pay more attention to domestic matters.

The feminization idea is consistent with other trends in American society. But Griswold shows that the change had much less to do with a shift in the Zeitgeist than with the enforcement of international copyright laws. Prior to 1891, American publishers did not have to pay royalties to a foreign author. They could reprint titles by British writers very cheaply (a copy of A Christmas Carol, which cost the equivalent of $2.50 in England went for six cents in the US). American authors were fully capable of writing sentimental fiction, but publishers preferred the cheaper imports. American novelists turned their efforts to subjects and genres where British writers couldn’t compete  (think Moby Dick, Huck Finn, Last of the Mohicans). Then, once copyright laws guaranteed royalties on both sides of the Atlantic, British “feminine” fiction lost its economic advantage, and publishers issued more and more sentimental work by American authors.

I don’t know. Maybe the spirit of the times in the US did change in the late 19th century, with religion and middle-class women feminizing the culture. Searching for the Zeitgeist is a game anyone can play. Or you can take Deep Throat’s advice and follow the money.

Ideology and Memory

August 16, 2016
Posted by Jay Livingston

Political ideology shapes what we see and what we consider important, as I’ve blooged recently (here and here). Ideology also skews what we remember and how we remember it.

The worst terrorist attack on this country happened on Sept. 11, 2001. George W. Bush had taken office nine months earlier on Jan. 20, 2001. Yesterday, Rudy Giuliani said, referring to Bush’s two terms,“Under those eight years, before Obama came along, we didn’t have any successful radical Islamic terrorist attack inside the United States.” Here’s the video.



He is not the only one to make this mistake. Bush’s former press secretary Dana Perino left the White House at the end of Bush’s term and took a job at Fox News, where in 2009 she told viewers, “We did not have a terrorist attack on our country during President Bush’s term.”  (A video is here. Push the slider to the 0:35 mark.)

I do not think that Giuliani and Perino are deliberately lying. It’s just that their political views have prevented them from seeing or remembering the facts. The belief that George W. Bush effectively prevented terrorist attacks does not square with the fact that the attacks of 9/11 happened when Bush had been in office for nine months. If the facts don’t fit the belief, too bad for the facts. They are no match against the need for cognitive consistency.

What is striking about the Giuliani/Perino view is how widespread it is. I have long thought that one of the great public-relations achievements of the Bush administration was its ability to create the impression that the attacks happened on someone else’s watch. Many people seem to believe that it was someone else’s fault, though they never get around to thinking who that might be. Maybe Obama.

Even today, few people publicly blame the Bush administration for being asleep at the switch. That is certainly true of Giuliani. He loves to recount his reaction on that day.

At the time, we believed that we would be attacked many more times that day and in the days that followed. Without really thinking, based on just emotion, spontaneous, I grabbed the arm of then-Police Commissioner Bernard Kerik, and I said to him, “Bernie, thank God George Bush is our president.”

The Bush-Cheney administration had been in office for nine months, getting regular reports from its terrorism experts like Richard Clarke warning of Al Qaeda, reports that Bush-Cheney discounted. Clarke, when he heard the news on the morning of Sept. 11, said to himself, “Al Qaeda.”
Rudy Giuliani said, “Thank God George Bush is our president.”

Given his public commitment to Bush, Giuliani could not very well publicly acknowledge any facts suggesting that Bush was at all responsible for the attacks. It seems that he cannot even acknowledge those facts to himself. And so he winds up making a statement that is so obviously wrong the video instantly flies around the Internet (or at least around the leftward territories). 

A Boy Named Sue Ashley

August 12, 2016
Posted by Jay Livingston

“Has anyone here ever seen the movie ‘Gone With the Wind’?” I ask my class during a discussion of names. “Do you remember that there was a character named Ashley Wilkes?” I say. “That role was played by Leslie Howard.”


Most students have not seen GWTW, and they are surprised to learn that Leslie Howard as Ashley Wilkes is the one on the left. They know that Leslie can be a boy’s name, though it’s mostly for girls. But Ashley? Yes, Ashley. Until about 1939 (the year “Gone With the Wind” was released), Ashley was unknown as a name for girls. As a name for boys it was not common – most years, fewer than 10 per 100,000 – but it was not weird, certainly not among Southern gentry.

(Click on an image for a slightly larger view.)

Then starting around 1950 and accelerating in the 1960s, Ashley took off among girls, followed by a smaller boom among boys. (The y-axes on the graphs are not the same scale. Male Ashleys at their peak in 1980 numbered only about 40 per 100,000. In the peak year for girls, the rate was nearly 700 per 100,000.)

Boys names becoming girls names is not unusual. Nameberry has a list of more than sixty names that have “morphed from blue to pink.”  The reverse almost never happens. Leslie is a good example. Until 1940, it was rare among girls, fairly common for boys. Up until about 1910, it ranked in the top 100 names for boys.


In the mid-1940s, Leslie became increasingly popular for girls, increasingly unpopular for boys. These contrasting trajectories suggest a sort of “there goes the neighborhood” effect. As girls move in, boys move out. Eventually the name becomes thought of as feminine, and parents no longer consider it fit for boys.

Kelly follows a similar pattern. For boys, the name is unusual; for girls it’s unheard of.


Then, around 1950, the number of boy Kellys triples in a decade, though those numbers are still relatively small – only in its peak year, 1968, does it break into the top 100 and then just barely at #97.  But following the boys by ten years or so, girl Kellys come on strong.  From ranking 904th in 1950 Kelly rose in popularity so that by 1966 she was in the top 20, where she remained for another fifteen years. The gender tipping point came in the late 1960s. Kelly became a girl’s name, and parents of boys stop choosing it.

The unusual thing about Ashley is that it reverses this pattern. The increased popularity for boys follows the girl Ashley boom by about ten years. That is, a small but increasing number of parents continued to name boys Ashley even after the name had become established as a name for girls.

Despite this exception, the unwritten rule of naming seems to be that you can give a girl a predominantly male name; she and her name will still be accepted. You might even be in the vanguard of a trend, like the parents in the late 1940s who named their daughters Ashley. But you can’t send a boy out into the world with the name Sue.                                        

Males are more constricted by norms of masculinity than are females by the norms of femininity. And not just in naming. Girls will do boy things, but the reverse is less common. It’s more acceptable for a girl to be a “tomboy” than for a boy to be a “sissy.”  Girls will watch movies targeted at boys, but boys shy away from girl-centered films. Among adults as well, women give favorable evaluations to TV shows targeted at men,  but men are less able to appreciate shows outside their narrow band of interest. (Walt Hickey at FiveThirtyEight thinks men are “sabotaging” women’s shows by giving them low ratings.) 

The same is true in fashion, where women can choose from a wider variety of colors and styles, including those usually for men. Men’s choices  are more constrained. Men will not wear skirts, but women will wear pants and even pants suits, an item of clothing I mention only as a cheap way of getting to one final name.


It follows the usual pattern – a male name, albeit an uncommon one, declining in popularity, crosses over and becomes a name for girls. Its popularity increases rapidly. Up to a point. That point was 1993. Hillary was doing fine before that, but then for some reason, parents of daughters were no longer with her.

Who’s Shameless?

August 11, 2016
Posted by Jay Livingston

How can Donald Trump, with his 39 Pinocchios from Fact Checker, continue to make false and outrageous claims? How could he denigrate the gold star parents of a Musliim US soldier killed in Afghanistan? Why has he no sense of shame?

Trevor Noah, interviewed on Ezra Klein’s podcast, suggested that it started with bankruptcy. For most people, declaring bankruptcy is a matter of shame. It is a public admission of failure. But for a business, it’s not really so bad. American bankruptcy laws allow business persons to pick themselves, dust themselves off, pay their creditors and suppliers a fraction of what they are owed, and start all over again. Which is what Trump has done at least four times. Even if he might have felt a slight touch of shame the first time, it quickly wore off in subsequent bankruptcies. Trump the businessman might have taken a financial hit, but Trump the public person suffered no loss of social standing.

Before looking for other explanations – surely they must be out there – I wanted to  see the extent of the image of Trump as shameless, I went to Google.



Nearly 700,000 hits. The difference between him and other polticians must be huge. For comparison, I tried the Democratic nominee.



Hillary, by this measure, is not quite so shameless as the Donald, but 500,000 seemed like a lot. Then again, her opponents could reel off a list of scandals dating back to her days in Arkansas. I tried a few successful presidential candidates.



Obama and Bush were not so far behind. The toll was high even for Ronald Reagan and Jimmy Carter, who served before shouts of “shameless” could be echoed around the Internet. Besides, Reagan and Carter, whatever you thought of their policies, seemed like decent human beings. Yet their quotient of “shameless” pages runs to hundreds of thousands. I confess I am ignorant of the ways of the Google algorithm and what those numbers actually reflect. Still, nearly half a million seems like a lot.

Maybe this is not about the politicians themselves. It’s about reactions to those politicians, especially in a polarized polity. Partisans strongly committed to their own point of view often believe that those who disagree with them are acting in bad faith. (See this earlier post about politics and perception.) They think that their own views are so obviously valid and true that a person who sees things otherwise must be denying reality and deliberately lying. These denials and lies are so blatant, so transparent, that most people would be ashamed to utter them. Who could say things that they know are factually and morally wrong?  The politician who is shameless. But the shamelessness may be mostly in the eye of the beholder

Weber at the DNC

August 3, 2016
Posted by Jay Livingston

Like those Japanese soldiers in Southeast Asia who held out long after Worrld War II was over, a few Bernie supporters are vowing to stay in the jungles fighting the good fight. Some are going with the Green party. The Guardian quotes one of them: “I just really strongly believe that you should always vote your conscience.” 

She is voicing what Max Weber called an “ethic of conviction.” In “Politics as a Vocation” (1919), Weber distinguished between that ethic and an “ethic of responsibility.” Conviction, as the name implies (at least in this English translation*), carries a religious certainty about ultimate values. Those who operate solely on an ethic of conviction refuse to compromise those values. How could conscience let them do otherwise? They remain faithful to their values regardless of the actual consequences in the shorter term.  Weber quotes the maxim, “The Christian does rightly and leaves the results with the Lord.”

By contrast, those guided by an ethic of responsibility pay attention to the immediate effects of an action or policy. “One has to give an account of the foreseeable results of one's action.”

These two ethics seem contradictory. Yet, Weber says, those who engage seriously in politics must blend these two seemingly incompatible orientations.

The ethic of conviction and the ethic of responsibility are not absolute opposites. They are complementary to one another, and only in combination do they produce the true human being who is capable of having a “vocation for politics.”


Max Weber, meet Sarah Silverman (2016): “Can I just say to the ‘Bernie or Bust’ people: you’re being ridiculous.”

------------------
* The German term, Gesinnungsrthik, has been translated as “Ethic of ultimate ends,” “Ethic of single-minded conviction,” “Ethic of absolute conviction or pure intention,” “Ethic of principled conviction,” and “Ethic of intention.”

The Social Fox Construction of Reality

August 2, 2016
Posted by Jay Livingston

Why do they love Trump? I often lean to the psychological explanation that many of the Trumpistas are voting with their emotions. It’s not about policy. It’s not about what might be best for their own lives or the country. They like Trump because he gives voice to their thoughts and feelings, especially thoughts and feelings that are unacceptable in the current cultural orthodoxy. Trump expresses their many resentments – against cultural elites, against economic elites, against immigrants, against the poor – especially the Black and Hispanic poor – against government.

I’m not quite ready to abandon that view, but Mark* at the blog West Coast Stat View (here) has a simpler explanation. Voting for Trump is rational. It is realistic – that is, it is consistent with reality. Mark doesn’t explicitly use the term “bounded rationality,” but he’s getting at something similar.  A decision may seem irrational to others (“What was he thinking?”); we ourselves may find it irrational looking back on it (“What was I thinking?”). But at the time, it  made sense given the available information and our ability to process that information. The problem was not with our rationality but with the boundaries that limited what we could see. It was consistent with our reality.

The important question in voting then is “Which reality?” And the answer Mark gives for Trump voters is: “the reality of Fox News.”

The tile of Mark’s post is “Explaining Trump in Four Words,” and the four words are, “Republicans believe Fox News.”

Here is Mark’s summary of that reality:

  • Global warming is a hoax
  • The government and the media are hostile to Christians
  • Food-stamp recipients live on steak and lobster
  • While America is the most taxed nation in the world
  • The financial crisis was caused by government policies that required loans to be made to poor minority members
  • The 2008 election was probably stolen
  • President  Obama's birth records are possibly fraudulent, the product of a massive cover-up
  • President Obama is certainly anti-American
  • As are most Democrats
  • Voter fraud is rampant
  • Islamic terrorist are on the verge of major attacks on Americans
  • America is in decline

A few hours after the The West Coast Stat View post appeared, Trump told an interviewer, “You have radical Islamic terrorists probably all over the place, we’re allowing them to come in by the thousands and thousands.” A dozen Pinocchios or Pants-on-Fire ratings from fact-checkers won’t matter. Those thousands of terrorists streaming in to the US are a reality for Trump and for his followers.

---------------------------
* I’m not on a first name basis with Mark. I don’t know him at all. But I’ve searched WestCoastStatView and cannot find his full name or that of his co-blogger Joseph. Which is too bad because they run an excellent blog.