Curiouser and Curiouser

January 25, 2009
Posted by Jay Livingston

The Curious Case of Benjamin Button – I saw it Saturday night, and the thing I found most curious was the passivity of the hero.

In case you hadn’t heard, the film is about a man who is born old and wizened and ages in reverse. As the years pass, his body grows ever more youthful, while his mind grows older in the usual way.

The film contains three stories:
  1. A love story – Benjamin and Daisy. Fated to become lovers, and when they finally get together, they know that their love is doomed. Daisy will grow older while Benjamin physically becomes a child.

  2. US History 101. The film paints Benjamin’s life, and Daisy’s, against the broader canvas of historical events – from Armistice Day to Hurricane Katrina – with some notable omissions, like the Depression.

  3. Mother-daughter. Daisy, dying in a New Orleans hospital in the present, has her daughter read Benjamin’s diary to her. The Button story is told in these flashbacks. The mother satisfies her nostalgia, but the daughter is angry. “This is how you let me know who my father really was?” Or words to that effect. It was the only real dramatic conflict in the movie.
Through it all, Benjamin is strangely passive, especially for an American hero. Most leading men in US films don’t express much emotion, except anger. But they are usually men of action. (The trailers that preceded Button were full of guys chasing, shooting, fighting, blowing things up. Even the young women in the chick-flick trailer (Bride Wars) were slugging it out.) American protagonists take steps, tackle problems, compete, outwit outfight, etc. Benjamin, however, drifts along on the waves of history. He winds up in a naval battle, but as the bullets fly, we see him mostly lying on the floor of the tugboat while the boat ultimately destroys the German submarine.

He is passive with women as well, including the love of his life Daisy. He does go to Paris in pursuit of her, but when he finds that she has a boyfriend there, he’s very willing to take no for an answer and goes back to New Orleans. Years later, Daisy shows up and asks him to sleep with her. Here we finally see Benjamin as an active young man, riding a motorcycle, piloting a sailboat, making love. But these years, the late 1950s and the 60s, fly by in a nearly wordless montage that takes up only a few minutes in a film that lasts well over two and a half hours.

In the rest of the movie, Benjamin moves through life with a homey fatalism.
Along the way you bump into people who make a dent on your life. Some people get struck by lightning. Some are born to sit by a river. Some have an ear for music. Some are artists. Some swim the English Channel. Some know buttons. Some know Shakespeare. Some are mothers. And some people can dance.
No surprise that the screenwriter is the guy who wrote Forrest Gump. What is surprising – no, curious – is that these protagonists who passively observe life rather than trying to change it are the center of highly regarded American films – films that get nominated of Golden Globes and Oscars.

The Inaugural II - Just Another Word

January 24, 2009
Posted by Jay Livingston

Barack Obama used the word freedom three times in his inaugural speech. Presidents often invoke values in their inaugurals, so I would have thought we’d hear more about freedom. I guess my baseline expectations had been raised during the Bush years. George W. used the word 27 times in his second inaugural. That was unusually high. But three is not unusually low. By historical standards, it’s about average.

I had been thinking of freedom as one of those eternal American values. But that’s not the picture that emerges from the chart of inaugural speeches.



Freedom seems to be mostly a word of the post-War era. Several earlier inaugurals use the word not at all – among them both of Washington’s, both of Lincoln’s, and FDR’s first two.

Freedom is also favored more by Republicans than by Democrats. Combining all post-War Republicans and all post-War Democrats, we get
  • Democrats – 2.4 freedom per 1000 words
  • Republicans – 4.4 freedom 1000 words

The Inaugural I - Talking ’Bout Generation

January 22, 2009
Posted by Jay Livingston

Here’s a Wordle of President Obama’s inaugural speech. (And by the way, how does it feel to you to say that phrase, “President Obama”?)
(Click on the Wordle to see a larger version.)

The word that is strikingly present here in comparison with other inaugurals is generation. Ronald Reagan used the word not at all in his first inaugural and only once in his second – a call to protect future generations from government spending.

Other presidents have spoken of generations, but the word usually appears as part of the unity-of-history theme. The inaugural is a ritual, and rituals exist in sacred time, a time that links the present with the past. So inaugurals often refer to America “across the generations” and to our obligation to future generations.

But over this continuity-of-generations line, some presidents sound a different theme – the theme of generational change. The most notable and most quoted version is JFK’s “the torch has been passed to a new generation of Americans.” Kennedy saw that new generation as already formed. He pointed to their shared experiences – “born in this century, tempered by war, disciplined by a hard and bitter peace, proud of our ancient heritage”– and the shared values that emerged from those experiences – “unwilling to witness or permit the slow undoing of those human rights to which this Nation has always been committed.”

Obama, by contrast, sees himself and the generation that was such a crucial factor in his campaign with some uncertainty. It’s not about what they already are, it’s about what they will become. And that depends on how they respond to the crises that the previous generation has dumped on them. We are living in “a moment that will define a generation.”

Inaugural - The Benediction

January 21, 2009
Posted by Jay Livingston

I’m not much of a connoisseur of religious speech (did anyone else notice – how could you fail to notice – Obama’s shout-out to “nonbelievers”?), but I thought Rev. Lowery’s benediction closing the inaugural was perfect. OK, maybe a bit too long. But what a finish. It stayed right on topic, a serious topic, but still provided needed smile at the end of an hour, a day, a two-year campaign, of gravity and high drama.



(Full text here.) Even if you don’t listen to the whole thing, drag the time button to 4:30 and listen to the last thirty seconds. And look at Obama and the others with him smiling.
help us work for that day when
black will not be asked to get in back
when brown can stick around
when yellow will be mellow
when the red man can get ahead, man
and when white will embrace what is right.
That all those who do justice and love mercy say Amen.
There’s a weak “amen” from the crowd, so he repeats the call twice. And you get the sense that hundreds of thousands of people on the mall and millions of people across the country were saying “amen.”

I watched the inauguration in a classroom full of undergraduates. They were all attentive. I didn’t hear any chatting, and I didn’t see anyone texting on a cell phone. Most of them filed out after the speech, so there weren’t too many of us left in the room when Rev. Lowery spoke. But I’d bet that none of the students who had been in the room with me knew what he was talking about in those last lines. Too bad.

Listen for yourself.

Is War Hell?

January 20, 2009
Posted by Jay Livingston

Ann Coulter got one right. Sort of. She takes the New York Times to task for a recent article on Iraq veterans who have committed murder. (Full Coulter column here.)
The Treason Times' banner series about Iraq and Afghanistan veterans accused of murder began in January last year but was quickly discontinued as readers noticed that the Times doggedly refused to provide any statistics comparing veteran murders with murders in any other group.
She’s right about the lack of data. She’s also right that by focusing on anecdotal evidence and not using rates, the Times appears to be deliberately promoting the crazed-war-veteran stereotype.

Coulter, on the other hand, is arguing that among things that drive people to murder, a year or two patrolling the streets of Baghdad is no worse than life in these United States. Is she right?

Coulter provides some comparative stats.
From 1976 to 2005, 18- to 24-year-olds -- both male and more gentle females -- committed homicide at a rate of 29.9 per 100,000. Twenty-five- to 35-year-olds committed homicides at a rate of 15.8 per 100,000.
The Afghanistan war started in late 2001, Iraq in 2003. But Coulter uses data spanning 1976 to 2005. Using data from the Iraq war era (2003-2008) would give a somewhat lower figure, no higher than 27 per 100,000. Ideally we would adjust that by age, sex, race, and region to make it comparable to the demographics of the army.

The crucial question is: what is the rate of homicide among Iraq war veterans? To answer that, we need to know how many veterans there are and how many murders they committed. Not easy.

The Times cites 121 murders by Iraq vets, but The Times’s research on “homicides involving all active-duty military personnel and new veterans” turned up “349 cases . . . about three-quarters of which involved Iraq and Afghanistan war veterans.” And those are just the ones the Times found by searching through court records and newspapers. So 270 is a minimum estimate. Considering that the Times included the years starting with the Afghanistan invasion of late 2001, it works out to about 40 per year.

That’s the numerator. What about the denominator?

How many veterans? Coulter gives the number of troops who have served as 1.6 million, a very high-end estimate. John Hinderaker, a conservative who launches grenades at the Times article from PowerlineBlog, proposes less than half that. “For the sake of argument, let's say that 700,000 soldiers, Marines, airmen and sailors have returned to the U.S. from service in Iraq or Afghanistan.”

But should we count all of them? The war-crazed-vet hypothesis is concerned with the psychological effects of combat and the daily exposure to death, mutilation, and danger. Should we count the airmen and sailors? Should we count soldiers who serve in some support capacity and never see battle or go out on patrol?

We also need to know not just the total number of returned vets; we need to know the number for each year. That 700,000 number is cumulative. There were certainly not 700,000 returned troops in 2002 or 2003.

So Ann Coulter is right, not in what she says but in the implications of what she says: to see if war is hell and whether that hell has lasting consequences on those who go there, we need good data. The trouble is that we don’t have it.

Winners

January 18, 2009
Posted by Jay Livingston

In some places these days, there’s more than one reason to celebrate.


(For those who don’t recognize this image – could there be such? – it’s Mike Tomlin, head coach of the Pittsburgh Steelers. For more on the link between the broader appeal of the Steelers and the Democrats, see this post from the early days of the Socioblog.)


I don’t know much about copyright law, but I imagine that the Obama campaign wouldn’t have wanted to copyright “Yes we can” even if they could. As for the visual, I guess you can’t copyright a style, a look, or a technique. Besides, Shepard Fairey, who created the Obama picture, says he’s not interested in enforcing the copyright, at least not against those who are using it for a worthy cause. And Mike Tomlin and the Steelers are certainly worthy.

The graphic was created by CommonWealth Press, a printing company on Pittsburgh’s South Side. If you want a t-shirt (and of course you do), go here. Pittsburghers take the Steelers seriously. The fifteen highest rated TV broadcasts of 2008 in the Pittsburgh market were fifteen Steeler games, with a 44.5 rating and a 66 share.

Privilege and Invisibility

January 17, 2009
Posted by Jay Livingston

It has been months since I felt the need to scream with a blood-curdling cry at some commie, partisan subordinate (i.e., most of the [Voting] section staff until recently). And I feel like the people I now work with are all complete professionals. What a weird change. Granted, these changes are nice in many respects, but bitchslapping a bunch of [Division] attorneys really did get the blood pumping and was even enjoyable once in a while. I think now it's all Good Cop for folks there. I much preferred the role of Bad Cop. . . . But perhaps the Division will name an award for me or something. How about the Brad Schlozman Award for Most Effectively Breaking the Will of Liberal Partisan Bureaucrats. I would be happy to come back for the awards ceremony.
That’s a memo (June 2006) from Brad Schlozman, a Justice Department official.

The Bush administration tried to turn the Justice Department into a wholly-owned subsidiary of the Republican National Committee. That’s obvious to everybody. Well, almost everybody. What’s interesting is that those most responsible for politicizing Justice seemed to think that they were being anti-political. Schlozman seems to have seen his hiring policies as getting rid of politics, taking Justice out of the hands of partisans and returning it to “real Americans.”

There’s a broader lesson here: Privilege – of race, gender, class, ideology, or anything else – works best when it’s invisible. As soon as people become aware that some groups enjoy privileges denied to others, the game is half over. To maintain their position, the privileged groups will now have to resort to obvious forms of power. It’s much easier if the system goes unquestioned.

Also, those who benefit most from privilege are usually the last to notice it. They cling to the idea that the system is neutral. Things that work in favor of the dominant group are “natural.” It’s only those who point out the privilege who are playing politics. For example, the Bush tax cuts, in the Republican view, were right and good – letting people keep their own money. To point out that the tax cuts disproportionately benefitted the wealthy was to engage in “class warfare.” Similarly, although nearly all the people Schlozman hired had ties to the Republican party or conservative groups, he saw himself as getting rid of “partisan bureaucrats” and replacing them with “complete professionals.”

We May Have Disagreed With Him on Iraq, the Environment, Torture, Tax Cuts . . . But

January 16, 2009
Posted by Jay Livingston

President Bush was not especially popular among college professors, but apparently in his final days in office, he's trying to change that.


(Full story here.)

Separated at Birth?

January 13, 2009
Posted by Jay Livingston

Chris Uggen, in a blog post on “doppelgangers,” says that he was surprised to find that there are four people who share his name.

Chris is stretching the meaning of doppelganger. It’s not about names. It usually means “any double or look-alike of a person.” The Wikipedia entry adds that seeing one’s doppelganger can be a portent of danger.

One of the co-nominals Chris finds is an orthopedic surgeon. But if Chris wants to find a real doppelganger, he should try looking in the kitchen. Of course, that ominous portent might make for a kitchen nightmare.


(Thats Chris on the left, multi-starred chef Gordon Ramsay on the right. Or is it the other way round?)

"They" Write the Books

January 12, 2009
Posted by Jay Livingston

Here’s a radical idea: textbooks are written by people. I know, it doesn’t seem alien to us academics. But in students’ thinking, textbooks and teachers represent two completely separate and different spheres. Teachers – you can take them or leave them. Textbooks are absolute and unimpeachable if often impenetrable.

Kieran Healy, in a comment at Scatterplot, recounts an exchange between a student and a professor who had just offered some ideas from a paper he was working on. The student was skeptical.
S: You mean you’re just making it up?
P: Well, in a sense, yes. But in another more important sense, no.
S: I’m not comfortable with going beyond the textbook like this.
P: Where do you think the stuff in the textbook comes from? Out of the ground in Nebraska or something?
Well, yes. To my students, the origin of textbooks is a matter of mystery and awe. The texts might be handed up out of the Nebraska ground or handed down from a sacred mountain. In either case, human authorship is out of the question. This, despite our constantly referring to books not by their titles but by their author’s name. (“For Monday, read chapter four in Newman/Stark/Tischler/Macionis/Whoever.”)

To students, the author of all textbooks is not someone with a name. It’s “They.” “They” is a windowless fortress-like factory in some remote location, spewing out books that students are forced to buy. “They” produce chemistry books, sociology books, economics books – just about everything on the bookstore shelves for course readings

I had a vague sense of the width of this chasm, in students’ perceptions, between textbooks and teachers. But I didn’t fully catch on until one year when I was teaching criminology and used the textbook I myself had written. Several weeks into the semester, a student had a question about some point I was making or some data I was presenting. I don’t remember the topic or the issue. All I remember is that the student said, “But didn’t they say . . .” and she went on to offer some bit of information.

“They?” I asked, “What they?”
“In the book. Didn’t they say that . . . .” she repeated the information.
“They is me,” I said. “I wrote that book.”

She seemed genuinely stunned, and I sensed that many in the class shared her confusion. The book was a school textbook; therefore it must have been written by the same “They” that churned out all textbooks. Yet here was someone they knew, a very ordinary person they saw two or three days a week, claiming to have written the book, and the evidence on the cover seemed to support his claim.

I don’t think they ever truly resolved the dissonance.

I'm a Sociologist and I'm O.K.

January 8, 2009
Posted by Jay Livingston

Being a sociologist may not be the best job in the world. But, hey, eighth out of 200 ain’t bad.
That’s the news from The Wall Street Journal (I’ve always said the WSJ was a wonderful publication). On Tuesday, the Journal released the results of an evaluation by CareerCast.com of two hundred jobs.

(For a larger version, click on the image.)

Oh, sure, you could be a welder down at #194 on the list, or a lumberjack (#200), and you might be O.K. But if you’re playing the percentages, and if you can’t quite do the math, you could do a lot worse that sociology as a career.

Why did we do so well? CareerCast’s criteria were
  • environment
  • income
  • employment
  • outlook
  • physical demands
  • stress
Here's a bit of what a statistician (#3) might call anecdotal evidence.
Mark Nord is a sociologist working for the Department of Agriculture's Economic Research Service in Washington, D.C. He studies hunger in American households and writes research reports about his findings. "The best part of the job is the sense that I'm making some contribution to good policy making,” he says. “The kind of stuff that I crank out gets picked up by advocacy organizations, media and policy officials.”

The study estimates sociologists earn $63,195, though Mr. Nord, 62, says his income is about double that amount. He says he isn't surprised by the findings because his job generates little stress and he works a steady 7:30 a.m. to 4 p.m. schedule. “It's all done at the computer at my desk,” he says. “The main occupational hazard is carpal tunnel syndrome.”

On the opposite end of the career spectrum are lumberjacks. The study shows these workers, also known as timber cutters and loggers, as having the worst occupation, because of the dangerous nature of their work, a poor employment outlook and low annual pay -- just $32,124.

Hat tip: Chris Uggen

Drug Pushers

January 7, 2009
Posted by Jay Livingston

I’m in Florida visiting my mother. When I go to make out a shopping list, I notice that every pad of paper, every post-it, every pen, carries the logo and name of some drug – bounty from visits to doctors.

But that will all stop now. The drug companies aren’t admitting that the freebies they lavish on doctors have tainted the practice of medicine. But even though there’s nothing wrong with treating doctors to these gewgaws or the sandwiches for the office staff and nice dinners for the docs, Big Pharma has vowed to stop it.

NEW RULES FOR 2009
The Pharmaceutical Research and Manufacturers of America's new rules for interacting with health care professionals:
• No branded pens, pads, mugs or other “leave behinds”
• No tickets to theaters or sporting events.
• No “dine and dash” meals dropped off in doctor's offices.
• No expensive dinners out at restaurants.

Big deal. I doubt that the mugs and pens were swinging much weight. These are doctors, after all. They’re not going to toss you a Manhattan-worth of drug sales for $24 worth of trinkets. No, the real problem is the money for speeches and “research.” Not only are these fees much heftier. More important, the speeches and research don’t look like corporate PR. They keep the large hand of the drug company hidden behind the curtain. There’s no sales rep pushing a product. Instead, it’s one doctor informing other doctors, colleague to colleague.

This money from Big Pharma corrupts the practice of medicine, and it corrupts research. Take a look at the article by Marcia Angell in the current New York Review of Books, (ungated, at least for now, here). Here’s the money quote.
It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or authoritative medical guidelines. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as an editor of The New England Journal of Medicine.

An Idea Whose Time Has Come

January 5, 2009
Posted by Jay Livingston

Men – the gang that can’t shoot straight.

Long ago, I blogged (here) about public men’s rooms, particularly the mess caused by men peeing outside the box. Here was the stuff of culture. The American approach to unwanted behavior is typically to define it as sin and punish it, and I suggested (only half facetiously) that our first impulse would be to try to curb splashing by levying heavy fines.

The Dutch by contrast have a less morally absolutist approach. They see unwanted behavior as a problem to be solved. Instead of posting signs threatening punishment, they gave men a target to shoot at – a fly painted on the porcelain – and spillage decreased by 80%.

But American culture, in addition to its moralism, also has a healthy streak of pragmatism – a practical concern with results. Nothing succeeds like success, and apparently news of the effective insect has crossed the Atlantic.

I was at JFK Airport last night (the teenager formerly in residence was himself about to cross the Atlantic), and in the men’s room, every urinal still had its central fly. I had noticed something similar at the Newark airport last summer, though with a slight variation. JFK gives shooters a realistic fly to aim at. Newark uses a cartoon-like bee (a realistic bee might might trigger a counterproductive startle and flinch).

(For a larger picture, click on the image.)

No doubt, other painted-on insects are coming soon to an airport urinal near you.

UPDATE, January 6. The target idea hasn't caught on everywhere. I'm blogging this from the new Jet Blue terminal at JFK, where the urinals have no small creatures in them, real or painted.

The Wisdom of Crowds vs. The Smart Money

January 3, 2009
Posted by Jay Livingston

It’s NFL playoff season, and here at the Socioblog, that means it must be time for The Wisdom of Crowds vs. The Smart Money.

The idea behind the “wisdom of crowds” is that the average guess or prediction of a large number of people will be more accurate than that of a few experts (“the smart money”). I looked at these principles two years ago, in a series of posts (here, here, and here).

Today’s game between the Atlanta Falcons and the Arizona Cardinals seems to pit the public against the insiders. The opening line had the Falcons as 3-point favorites, and the public money came in on the Falcons. Normally, the bookmakers would try to balance their books and get an equal amount of money on both sides. To encourage more people to bet on the Cardinals and discourage Falcon bets, they would raise the point spread. Falcon backers might think twice if they had to give up 3 ½ or 4 points rather than 3.

But that’s not what happened. Instead, the line went down. On Friday, Cardinal bettors were getting only 1 ½ or 1 point. Apparently, in addition to the public money coming in mostly on the Falcons, the bookies also got “smart” money on the Cards. The oddsmakers were responding not to the amount of money but to the source. If the smart money was on the Cardinals, they would lower the line to encourage the public to bet on Atlanta.

So today’s game offers a clear choice. If you believe in the wisdom of crowds, you’ll follow the public and bet the Falcons and be happy that you have to give up only a point or so. If you think the smart money is smart, you’ll bet the Cardinals. (But remember, the smart money bet early and got the Cards plus 3 points; you might only get one, or none. Even so, my money’s on the Cards.)

UPDATE Saturday Evening.
The money must have kept coming in on the Cardinals, because by game time they were favored by as much as 2 ½ points. The Cardinals won the game 30-24. The smart money got it right.

Sociology Blogger Scoops The New York Times

January 2, 2009
Posted by Jay Livingston

Back in 2008, I wrote about the New York Parking Violations Bureau – particularly their willingness to haggle. Get a ticket and don’t like the high number on the fine, no problem. They’ll offer you a one-third discount. (The blog post is here).

Today, the New York Times finally breaks the story on page one, above the fold.

Wanna Buy a Brett Favre Jersey?

December 29, 2008
Posted by Jay Livingston

After last season, the Jets traded away Chad Pennington in order to get the future Hall of Fame quarterback Brett Favre. Favre then spent the latter half of the season leading the Jets out of the playoffs.

Steven Dubner of Freakonomics writes what purports to be a post about “deadweight loss” and the inefficiency of gifts but is really a disappointed fan’s angry kvetch. In between nasty digs at the great QB (all supported by data), Dubner asks the economic question:

So how do all those people who paid $80 for Favre Jets jerseys feel today? Do they wish they’d spent their money elsewhere? How much would they pay for the same jersey today? Did they derive $80 worth of pleasure from it up to this point — i.e., was the thrill of the first two-thirds of the season worth the pain of the last third?

To answer the question, I checked Craig’s List, and apparently the owners of those Favre jerseys are not rushing to unload them at any cost. I looked in Sporting Goods and Clothing. Here are the results
  • Individuals selling Favre jerseys: 2
  • Asking price: $60
(Three other sellers had the Jets #4, but they were commercial dealers not disgruntled fans.)

For comparison, I also checked Eli Manning jerseys. (The Giants are in the playoffs, having won their division handily with a 12-4 record.)
  • Individuals selling Manning jerseys: 2
  • Asking price: $60
(Jets fans note: dealers on Craig’s List in Miami are asking $55 for Chad Pennington jerseys.)

Job Search - Parody Version

December 29, 2008
Posted by Jay Livingston

The MLA parodists, Aaron Winter and Andy Warren, are worth detour. This year, they’ve done letters of application and recommendation.

(Click on the image for a larger view.)


The allusions are literary, but you don’t have to be an insider to get most of them. (I never took Latin, but I’m guessing that the motto of their Riverdale University,
Ex limoni ut mellitus, is something like, “When you have lemons, make lemonade.”)


The full file, with links to back issues, is here.

Ideological Purity . . . and Danger

December 28, 2008
Posted by Jay Livingston

Jenn Lena blogged recently about Chip Saltsman, who wants to be head of the Republican National Committee. To woo Republicans, Saltsman released a CD of anti-Democratic song parodies like “Love Client #9” (raise your hand if you remember The Clovers. No, not The Searchers, The Clovers). OK, Spitzer is fair game, and maybe the song is actually funny.

But what’s interesting is how offensive most of the titles are: “The Star-Spanglish Banner,” “Ivory and Ebony,” and (as Dave Barry would say, I am not making this up) “Barack the Magic Negro.”

The CD title is “We Hate the USA.”

I hope the RNC picks Saltsman. It looks like he’s the man to complete the process started by George W. Bush and pushed along briefly by Sarah Palin: turning the Republicans from a majority party into small cult of the self-righteous.

Their message is, “If you don’t agree with us, you must hate the USA.” Or as Sarah Palin implied, you are not a “real American.” That’s a good strategy for solidifying “the base.” The danger is that it drives away potential adherents. What a contrast with Obama’s message of inclusiveness.

Claiming sole ownership of virtue and truth runs against the American grain. We have an ethic of tolerance for diversity. “My way or the highway” may be O.K. for football coaches, but when national politicians start shouting it, a lot of people discover the attractive features of the broad highway with all those different cars and trucks and buses.

Apparently that includes the highway to Heaven. Most Americans, when asked which religions can “lead to eternal life,” say that the gate of Heaven is open to religions other than their own. By more than 2-1, Americans choose, “Many religions can lead to eternal life” over “My religion is the one, true faith leading to eternal life.”

Even among white evangelicals, despite the message their preachers repeat regularly, a majority thought that Jews and Catholics could make it past St. Peter’s velvet rope. And about a third of white evangelicals thought that Heaven was open to Hindus, Muslims, and people with no religious faith.
(For a larger view, click on the image.)

I draw two lessons from this
  • People at the top are more ideologically consistent than are the rank and file.
  • The rank and file are more tolerant of diversity than are the leaders.
I suspect that the first is an axiom of organizational theory. After all, the leaders are in the ideology business. They spend a lot of time thinking about it, so they cannot ignore or deny inconsistencies. But the second may be a particularly American variation.

The data are from a Pew report. The New York Times converted Pew’s tables to graphs (including three in addition to the ones above) to accompany a nice op-ed by Charles M. Blow on the Pew study.

Taking a Mulligan on the Economy

December 27, 2008
Posted by Jay Livingston

I often use unemployment when I’m trying to explain the difference between social facts and individual facts. To explain why an individual doesn’t have a job, use individual facts – lack of education, bad work habits, etc. But when the unemployment rate rises by a few tenths of a percent, when hundreds of thousands of people who were working a few months ago are now jobless, we think not about individual characteristics but about “the economy.”

Mills uses this example in The Sociological Imagination, and it’s an easy one for intro sociology students to grasp. But maybe Mills and I are wrong.


Are Employers Unwilling to Hire,

or Are Some Workers Unwilling to Work?

By Casey B. Mulligan

Casey B. Mulligan is an economist at the University of Chicago.

The recent decrease in employment may be due less to employers’ unwillingness to hire more workers and more to workers’ unwillingness to work. . . .

Of course, people have not suddenly become lazy, but the experiment gives similar results to the actual situation in which some employees face financial incentives that encourage them not to work and some employers face financial incentives not to create jobs. [Emphasis added.]

Mulligan must be right. After all, the New York Times is publishing this (on Dec. 24, a Christmas gift to workers), and Mulligan is a professor of economics at Chicago. He must know.

Back in October, the Times published another Mulligan piece saying that “the economy doesn’t really need saving. It’s stronger than we think. . . . If you are not employed by the financial industry (94 percent of you are not), don’t worry. The current unemployment rate of 6.1 percent is not alarming.”

The unemployment rate for November was up to 6.7%, also not alarming, I suppose – just another half million people responding to those incentives not to work.

I always thought that the unemployment rate measured only people who were looking for work. Those who had given up and dropped out of the labor force were not officially “unemployed.” So I’m not sure what Mulligan means by “incentives that encourage them not to work.” Whatever. In any case, in the past year, the number of the officially unemployed in the US has risen by nearly 3 million, bringing the total to 10 million.

That's a lot of people with no incentive to work. But I’m sticking with Mulligan. Not to worry. No cause for alarm. It’s not the economy, stupid.

“Don’t give me a book. I have a book.”

December 25, 2008
Posted by Jay Livingston

To all Sociology bloggers and blog readers.
Merry Christmas


(Tree stolen shamelessly from Magda of Ce que tu piques.)

The subject line for this post – I’m pretty sure it’s Mae West’s response when someone asked her what she wanted for Christmas. But Google though I might, I can’t find it.

A Child Is Born

December 24, 2008
Posted by Jay Livingston

France’s minister of justice is unmarried and very pregnant, yet she still refuses to identify the father of the child. Gossip speculation on the matter includes several prominent Europeans including the former prime minster of Spain.

I’m not sure how that scenario would play in this country, but I do remember that Bush’s first Attorney General, John Ashcroft, an Evangelical Christian, had the DOJ spend $8000 for drapery to cover the bare breast on a statue.

But the Evangelical relation to sex and pregnancy is complicated. At first blush, it seems monolithically puritanical – no unmarried sex, no sex education, abstinence pledges. But as the evangelical reaction to Bristol Palin showed, it is also understanding and forgiving. When it was revealed that Palin, seventeen and unmarried, was pregnant, evangelicals were not the first to cast stones. Instead, they seemed to accept the pregnancy as one of those things that just happen. And since Bristol was not going to have an abortion, and since she was going to marry the father, a difficult situation would be resolved for the best. Difficult, but for evangelicals, not at all unusual. Palin’s mother Sarah seems to have taken a similar path (either that or her eldest child was several weeks premature).
As Marlys Popma, the head of evangelical outreach for the McCain campaign, told National Review, “There hasn’t been one evangelical family that hasn’t gone through some sort of situation.”

That’s from “Red Sex, Blue Sex,” by Margaret Talbot in the New Yorker last month. Evangelicals, especially teenagers, face a large gap between values and beliefs on the one hand and behavior on the other. Compared with other teens, they favor abstinence (by a wide margin), fear that partners will lose respect for them if they have sex, and do not anticipate that sex will be pleasurable. Yet on average, they start sex at an earlier age (16) and get pregnant more often. I’m not sure how they handle the cognitive dissonance.

Talbot cites the work of some sociologists (Peter Bearman and Hannah Brückner, Mark Regnerus) on factors that influence whether virginity pledges work – mostly how embedded a teen is in networks (friends, family) that support abstinence. The basic data on abstinence seem to reinforce what should by now be a sociological truism: situational forces matter far more than personal factors like character or statements of intent.

Oh to be in Finland

December 21, 2008
Posted by Jay Livingston

This is just a coincidence, right?

Which countries would you rank the highest in terms of education?
Darling Hammond: Finland ranks the highest generally across the board.
(From a Newsweek interview with “Stanford professor Linda Darling-Hammond [who] has been the brains behind Obama's education policy over the past year as a lead education advisor on the campaign and during the transition.”)


The [survey] results were combined into an index of . . .“sociosexuality” . . . a measure of how sexually liberal people are in thought and behaviour. Most individuals scored between 4 and 65.
The country with the highest rating was Finland, with an average of 51.

(The London Times reporting on a survey of 14,000 people in 48 countries, a project headed by David Schmitt of Bradley University. )

Sending a Message - But Who's Listening?

December 20, 2008
Posted by Jay Livingston


The Republicans tried to run on symbolic issues – Rev. Wright and Bill Ayers. The Republicans cried “country first” and whined that Obama “pals around with terrorists.” You’d have thought that once elected Obama was going to make Al Qaeda his chief of staff.

It didn’t work. People voted for Democrats mostly because the Republicans had done so disastrously on real issues – the war and the economy.

Now it’s the left’s turn. Obama chooses Rick Warren to give the Inauguration invocation, and people on the left are up in arms, as though a 30-second prayer were the equivalent of a cabinet appointment.

Symbolic gestures like this appeal to our emotions; they make a difference in how we feel. Symbols are easy to respond to, and the response is often binary. Us vs. them, good vs. evil. Rev. Warren opposes gay marriage, therefore he’s a bad guy.

Policy is different. It’s about what actually happens on the ground, and it’s far more complex. It doesn’t lend itself to Manichaeanism (that’s one of the reasons the Bushies messed up so badly). It doesn’t require emotion, it requires thought . . . and data.

Still, the moralists must insist that symbolic issues are real. They must also claim not just that evildoers are evil, but that “if we don’t fight them over there, we’ll have to fight them here.” Iraq was no threat to the US, but the invasion would “send a message” to the terroists. War as candy-gram.

Similarly, the anti-Warrenists insist that his half-minute as invocator-in-chief, will “send a message” that anti-gay bigotry is all right. As Andrew Perrin over at Scatterplot puts it, “That message will be heard, loud and clear, and it’s quite reasonable to expect that real people’s real lives will really be affected by it.”

Now, I’ve always thought that when someone says, “It is reasonable to expect,” what they really mean is “I have absolutely no evidence to support this.” But Andrew is an honorable man, and presumably he does know of evidence. Still, I’ve been skeptical about send-a-message arguments ever since my days in the crim biz.

Back then, send-a-message was usually a call for harsh sentences in celebrated cases. The death penalty would “send a message” to potential murderers. Long and mandatory sentences would “send a message” to drug dealers, robbers, Enronistas, or whatever evildoer was currently in the headlines. Whatever this week’s crime of the century was, an acquittal or a sentence less than the maximum would send a message that this crime was O.K., a message which would be heard loud and clear, and nobody would be safe.

The trouble was that evidence of actual deterrence was hard to find, and to the extent that punishment does deter, it’s more a matter of increasing the certainty of arrest, not the severity of sentences.

The symbolic messages of celebrated cases make for good TV – the sorts of things Bill O’Reilly types get all riled up about – and they may be morally satisfying. But they have no impact on what people actually do.

If I were concerned about gay marriage, I’d be much more worried about who’s getting out the vote and who’s getting appointed to the judiciary than about who’s praying at the Inauguration.

Deflationary Psycholoogy

December 19, 2008
Posted by Jay Livingston

Are lower prices bad? In Monday’s Times, David Leonhardt explains the dangers of consumer thinking.
There is good reason to fear deflation. Once prices start to fall, many consumers may decide to reduce their spending even more than they already have. Why buy a minivan today, after all, if it’s going to be cheaper in a few months? Multiplied by millions, such decisions weaken the economy further, forcing companies to reduce prices even more.”
This seemed reasonable to me. Then I thought about all those digital cameras and flat-screen TVs and computers and flash drives and all other electronic gadgetry. People buy this stuff even though they know that in a few months they’ll be able to get either the same thing for less money or a better version for the same money.

With all the doubt cast recently on economic rationality, it would be nice to have some evidence on what really happens during deflation. Do economists have such evidence, and if so, where did they get that evidence? How many deflationary periods are there for us to sample?

Does consumer spending rise in tandem with inflation? And even if it does, there are two possible explanations. One is the flip side of the deflation mentality Leonhardt mentions: buy it now before the price goes up. The other is that inflation means higher wages, and people with increased incomes feel they have more money to spend.

I should know this, but I don't. Economic sociologists, please speak up.

Fifties Food

December 17, 2008
Posted by Jay Livingston

Jenn Lena has a link to the Gallery of Regrettable Food, a site which looks back at US food a half century ago and asks, “What were they thinking?”

At Sociological Images, eallen has a more thoughtful take. She looks at the ads with recipes for baked bean pizza or broiled spam on canned peaches and chalks up the reliance on canned or prepared food to “the Atomic Age’s fascination with technologically advanced cookery.”

We look back, and we laugh – “Spiffy Then, Hilarious Now” is the title of eallen’s post. Ah yes, we are so superior in what we eat today.

The trouble with this sort of smugness is that its ethnocentrism stops any further sociological thinking. Fifties food was laughably bad. The end. It’s like watching Mad Men and chuckling at the hair styles and habits (smoking, drinking) and boat-like automobiles, and not looking for the less visible structures that shaped work, family, gender, and consumer choices.

A little cultural relativism and conflict theory might be more helpful. Food is fashion, just like clothing. What tastes good, like what looks good, is what’s in fashion. In a few decades, we may look back at Ugg boots and chicken Caesar wraps the way we now look back on poodle skirts and Jello everything.

Also, like fashions in clothing, fashions in food don’t just happen. They are part of history, and they have an industry behind them. The fifties were the post-War era. The Spam and canned peaches were leftovers, left over from the War. More importantly, so was the industrial set-up producing them. These ads are part of the food industry’s effort to create “a peacetime market for wartime foods. . . . factories were ready to keep right on canning, freezing, and dehydrating food as if the nation’s life still depended on it.”

“What the industry had to do was persuade millions of Americans to develop a lasting taste for meals that were a lot like field rations.”

Both quotes are from Something From the Oven, by Laura Shapiro, who also has more than a few words to say about how these food fashions relate to the social constraints on the role of women. It’s kind of embarrassing when the best sociology on a topic is done by a dance critic.

Music and Violence

December 15, 2008
Posted by Jay Livingston

Music hath charms to soothe the savage breast. Or enrage it.

At Gitmo, “harsh interrogations” include Heavy Metal. We’ve known that for a while. Here’s the latest twist:
Musicians are banding together to demand the U.S. military stop using their songs as weapons. . . . groups including Massive Attack and musicians such as Tom Morello, who played with Rage Against the Machine and Audioslave.

For many Afghan detainees - where music was prohibited under Taliban rule - interrogations by U.S. forces marked their first exposure to the rhythms, played at top volume. (New York Newsday, Dec. 10)

Music as torture –good ideas like this are hard to stop from spreading.
A Colorado judge who noticed that many of the people who showed up in his courtroom for violating noise ordinances were repeat offenders has decided to quit fooling around: new offenders may find themselves sentenced to an hour of listening to Barry Manilow or the theme tune from the children's TV show ''Barney and Friends.'' (New York Times, Nov. 28)
(The judge’s cruel and unusual list also included The Platters and The Carpenters.)

In the US the music-and-violence flap has been mostly about rap. But in some places, even easy listening isn’t so easy.
International: Karaoke singer killed after hogging mic

A Malaysian karaoke enthusiast hogged the microphone for so long that he was set upon and stabbed to death.

Karaoke rage is not uncommon, especially in Asia. There have been several reported instances of singers being assaulted, shot or stabbed mid-performance, usually over how songs are sung.

In Seattle last year, a woman with an apparent aversion for Coldplay attacked a singer who had just embarked on a rendition of Yellow.

Frank Sinatra's My Way has reportedly generated such outbursts of hostility that some bars in the Philippines now no longer serve it up on the karaoke menu.


In Thailand this year, a gunman killed eight people after tiring of endless renditions of a John Denver tune.
(The Guardian)

Billy Elliot -- That Was Newcastle, This Is Flint

December 13, 2008
Posted by Jay Livingston

I saw the musical Billy Elliot last night. It sets the world of dance – joyful, hopeful, not quite real – against the grim reality of the 1984 coal strike in northern England. As the program notes say, Thatcher was “determined to crush the unions.” And she did.

At the end of the show, as the strike and the strikers have been broken, Billy’s brother, a miner, tells Billy that when he comes back from Royal Ballet School in London, there will be no more work in the mines up here in the North. In village after village, men will be without work.
“We’re dinosaurs,” he says.

He was right. Before Thatcher, the coal industry employed 300,000. Today, less than 1,000, and almost all coal burned in Britain is imported.

Despite the magic of theater, I couldn’t quite suspend my thoughts about reality (maybe because I was far away from the stage – next-to-last row, rear mezz). I kept thinking about Detroit and wondering if it was now like Yorkshire, with the US auto industry, now apparently on the brink of extinction thanks to bad decisions and high costs. It’s hard to imagine a world without Ford and Chevy, but then again, in Yorkshire in 1983 it was probably impossible to imagine an England without coal. I wonder if the people who work in the GM plants – Michael Moore’s friends in Flint – are saying to their children, “We’re dinosaurs.”

We don’t know exactly why the real dinosaurs disappeared. It certainly wasn’t because of government policy. But the NUM had Maggie Thatcher and the Conservative Party, willing to destroy an industry to crush a union. But of course that wouldn’t happen here.

I turned out my computer this morning, and the top story on Google News was a link to the LA Times.
Auto bailout's death seen as a Republican blow at unions
For some Senate Republicans, a vote against the bailout was a vote against the United Auto Workers, and against organized labor in general.

Clearance Rates - Bad News?

December 11, 2008
Posted by Jay Livingston

Crime is news only when it’s bad. Most crime stories are reports of individual crimes, the worse the better. But even when the media report on general trends and statistics, they look for trouble. Good news is no news.

This week, it was the clearance rate for homicide – the percentage of murders where the police made an arrest. “More Getting Away With Murder,” was a typical headline.
Despite the rise of DNA fingerprinting and other "CSI"-style crimefighting wizardry, more and more people are getting away with murder.

FBI figures show that the homicide clearance rate, as detectives call it, dropped from 91% in 1963 - the first year records were kept in the manner they are now - to 61% in 2007. (From the Chicago Sun-Times)
The big decrease in clearance rates accompanied a big increase in murder that began, coincidentally, in 1963.

It wasn’t that a tide of incompetence was washing over homicide bureaus nationwide. The problem was that more of the murders were the kind where it’s hard to know who to arrest. The easy ones are the arguments and fights between family members and acquaintances. But much of the increase in homicide came from killings committed during robberies or between rival drug dealers, and those murders are much harder to solve.

Clearance rates fell from 91% in 1963 to 67% in 1991, the peak year for homicide. Since then, murder rates have declined dramatically. Clearance rates, too, still continued to slide, though less steeply, from 67% to 61%.

That’s the bad news. If you want the good news, look at the actual numbers of cases.


From 1991 to 2006, the number of uncleared murders declined. In 1991, about 8000 people “got away with murder.” By 2006, that number had decreased to about 6,300. The number of cleared murders also decreased. The real news is that Americans are killing one another far less frequently than they did fifteen or twenty years ago. The clearance rate has decreased because the murders that are easy to solve have decreased more rapidly than the kind that are hard to solve.

So while “More Getting Away With Murder” has the virtue of appealing to our sense of moral outrage, it has the disadvantage of being untrue.

Dumbing Down

December 6, 2008
Posted by Jay Livingston

Quiz shows on early TV combined big brains and big money. “The $64 Question” of 1940s radio became “The $64,000* Question” of 1950s television. And the questions were difficult – questions you couldn’t possibly know the answer to. Questions that people could get only if they were incredibly smart. Or if the show was rigged, which it was.

“Jeopardy” goes more for questions that many viewers can get. Even the higher-priced questions are the kind that when the contestant gives the answer, you might snap your fingers and think: right, I knew that, and I would have remembered it, too, given a little more time.

Now there’s “Are You Smarter Than a Fifth Grader?” With “$64,000 Question,” you were far below the level of the players. With “Jeopardy,” you’re in the same neighborhood, though probably on a different street.** But Fifth Grader can give you that Jerry Springer sense of superiority (the show is on Fox, not surprisingly).

See more funny videos at Funny or Die

(When I first read that Sarah Palin had thought Africa was a country, I dismissed it as a canard launched out by the snarky, dissatisfied guys in the McCain campaign. Now, I’m not so sure.)

One final thought. Fifth Grader also rests on the idea that children are superior to adults, a theme that suffuses most American movies and TV shows that have children in them (think “Home Alone”). On Fifth Grader, adults cheat off the kids, peeking at their answers or copying them outright.



I got the clip from Funny or Die, thanks to a tip from Wesleying.

* About a half million in 2008 dollars

** Full disclosure: I was a contestant on Jeopardy many, many years ago.

Would It Be Funny in Japan?

December 4, 2008
Posted by Jay Livingston

Chris Uggen posted this New Yorker cartoon on Monday.
The funny thing is, in Japan, nobody would get the joke. To begin with, a Japanese cartoonist probably wouldn’t even consider the idea of choice. So Chris’s “I’m in” comment on the cartoon, which I smiled at, wouldn’t be amusing in Japan. Of course you’re in.

Besides that, in Japan, the idea of work after work isn’t a comic possibility. It’s reality. The distinction between work and after work is much fuzzier, mainly one of setting. You leave the office and go out to a bar, but you’re with the same group of people that you work with. There’s more liquor and less formality, but it’s still the same work group.

The Japanese equivalent of the office party is the nomikai (飲み会), though it’s rarely held in the office. Kai is a general term for get-together, and nomikai is usually translated as “drinking party” But “drink meeting” might better convey the idea that the drinkers are also co-workers. More to the point, co-workers often go for drinks together as a group though not at the level of an official nomikai. It’s more like the situation in the cartoon.

Thanksgiving – False Consciousness vs. Solidarity

December 2, 2008
Posted by Jay Livingston

I guess professors at Harvard Law don’t have to read Durkheim.

Jon Hanson, Alfred Smart Professor in Law at Harvard, has a post on “Thanksgiving as System Justification.” I didn’t come across this post till after Thanksgiving, and Hanson posted it for Thanksgiving 2007 (The Situationist reposted it). Still, it’s worth mentioning.

Hanson sees Thanksgiving as an exercise in false consciousness. He doesn’t use that term, but he’s arguing that the message of Thanksgiving is, “Don’t complain, be thankful.” And when people are justifying and giving thanks for a system that’s basically screwing them, that’s false consciousness. By giving thanks for what we have, we are supporting the status quo.

Hanson quotes stuff he’s found on the Internet (I have boldfaced the key phrases) :
  • your goal should be to move the spirit of Thanksgiving from a one-day event to a basic life attitude. . . . This means being thankful no matter what our situation in life.. . . Are you thankful for your job even when you feel overworked and underpaid? [implying that if you’re not thankful for your exploitative job, you should be]

  • The Pilgrims recognized that everything we have is a gift from God – even our sorrows.

  • The deeper meaning is that we have the capacity to produce such wealth and that we live in a country that affords us our right to exercise the virtue of productivity and to reap its rewards. So let’s celebrate wealth and the power in us to produce it; let’s welcome this most wonderful time of the year and partake without guilt of the bounty we each have earned.
That last one is a bit political – telling the wealthy and powerful they need not feel any guilt – and obviously written for Republicans. But Thanksgiving is inherently conservative. Its message that we should be thankful for what we have is another way of saying, “Whatever is is right.”

It’s right, as these formulations tell us, because it is the work of God. Or as President Bush said in last year’s Thanksgiving declaration, “We give thanks to the Author of Life . . . who watches over our nation every day.” If God is watching over us every day, things must be O.K.

But Hanson misses the larger, Durkheimian insight: Rituals exist for the benefit of the society (or whatever group that stages them). The goal of any ritual is social solidarity, solidarity among all members of the society. Your basic religious ritual, for example, exalts God. But God, as Durkheim showed, functions as a representation of the society. So all rituals are inherently conservative; they idealize and uphold the society as a whole and promote the attachment of individuals to that whole.

The sacred world of ritual may be conservative in this sense, but elsewhere, in the profane world, change happens – change we can be thankful for. I just wonder whether godly conservatives, those who “recognize that everything we have is a gift from God” included the election of Obama as one of those gifts . . . and gave thanks for it last Thursday.

Summers School

November 29, 2008
Posted by Jay Livingston

I don’t know Larry Summers (though my father was a very good friend of his uncle), but I always sensed he was the kind of guy I wouldn’t like even when I agreed with him. And that was before I read this:

Over lunch not long after Summers took over the presidency in 2001, Ellison said, Summers suggested that some funds should be moved from a sociology program to the Kennedy School, home to many economists and political scientists. “President Summers asked me, didn’t I agree that, in general, economists are smarter than political scientists, and political scientists are smarter than sociologists?” Ellison said. “To which I laughed nervously and didn't reply.”

A major critique of Summers by faculty has been that he plays favorites with subject areas.

It’s from a Boston Globe piece in 2006, when Summers, as president of Harvard, was busy alienating much of the faculty.


Hat tip to Henry at The Monkey Cage, who is, I suppose, smarter than the average sociologist Boo Boo.

Vegans to the Moon

November 26, 2008
Posted by Jay Livingston

One of those strange coincidences. For some reason, Monday morning I was sitting in my office thinking about Jackie Gleason in the Honeymooners, his Ralph Kramden character stomping about the kitchen in anger and frustration. “One of these days, Alice, one of these days . . . . POW. Right in the kisser.” Or he would threaten to hit her so hard, she would go “to the moon.” These were regular laugh-getters.

Here's a collection of these threats; the first “pow” comes at 6:55 into the clip. [T"his clip is no longer available. But if you are not familiar with this trope, search for “Honeymooners to the moon.”



We knew he didn’t mean it. So did Alice, who would respond – unfazed, arms akimbo, scornful – “Sure, Ralph.” (It was the fifties. “Yeah, right” didn’t yet exist.)

Even so, you couldn’t use that “joke” today, I thought. ’Taint funny, McGee.

The coincidence is that the next day, bellelettre at Scatterplot posted a link to a blogpost by a law professor, Michael Dorf Neil Buchanan*, who asks, “How quickly can norms change?” Here’s his first example of norms that have changed:
I recently watched a rerun of the 60's sitcom "The Dick Van Dyke Show." The story revolved around a woman who was drawn to a man because he was a mean drunk, bringing out her "maternal" side. The final line of the episode had one character saying to another: "You know what we should do? Go home and hit our wives." Raucous laughter, upbeat theme music, roll credits. It goes without saying that this is shocking to us today.
Dorf’s Buchanan's other examples, besides domestic violence, are smoking, the environment, and alcohol use / drunk driving. But what’s interesting is that in Dorf Buchananf’s version, these attitudes change seemingly by themselves. People just change their minds. Here’s his take on smoking:
The driving force in this social change seems to have been more a matter of deciding who had the right to force other people to do what they wanted. This may have been caused by concerns about suffering, but from my perspective it seemed to be more about attitudes toward public cleanliness. Smoking came to be seen as ugly, not dangerous (which people had known even before the surgeon general's report in the 60's).

Dorf Buchanan presents change as a strangely passive phenomenon. There’s no human action/agency involved. Smoking “came to be seen as.” You can’t do “to the moon” jokes anymore “because of a rapid and widespread public acceptance of a new norm.”

Dorf Buchanan, who is now a vegan, wants attitudes on veganism to change, and he frames the issue as a matter of the awareness of harm. Attitudes on smoking, domestic violence, and the rest changed because of a similar awareness of harm.

But how do people become aware? I guess law professors don’t know about “moral entrepreneurs.” Anti-smoking groups, MADD, NOW, etc. If veganism becomes more accepted, it will have more to do with the actions of PETA and other groups than with the outcome of Talmudic debates about the certainty of suffering.

*SocProf at Global Sociology pointed out that I incorrectly attributed the post to Dorf when in fact it was written by Buchanan posting on Dorf's blog.

Mirror, Mirror

November 24, 2008
Posted by Jay Livingston

I’m skipping the “Self and Socialization” unit this semester. The real reason is that time thieves have been at work, and the term is about two weeks too short this year. But beyond that, I’ve lost my faith. I realize how wrong I’ve been about some basic ideas. Taking the role of other, seeing ourselves as other see us, the looking-glass self – what a crock. In fact, people don’t see themselves as others see them, and I’m not just talking about people who are clearly delusional.

A few months ago, I was interviewed for a TV show – a show you’ve never heard of for a network you’ve never heard of, unless you’ve heard of Ebru TV. Weeks later, they sent me a DVD of the show. It was painful to watch myself. Not because I said things that were wrong (though there was some of that too), but because the person in that real video looked and sounded so different from the person in the imaginary video of myself that I carry around in my head.

A looking-glass self? Maybe, but that looking-glass is flat and flattering. That’s why it’s so distressing to look in those triptych mirrors in the fitting rooms. Or to watch yourself on TV. Who was this stiff-looking guy with the ungraceful walk and a much higher forehead than I remember, this guy who looked like my brother (what’s Jack doing in this video?) and not at all like Gregory Peck?

I didn’t sound like Gregory Peck either. I knew that already, but even so, I certainly didn’t hear myself as others hear me. It wasn’t just my voice, which sounds so much more resonant from inside my head than from outside. It was all those verbal tics – “y’know” and “I mean.” I had no idea how often and how unwittingly I utter them.

Maybe the proper question is not how socialization works. The interesting questions are about the discrepancy between the image we have of ourselves and the image others have of us. Why do we so seldom become aware of the discrepancy? And given this discrepancy, how do we manage to sustain social life?