Just Wait Till You’re Older

June 29, 2007
Posted by Jay Livingston

Sometimes when I was an adolescent, I would voice some opinion about teachers or sex or drugs, and an adult would say, “Just wait till you get older and have kids.” And indeed, the opinions of the people in the room often divided along age lines. My peers and I were more liberal than the grown-ups.

In the previous post on age differences in political views, I mentioned the difficulty of knowing whether differences were a matter of age or of generation. If it’s age, then people’s opinions will change as they grow older. But if it’s generational, then the members of that generation will retain their views forever. Imagine today’s twentysomethings in fifty years, still thinking their tattoos are cool, while the youth near them on the beach shake their heads in disbelief.

A survey like the one done by the Times/CBS/MTV shows a cross-section of the population at one point in time. But what we’d really like is “longitudinal” or “time-series” data that can show us what happens over the course of time.

Statistician Howard Wainer has an example that illustrates the dangers of drawing longitudinal conclusions from cross-sectional data. If you did a cross-sectional survey on language development in Miami, you might be tempted to conclude that when Miamians are young, they speak Spanish. As they mature into their middle years, they change to speaking English. And when they get even older, they switch to Yiddish.

Hat tip to my brother Skip for relaying this example.

The Kids Are: 1. All Right, 2. All Left, 3. About the same

June 27, 2007
Posted by Jay Livingston


The New York Times published the results of a poll of young Americans — ages 17 to 29. The headline is “Young Americans Are Leaning Left.” In fact, on most issues, there’s not much difference between the young and the rest of the country, even on issues where you might expect the young to be more concerned, notably global warming.

On the topic of Iraq, the young were more sanguine than their elders, who 47%-38% were more likely to see the war as going very badly.

It’s not that the young are more pro-war. They’re just more optimistic. The only way that this attitude makes them more left-leaning is that while traditional conservatives want less government and think that the private sector can do everything better, young people are more optimistic about the ability of government to do good. Health care is a good thing, and the young — 62% to 47% — are more likely to favor a government-run health care program. Similarly, winning wars is a good thing, so the young are more likely to think we can win the war.

This raises the question of whether these are true generational differences or merely differences of age. If they are age differences, then the Gen Y’ers, as they grow older, will shift their opinions more towards those of older people today. If the differences are generational, they will keep their current beliefs even as they get older and have children.

Tastes in music, for example, are strongly generational — as I’m regularly reminded when the Allman Brothers play at the nearby theater, and I see who’s lined up on the sidewalk. Not exactly the same crowd that turned out for Pink. I suspect that the young will retain some of their political views — their more libertarian view of homosexuality and marijuana (on abortion their views are almost identical to those of the country as a whole)— in the same way that they’ll save and listen to their Outkast MP3s. But other political positions like party preference may change as they grow older.

Methodological note. The poll was sponsored by the Times, CBS News, and MTV. The sample size was 659, which means that the confidence interval was 8 percentage points. You’d think that with three heavy hitters like these bankrolling the survey, they’d have had a sample at least twice that size.

Presidents and Prime Ministers

June 24, 2004
Posted by Jay Livingston

The New York Times today publishes an op-ed piece on Tony Blair by British journalist A.A. Gill, who makes a similar point to something I posted yesterday. Interestingly, my point of departure was the scene in “The Queen” in 1997 when Blair is installed as prime minister after a landslide election victory. In a few months, his positive ratings had soared to 75%, unprecedented in British politics. Gill is writing about Blair’s departure from office yesterday with approval ratings of 28%, roughly equivalent to those of George W. Bush.

There’s no pleasing the British, or winning their favor. They simply hate politicians. All politicians. Hatred goes with politicians like mint sauce with lamb. It’s as old as Parliaments. . . .

The difference between British politics and American is that you maintain a collective respect for the office, if not the holder. So the presidency is a venerable thing, even if the president is a cretin. And every president leaves office with his title. He will always be Mr. President, a peculiarly regal touch for a republic. But when he wakes up on Thursday, the prime minister will be plain Mr. Blair, M.P.
On the one hand, I feel vindicated to find something I posted only a few hours earlier confirmed in the Sunday edition of the nation’s paper of record. On the other, there’s also the lurking sense that the point is so obvious you can find it in the next US press article about British politics that you read.

Could We Use a Queen?

June 23, 2007
Posted by Jay Livingston

Dan Myers, in a recent installment of Blue Monster in Europe hears the band at Buckingham Palace play “Stayin’ Alive” and speculates, “The flag was up on the top of the palace, indicating that the Queen was home. I would like to think, therefore, that this performance was a personal request and that she was upstairs working on her own electric slide.”

I watched “The Queen” on DVD recently, which is how I know that the flag Dan refers to is not the Union Jack but the Royal Standard.




Here’s a clearer image.

It’s not the British flag, the Union Jack.



It’s difficult for us Americans to grasp the idea of monarchy. “Stupid” was the comment of the teenager-in-residence who was sitting a space or two down the couch from me as we watched the film.

But there’s something to be said for having a ceremonial head of state, someone who symbolizes the nation as a whole and who stands above partisan politics. The Queen is so far above politics that she’s not allowed to vote. We learn this early in the film, which opens with the election of Tony Blair as prime minister.

“The sheer joy of being partial,” says the Queen. As a person, she no doubt has her political preferences. But as the Queen, she must remain impartial. She is someone the entire country can look to as its leader.

Most European countries, with their long histories of monarchy, have retained a nonpolitical figure as symbolic ruler of the country. In some countries (England, the Netherlands, Norway, Spain, etc.) it’s an actual monarch; in others, it’s a president, who has only ritual duties, while the actual business of running the country falls to the elected prime minister.

But in the US, we have this strange system where a partisan politician is also our ceremonial head of state. It is he who represents the country, attending state ceremonies, recognizing ambassadors, conferring honors, and carrying out other symbolic duties. In the minds of some citizens, to disrespect the president, therefore, is to disrespect the country, even if, as happened in 2000, that president got fewer votes than his opponent. How often have we heard that we must stand behind our president merely because he is our president?

To erode the good will that comes with this symbolic position, a president has to do a really bad job and over a fairly long time. It can be done (Mr. Bush’s latest ratings show only 26% of the country favorable, 65% unfavorable), but it takes sustained effort.

Giving the mantle of symbolic head of state to a partisan politician also can lead to the kind of arrogance we’ve come regretfully to expect of our presidents. They can come to think of themselves in near-kinglike terms — think of Lyndon Johnson’s famous remark, “I’m the only president you’ve got” — rather than as elected politicians. The Bush administration has taken this arrogation of power further than any of its predecessors, with their belief that they can ignore laws they don’t like, withhold information from the Congress and the people, and use the justice system as a political tool.

There may be something about constitutional monarchies that curbs such arrogance. An early scene in “The Queen” shows Tony Blair coming to Buckingham Palace. He has just won the election in a landslide, but he will not be prime minister until he kneels before the Queen and is officially requested by her to form a government. As historian Robert Lacey says in his commentary track on the DVD, “People feel it’s good that these politicians have to kneel to somebody to be reminded that they are our servants.”

In the US, the president is sworn in by the Chief Justice, the Supreme Court being the closest thing we have to an impartial power. But the justices are appointed by politically elected presidents, and as recent history has shown, the Court is quite capable of pure political partiality. Does anyone really believe that the vote in Bush v. Gore was about the law and not about politics? All those five votes that in effect gave Bush the election were Republican appointees. The two Democratic appointees sided with Gore.

Nobody, not presidents or prime ministers, appoints the Queen. Moreover, as historian Lacey notes, the prime minister has to meet with the Queen every week and report to her. The US president does not have to report to anyone. Cabinet members and other administration officials may testify before Congress, and the president himself may hold press conferences. But as the current incumbent has demonstrated, it’s possible to greatly limit the amount of such questioning.

The only thing the US has that takes on some of the magisterial symbolism of the Queen is the flag, which, as an inanimate piece of cloth, cannot do all the things the Queen does. Less officially — only somewhat less officially— there’s God. But over the last half century or so, the Republicans have successfully claimed both God and the flag as belonging exclusively to their party.

As “The Queen” unfolded, the more I watched this very human figure sorting our her roles as grandmother, mother, ex-mother-in-law, and Queen of England, the more I thought that perhaps monarchy isn’t such a bad idea.

(Hat tip and deep bow to Philip Slater, who blogged along similar lines to this post for his Fourth of July essay at Huffington Post.)

The New York Post

June 20, 2007
Posted by Jay Livingston

I love the New York Post. I don’t actually read it; but I see the front page every day. Regardless of the actual content, the message of the subtext could often be summarized in the following headline:

THIS IS NOT A REAL NEWSPAPER.
We’re in it for kicks.

Here’s yesterdays front page.



The story was that Israel was trying to boost its sagging tourism. Apparently the Israelis thought that potential tourists —younger male tourists — were staying away not because of questions of politics and safety but because the Israel “brand” lacked sex. So they made a deal with Maxim, the lad mag, to sponsor a party honoring Israeli women soldiers. They used this photo of Miss Israel 2004 on the invitation.

The Post usually supports Israel, but when it comes to politics on one side and a bikini and a smutty pun on the other, it’s no contest. (“Piece” seems somewhat dated to my ear, but I don’t spend much time listening to informal talk among the target demographic. And how many Post readers will get the pun on brouhaha?)

There’s a culture and class connection here that I can’t quite put my finger on. It’s a combination of patriotism, prurience, and puritanism, and the Post, like much of the rest of Murdoch’s media empire, thrives on it. Most of Murdoch’s other outlets take themselves more seriously, though when the boys at Fox News slapped on that “Fair and Balanced” slogan, they surely must have had their tongue deep in their cheek and all fingers crossed.

The Murdoch media love sex, especially when it allows the pointing of fingers at other people’s sexuality. I’m sure they’re kicking themselves for failing to come up with “To Catch a Predator,” NBC’s champion in the “moral righteousness cum voyeurism” category, while Fox and the others must content themselves with Britney, Paris, and the rest. (For an interesting take on “Predator,” see Corey Colyer’s recent blog entry. ) Typically with these publications, you get babes in bikinis on one page and on the next a sex scandal news item that clearly shows the border between the good guys’ cheerful sexuality and that bad guys’ evil sexuality.

The simplistic moral approach also applies to politics. It’s us against them, and we’re the good guys. Again, the Post, with its lack of pretension to being a real newspaper, provides the best example. Here it is in March 2003, when in the U.N., France and Germany were opposing US resolutions authorizing the invasion of Iraq.


(In case you can’t see it in the photo, the weasel heads have been superimposed on the representatives from France and Germany.)

The Post covers politics the way it covers sports, with the Bush administration as the home team. The main difference is that the Post sports reporters try to get information besides what’s put out by the Yankee front office, and they will actually criticize the Yankee players and management when the team is not doing well. (Currently, the Yankees in MLB are sort of like the US in Iraq — spending a ton of money and not having a lot of victories to show for it, in fact just barely above having a losing record.)

For the New York Post, every day is a moral struggle. Every day, on various fronts, good confronts evil. The forces of selfless virtue, loyalty, common sense, and pride confront the forces of self-interest, danger, sneakiness, corruption, elitism, and immorality. The good guys don’t win every battle, but they can pat themselves on the back for trying. And whatever happens, they can usually find comfort in a picture of a beautiful woman who’s not wearing very much at all.

Rich Girls II

June 18, 2007
Posted by Jay Livingston

More on that Gallup poll in the previous post about Paris, Lindsay, Nicole, and Britney. Around the time I was looking at these poll results, a friend wrote about her daughter’s thirteenth birthday party. “After a year of getting to know these [13-year-old] girls, I've really started to care about them and like them. I think they are good at heart. Some of them come from extremely wealthy families and are spoiled rotten, so it took me a while to find the love for some of them.”

I was reminded of the movie “Thirteen,” in which a girl of modest means is corrupted by a wealthier classmate. Poor girl looks up to “cool,” rich girl; rich girl seduces poor girl into drugs, shoplifting, sex with boys. It’s a cliche, but at least it’s a low-budget, indie version of the cliche.

Like other cliches, it confirms a widely held view, in this case that having a lot of money is dangerous, especially because it can lead girls away from conventional middle-class ways.

(Are there similar tales about boys? I can’t think of any. Rich boys in American stories can be cruel — they can also be helpful — but they seldom corrupt the ordinary boy’s morals as is so common in the fallen-woman stories.)

Ms. Hilton and the others are considerably older than thirteen, but the money-vs.-morals theory retains its attractiveness even when we think about these twentysomethings. Gallup offered four choices for people to explain what caused the problems of these celebs.

“Having too much money at a young age”
“The pressures of fame at a young age”
“Negative influences of the Hollywood culture”
“Parents doing a poor job raising them”

Here are the results:
Too Much Money is the clear winner.

I don’t know the systematic evidence on child-rearing and what might cause girls to have problems. But Gallup respondents didn’t know either. Besides, even if we know that something is true in a general sense, it is impossible to know if it applied in any individual case. So what we’re looking at here is not solid reality; it’s people’s beliefs about reality, specifically about the effects of money.

Those beliefs seem to be rooted in a relentless belief that only middle-class morality will work. It’s a Goldilocks view of socio-economic status. We believe that poverty is not good for kids, but we also see dangers in great wealth. The middle-income range is just right.

I wonder if people in other societies take a similar view, especially in societies with less of an egalitarian ethos and with some trace of aristocratic tradition. The British may not be pleased with the behavior of the younger generations of the royal family, but I don’t think they attribute the shortcomings to an overabundance of money. It’s also possible that within an upper class, drunkenness and adventurous sexuality are not seen as inherently bad. Fidelity and sobriety are middle-class virtues, not nearly so exalted at the outer reaches of the social distribution.

There also seems to be some ambivalence here about middle-class aspirations. We would all like to have more money, though not too much more. (Ask people what the “right” income would be, the income that would allow them to live comfortably, and you’ll usually get a number that’s about 25% higher than what they’re currently making.) Historically, the American pattern of upward mobility is that parents want their own children to have it better than they did. Parents want to be able to buy stuff for their kids. They don’t want their kids to be at all deprived. Yet, there seems to be a nagging fear that giving kids these advantages might also spoil them.

We project that fear upward. I’m not going to give my kid enough to spoil him, not on my income and not even if I were making 25% more than I do now. But the Hiltons, and even those people who make twice what I make — they’re the ones who risk spoiling their kids.

The irony, of course, is that this analysis is relative to one’s own income, and at all levels throughout the broad spectrum that think of themselves as middle class, people may be applying the same moral-economic formula. Someone who makes half as much as you do may see you as one of those rich people who spoil their kids.

Girls (But Not Boys) Behaving Badly

June 16, 2007
Posted by Jay Livingston

George Bush’s favorability ratings keep sinking. In the latest polls (Quinnipiac, NBC/WSJ), only 28-29% of the people come down on the plus side. But wait. Gallup has found other public figures whose numbers are even lower.


Paris Hilton, of course, was the big winner in the girls-gone-bad sweepstakes. Lindsay Lohan, Nicole Richie, even Britney Spears were more highly thought of, with sympathy ratings soaring upwards of 15%.

Note that Gallup asked only about young women. Gallup was only following the media, and the media would claim that they are only following the public’s interest. True, when it comes to stories about troubles people bring on themselves, the public seems to take much more delight in stories about women than about men. When Paris Hilton was jailed, and especially when she broke into tears on being re-jailed, the tabloid media reported with a self-righteous glee in her suffering.

Surely there must be male celebrities who have sinned— Russell Crowe throwing a telephone into the face of a hotel desk clerk, Mel Gibson spewing drunken anti-Semitic remarks at the cop who pulled him over, Robert Downey, Jr.’s recurrent drug problems, etc. Nicole Richie and Paris Hilton in “The Simple Life” set their own glam sensibilities against that of the heartland, but the show was self-mocking and generally not critical of middle America. Borat, by contrast, took some pretty serious swipes at middle America and used some underhanded (though legal) tricks in doing so.

But these men have not been surrounded with the sustained fascination we have given Paris, Lindsay, and the others, nor do the public and media seem to have the same sadistic longing to see them suffer legal or other consequences.

The media framed the Paris Hilton story as the rich girl trying to avoid justice and finally getting what she deserved. The media didn’t report much on what the usual sentence would be for a first-offense driver’s license violation. Probably not jail time. But for the media and public, this case wasn’t really about driving with a suspended license, just as the Clinton impeachment wasn’t really about lying to a grand jury. It was about sex. Paris Hilton’s real crime in the court of tabloid opinion was flaunting her sexuality. And the same may be true of the others.

It doesn’t matter that in many cases, it’s the tabloid media and public that are mostly responsible for making public what had been private — the voyeuristic photos, the theft and distribution of a private video. The media and public strip the clothes off these female celebs and then punish them for indecent exposure.

These stories aren’t news. They’re morality tales, and they show that moral standards are still different for women and men.

Attitudes and Familiarity

June 13, 2007
Posted by Jay Livingston


In the previous post, I said that professors who have the most negative attitudes towards Evangelicals probably are those who have the least contact with them. I don't have any data on that specific issue, but studies on other topics generally confirm the idea that familiarity breeds content. Or if not content, at least a decrease in antipathy.

The graph shows some data from a survey that asked, among other things, if immigrants were a burden on the US.

The people who had little contact with immigrants were more than twice as likely to say that immigrants were a burden.

I live in New York. Buy food in a grocery store, take a cab, buy a newspaper, eat in a restaurant, check into a hotel or hospital — just about anything you do will bring you into contact with immigrants. I doubt that many New Yorkers see immigrants as a threat to the Republic.

The same principle holds for attitudes on issues that surround homosexuality—  should they be allowed to teach in schools, should they be able to marry.

The same is true of homosexuality.

People with no gay friends or relatives are more than twice as likely to favor allowing schools to fire gay teachers. People who do have gay friends or relatives are more than twice as likely to favor allowing gays to marry.

Evangelicals in the Classroom

June 11, 2007
Posted by Jay Livingston

Q: What religious group do college faculty feel least favorable about?

A: Evangelicals.

That’s one result from a recent study, and in yesterday’s post, I offered my guess that faculty were really reacting to what they perceived as the politics of Evangelicals, not their religion. In that sense, the attitude is different from other kinds of prejudice, especially prejudice based on ascribed characteristics like race.

But I would also guess that the attitude shares something with other kinds of prejudice: those who have the most unfavorable attitudes towards Evangelicals are probably those who have the least contact with them. It’s true of homosexuality, and it’s true in the current debate over immigration.

So I wonder about those professors who say they view Evangelicals unfavorably. I think about the late Donna Darden, who taught sociology in Tennessee, where Evangelicals and fundamentalists were the rule rather than the exception. She had wonderful stories about her struggles to get students to think sociologically. I’m not sure whether Donna was an atheist or Wiccan, but whichever it was, when students learned of her views, some would stand up and “witness” right in class. Here’s an excerpt from something she posted to a sociology Internet group.
Their next assignment calls for them to post a completion to the stem, “If I had been born a member of a different race...” They have read good stuff about the social construction of gender and race. Many will still tell me that they cannot answer that question because God made them the persons they are, and they cannot second-guess God.

But for all her disagreement and frustration with her students, she certainly would never have answered the survey by saying she felt “cold” towards them.

My own contact with Evangelicals and other born-again students has been limited. Northern New Jersey ain’t Tennessee, and up here in Sopranoland, most people are content to be born only once. I started teaching before the Moral Majority became a strong political force, but even so, I may have shared some of the same pre-judgments as the professors in the recent survey. At the very least, I expected that Evangelicals would be closed-minded and dogmatic. But what I found was something else.

First, I never had Donna’s experience of students injecting their theology into sociological discussions. The only way I could even guess that a student was an Evangelical was this: on the first day, I ask students to fill out index cards with their name and phone numbers. But so I’ll have a better chance of remembering them and learning their names, I also ask them to put down the title of the best film they’ve ever seen. Not just the most recent, I say; take a minute to think. OK. Then do the same thing for a book and a record. I also ask for a TV show they watch regularly — one they’d record if they weren’t going to be home when it was on. (This exercise also gives me a small window onto all those areas of pop culture that I’m growing farther and farther away from.)

Every once in a while, there’ll be a student who for best book lists The Bible. So I figure here’s someone who, if not Evangelical, fundamentalist, or born again, at least takes their religion pretty seriously. My sample is small, but my impression is that compared with the average student, they are more conscientious— less likely to miss class and more likely to do the reading and to turn in assignments on time.

But it’s not just that they are well-behaved. They regard the sociological ideas from class as something important, not just a bunch of stuff that you leave behind each day once you walk out of the classroom, except maybe to remember for a test. Where many students are content to “learn” the material in a sort of Durkheim-said-this-Weber-said-that way, these students will follow the line of thought further and look for its corollaries, implications, and applications.

They engage the material more than most students because they walk into the course already having a more or less coherent world view. Most students at age twenty or so have an inchoate set of ideas for understanding the world. They take it as it comes and haven’t thought systematically about the way they are interpreting it. They don’t even see themselves as making interpretations. They have trouble seeing the differences between theories, between Weber and Marx for instance.

But the born-again students have a systematic scheme for encountering the world. They have a “theory,” a set of related ideas, and they are constantly alert to interpret the events of the real world with respect to that theory. Give them some new data or some new ideas, and they want to know how these fit with their own view.

They may reject sociological ideas. They may even, like some of Donna Darden’s students, shun these ideas as the work of Satan. But in order to make that judgment, they first have to think through those sociological ideas and see how they match up against their religious ideas. They have to take the material seriously.

Prejudiced Professors?

June 10, 2007
Posted by Jay Livingston


What are your overall feelings toward Catholics? Use a scale of 0 to 100, where 0 is very cold or unfavorable, and 100 very warm or favorable.

That’s a question from a survey of college faculty. Sociologist Brad Wright has been blogging recently about one finding from this survey. On this warm-cold question, 13% of faculty have a view of Catholics that's below 50 out of 100. The religious group viewed most unfavorably is Evangelical Christians.
Evangelicals were the only group that a majority of faculty rated as less than 50. Brad Wright, himself an Evangelical, sees this as “prejudice,” similar race prejudice. And he thinks that this unfavorable attitude probably takes the active form of unfavorable treatment of Evangelical students.

The report has no evidence on discriminatory treatment, so we’re all just speculating on that. But in any case, prejudice is probably the wrong term for these unfavorable feelings. They are not based on some irrational stereotype. Unfortunately, the report doesn’t ask the faculty why they hold their favorable or unfavorable views. That’s one of the big problems with survey research— it doesn’t go very deeply into what people actually think. But from other evidence in the report, it’s pretty clear that the attitude towards Evangelicals is not primarily about religion; it’s about politics. Faculty are more liberal than mainstream America; Evangelicals are conservative. And what faculty were primarily concerned about was not someone else's personal relationship with God and Jesus but their political actions.

In fact, while a bare majority of faculty viewed Evangelicals unfavorably, 71% felt that “this country would be better off if Christian fundamentalists kept their religious beliefs out of politics.”

Strictly speaking, the terms fundamentalist and Evangelical are not interchangeable. It is the fundamentalists who are more politically active. But I suspect that many of the faculty surveyed ignored these differences (if they were aware of them at all) and lumped fundamentalists and Evangelicals together into the single undifferentiated category of conservative Christians. The people who designed the questionnaire did mention this distinction and may actually have encouraged the oversimplification. The question about politics referred to “Christian fundamentalists,” but the questions on warm or cold feelings asked only about Evangelicals and omitted fundamentalists entirely.


Conservative Christians have been loudest in their views condemning if not criminalizing abortion and homosexuality. (Does Jesus every mention either of these?) Most faculty (and most Americans) take a more tolerant view on these issues. But what if these Christians had instead been putting their political muscle into raising the minimum wage, creating more equitable health care and tax policies, restricting access to deadly weapons, protecting workers and the environment against powerful corporations, etc.? (I’m not a theologian, but I suspect that you could make a “what would Jesus do” argument for the liberal side of all these issues. In fact, some Evangelicals work for goals liberals would certainly support — adult literacy, food banks, day care, etc.) And then there’s the most important political issue of the day — Iraq.

If the politics of conservative Christians were different, with no change in their theology, faculty would surely view them more favorably.

(I'll continue tomorrow with a more personal take on this issue.)

Character and Conviction — Scooter Libby

June 6, 2007
Posted by Jay Livingston

Stanley Milgram’s experiments on “obedience to authority” and Philip Zimbardo’s prison simulation are two of the most famous studies in all social science. If there’s anything that we should have learned from them it’s this:
In new and unusual situations,
1. We are not very good at predicting what we ourselves or anyone else will do
2. Individual factors like character and personality are far less important than situational pressures.
Today, the New York Times reprinted excerpts of letters asking for leniency in the Scooter Libby case. The article should have carried the title “Fooled by Character.”

The judge had sentenced Libby to 2½ years in prison for lying to the FBI and to a federal grand jury. Libby frequently claimed not to be able to remember a lot of important things about the disclosure of the identity of a CIA agent. In the criminal trial, the jury concluded, beyond a reasonable doubt, that he was lying about these losses of memory.

Libby was trying to protect the Bush administration generally and perhaps specific members of it (Rove? Cheney?) in outing CIA agent Valerie Plame. The Bush team leaked Plame’s identity in order to discredit her husband, who had undercut some of the administrations claims about Saddam’s WMDs.

None of Libby’s supporters (mostly Bush administration biggies) argued that Libby had been wrongfully convicted and that he hadn’t committed the crimes he was convicted of. But clearly, they found it difficult to believe. A man of such good character could never commit such crimes, could he? Here’s James Woolsey, former head of the CIA:
His conviction for perjury and obstruction of justice is completely inconsistent with my knowledge, and highest evaluation, of this man's character and integrity.
And Richard Perle (something of a slimeball himself):
Having known Scooter Libby for many years, I am unable to reconcile the man I know with the crime for which he has been convicted.
Obviously, Woolsey and Perle have not learned the lessons of Milgram-Zimbardo.

More sophisticated is Henry Kissinger:
I would never have associated the actions for which he was convicted with his character. . . . Having served in the White House under pressure, I have seen how difficult it sometimes is to recall precisely a particular sequence of events. This does not justify the action, but it may help you consider mitigating circumstances.
Kissinger at least recognizes situational pressure. But he sees that pressure as affecting only the ability to remember, not the willingness to lie to authorities in order to protect a bankrupt policy and the claims that it was based on. Still, Kissinger, like the others, is blinded by character.

It all reminds me of official reaction when the abuses of Abu Ghraib were exposed and could not be covered up. A military spokesman said, “It's just not something we would do.”

The Daily Show* caught this paradox perfectly when it had Rob Corrdry explain, “We shouldn’t be judged on our actions. It’s our principles that matter, our inspiring, abstract notions. Remember: just because torturing prisoners is something we did, doesn't mean it’s something we would do.”

Committing the crimes of perjury and obstruction of justice was not something that Scooter Libby would do. It was just something he did do.

* Personal note: I will be in the audience at today’s taping of The Daily Show.

Who Supports Terrorism?

June 4, 2007
Posted by Jay Livingston
Not long ago, I blogged about the reaction to a Pew Research Center poll of American Muslims. Generally, the media ignored the story entirely. What coverage there was mostly emphasized that American Muslims, in their attitudes and values, were very similar to the rest of Americans.
The far right, however, was apoplectic, accusing the mainstream media of deliberately downplaying one frightening finding: Only 69% of younger Muslims (age 18-29) refused to say that violence was “never justified.” That means, as Rush Limbaugh put it, “Almost a third of young American Muslims who support in one way or another homicide bombings.”
Do you personally feel that this kind of violence [suicide bombing and other violence against civilians] is often justified to defend Islam, sometimes justified, rarely justified, or never justified?
Here are the results:
Often 2%
Sometimes 13%
Rarely 11%
Never 69%
Don't Know or N/A 5%
(Note that in order to boost the percent to almost a third, Limbaugh classified the “Don’t Know or No Answer” responses as supporting terrorism.)

In my posting, I wondered what percentage of US Christians might feel that such violence in defense of Christianity might be justified. I still don’t have any data on Christians, but I did find a survey that found a group even more supportive of terrorism than were the US Muslims. The survey question was nearly identical, except that it left out the part about defending Islam
Some people think that bombing and other types of attacks intentionally aimed at civilians are sometimes justified while others think that this kind of violence is never justified. Do you think that such attacks are . . . .justified?
Here are the results:
Often 5%
Sometimes 19%
Rarely 27%
Never 46%
Don't Know or N/A 3%

Who were these bloodthirsty terrorism-justifying extremists? Americans. A representative cross-section of the US population. By comparison with the hotheaded Muslim youth, the average American was more likely to to say that bombing civilians was
“often” or “sometimes” justified, and far less likely to say that it was never justified.

Here's a chart showing the comparison (my apologies for the tiny scale. Size matters, but I can't figure out how to make charts larger):

Purity and Danger on the Campaign Trail

June 3, 2007
Posted by Jay Livingston

Mary Douglas, in some of her later writing after Purity and Danger, noted that some cultures are more rigid about the categories they use to think about things in their world; other cultures accepted a degree of fuzziness and ambiguity.

American politicians have often found success in appealing to the more rigid world view. They call for a hardening of boundaries — geographic, moral, and cognitive. It is the view that divides the world into good and evil. The most famous example in recent history is George W. Bush’s “axis of evil” State of the Union speech in 2002. Referring to Iraq, Iran, and North Korea, Bush declared, “States like these, and their terrorist allies, constitute an axis of evil, arming to threaten the peace of the world.”

The “axis powers” of World War II , the basis of Bush’s phrase, were in fact linked by alliance. The countries in Bush’s axis of evil had no such alliance, and two of them, Iraq and Iran, had just fought each other in a devastating nine-year war. Bush was trying to build support for the invasion of Iraq as a response to the terrorist attack of 9/11. So even though the 9/11 terrorists had nothing to do with Saddam’s Iraq, Bush could lump them all together as “evil.” This reduction of the world into two simple categories, good and evil, worked. Nearly half of all Americans believed that Saddam had been behind 9/11, and of course nobody in the Bush administration did anything to disabuse them of that mistaken idea. Bush was able to sell the invasion of Iraq as part of the war on terror (he might just as well have said “war on evil”) — sell it to America, that is.  But while Bush was successful in the US, the rest of the world rejected his logic, his “facts,” his policy, and quite possibly his good-vs.-evil world view.

You would have thought that the experience of Iraq had taught us something. In the movies, when you get rid of the evil ruler, all the Munchkins hail you as a savior, send you back to Kansas, and live forever in happiness and peace. But in the real world, Iraq turned out to be a much more complicated array of political, religious, and ethnic alliances than merely good and evil. Even if Saddam was evil, getting rid of him did not exactly unleash the forces of good, as the daily press reports from Iraq remind us. The world of international politics is more complicated than good and evil, and the country that has benefited most from our wars against evil in Afghanistan and Iraq has been that axis-of-evil linchpin Iran.

Nevertheless, here is Fred Thompson, former senator and now probably a candidate for president: “This is a battle between the forces of civilization and the forces of evil and we've got to choose sides.”

Thompson has not even officially declared his candidacy for President, but in the polls, he’s already ranked third among Republicans. If Mary Douglas is right, we should also see this Purity-and-Danger view underlying the position of Thompson and his supporters on the subject of immigration: Harden the boundaries, wall off the borders, keep out the dangers of impurity.

Stay tuned.

Movin’ on up?

June 2, 2007
Posted by Jay Livingston

When I was in my thirties, it occurred to me that I was not earning nearly as much as my father had when he was my age. He had been, for much of his life, a successful businessman; I was an academic. Even in absolute dollars, his income thirty years earlier (he was about thirty years older than me) may have been higher than mine. Certainly in inflation-adjusted dollars, he had been much better off.
I was an example of “downward mobility,” something I’d almost never heard about, not even in my sociology classes. The phrase “social mobility” almost always carries an assumption that we move in only one direction – up. There’s a silent “upward” in front of the “social mobility.” We expect that children will do better or go farther than their parents did.

This expectation runs into a logical problem. If the pie remains the same size, and some people get a larger slice, others have to get a smaller slice. The only way for everyone’s slice to increase is if the pie is constantly getting larger. Or as President Reagan famously said, a rising tide lifts all boats. The pie metaphor works better, for what has happened in the last few decades is that the pie has gotten larger, but the slices for most of us have grown by a few bites while the slices of the wealthy, already large, have been supersized.
The middle class is slipping farther and farther behind the wealthy (and much farther behind the very wealthy). But beyond that, the last few years have also brought more downward mobility. (At least in this one area, I was way ahead of the curve.) On average, men in their 30s have not been doing as well as their parents at a similar age. The report by Isabel Sawhill and John Morton, mentioned in the previous post in this blog, compared the incomes of thirtysomethings a generation apart. Here are the results.
The men in their thirties in 1994 were earning just barely more than were men of their parents’ generation. But men in their thirties in 2004 were doing worse, and by a considerable amount – $35,000 compared with $40,200 for their fathers.

The good news is that despite this trend, family incomes in both periods were up.

The increase is not as great for today’s thirtysomethings as it was for those even ten years ago, but the trend is still up. What does it all mean – men’s incomes down, family incomes up? The obvious answer is that more women are working. Some women in their thirties have chosen to be in the paid labor force for reasons of career and self-fulfillment. But my guess is that most of these women are working because they have to – because the additional income they bring in is the only thing that allows their family to maintain middle-class status.

In earlier generations, American families had the luxury of being able to live on a single income. Now, a second or even third income in the family has become a necessity.