The Immigration Debate Disrupts My Vacation

July 27, 2007
Posted by Jay Livingston

I didn’t think I’d run into the immigration debate out here on the eastern reaches of Long Island. (I didn’t think I’d pick up an Internet signal here either. It’s weak, but enough to blog by.)

The Hamptons, summerplace of wealthy New Yorkers (This season there’s a rock concert series – Dave Matthews, Prince, et. al. Tickets will run you $15,000.) But the presence of Hispanics is obvious. The people who landscape the property of the wealthy, clean their swimming pools, wait on them in restaurants – many of them speak Spanish. So do the people who are building the new 7-figure houses.

That’s fine for the vacationers, the people who own these houses and come out here for weekends in the summer. But the immigrants are not seasonal; they have settled here, and that demographic change has created some tension with the other full-time residents. There have been a few nasty incidents. The public schools of East Hampton are now at least one-third Hispanic. The big issue, of course, is jobs.

“They really have to do something about this immigration thing,” says Joe, who now delivers heating oil. “I had a job in construction. I was making $35 an hour, which out here is pretty good. Plus benefits and a 401k. And one day my boss calls me in and tells me straight out: ‘Look, I can hire three illegals for $10 an hour. I get three workers, don’t have to pay benefits, and I still save $5.’ I mean these immigrants are really a problem.”

Joe called his US senators to complain. He called Sen. Clinton’s office so many times that eventually Hillary herself called him back. Joe’s animosity seemed to be directed mostly at the immigrants, especially illegals – some estimates put their number in the county at 180,000. But I asked him if the moral of his story wasn’t so much about the immigrants as about employers. Wouldn’t it be easier to enforce laws on a relatively smaller number of employers than on a relatively huge number of immigrants?

We Shall Fight Them on the Beaches . . .

July 24, 2007
Posted by Jay Livingston


The blog is heading out to the beach and taking me with it. A week or so of sun and sand . . . unless the weather forecast is actually correct, in which case it'll be a week of clouds and rain. Either way, the blog and I may be oceanswept out beyond the reach of the Internet.

Thinking and Working

July 23, 2007
Posted by Jay Livingston
Early in my teaching career, I was talking casually after class one day with a student. “What are you, some kind of intellectual?” he asked, more challenging than curious.

Well yes, I thought. Isn’t that a legitimate thing to be at an institution of higher learning? I had not yet gotten used to the very practical orientation most of my students had towards their education. They weren’t interested in ideas as such. They wanted to learn stuff that would allow them to get better jobs and make more money.

I was reminded of this again by a front page story in Sunday’s New York Times. “Enough thinking, already. Roll up your sleeves.” So said France’s new finance minister recently.
France’s new president, Nicolas Sarkozy, was elected on a platform of more work for the French people, who by law have 30 paid vacation days and one paid holiday each year. (The US, by contrast, has no law requiring employers to give workers even one paid vacation day. See my earlier blog post.) Apparently, M. Sarkozy’s government sees thinking as antithetical to working, and they are trying to change a long-standing French view about abstract thought.

It may be hard for those of us in the US to appreciate the status that thinking and ideas have in France. Intellectuals and philosophers become famous there – a line that goes from Descartes through Sartre to today’s Bernard-Henry Lévy, a name virtually unknown here but so familiar in France that he’s known by his initials, BHL. Sort of like ARod and JLo.

Intellectuals appear regularly on French TV and are allowed to speak at length, not the three-and-a-half minute interview or crossfire shouting match that passes for discussion on the US airwaves. We Americans want our answers short and, if not sweet, at least easy to grasp and to use. We are generally suspicious of intellectuals and of abstract ideas. Our orientation has always been more pragmatic.

Things haven’t changed much since deTocqueville, 170 years ago, opened Book II of Democracy in America with this:

   
Chapter I
    PHILOSOPHICAL METHOD OF THE AMERICANS

    I think that in no country in the civilized world is less attention paid to philosophy than in the United States. The Americans have no philosophical school of their own, and they care but little for all the schools into which Europe is divided, the very names of which are scarcely known to them.


A few chapters later, “The Americans show a less decided taste for general ideas than the French. This is especially true in politics.”
DeTocquville attributes this disdain for abstract ideas to democracy, equality, and individualism. In an egalitarian society, where nobody is better than anyone else, each person relies on himself and winds up being able to manage very well, thank you. So if a person’s ideas are sufficient for his own life, what need does he have of other ideas?
As they perceive that they succeed in resolving without assistance all the little difficulties which their practical life presents, they readily conclude that everything in the world may be explained, and that nothing in it transcends the limits of their understanding.
This orientation also leads to a focus on the concrete and a vague suspicion of abstractions, especially those that have no practical application
They like to discern the object which engages their attention with extreme clearness . . . . This disposition of mind soon leads them to condemn forms, which they regard as useless and inconvenient veils placed between them and the truth.
But the French are more concerned with ideas and the logical connections among those ideas. Americans might reject a line of thought because it leads to nothing useful. The French might reject it if it is pas logique. Americans, on the other hand, are much more concerned with concrete facts.

Adam Gopnik, a journalist who lived in Paris for a while, describes his difficulties in France when he had to “fact check” an article. Fact-checking is standard procedure in American magazines: you call people mentioned in the article to make sure that the facts – dates, quotations, etc. – are correct. The French had never heard of such a thing (“What do you mean, une fact checker?”) and were suspicious when Gopnik explained.


Dubious look; there is More Here Than Meets the Eye. . . .There is a certainty in France that what assumes the guise of transparent positivism, “fact checking,” is in fact a complicated plot of one kind or another, a way of enforcing ideological coherence. That there might really be facts worth checking is an obvious and annoying absurdity; it would be naive to think otherwise.

I was baffled and exasperated by this until it occurred to me that you would get exactly the same incomprehension and suspicion if you told American intellectuals and politicians, post-interview, . . . .

“In a couple of weeks a theory checker will be in touch with you.”

Alarmed, suspicious: “A what?”

“You know, a theory checker. Just someone to make sure that all your premises agreed with your conclusions, that there aren’t any obvious errors of logic in your argument, that all your allusions flow together in a coherent stream—that kind of thing.”

. . . A theory checker? What an absurd waste of time, since it’s apparent (to us Americans) that people don’t speak in theories, that the theories they employ change, flexibly, and of necessity, from moment to moment in conversation, that the notion of limiting conversation to a rigid rule of theoretical constancy is an absurd denial of what conversation is. (pp. 95-96)

Well, replace fact (and factual) for theory in that last sentence, and you have the common French view of fact checking.

Apparently President Sarkozy has his work cut out for him.


Music? At a concert?

July 21, 2007
Posted by Jay Livingston

We went to the Philharmonic concert one evening this week on the Great Lawn in Central Park. People come early, meet friends, spread blankets, put out all sorts of food. Most find the others in their group either by cell phone (“I see you now. Turn about a quarter-turn. I’m waving”). Some had more striking visual techniques, like waving the Turkish flag.


And when the concert starts, they listen to the music. Or not.

As a sociologist, I should have remembered that music is only one of the reasons that people go to a concert, especially a free, outdoor concert. I also should have realized that there’s a strong correlation between how interested people are in the music and how far they are from the stage.


In the diagram, the X marks where we were, nearly 200 yards from the stage, and the crowd stretched back to the farthest reaches of the Lawn. (You can get some idea of distances by using the baseball diamonds. The blue lines indicate emergency lanes the Park establishes – nobody’s allowed to sit there – that divide the area into quadrants.)

I know that the demographic for classical music tilts heavily towards the geezer end of the scale, but around us were thirtysomethings, people who in an earlier decade would have been called “yuppies.” And they never stopped talking, to one another or on their cell phones. Even when the music began, and all through the concert, they didn’t even lower their voices. I heard about baby sitters and hedge funds and the Yankees’ pitching rotation. I heard conversations I didn’t understand because they were in Italian or Turkish.

It was clear that almost nobody at this remove from the stage was listening to the music at all, even though the orchestra had selected “pops” pieces that might be familiar (“Til Eulenspiegel” “Pictures at an Exhibition”). But I would bet that if you asked them, “What did you do last night?” a lot of them would say, “We went to the concert in Central Park.”

Why would people come to a concert if they were going to completely ignore the music? They could picnic in the park any time. I speculated that there must be some attraction to doing something that many other people are doing. The presence of lots of other people makes the same activity more pleasant, more social.

One other reason became obvious at the end of the concert.



Still, I doubt that most of the people there knew about the fireworks in advance, and I’d guess that most of them would have come even without the fireworks.

Full disclosure: I could not get my own shot of the fireworks to print, and I grabbed this image off the Web. It looks much better than mine.

Draw-a-line Contest

July 17, 2007Posted by Jay Livingston

My previous post was about methodological dishonesty. Here we go again.

A line is a convenient way of illustrating the relation between variables, a quick way to make sense of the trend in an array of points. Here’s a graph showing two variables – the percent of income that corporations supposedly pay (X-axis) and the percent of GDP accounted for by taxes.

Your assignment is to draw a line that shows the relation between these to variables. (Try doing it before you scroll down to see my solution.)



Here’s what I got.



The line is not quite straight – I used Paint, and I haven’t figured out how to draw a straight line – but you get the idea. The higher the tax rate, the more money for the government. That’s only common sense. And the graph seems to show that it holds true even when that money is figured as a proportion of GDP.

If these data points were hours of study time and GPA, we’d conclude that studying generally raises your GPA.

Norway is an “outlier” and we’d need to take a closer look at it to figure out why its tax revenues are so much higher, relative to GDP, than are those in countries with similar corporate tax rates. (If this were studying and GPA, we’d probably conclude that Norway must be unusually smart.)

But that’s not how conservative economists want things to work. They believe in something called the Laffer curve. It’s based on the idea that if you raise taxes too much, people and corporations will be discouraged and not bother working. For example, if the government taxed 100% of your income, would you work? Of course not, and the government would get no taxes from you.

Conversely, if you lower taxes, revenue will actually go up because people will work more, make more money, and even though the percentage paid is lower, the total amount paid will be higher. Twenty percent of $100,000 is more than 30% of $50,000.

When the Cheney-Bush administration proposed huge tax cuts back in 2001, some people thought the loss of tax revenue would erase the surpluses built up in the Clinton years and create a deficit. But the conservatives hauled out the Laffer-curve theory to counter these arguments. Of course they were wrong. The tax cuts quickly wiped out the surplus and ran up huge deficits.

So what’s a Laffer believer like the editorial page of the Wall Street Journal to do with this array of points? Look at the line they draw to illustrate the relation between tax rates and revenue.


Now Norway, instead of being an outlier, is the point that best allows the Journal to draw the Laffer curve. It's a bit of a stretch, much like travelling from the United Arab Emirates to France by going through Norway

Usually, we try to draw a line so that it minimizes the distance of points from the line. The Wall Street Journal line maximizes the distances. Seems like a good idea, doesn't it. If you studied a fair amount and still wanted to improve your GPA, I guess the Journal would suggest cutting down on book time. Let’s party.

Hat tip: Several other bloggers have picked up on this bit of nonsense. Kieran Healy, in his own commentary, links to several of them

Conclusions First, Then the Research

July 15, 2007
Posted by Jay Livingston

Sometimes in class, I’ll ask students how they might find get information on some sociological question. “Do a survey,” is the frequent answer. Students seem to think that a survey, any survey, has magical powers to reveal the truth.

“A survey is just asking a bunch of people a bunch of questions,” I usually say. “Who are you going to ask, and what are you going to ask them?” In other words, the validity of a survey depends on the quality of the sample and the quality of the questions.

Michael Schwartz, when he was a grad student TA for the methods course, would give students an assignment intended to tarnish the objectivity-mystique of surveys: : “Design a questionnaire to show that . . .” Conclusion first, then the research. Mike’s point was that the questions – the order in which they are asked, the phrasing, etc. – can bias the results. I think part two of the assignment was to design a second questionnaire that would yield a result directly opposite to the first.

It was all in good, educational fun, I thought. Real researchers would never deliberately do something like that. But today, Mark Kleiman at The Reality-Based Community gives some examples of result-oriented questions from Zogby polls. Zogby is a pretty good source on political questions, especially their surveys of international populations. So it was sobering to see these polls custom-tailored to produce the results wanted by the people who were paying for the survey.

Here’s an example for a poll Zogby did for Judicial Watch, a conservative watchdog group that gets a lot of its money from the Clinton-hating Scaife foundation.

Some people believe that the Bill Clinton administration was corrupt. Whether or not you believe the Clinton administration was corrupt, how concerned are you that there will be high levels of corruption in the White House if Hillary Clinton is elected President in 2008?
I think that the Sociology department at Montclair will rewrite some of the items in our teacher evaluation forms.
Some students in this professor’s other courses have said that this instructor is one of the best in the university. Regardless of those other courses, how would you rate the professor’s performance in this course?

Maneaters

July 13, 2007
Posted by Jay Livingston

David Brooks, in his Tuesday op-ed column in the Times, wrote about today’s young women:
These iPhone Lone Rangers are completely inner-directed; they don’t care what you think. They know exactly what they want; they don’t need anybody else.
A lot of people on the left wish the Times would dump Brooks. He holds down the neoconservative seat on the Times op-ed page, and he usually writes about politics. He was a staunch supporter of the Iraq invasion and many other policies of the Cheney-Bush administration. But sometimes he looks at social and cultural matters, so he’s providing something for us sociologists, even though, as with his politics, he usually gets it wrong.

For the text of his sermon on Tuesday, he took three hit songs: Avril Lavigne’s “Girlfriend,” “U + Ur Hand,” by Pink, and “Before he Cheats,” by Carrie Underwood. (He could have added Nelly Furtado’s huge hit “Maneater.”)

These songs, according to Brooks, herald the appearance of a new kind of young woman – “hard-boiled, foul-mouthed, fed up, emotionally self-sufficient and unforgiving.” She’s the female counterpart of the hard-bitten hero of Western movies or the hard-boiled detective of crime fiction. Clint Eastwood and Bogie in drag.

But Brooks’s radio must be tuned in to unusual versions of these songs. These women are not emotionally self-sufficient, they’re angry, and they want revenge. The tough guys in US culture are essentially devoid of feeling. They don’t get mad, they get even. Suppressing their emotions, including anger, allows them to mete out justice, even against those they might once have been romantically involved with. In the well-known ending of I the Jury (see the film “Marty” next time it comes around on TCM), private eye Mike Hammer shoots his former love Charlotte after figuring out that she’d killed his partner.
“How c-could you?” she gasped.
I only had a moment before talking to a corpse, but I got it in.
“It was easy,” I said.
The justice is not purely abstract or ideological; the hero has been personally touched by the crime. But he also acts on the basis of personalized principle, not a simple emotional reaction. Sam Spade puts it nicely at the end of “The Maltese Falcon,” in circumstances similar to those of I the Jury. He discovers that Brigid O’Shaughnessy has killed his partner. She appeals to their past relationship: “You know in your heart that in spite of anything I've done, I love you.”

But Spade is adamant: “You killed Miles and you’re going over for it.” He explains, “When a man's partner is killed, he’s supposed to do something. It makes no difference what you thought of him. He was your partner, and you’re supposed to do something about it.”

A man’s gotta do what a man’s gotta do.

The women in these songs are not acting on any general principle. They are responding, violently and personally to personal insults. They don’t want justice; they want revenge.

That I dug my key into the side of his pretty little souped up 4 wheel drive,
carved my name into his leather seats,
I took a Louisville slugger to both headlights,
slashed a hole in all 4 tires...
Maybe next time he'll think before he cheats.
(In America, if you really want to take revenge on a guy, go after his car.)

As Brad deLong points out, the rage of a woman scorned goes back a few years – Medea, Clytemnestra, Frankie and Johnny. But in our culture, it’s usually been the men who are allowed to express their anger by seeking revenge. So in a way Brooks is right; the tone in these songs may not be completely unprecedented, but it is atypical. The women in these songs also don’t bother with the typically feminine strategies of seduction, pleading, or guilt-tripping to get what they want. They make direct demands, and if the guy can’t meet those demands, to hell with him.

Brooks attributes the ethos of these songs to the Zeitgeist. They are “a product of the cold-eyed age of divorce and hookups. It’s also a product of the free-floating anger that’s part of the climate this decade.”

Not exactly. The emotions and actions in these songs have been around for centuries. It’s just that for most of that history, they had been restricted to men. If the popularity of these songs illustrates anything, it’s the democratization of emotions and actions, much like the “foulmouthed” language that upsets Brooks. Those words, reactions, and actions which have long been a male preserve are now becoming legitimate for women as well. At least for rock stars.

These songs are all hits. I just wish we had some data on who’s downloading them – men or women.

Ratatouille

July 8, 2007
Posted by Jay Livingston

“Ratatouille” opened to universally great reviews, and it’s a delightful film. The more important question for Hollywood, though, is not whether a film is good but whether it will make money. No, not just whether it will make money but whether it will make a lot of money. “Ratatouille” had an opening weekend gross of “only” $47.2 million, and people at Disney already felt they had to spin the numbers to rebut claims that the movie was a disappointment.

The “trades” (I love using show-biz lingo) were comparing it unfavorably with “Cars,” Pixar’s 2006 summer movie and a big money make. But besides the financial comparison, the two films also provide an interesting cultural comparison. They exemplify the “culture wars,” the red-state blue-state divide.

“Cars” embodied the Nascar red-state mythology, not just because of its obvious theme (stock car racing) and setting (the American Southwest) but because of its moral: the triumph of American small-town virtues (friendship, community) over egotistical self-fulfillment and achievement.


The Michael J. Fox film, “Doc Hollywood,” was nearly identical in plot (career-minded doctor headed for Beverly Hills crashes his Porsche and winds up in a small Southern town; you can guess the rest), but this theme is a staple in many American fictions. Community is to be prized over individual achievement; plain small-town folk are better than city fast-trackers.

“Ratatouille,” by comparison, is downright unAmerican. I imagine Disney-Pixar was taking a chance even with the title, a foreign word unknown to many Americans, and most of those who do know it probably can’t spell it. On the other hand, what could be more American than “Cars”? The movie is set in France, a country US patriots were boycotting not so long ago (remember “freedom fries”?).  As for the virtues of bucolic settings, the rural life shown at the start of the film has little to recommend it, and our hero, the rat Remy, quickly winds up in Paris. And this movie loves Paris, a city which has long been, in the American imagination, the antithesis of down-home American virtues and values. Paris is tempting because of its sensuality (“How ya gonna keep ’em down on the farm?”) but ultimately evil.



Even the basic concept of the film must seem foreign to the red-state mentality. It’s not about a manly pursuit like driving fast; it’s about cooking. While other films may extol just plain folks who eat plain simple food that nobody made too much of a fuss over in preparing, “Ratatouille” dismisses such an attitude as unworthy. Food is something that requires attention, both in the cooking and the eating. And the film takes frequent jabs at the American way of eating. It makes Remy’s rival (the evil chef Skinner) all the more repugnant by having him promote his line of micorwavable frozen foods – burritos, pizzas, and other things you’d find in many American freezers. Even worse, he has his people working to produce a frozen corn dog.

The attack on American bread is a bit more subtle – a didactic speech by a female chef giving the audience a lesson in what makes for good bread: a crunchy crust. The slap at our preference for squishy bread (Wonder) is so obvious she doesn’t need to say it out loud.

Despite this unAmerican aura, the film seems to be “doing well,” and the grosses from the weekend will probably look encouraging. I take these numbers as a sign that things are changing in America, that good food, even good European food, is not something that happens only on the coasts. Remember the Republican attacks on Democrats in recent elections as “brie-eating, chablis-drinking” pretentious snobs? But stores in the heartland are selling brie and chablis. David Kamp is probably exaggerating in calling America The United States of Arugula, but apparently a lot of Americans now at least know what arugula is.

In fact, the red-state blue-state division may be less an accurate representation of reality than a convenient stereotype dreamed up by politicians and the press. Like any stereotype, it may be a useful shorthand with some truth to it, but like other stereotypes, it can also make real-life contradictions harder to see. Not so long ago, a caffe latte was an exotic drink reported on by adventurous tourists returning from Italy. Now, every kid in Iowa and Wyoming has grown up with Starbucks. The drinks have been Americanized (a spoonful of high fructose corn syrup makes the espresso go down), but now latte and cappuccino are as American as pizza.

Maybe the next time you stop in at Flo’s Café in Radiator Springs, the menu will feature ratatouille.

The Colorblind Doctor

July 5, 2007
Posted by Jay Livingston

(Big hat tip to Mark Liberman at The Language Log for this post.)


Deepak Chopra, of all people, is writing about language and politics. The good doctor was blogging at Huffington Post about the recent Supreme Court decision that forbade school districts from giving any consideration to race in assigning students to schools.

The cities involved in the case (Louisville, Seattle) had been struggling to achieve some degree of integration in their schools. Where other factors were equal, the school district would avoid assigning a white student to a predominantly white school or a black student to a predominantly black school. The Court ruled 5-4 that this policy was unconstitutional.

Chopra’s point is that the majority opinion makes clever use of language in defending an indefensible position. He is particularly ticked off about the Court’s use of the word colorblind “as a disguise for racial neglect.”

He’s right, though I would put it more in terms of individual and group effects. Being colorblind at the individual level will probably lead to more segregation at the school level. The schools will become indistinguishable from pre-Brown schools, where students were deliberately segregated by very color-conscious policies and laws. So the ruling mandates a policy that is both colorblind and segregationist.

Justice Roberts, in his majority opinion, made an equally facile statement, one quoted in many news stories: “The way to end discrimination on the basis of race is to stop discrimination on the basis of race.” Clever, but sees the problem of race and schools as a purely individual matter (discrimination) rather than a social one (segregation). What about putting it the other way: “The way to end racially segregated schools is to end racially segregated schools.”

But Chopra makes a wonderfully ironic mistake. He writes:
Despite the overwhelming public support for school integration in both Seattle and Louisville, five powerful white males were enough to squash a society's better nature.
Those five powerful males are Scalia, Roberts, Kennedy, Alito, and . . . Clarence Thomas, the only African American on the Court.


Chopra was obviously being colorblind. His classification of a man as white had nothing to do with the color of his skin but only with the content of his characteristically white opinion on integration.

Why I Am Not a Psychologist

July 2, 2007
Posted by Jay Livingston

A link on some political blog – I wish I could recall whose – took me to this posting on a site called StraightDope:
I recently read Phantoms by Dean Koontz and was curious about his description of the “flatworms in a maze” phenomenon – namely that a flatworm can be taught to negotiate a maze and then ground up and fed to a flatworm that has never seen the maze. This new flatworm will absorb the knowledge of the maze from the first flatworm.
I’ve never read Dean Koontz, but I’m pretty sure the cannibalized-learning theory is bunk, at least where it concerns flatworms. But the mention of these studies took me back to a summer long ago when I worked in a university psych lab run by two psychology professors. It was the beginning of my disillusionment with psychology. It was as though I’d taken a lowly summer job at a law firm and discovered that most lawyers never saw the inside of a courtroom and that their work did not in the least resemble that of Perry Mason.

I had just finished my first year of college, and at the time I still thought that psychologists studied ways to understand and heal minds that were troubled and confused. Minds like my own. I also thought that a professor’s work involved the teaching of students – courses, lectures, exams, that sort of thing. But these two men did neither. They were researchers, and most of their research was about communication in rhesus monkeys. The way they treated the monkeys would today probably land these guys in jail, but as I said, this was a long time ago.

Their domain was the top two floors of one of those tall buildings that gets very narrow at the top. There were a half-dozen linoleum-floored rooms, most of them occupied by monkeys. Most of the monkeys were in cages. The few in the experiment were kept in uncomfortable “primate chairs” that allowed very little movement. Our own little Gitmo.

That summer the professors had read a journal article showing that planaria (flatworms) could be conditioned to swim or crawl a maze. For some reason, the article inspired them to branch out from monkeys and to try to replicate these experiments. Step one was to buy some flatworms – I guess there must have been a planaria supply house. Step two was to assign the groundwork to me.

Planaria (flatworms) are very simple organisms. They are worms, and they are flat. They measure less than a half-inch, with a triangular head featuring two eyes that are set so close together it makes them look cross-eyed.

My job included their care, feeding, and education (or “conditioning” as psychologists call it). Feeding meant dropping a piece of raw beef liver into each worm’s Petri dish. The hungry worm would crawl up on the liver and chow down. For their conditioning, I was to put the flatworm in a narrow, water-filled trough with electrodes at each end. I would then turn on a light — planaria are senstive to light — give the worm a second to realize that the light was on, and then zap him. The worm’s body would contract.
It was “classical conditioning.” The idea was that the worms would learn the light-shock connection. Then, even without the jolt of electricity, the worm would react to the light the same way it reacted to the shock, just as Pavlov’s dogs started salivating at the mere sound of a bell because it had been rung so often at Alpo time.

It sounds easy, but there was one catch. How do you move a worm from its Petri dish to the experimental trough? Our technique was to dip an eyedropper into the dish, suck the worm into the eyedropper, then squirt him out into the trough. After the worm’s experimental session (I forget how many light-shock trials I hit them with each time, maybe twenty), I would put the worm back using the same suck-and-squirt method. Unfortunately, the eyedropper aperture was a bit narrow, and the worm got squeezed each time in its rough passage in or out of the dropper.

The research plan was that after we succeeded with the classical conditioning, we would move on to the “operant conditioning” phase, teaching the little guys to swim a maze – i.e., to bear right at a Y-intersection.

We never got that far. The hardest part turned out to be keeping the planaria alive. Each morning I would check the tank and find a few more of my charges getting paler and paler, becoming translucent and finally giving up the ghost. We brought in new subjects, but they too withered. I figured that the reason for their failure to thrive was the combination of being squeezed through the narrow aperture of the eyedropper and then being electrocuted. That and maybe having to live in water that quickly became murky from the liver decomposing in it.

When I left at the end of the summer, I had not taught a single worm anything – preparation for my eventual career – and as far as I know, my bosses gave up without a publication. Ever since, I have been extremely skeptical about reported findings on flatworm learning. Yes, I know that these studies have been done and replicated. I just choose not to believe them until I see them first hand.

My bosses did publish several articles about the monkeys, but my first-hand experience with those experiments makes me very skeptical of those results too. But that’s another story.

The road sign, which I found using Google, is from the
MySpace page ofTeresasaurus Rex .

Just Wait Till You’re Older

June 29, 2007
Posted by Jay Livingston

Sometimes when I was an adolescent, I would voice some opinion about teachers or sex or drugs, and an adult would say, “Just wait till you get older and have kids.” And indeed, the opinions of the people in the room often divided along age lines. My peers and I were more liberal than the grown-ups.

In the previous post on age differences in political views, I mentioned the difficulty of knowing whether differences were a matter of age or of generation. If it’s age, then people’s opinions will change as they grow older. But if it’s generational, then the members of that generation will retain their views forever. Imagine today’s twentysomethings in fifty years, still thinking their tattoos are cool, while the youth near them on the beach shake their heads in disbelief.

A survey like the one done by the Times/CBS/MTV shows a cross-section of the population at one point in time. But what we’d really like is “longitudinal” or “time-series” data that can show us what happens over the course of time.

Statistician Howard Wainer has an example that illustrates the dangers of drawing longitudinal conclusions from cross-sectional data. If you did a cross-sectional survey on language development in Miami, you might be tempted to conclude that when Miamians are young, they speak Spanish. As they mature into their middle years, they change to speaking English. And when they get even older, they switch to Yiddish.

Hat tip to my brother Skip for relaying this example.

The Kids Are: 1. All Right, 2. All Left, 3. About the same

June 27, 2007
Posted by Jay Livingston


The New York Times published the results of a poll of young Americans — ages 17 to 29. The headline is “Young Americans Are Leaning Left.” In fact, on most issues, there’s not much difference between the young and the rest of the country, even on issues where you might expect the young to be more concerned, notably global warming.

On the topic of Iraq, the young were more sanguine than their elders, who 47%-38% were more likely to see the war as going very badly.

It’s not that the young are more pro-war. They’re just more optimistic. The only way that this attitude makes them more left-leaning is that while traditional conservatives want less government and think that the private sector can do everything better, young people are more optimistic about the ability of government to do good. Health care is a good thing, and the young — 62% to 47% — are more likely to favor a government-run health care program. Similarly, winning wars is a good thing, so the young are more likely to think we can win the war.

This raises the question of whether these are true generational differences or merely differences of age. If they are age differences, then the Gen Y’ers, as they grow older, will shift their opinions more towards those of older people today. If the differences are generational, they will keep their current beliefs even as they get older and have children.

Tastes in music, for example, are strongly generational — as I’m regularly reminded when the Allman Brothers play at the nearby theater, and I see who’s lined up on the sidewalk. Not exactly the same crowd that turned out for Pink. I suspect that the young will retain some of their political views — their more libertarian view of homosexuality and marijuana (on abortion their views are almost identical to those of the country as a whole)— in the same way that they’ll save and listen to their Outkast MP3s. But other political positions like party preference may change as they grow older.

Methodological note. The poll was sponsored by the Times, CBS News, and MTV. The sample size was 659, which means that the confidence interval was 8 percentage points. You’d think that with three heavy hitters like these bankrolling the survey, they’d have had a sample at least twice that size.

Presidents and Prime Ministers

June 24, 2004
Posted by Jay Livingston

The New York Times today publishes an op-ed piece on Tony Blair by British journalist A.A. Gill, who makes a similar point to something I posted yesterday. Interestingly, my point of departure was the scene in “The Queen” in 1997 when Blair is installed as prime minister after a landslide election victory. In a few months, his positive ratings had soared to 75%, unprecedented in British politics. Gill is writing about Blair’s departure from office yesterday with approval ratings of 28%, roughly equivalent to those of George W. Bush.

There’s no pleasing the British, or winning their favor. They simply hate politicians. All politicians. Hatred goes with politicians like mint sauce with lamb. It’s as old as Parliaments. . . .

The difference between British politics and American is that you maintain a collective respect for the office, if not the holder. So the presidency is a venerable thing, even if the president is a cretin. And every president leaves office with his title. He will always be Mr. President, a peculiarly regal touch for a republic. But when he wakes up on Thursday, the prime minister will be plain Mr. Blair, M.P.
On the one hand, I feel vindicated to find something I posted only a few hours earlier confirmed in the Sunday edition of the nation’s paper of record. On the other, there’s also the lurking sense that the point is so obvious you can find it in the next US press article about British politics that you read.

Could We Use a Queen?

June 23, 2007
Posted by Jay Livingston

Dan Myers, in a recent installment of Blue Monster in Europe hears the band at Buckingham Palace play “Stayin’ Alive” and speculates, “The flag was up on the top of the palace, indicating that the Queen was home. I would like to think, therefore, that this performance was a personal request and that she was upstairs working on her own electric slide.”

I watched “The Queen” on DVD recently, which is how I know that the flag Dan refers to is not the Union Jack but the Royal Standard.




Here’s a clearer image.

It’s not the British flag, the Union Jack.



It’s difficult for us Americans to grasp the idea of monarchy. “Stupid” was the comment of the teenager-in-residence who was sitting a space or two down the couch from me as we watched the film.

But there’s something to be said for having a ceremonial head of state, someone who symbolizes the nation as a whole and who stands above partisan politics. The Queen is so far above politics that she’s not allowed to vote. We learn this early in the film, which opens with the election of Tony Blair as prime minister.

“The sheer joy of being partial,” says the Queen. As a person, she no doubt has her political preferences. But as the Queen, she must remain impartial. She is someone the entire country can look to as its leader.

Most European countries, with their long histories of monarchy, have retained a nonpolitical figure as symbolic ruler of the country. In some countries (England, the Netherlands, Norway, Spain, etc.) it’s an actual monarch; in others, it’s a president, who has only ritual duties, while the actual business of running the country falls to the elected prime minister.

But in the US, we have this strange system where a partisan politician is also our ceremonial head of state. It is he who represents the country, attending state ceremonies, recognizing ambassadors, conferring honors, and carrying out other symbolic duties. In the minds of some citizens, to disrespect the president, therefore, is to disrespect the country, even if, as happened in 2000, that president got fewer votes than his opponent. How often have we heard that we must stand behind our president merely because he is our president?

To erode the good will that comes with this symbolic position, a president has to do a really bad job and over a fairly long time. It can be done (Mr. Bush’s latest ratings show only 26% of the country favorable, 65% unfavorable), but it takes sustained effort.

Giving the mantle of symbolic head of state to a partisan politician also can lead to the kind of arrogance we’ve come regretfully to expect of our presidents. They can come to think of themselves in near-kinglike terms — think of Lyndon Johnson’s famous remark, “I’m the only president you’ve got” — rather than as elected politicians. The Bush administration has taken this arrogation of power further than any of its predecessors, with their belief that they can ignore laws they don’t like, withhold information from the Congress and the people, and use the justice system as a political tool.

There may be something about constitutional monarchies that curbs such arrogance. An early scene in “The Queen” shows Tony Blair coming to Buckingham Palace. He has just won the election in a landslide, but he will not be prime minister until he kneels before the Queen and is officially requested by her to form a government. As historian Robert Lacey says in his commentary track on the DVD, “People feel it’s good that these politicians have to kneel to somebody to be reminded that they are our servants.”

In the US, the president is sworn in by the Chief Justice, the Supreme Court being the closest thing we have to an impartial power. But the justices are appointed by politically elected presidents, and as recent history has shown, the Court is quite capable of pure political partiality. Does anyone really believe that the vote in Bush v. Gore was about the law and not about politics? All those five votes that in effect gave Bush the election were Republican appointees. The two Democratic appointees sided with Gore.

Nobody, not presidents or prime ministers, appoints the Queen. Moreover, as historian Lacey notes, the prime minister has to meet with the Queen every week and report to her. The US president does not have to report to anyone. Cabinet members and other administration officials may testify before Congress, and the president himself may hold press conferences. But as the current incumbent has demonstrated, it’s possible to greatly limit the amount of such questioning.

The only thing the US has that takes on some of the magisterial symbolism of the Queen is the flag, which, as an inanimate piece of cloth, cannot do all the things the Queen does. Less officially — only somewhat less officially— there’s God. But over the last half century or so, the Republicans have successfully claimed both God and the flag as belonging exclusively to their party.

As “The Queen” unfolded, the more I watched this very human figure sorting our her roles as grandmother, mother, ex-mother-in-law, and Queen of England, the more I thought that perhaps monarchy isn’t such a bad idea.

(Hat tip and deep bow to Philip Slater, who blogged along similar lines to this post for his Fourth of July essay at Huffington Post.)

The New York Post

June 20, 2007
Posted by Jay Livingston

I love the New York Post. I don’t actually read it; but I see the front page every day. Regardless of the actual content, the message of the subtext could often be summarized in the following headline:

THIS IS NOT A REAL NEWSPAPER.
We’re in it for kicks.

Here’s yesterdays front page.



The story was that Israel was trying to boost its sagging tourism. Apparently the Israelis thought that potential tourists —younger male tourists — were staying away not because of questions of politics and safety but because the Israel “brand” lacked sex. So they made a deal with Maxim, the lad mag, to sponsor a party honoring Israeli women soldiers. They used this photo of Miss Israel 2004 on the invitation.

The Post usually supports Israel, but when it comes to politics on one side and a bikini and a smutty pun on the other, it’s no contest. (“Piece” seems somewhat dated to my ear, but I don’t spend much time listening to informal talk among the target demographic. And how many Post readers will get the pun on brouhaha?)

There’s a culture and class connection here that I can’t quite put my finger on. It’s a combination of patriotism, prurience, and puritanism, and the Post, like much of the rest of Murdoch’s media empire, thrives on it. Most of Murdoch’s other outlets take themselves more seriously, though when the boys at Fox News slapped on that “Fair and Balanced” slogan, they surely must have had their tongue deep in their cheek and all fingers crossed.

The Murdoch media love sex, especially when it allows the pointing of fingers at other people’s sexuality. I’m sure they’re kicking themselves for failing to come up with “To Catch a Predator,” NBC’s champion in the “moral righteousness cum voyeurism” category, while Fox and the others must content themselves with Britney, Paris, and the rest. (For an interesting take on “Predator,” see Corey Colyer’s recent blog entry. ) Typically with these publications, you get babes in bikinis on one page and on the next a sex scandal news item that clearly shows the border between the good guys’ cheerful sexuality and that bad guys’ evil sexuality.

The simplistic moral approach also applies to politics. It’s us against them, and we’re the good guys. Again, the Post, with its lack of pretension to being a real newspaper, provides the best example. Here it is in March 2003, when in the U.N., France and Germany were opposing US resolutions authorizing the invasion of Iraq.


(In case you can’t see it in the photo, the weasel heads have been superimposed on the representatives from France and Germany.)

The Post covers politics the way it covers sports, with the Bush administration as the home team. The main difference is that the Post sports reporters try to get information besides what’s put out by the Yankee front office, and they will actually criticize the Yankee players and management when the team is not doing well. (Currently, the Yankees in MLB are sort of like the US in Iraq — spending a ton of money and not having a lot of victories to show for it, in fact just barely above having a losing record.)

For the New York Post, every day is a moral struggle. Every day, on various fronts, good confronts evil. The forces of selfless virtue, loyalty, common sense, and pride confront the forces of self-interest, danger, sneakiness, corruption, elitism, and immorality. The good guys don’t win every battle, but they can pat themselves on the back for trying. And whatever happens, they can usually find comfort in a picture of a beautiful woman who’s not wearing very much at all.

Rich Girls II

June 18, 2007
Posted by Jay Livingston

More on that Gallup poll in the previous post about Paris, Lindsay, Nicole, and Britney. Around the time I was looking at these poll results, a friend wrote about her daughter’s thirteenth birthday party. “After a year of getting to know these [13-year-old] girls, I've really started to care about them and like them. I think they are good at heart. Some of them come from extremely wealthy families and are spoiled rotten, so it took me a while to find the love for some of them.”

I was reminded of the movie “Thirteen,” in which a girl of modest means is corrupted by a wealthier classmate. Poor girl looks up to “cool,” rich girl; rich girl seduces poor girl into drugs, shoplifting, sex with boys. It’s a cliche, but at least it’s a low-budget, indie version of the cliche.

Like other cliches, it confirms a widely held view, in this case that having a lot of money is dangerous, especially because it can lead girls away from conventional middle-class ways.

(Are there similar tales about boys? I can’t think of any. Rich boys in American stories can be cruel — they can also be helpful — but they seldom corrupt the ordinary boy’s morals as is so common in the fallen-woman stories.)

Ms. Hilton and the others are considerably older than thirteen, but the money-vs.-morals theory retains its attractiveness even when we think about these twentysomethings. Gallup offered four choices for people to explain what caused the problems of these celebs.

“Having too much money at a young age”
“The pressures of fame at a young age”
“Negative influences of the Hollywood culture”
“Parents doing a poor job raising them”

Here are the results:
Too Much Money is the clear winner.

I don’t know the systematic evidence on child-rearing and what might cause girls to have problems. But Gallup respondents didn’t know either. Besides, even if we know that something is true in a general sense, it is impossible to know if it applied in any individual case. So what we’re looking at here is not solid reality; it’s people’s beliefs about reality, specifically about the effects of money.

Those beliefs seem to be rooted in a relentless belief that only middle-class morality will work. It’s a Goldilocks view of socio-economic status. We believe that poverty is not good for kids, but we also see dangers in great wealth. The middle-income range is just right.

I wonder if people in other societies take a similar view, especially in societies with less of an egalitarian ethos and with some trace of aristocratic tradition. The British may not be pleased with the behavior of the younger generations of the royal family, but I don’t think they attribute the shortcomings to an overabundance of money. It’s also possible that within an upper class, drunkenness and adventurous sexuality are not seen as inherently bad. Fidelity and sobriety are middle-class virtues, not nearly so exalted at the outer reaches of the social distribution.

There also seems to be some ambivalence here about middle-class aspirations. We would all like to have more money, though not too much more. (Ask people what the “right” income would be, the income that would allow them to live comfortably, and you’ll usually get a number that’s about 25% higher than what they’re currently making.) Historically, the American pattern of upward mobility is that parents want their own children to have it better than they did. Parents want to be able to buy stuff for their kids. They don’t want their kids to be at all deprived. Yet, there seems to be a nagging fear that giving kids these advantages might also spoil them.

We project that fear upward. I’m not going to give my kid enough to spoil him, not on my income and not even if I were making 25% more than I do now. But the Hiltons, and even those people who make twice what I make — they’re the ones who risk spoiling their kids.

The irony, of course, is that this analysis is relative to one’s own income, and at all levels throughout the broad spectrum that think of themselves as middle class, people may be applying the same moral-economic formula. Someone who makes half as much as you do may see you as one of those rich people who spoil their kids.

Girls (But Not Boys) Behaving Badly

June 16, 2007
Posted by Jay Livingston

George Bush’s favorability ratings keep sinking. In the latest polls (Quinnipiac, NBC/WSJ), only 28-29% of the people come down on the plus side. But wait. Gallup has found other public figures whose numbers are even lower.


Paris Hilton, of course, was the big winner in the girls-gone-bad sweepstakes. Lindsay Lohan, Nicole Richie, even Britney Spears were more highly thought of, with sympathy ratings soaring upwards of 15%.

Note that Gallup asked only about young women. Gallup was only following the media, and the media would claim that they are only following the public’s interest. True, when it comes to stories about troubles people bring on themselves, the public seems to take much more delight in stories about women than about men. When Paris Hilton was jailed, and especially when she broke into tears on being re-jailed, the tabloid media reported with a self-righteous glee in her suffering.

Surely there must be male celebrities who have sinned— Russell Crowe throwing a telephone into the face of a hotel desk clerk, Mel Gibson spewing drunken anti-Semitic remarks at the cop who pulled him over, Robert Downey, Jr.’s recurrent drug problems, etc. Nicole Richie and Paris Hilton in “The Simple Life” set their own glam sensibilities against that of the heartland, but the show was self-mocking and generally not critical of middle America. Borat, by contrast, took some pretty serious swipes at middle America and used some underhanded (though legal) tricks in doing so.

But these men have not been surrounded with the sustained fascination we have given Paris, Lindsay, and the others, nor do the public and media seem to have the same sadistic longing to see them suffer legal or other consequences.

The media framed the Paris Hilton story as the rich girl trying to avoid justice and finally getting what she deserved. The media didn’t report much on what the usual sentence would be for a first-offense driver’s license violation. Probably not jail time. But for the media and public, this case wasn’t really about driving with a suspended license, just as the Clinton impeachment wasn’t really about lying to a grand jury. It was about sex. Paris Hilton’s real crime in the court of tabloid opinion was flaunting her sexuality. And the same may be true of the others.

It doesn’t matter that in many cases, it’s the tabloid media and public that are mostly responsible for making public what had been private — the voyeuristic photos, the theft and distribution of a private video. The media and public strip the clothes off these female celebs and then punish them for indecent exposure.

These stories aren’t news. They’re morality tales, and they show that moral standards are still different for women and men.

Attitudes and Familiarity

June 13, 2007
Posted by Jay Livingston


In the previous post, I said that professors who have the most negative attitudes towards Evangelicals probably are those who have the least contact with them. I don't have any data on that specific issue, but studies on other topics generally confirm the idea that familiarity breeds content. Or if not content, at least a decrease in antipathy.

The graph shows some data from a survey that asked, among other things, if immigrants were a burden on the US.

The people who had little contact with immigrants were more than twice as likely to say that immigrants were a burden.

I live in New York. Buy food in a grocery store, take a cab, buy a newspaper, eat in a restaurant, check into a hotel or hospital — just about anything you do will bring you into contact with immigrants. I doubt that many New Yorkers see immigrants as a threat to the Republic.

The same principle holds for attitudes on issues that surround homosexuality—  should they be allowed to teach in schools, should they be able to marry.

The same is true of homosexuality.

People with no gay friends or relatives are more than twice as likely to favor allowing schools to fire gay teachers. People who do have gay friends or relatives are more than twice as likely to favor allowing gays to marry.

Evangelicals in the Classroom

June 11, 2007
Posted by Jay Livingston

Q: What religious group do college faculty feel least favorable about?

A: Evangelicals.

That’s one result from a recent study, and in yesterday’s post, I offered my guess that faculty were really reacting to what they perceived as the politics of Evangelicals, not their religion. In that sense, the attitude is different from other kinds of prejudice, especially prejudice based on ascribed characteristics like race.

But I would also guess that the attitude shares something with other kinds of prejudice: those who have the most unfavorable attitudes towards Evangelicals are probably those who have the least contact with them. It’s true of homosexuality, and it’s true in the current debate over immigration.

So I wonder about those professors who say they view Evangelicals unfavorably. I think about the late Donna Darden, who taught sociology in Tennessee, where Evangelicals and fundamentalists were the rule rather than the exception. She had wonderful stories about her struggles to get students to think sociologically. I’m not sure whether Donna was an atheist or Wiccan, but whichever it was, when students learned of her views, some would stand up and “witness” right in class. Here’s an excerpt from something she posted to a sociology Internet group.
Their next assignment calls for them to post a completion to the stem, “If I had been born a member of a different race...” They have read good stuff about the social construction of gender and race. Many will still tell me that they cannot answer that question because God made them the persons they are, and they cannot second-guess God.

But for all her disagreement and frustration with her students, she certainly would never have answered the survey by saying she felt “cold” towards them.

My own contact with Evangelicals and other born-again students has been limited. Northern New Jersey ain’t Tennessee, and up here in Sopranoland, most people are content to be born only once. I started teaching before the Moral Majority became a strong political force, but even so, I may have shared some of the same pre-judgments as the professors in the recent survey. At the very least, I expected that Evangelicals would be closed-minded and dogmatic. But what I found was something else.

First, I never had Donna’s experience of students injecting their theology into sociological discussions. The only way I could even guess that a student was an Evangelical was this: on the first day, I ask students to fill out index cards with their name and phone numbers. But so I’ll have a better chance of remembering them and learning their names, I also ask them to put down the title of the best film they’ve ever seen. Not just the most recent, I say; take a minute to think. OK. Then do the same thing for a book and a record. I also ask for a TV show they watch regularly — one they’d record if they weren’t going to be home when it was on. (This exercise also gives me a small window onto all those areas of pop culture that I’m growing farther and farther away from.)

Every once in a while, there’ll be a student who for best book lists The Bible. So I figure here’s someone who, if not Evangelical, fundamentalist, or born again, at least takes their religion pretty seriously. My sample is small, but my impression is that compared with the average student, they are more conscientious— less likely to miss class and more likely to do the reading and to turn in assignments on time.

But it’s not just that they are well-behaved. They regard the sociological ideas from class as something important, not just a bunch of stuff that you leave behind each day once you walk out of the classroom, except maybe to remember for a test. Where many students are content to “learn” the material in a sort of Durkheim-said-this-Weber-said-that way, these students will follow the line of thought further and look for its corollaries, implications, and applications.

They engage the material more than most students because they walk into the course already having a more or less coherent world view. Most students at age twenty or so have an inchoate set of ideas for understanding the world. They take it as it comes and haven’t thought systematically about the way they are interpreting it. They don’t even see themselves as making interpretations. They have trouble seeing the differences between theories, between Weber and Marx for instance.

But the born-again students have a systematic scheme for encountering the world. They have a “theory,” a set of related ideas, and they are constantly alert to interpret the events of the real world with respect to that theory. Give them some new data or some new ideas, and they want to know how these fit with their own view.

They may reject sociological ideas. They may even, like some of Donna Darden’s students, shun these ideas as the work of Satan. But in order to make that judgment, they first have to think through those sociological ideas and see how they match up against their religious ideas. They have to take the material seriously.

Prejudiced Professors?

June 10, 2007
Posted by Jay Livingston


What are your overall feelings toward Catholics? Use a scale of 0 to 100, where 0 is very cold or unfavorable, and 100 very warm or favorable.

That’s a question from a survey of college faculty. Sociologist Brad Wright has been blogging recently about one finding from this survey. On this warm-cold question, 13% of faculty have a view of Catholics that's below 50 out of 100. The religious group viewed most unfavorably is Evangelical Christians.
Evangelicals were the only group that a majority of faculty rated as less than 50. Brad Wright, himself an Evangelical, sees this as “prejudice,” similar race prejudice. And he thinks that this unfavorable attitude probably takes the active form of unfavorable treatment of Evangelical students.

The report has no evidence on discriminatory treatment, so we’re all just speculating on that. But in any case, prejudice is probably the wrong term for these unfavorable feelings. They are not based on some irrational stereotype. Unfortunately, the report doesn’t ask the faculty why they hold their favorable or unfavorable views. That’s one of the big problems with survey research— it doesn’t go very deeply into what people actually think. But from other evidence in the report, it’s pretty clear that the attitude towards Evangelicals is not primarily about religion; it’s about politics. Faculty are more liberal than mainstream America; Evangelicals are conservative. And what faculty were primarily concerned about was not someone else's personal relationship with God and Jesus but their political actions.

In fact, while a bare majority of faculty viewed Evangelicals unfavorably, 71% felt that “this country would be better off if Christian fundamentalists kept their religious beliefs out of politics.”

Strictly speaking, the terms fundamentalist and Evangelical are not interchangeable. It is the fundamentalists who are more politically active. But I suspect that many of the faculty surveyed ignored these differences (if they were aware of them at all) and lumped fundamentalists and Evangelicals together into the single undifferentiated category of conservative Christians. The people who designed the questionnaire did mention this distinction and may actually have encouraged the oversimplification. The question about politics referred to “Christian fundamentalists,” but the questions on warm or cold feelings asked only about Evangelicals and omitted fundamentalists entirely.


Conservative Christians have been loudest in their views condemning if not criminalizing abortion and homosexuality. (Does Jesus every mention either of these?) Most faculty (and most Americans) take a more tolerant view on these issues. But what if these Christians had instead been putting their political muscle into raising the minimum wage, creating more equitable health care and tax policies, restricting access to deadly weapons, protecting workers and the environment against powerful corporations, etc.? (I’m not a theologian, but I suspect that you could make a “what would Jesus do” argument for the liberal side of all these issues. In fact, some Evangelicals work for goals liberals would certainly support — adult literacy, food banks, day care, etc.) And then there’s the most important political issue of the day — Iraq.

If the politics of conservative Christians were different, with no change in their theology, faculty would surely view them more favorably.

(I'll continue tomorrow with a more personal take on this issue.)

Character and Conviction — Scooter Libby

June 6, 2007
Posted by Jay Livingston

Stanley Milgram’s experiments on “obedience to authority” and Philip Zimbardo’s prison simulation are two of the most famous studies in all social science. If there’s anything that we should have learned from them it’s this:
In new and unusual situations,
1. We are not very good at predicting what we ourselves or anyone else will do
2. Individual factors like character and personality are far less important than situational pressures.
Today, the New York Times reprinted excerpts of letters asking for leniency in the Scooter Libby case. The article should have carried the title “Fooled by Character.”

The judge had sentenced Libby to 2½ years in prison for lying to the FBI and to a federal grand jury. Libby frequently claimed not to be able to remember a lot of important things about the disclosure of the identity of a CIA agent. In the criminal trial, the jury concluded, beyond a reasonable doubt, that he was lying about these losses of memory.

Libby was trying to protect the Bush administration generally and perhaps specific members of it (Rove? Cheney?) in outing CIA agent Valerie Plame. The Bush team leaked Plame’s identity in order to discredit her husband, who had undercut some of the administrations claims about Saddam’s WMDs.

None of Libby’s supporters (mostly Bush administration biggies) argued that Libby had been wrongfully convicted and that he hadn’t committed the crimes he was convicted of. But clearly, they found it difficult to believe. A man of such good character could never commit such crimes, could he? Here’s James Woolsey, former head of the CIA:
His conviction for perjury and obstruction of justice is completely inconsistent with my knowledge, and highest evaluation, of this man's character and integrity.
And Richard Perle (something of a slimeball himself):
Having known Scooter Libby for many years, I am unable to reconcile the man I know with the crime for which he has been convicted.
Obviously, Woolsey and Perle have not learned the lessons of Milgram-Zimbardo.

More sophisticated is Henry Kissinger:
I would never have associated the actions for which he was convicted with his character. . . . Having served in the White House under pressure, I have seen how difficult it sometimes is to recall precisely a particular sequence of events. This does not justify the action, but it may help you consider mitigating circumstances.
Kissinger at least recognizes situational pressure. But he sees that pressure as affecting only the ability to remember, not the willingness to lie to authorities in order to protect a bankrupt policy and the claims that it was based on. Still, Kissinger, like the others, is blinded by character.

It all reminds me of official reaction when the abuses of Abu Ghraib were exposed and could not be covered up. A military spokesman said, “It's just not something we would do.”

The Daily Show* caught this paradox perfectly when it had Rob Corrdry explain, “We shouldn’t be judged on our actions. It’s our principles that matter, our inspiring, abstract notions. Remember: just because torturing prisoners is something we did, doesn't mean it’s something we would do.”

Committing the crimes of perjury and obstruction of justice was not something that Scooter Libby would do. It was just something he did do.

* Personal note: I will be in the audience at today’s taping of The Daily Show.