Packaging Air

February 26, 2010
Posted by Jay Livingston

(A Dan Myers* kind of post)

I had to buy low-dose aspirin (81 mg). One a day, supposedly good for the heart. I went for the generic brand, of course. It came in two sizes – 120 pills and 300 pills. The larger size was the better bargain. And it certainly looked much larger on the shelf.

Then I got home and opened the package. The bottle was mostly empty. I had bought a lot of air. The 300 little aspirin tablets were all there I guess, though I didn’t bother to count them. But they would have fit into a bottle one-third the size. The pill packers hadn’t even put in the usual wads of cotton to fill out the empty space. (What are you supposed to do with that cotton anyway? Do you swallow it with the aspirin, or before? Or do you use it to sop up spilled water?)


In the picture, I’ve dumped the 300 aspirin into a plastic cup that’s about the same diameter as the bottle, and I’ve drawn a line on the bottle to show the level the aspirin reached.


*Dan would have made a video showing the 2/3 empty bottle and him pouring the aspirin into the glass, and maybe popping one into his mouth. Or probably something more amusing. Like taking a huge bag of potato chips, pounding on it till the contents were small particles, and then pouring that into a thimble.

Pimp My Write

February 23, 2010
Posted by Jay Livingston

It doesn’t have much to do with sociology, but this correction in the Times was too good to pass up.


“Pimp” as a verb has now become such a mainstream term that a Times reporter mistakes pumped up for pimped out. (The full Times article is here.)

Ben Yagoda says that the title he really preferred for his book on the parts of speech was Pimp My Ride. The name of the MTV show provides such a good example of the fluidity of parts of speech – a noun turned into a verb, a verb into a noun. Instead, he went with When You Catch An Adjective, Kill It.

Hat Tip: Brendan Nyhan reporting a tweet from Ben Smith at Politico

Cooking the Books - A Second Look

February 19, 2010
Posted by Jay Livingston

Do the police undercount crime?

The graph I cribbed from Rick Rosenfeld in yesterday’s post showed a remarkable similarity between victimization surveys and official crime statistics. In 2000, for example the rate of reported burglaries according to the NCVS was nearly identical to the UCR rate. Both were about 4.4 per 1,000.

Yet in the recent Eterno-Silverman study, police commanders, responding anonymously, said that crime statistics were suppressed. And Josh in his comment yesterday refers to Peter Moskos’s “let me count the ways” description of how the police keep crimes off the books. (See Moskos’s own take on the study at his website.)

The problem is that the graph I presented was somewhat misleading The NCVS and UCR rates of burglary do not measure exactly the same thing. It’s not exactly oranges and apples; more like oranges and tangerines.

1. The NCVS data are for the New York metro area, so we have to use similar UCR data even though the rap about fudging the stats is only about the NYPD. No way to get around that problem

2. More crucially, the NCVS counts only residential burglaries; the UCR number includes both commercial and residential burglaries. Nationwide, about 2/3 of all UCR burglaries are residential. Using that figure for the New York area we get a UCR rate for Residential burglaries of only 3.0 per 1,000 population, about one-third less than we would expect from the estimate of the number of residential burglaries that victims say they reported. Here’s an amended graph. I’ve added a line for residential burglaries that uses the simple 2/3 formula.

(Click on the graph for a larger view.)

The rate of residential burglaries that victims say that they report is usually one-and-a-half to two times greater than the rate of residential burglaries officially “known to the police.” For the year 2000, the NCVS rate of 4.4 per 1,000 population works out to 40,000 reported residential burglaries. If 2/3 of burglaries are residential, only 27,500 of those made it onto the police books.

Does that mean that the police canned 12,5000 reported burglaries? Probably not. There may be other explanations for the some of the discrepancy. But the data do provide some support for those who are skeptical of the precision of the police numbers.

Cooking the Crime Books?

February 18, 2010
Posted by Jay Livingston

“Crimes known to the police” is the official count of Crime in the United States – the annual report published by the FBI, which compiles data from local police departments. It’s also known as the Uniform Crime Reports (UCR).

Many years ago, a friend of mine found that his car had been broken into and wanted to report the crime to the police. He went to the local precinct, and when the desk sergeant finally acknowledged him, he said, “Someone broke into my car and stole my stuff.”

“So what do you want me to do?” said the sergeant.

That was one larceny that never became “known to the police,” at least not on the books of the 20th precinct.

The problem of uncounted crime has been around a long time. In the late 1940s, New York’s burglary rate grew by 1300% in a single year, a huge increase but entirely attributable to changes in bookkeeping. Word had gone out that burglaries should no longer be routinely assigned to “Detective Can.”

In the 1980s, Chicago’s robbery rate rose after the FBI threatened the city that it wouldn’t include their data because the numbers were so suspect. Atlanta kept its numbers artificially low prior to the Olympics. This week, the Dallas police chief is under attack for the way his department reports crimes.

Now two criminologists, John Eterno and Eli Silverman, are claiming that New York’s crime data have been fudged consistently for the last 15 years, and they point to CompStat as the culprit (NY Times article here.) CompStat is the system that William Bratton brought to New York when he became police commissioner in 1994. It required commanders to report every week on statistics and patterns of crime in their areas.

Eterno and Silverman gave anonymous surveys to retired precinct commanders, Under pressure to appear effective in the war on crime, precinct commanders might stretch the facts. The value of a theft might be creatively investigated to keep the total under the $1000 threshold a misdemeanor and the felony known as “grand larceny.” Felonies look worse on your statistical report.

A purse snatch might get recorded as a theft instead of a robbery because robberies fall into the broader category of “violent” crimes. Or victims, like my friend in the old days, might be persuaded not to bother reporting the crime.

In an op-ed in the Times yesterday, Bratton vigorously defended the NYPD numbers. He provided no data, but he could have.

Since 1973, the US has had an alternate count of crime, the National Crime Victimization Survey. Most of the data are for the US, but Rick Rosenfeld and Janet Lauritsen were able to get three-year averages for New York City, and they have looked at the data for burglary.

(Click on the graph for a larger view.)


The graph shows the rate (percents) of
  • people who told the NCVS they had been victims of a burglary
  • people who say they reported the burglary to the police
  • the official rate of burglaries “known to the police”
The numbers are not precisely comparable (the NCVS rate may be based on households rather than population, and the UCR rate includes commercial burglaries as well as residential). But the data in the graph do not support the idea that CompStat increased the fudging of burglary statistics If it had, then starting in 1994, we should see a widening gap between the NCVS line and the UCR line, with the UCR line moving downward much more. But if anything, it’s the Victimization line that descends more steeply.

In the decade following CompStat, both sources of data show a 68% decrease in burglary. So if commanders were cooking the books, they weren't including burglary in the recipe.