For most readers, I suspect, the most shocking aspect of The Post’s series of articles last week on police use of deadly force was the discovery that, compared with other police forces, Prince George’s County officers have killed people so often.
And shocking it was. But what caught my attention, as well as that of other scholars of criminal justice, was the straightforward compilation of data on killings by officers in America’s 51 largest law enforcement agencies. As hard as it to believe, the list printed by The Post last Sunday was the most comprehensive accurate information on police killings in major U.S. cities that has appeared since 1981 and 1985, when the International Association of Chiefs of Police published federally funded reports on the subject.
In a nation that often seems obsessed with statistics on far less significant topics, this absence of information is bizarre — and dangerous. Deadly force is an inevitable component of police work. Excessive force, on the other hand, is a scourge that undermines a department’s ability to do its job. But citizens who deserve responsive, non-aggressive police protection can’t demand reforms unless they know what the facts are.
There is no federal agency that reports on killings or non-fatal use of deadly force by police. The FBI — which annually collects and publishes data on crime and on the assaults and murders of law enforcement officers — includes no information on violence committed by those officers in its Uniform Crime Reports.
Congress tried to correct this situation in 1994, passing a law ordering the Justice Department to report annually on the use of force, including excessive force, by police officers in America. The problem with this mandate is that Justice was given no real power to demand cooperation from state and local police departments — and, consequently, it doesn’t get usable data. Instead it develops estimates by including in the National Crime Survey — a survey of citizens, rather than police — questions about how often the respondents have been subjected to police use of force. Since dead people can’t participate in such a survey, this work tells us nothing about how often police kill.
In 1979, when I moved to Washington to teach at American University, a federal official told me that the FBI collected data on justifiable homicides by police. I questioned this because I had just left my job as a New York City cop, where I had spent the previous several years writing a doctoral dissertation on police use of deadly force there. During all that time, not one of the NYPD officials responsible for collecting information on shootings or for reporting crime figures to the FBI ever mentioned that these included a “justifiable homicide by police” category. I then checked with the FBI and was assured that it does collect these data, but it does not publish them because it cannot vouch for their accuracy. The FBI has good reason for its doubts: Whenever I have checked the FBI numbers for particular police departments against mortality figures obtained directly from the same agencies, they have differed.
Scholars once assumed that the Department of Health and Human Services’s annual report on deaths in America — the report that gives information on deaths by cancer, automobile accident and so on — also provided accurate information on killings by police. During the 1970s, these reports typically indicated that about 250 Americans suffered “death by legal intervention of the police.” Then the 1981 police chiefs’ study showed that, just in the 54 cities with populations greater than 250,000, the police were killing about 325 people every year. Since these cities accounted for only a quarter of all U.S. police officers, it was clear that the HHS data also were inaccurate — largely because local coroners were including most police killings in a general “death by gunshot wound” category, without specifying that the gunshot was from a police weapon.
What this means is that instead of judging police use of deadly force on the basis of systematic data, citizens must rely on anecdotal evidence — such as the 11 o’clock news. If the police shooting of an unarmed man in Cincinnati causes riots, people watching TV across America may conclude that the Cincinnati police are the worst of the worst. Are they? Nobody can tell, because the absence of data has meant that there has been no way to compare Cincinnati’s experience with that of other cities. (And Cincinnati’s police department wasn’t among the 51 listed in The Post series.)
Similarly, when four New York City cops fired 41 shots, killing Amadou Diallo in his Bronx doorway, many Americans (who had already heard of Abner Louima, sodomized with a stick by a New York officer) regarded the NYPD with horror. But in this case there are some facts to challenge the popular image: the numbers collected by The Post. These show that New York ranked 43rd out of the 51 police departments in terms of fatal police shootings — 0.71 such incidents per 1,000 officers.
Meanwhile, consider the Phoenix and San Diego city police departments — both excellent agencies, both regarded as models of successful, non-aggressive, community-oriented policing. The Post’s data show that their officers are nearly five times as likely as New York cops to shoot and kill people (the San Diego rate per 1,000 officers is 3.27; Phoenix’s is 3.14). Shouldn’t citizens of those cities have simple, official access to this kind of information?
Evaluating police restraint on the basis of anecdotes rather than systematic data has had consequences. Because so many media are based in New York, anything that goes wrong in that agency — Louima, Diallo — becomes a national, even international, story. Cops there are on alert that they will be held very publicly accountable for their mistakes.
Here’s a contrast: I testified as an defense expert on police practices in the Diallo case. Just a day earlier, I had completed an affidavit in a suit stemming from a police killing in Camden, N.J. The Camden case involved a mentally disturbed man who, during the course of a long confrontation with the police, pulled from his pocket a talcum powder bottle wrapped in a sock. Eleven officers, some as far as 280 feet away, responded by firing at least 106 shots at him. The media coverage of this killing consisted of one small newspaper story. There was no discipline, no criminal trial, no outrage. In such places, where people are not paying attention, the need for change is not recognized and the police come to believe that such behavior is appropriate.
In order for the public to give police use of deadly force the attention it deserves, there must be data. We already know that when communities suffer disproportionately from crime, from cancer, from poverty, knowledge of the relevant statistics has empowered citizens and encouraged them to demand change. It is long past time that consistent, reliable data on deadly force — like The Post’s, which show that fatal shooting rates vary by more than tenfold across these 51 departments — were made available.
Only the federal government can make this happen. One way would be for Congress to put some muscle into the existing legislation by denying federal funds to jurisdictions that fail to collect and provide use of force data for inclusion in the FBI’s Uniform Crime Report. However it is done, it is important that our government finds a way to assure us that the police we pay to protect us are mandated to tell us how often they kill.
James Fyfe, who served 16 years on the New York City police force, is a professor of criminal justice at Temple University in Philadelphia.
Author: James J. Fyfe
News Service: Washington Post