Yesterday, a discussion I had with Justin Feldman, with Harvard University’s public health school, was published on the Voices blog of Open Society Foundations. I’ve re-posted it here…
A new study supported by the Open Society Foundations finds official public health data on law enforcement–related deaths to be severely lacking. Open Society Public Health Program Officer Marc Krupanski speaks with Justin Feldman, the study’s lead researcher, about the implications of these findings.
Why is this study needed?
The U.S. government does not accurately track killings by police. As a result, the public cannot know how many of these deaths occur each year, where they are happening, and under what circumstances.
My coauthors and I looked at national mortality data, which is based on state death certificates, to get a better sense of why the system is undercounting police killings. Our ultimate motivation is improving public health data collection on police violence, which can help guide reforms and hold law enforcement accountable.
What role can health researchers and practitioners play in the public debate about policing and safety?
Policing and public health overlap in a few different ways. For one, police departments are often tasked with addressing key public health concerns, responding to everything from substance use to sex work to mental health crises. In this context, it is incumbent upon public health professionals to advocate for better policing practices that do not undermine harm reduction goals, and also to develop alternative strategies that do not involve police.
A second area—the one our research addresses—involves using injury and mortality data to monitor police violence. This sort of research can inform the public about whether police violence is increasing or decreasing over time, which police departments have the highest rates of civilian deaths, and how large inequalities in rates of police violence are by race, gender identity, and income.
Were the results of the study surprising to you?
Our main findings were not surprising. It has long been suspected that death certificate–based data undercount police killings by wide margins. It was clear that newer databases like the Guardian’s The Counted, which track deaths by searching through local news reports, would be far superior to the official data.
With regard to the death certificate data, it was surprising how much the quality varied by state. Oklahoma experienced more than 30 deaths at the hand of police in 2015, for example, yet not a single one was counted in official mortality data. But Oregon, on the other hand, did a much better job, recording more than 80 percent of its police killings.
How might the study inform policymakers outside the United States?
It appears that police killings are extremely rare in other wealthy democracies, so our study may be somewhat less relevant in those countries. But I do see a role for improved data collection in middle-income countries like Brazil or Mexico, where the health sector can play a bigger role in documenting state violence.
How should policymakers and advocates use these findings, going forward?
We would like governments at the city, state, and federal level to provide data to the public—in real time—on killings by police. We have started to connect with some activists around the United States, particularly in the fields of public health and medicine, who share this goal. We hope that momentum builds since this study has come out highlighting the shortcomings of the current system in greater detail.