A Predictive Tool For Armed Police
Police operations will be able to use behavioral data and get warning signs that a police officer may be breaking the force’s rules and is about to shoot an unarmed civilian.
To build the algorithms that may one day be able to create a sort of “risk score” for police, researchers are using familiar tools: data from police body cameras and squad cars, and the internal reports usually kept locked in police departments away from researchers, including information on officer suspensions and civilian complaints.
Of all this information, body cameras, which were purpose-built to create an objective and unaltered record of an officer’s every move on the job, may be the most valuable. At least in theory: Since the Justice Department began offering $ millions in grants for body cameras in 2015 and advocates began clamoring for the technology, police have claimed their cameras have fallen off and become unplugged or blew-up and the footage deleted.
But the push to use body cameras on police now has a surprising source: the camera industry itself. Late April, Axon, the No. 1 manufacturer of body cameras in the United States, announced its Performance tool, which is seemingly targeted at the long line of high-profile body-camera failures.
The tool, a paid upgrade for current customers, is a dashboard that quantifies how often officers turn their cameras on during calls and whether they categorise videos correctly.
Axon’s announcement came just after a jury convicted the Minneapolis police officer Mohamed Noor of shooting and killing an unarmed civilian, the Australian-born yoga instructor Justine Damond. The case is among the most high-profile incidents of police violence that involved body-camera failure.
While both Noor and his partner wore body cameras the night Damond was shot, neither was turned on to record the shooting. Shannon Barnette, the sergeant on duty and Noor’s supervisor, was the first to arrive on the scene after Damond’s death. Footage from her body camera is split into three parts.
In the first, she drives to the scene. In the second, she approaches another officer; says “I’m on,” presumably referring to her body camera; and then turns the camera off. The footage resumes two minutes later. Prosecutors asked Barnette why her camera was turned off, then on. “No idea,” Barnette responded. He testified that the department’s policy on when the cameras are supposed to be on was “not clear” at the time. Since the shooting, the Minneapolis Police Department has revised its policy: The cameras stay on.
Lauren Haynes, the former associate director of the Center for Data Science and Public Policy at the University of Chicago, helped design a statistical model to predict when officers may become involved in an “adverse event,” anything from a routine complaint up to an officer-involved shooting. The project didn’t use the kind of body-camera data that Axon’s new tool works with, but she says they’re “absolutely something that could be put into the model.”
The team found that a number of stressors were related to those adverse events, including whether officers worked a second job, whether they took too many sick days, and whether they’d recently responded to a domestic-violence incident or a suicide. Place matters, too: Officers were more likely to be involved in an adverse event if they were sent into neighborhoods far from their usual beat.
Haynes thinks it’s possible that officers won’t be completely opposed to the idea of risk scoring. “If it comes off as a punitive thing, people are going to be against it,” she says. On the other hand, if the scores are presented as a tool to keep departments from pushing officers too hard, the plan might gain some support.
“You want to put people in the right interventions for them,” Haynes says. “There are all kinds of different solutions depending on what the specific risk is.”
Predictive tools have an inherent risk. They offer only the probability of any event happening. They could be wrong or even dangerous, creating feedback loops that penalize officers who seek counseling, for example. Cameras and algorithms offer potential tools for police accountability, but don’t ensure it.
You Might Also Read:
Smartphones Are Working For Dutch Police: