The last few years have seen the rapid rise of body-worn cameras used by police departments around the United States, fueled by tens of millions of dollars in Justice Department funding. While originally justified on the basis of increasing police accountability and transparency, however, body cameras’ usefulness is in question as they begin to incorporate increasingly advanced facial recognition technology.
“Body-worn cameras hold tremendous promise for enhancing transparency, promoting accountability, and advancing public safety for law enforcement officers and the communities they serve,” then-Attorney General Loretta Lynch said in 2015 in announcing federal funding for body cameras through a program The Atlantic describes as “an effort to restore the public’s trust in law enforcement after high-profile instances of black men being shot by police.”
Yet it was also The Atlantic which, last fall, just a few days after publishing those words about the ongoing body camera funding program, published another article noting that body cameras were “betraying their promise,” as officers frequently have not had their camera turned on at a crucial moment, such as during an officer-involved shooting, and in many cases camera footage is made available to police but not the public.
“Instead of providing an independent documentation of an event, body cameras seem to be one more way that police officers can shore up their version of events on the ground,” noted associate editor Robinson Meyer. Less than a month later, the Atlantic also noted that about half of all Americans are included in one or another searchable police face recognition database — databases which can frequently turn up false positives based on racially biased algorithms.
Not long after that, a government report on police body cameras was published. “At least nine different device manufacturers either currently allow facial recognition with their equipment or have built in the option for such technology to be used later, the report found,” notes Kevin Collier of Vocativ, though the lengthy report does not appear to clearly address whether most such body-camera face-recogniton technology functions in real-time or is simply usable for later analysis of video footage.
“There is no indication that these [body worn camera] systems will stop proliferating,” according to the report. “In fact, vendors are developing and fine-tuning next-generation BWC features such as facial recognition and weapons detection.”
Indeed they are. On Monday, prominent device manufacturer Motorola, along with a lesser-known artificial intelligence startup called Neurala, announced a new partnership “to develop intelligent cameras for public safety users,” according to a press release. “The goal is to enable police officers to more efficiently search for objects or persons of interest, such as missing children and suspects.”
As Patrick Tucker, technology editor for Defense One notes, Neurala’s work represents a significant advance in body camera facial recognition.
“Italian-born neuroscientist and Neurala founder Massimiliano Versace has created patent-pending image recognition and machine learning technology,” Tucker writes. “It’s similar to other machine learning methods but far more scalable, so a device carried by that cop on his shoulder can learn to recognize shapes and — potentially faces — as quickly and reliably as a much larger and more powerful computer. It works by mimicking the mammalian brain, rather than the way computers have worked traditionally.”
He also notes this project to create surveillance devices to be used primarily by local law enforcement ultimately has its origins, as is the case with so much high-tech weaponry and equipment that ends up in the hands of local cops, at the Pentagon.
“Versace’s research was funded, in part, by the Defense Advanced Research Projects Agency or DARPA under a program called SyNAPSE,” Tucker writes. “In a 2010 paper for IEEE Spectrum, he describes the breakthrough. Basically, a tiny constellation of processors do the work of different parts of the brain — which is sometimes called neuromorphic computation — or ‘computation that can be divided up between hardware that processes like the body of a neuron and hardware that processes the way dendrites and axons do.’ Versace’s research shows that AIs can learn in that environment using a lot less code.”
We may be no nearer to achieving police accountability, but leave it to the military-industrial complex to take a seemingly well-intended idea and turn it into a counterproductive yet self-perpetuating money pit, in all likelihood making the problem of bias in policing worse, while expanding already pervasive government surveillance.

Help keep independent journalism alive.
Donate today to support MirrorWilderness.com.
$1.00