Last summer, a coalition of American civil rights groups came together to denounce the increasing proliferation of “predictive policing” technologies that they described as “profoundly flawed.” Nevertheless, the development of these technologies hurtles forward both in the United States and around the world.
In the U.S., President Donald Trump appears enthusiastic about the prospects for rooting out pre-crime, having handpicked a Homeland Security team with close ties to facial recognition and biometric surveillance firms. Yet America is not alone, as Israel reportedly moved last month to adopt a mandatory biometric database for its citizens, and a U.K. group reportedly received a £3m government grant for predictive policing research this week, for example.
Ongoing research and development with potential applications for predictive policing covers a wide range of disciplines, from social science to neuroscience, up to and including the use of brain scans to physically read the minds of would-be criminals. Researchers “have discovered that brain imaging can determine whether someone is acting in a state of knowledge about a crime,” it was reported this week, and while discussion so far is focusing on future courtroom use of such methods, “predictive” applications, though frightening, are not difficult to imagine.
With an eye towards selling the idea of predictive policing to the public, meanwhile, one startup, CivicScape, has reportedly released its algorithm and data online for public scrutiny. “By making our code and data open-source, we are inviting feedback and conversation about CivicScape in the belief that many eyes make our tools better for all,” the company said in a statement.
Technologies that purport to predict criminal tendencies, however, along with the biometric surveillance tools such as face recognition that allow for the vast data collection that enables algorithm-driven crime prediction, have been widely criticized for built-in racial bias, among other major problems.
“CivicScape is built with a primary focus on using only the most reliable data and includes multiple safeguards to enforce that standard,” the company claims. “This doesn’t mean that law enforcement and communities can’t use crime data to anticipate crime, but it does mean we must understand and measure bias in crime data that can result in disparate public safety outcomes within a community.”
Clearly, given the tone of some of the media coverage of predictive policing initiatives, companies like CivicScape have a strong incentive to make these technologies more palatable. Many Americans, after all, are more familiar with notions such as “PreCrime” as a feature of the dystopian science fiction future depicted in the movie Minority Report than a fact of real life. And it doesn’t necessarily help the image of the startups that are trying to sell this stuff that some of them, such as the widely-used PredPol, appear to have made the conscious decision to incorporate sinister-sounding Orwellian newspeak into their company branding.
If the increased use of so-called predictive policing is indeed an inevitability, then CivicScape’s move towards increased transparency is certainly a step in the right direction. Yet it is far from clear that the seemingly inherent problems with this kind of technology can be solved at all, or that the pros of using it will ever outweigh the cons.
“While this startup might be the first to publicly reveal the inner machinations of its algorithm and data practices, it’s not an assurance that predictive policing can be made fair and transparent across the board,” writes reporter Dave Gershgorn. In other words, the fact that CivicScape sees an opportunity to attempt a more PR-savvy foray into the emerging predictive policing market doesn’t in itself make this budding industry any less problematic from a regulatory or civil rights perspective.
It may very well turn out that there is no safe, responsible, or socially desirable way to predict crimes before they happen — but this is not likely to stop the government from trying, or to stop the private sector from profiting from politicians’ attempts to fulfill this questionable and quite likely impossible goal.