Face recognition technology, something that seemed like science fiction just a few years ago, is becoming an increasingly key component of America’s mass surveillance network.
Even as plans for high-tech mass-scale facial recognition systems in places like China raise privacy concerns, the U.S. government was reportedly looking to develop face recognition tech capable of targeting “non-cooperative” subjects “in the ‘wild'” as of last year. U.S. police body cameras increasingly incorporate the technology, while American contractors appear more than willing to oblige their government in creating such frightening products as a networked “smart” city-wide automatic face recognition system.
Aside from the inherent privacy issues involved in a government tracking its citizens everywhere they go, in public at least, based on automatic analysis of video feeds of their facial features, the technology’s questionable accuracy has also been widely reported. Face recognition’s issues with racial bias have been discussed at length, for example, while plans to help a privacy-concerned public thwart the algorithms, such as Computer Vision Dazzle (or CV Dazzle), an “anti-surveillance makeup” art project by artist Adam Harvey, have received positive news coverage as recently as last month.
Yet the algorithms are apparently becoming more accurate — making it more important than ever for those who oppose mass surveillance schemes such as auto-face-recognition to voice their concerns. Last September it was reported that CV Dazzle would likely soon become obsolete as face recognition developers focus on creating software that can penetrate the kind of simple masks and disguises that activists often wear at protests to conceal their identities. And this week, Defense One has reported that U.S. Army researchers have developed face recognition that can “see” “in the dark” using thermal imaging.
“The method that the researchers use breaks a thermal picture of a face into specific regions and then compares them to an optical image of the same face. The network estimates where key features are in the thermal image in relation to the conventional image. The network’s final product is something like a police sketch — not a perfect match, but with enough overlap in key points to make a high-certainty match,” writes Defense One technology editor Patrick Tucker.
“When using thermal cameras to capture facial imagery, the main challenge is that the captured thermal image must be matched against a watch list or gallery that only contains conventional visible imagery from known persons of interest,” Army researcher Benjamin S. Riggan reportedly said in a statement. “Therefore, the problem becomes what is referred to as cross-spectrum, or heterogeneous, face recognition. In this case, facial probe imagery acquired in one modality is matched against a gallery database acquired using a different imaging modality.”
As face recognition becomes more powerful in its capabilities as a tool of state surveillance and appears as though it may become fool-proof in its accuracy sooner than some had predicted, the need to regulate its use and implement privacy protections for consumers and the public can only grow correspondingly.