As the U.S. military pushes forward with ambitious artificial intelligence initiatives, those responsible can’t seem to resist describing them in unsettling terms.
Col. Drew Cukor, a project manager for the new “Algorithmic Warfare Cross-Functional Team” (AWCFT) established in April, this week enthusiastically compared recent AI developments to the advent of the atomic bomb during a panel discussion in Washington, D.C., according to C4ISRnet.
“I’m happy to tell you that this is as big as the introduction of nuclear weapons into the Department of Defense,” Cukor reportedly said.
Cukor’s comments come in the wake of similarly frightening words from his boss, Lt. Gen. John N.T. “Jack” Shanahan, director of the AWCFT, who said last month that if the new “algorithmic warfare” team can “show early wins” it can “then start to open Pandora’s box” of artificial intelligence. It’s not clear from the context whether Shanahan’s comment was meant as a joke. Pandora’s box is a reference to a Greek myth in which the character Pandora accidentally unleashes demons representing all the evils of the world from a box, leaving only the spirit of hope behind.
Shanahan has also previously discussed the example of the apparently ubiquitous white pickup truck, which came up in Cukor’s more recent comments as well.
Some of the AWCFT’s goals, according to a March report from C4ISRnet‘s Mark Pomerleau, “under a working title of the ‘Go Big Project,’ are to immediately inject industry’s best technologies in artificial intelligence and deep learning. DoD can take care of the ‘processing’ and ‘dissemination’ portion of [processing, exploitation and dissemination], Shanahan said. It’s the ‘E’ in ‘exploitation’ — it’s the analyst’s time of looking at a video for 12 hours a day to see if the white pickup truck left or entered the compound — that the department needs help with. A machine can watch for the pickup truck, he said.”
In his report on Cukor’s comments this week, meanwhile, Pomerleau notes that, according to Cukor, “the No. 1 thing being chased around Mosul these days” is white pickup trucks. He also describes how the AI system that the AWCFT is working on will function, as Cukor envisions it.
“First, it will look at a detection, classification, alerting and then moving into map display. Detection will put boxes around objects on the screen that algorithms identify. Classification will give those items names based on a confidence scale. Alerting will enable the computer to spot a certain object, unburdening the overburdened analyst poring over a screen trying to find the needle in a haystack a with the camera of poor resolution [sic]. This could be a white pickup truck, for example,” Pomerleau writes.
A campaign for an international ban on killer robots has been gaining momentum in recent months, while top Pentagon officials, including former Defense Secretary Ashton Carter, have expressed their views that the military should stop short of creating “fully autonomous” killer robots. Yet many key decisions about the U.S. military’s use and development of robots in the immediate future will be up to President Donald Trump and his administration.
The DoD has recently been funding research aimed at harnessing the brainwaves of soldiers to teach artificial intelligence to identify targets. Nevertheless, while machines may soon be doing more of the mundane work of watching for white pickup trucks, we can rest assured it will still be a human drone operator that has to make the call on whether to pull the trigger — for now, at least.
“Hopefully,” Cukor said, “we’ll begin to do things with the super-high confident level we’ve always wanted to.”