It is no big secret that the U.S. military and its friends in private business want to wire their latest technology directly into the human brain. In late 2016, I wrote about the Defense Department using “electrical brain stimulators” to give its drone operators “minor brain zaps” to keep them awake, for instance. More recently, I noted that prominent tech executives ranging from Facebook’s Mark Zuckerberg to Elon Musk, through his Neuralink venture, have been enthusiastic about experimenting with “brain computer interface” technology.
So it doesn’t come as a huge surprise –though it’s also far from good news, in my opinion– that the Pentagon’s high-tech research department, the Defense Advanced Research Projects Agency (DARPA), has announced work on a “neural interface,” which hypothetically “would both allow troops to connect to military systems using their brainwaves and let those systems transmit back information directly to users’ brains,” NextGov reports.
“From the first time a human carved a rock into a blade or formed a spear, humans have been creating tools to help them interact with the world around them,” Al Emondi, program manager at DARPA’s Biological Technologies Office, reportedly said. “The tools we use have grown more sophisticated over time … but these still require some form of physical control interface—touch, motion or voice. What neural interfaces promise is a richer, more powerful and more natural experience in which our brains effectively become the tool.”
Despite this perhaps overly dramatic description that seems to obfuscate the huge technological breakthrough –complete with all of the ethical and logistical questions that such breakthroughs often entail– that a practically functional “neural interface” technology would represent, NextGov‘s Jack Corrigan does not seem to take an appropriately critical view as a member of the independent press with respect to Emondi’s comments as an ad-hoc spokesman for the military.
“DARPA began studying interactions between humans and machines in the 1960s, and while technology that merges the two may sound far-fetched, the organization already proved it’s possible,” Corrigan notes. It’s worth pointing out that the new program he is reporting on is called Next-Generation Non-Surgical Neurotechnology, or N3.
“The N3 program is divided into two tracks: non-invasive interfaces that sit completely outside the body, and minutely invasive interfaces that could require users to ingest different chemical compounds to help external sensors read their brain activity. In both tracks, technologies must be ‘bidirectional,’ meaning they can read brain activity and also write new information back to the user,” Corrigan writes.
“While those capabilities might fuel conspiracy theories about government mind-reading and mind-control, Emondi told Nextgov that won’t be the case—scientists are only beginning to figure out how the brain’s 100 billion neurons interact, so controlling those interactions is next to impossible. Instead, he said it’s better to think of N3 technology as means to use to a computer or smartphone without a mouse, keyboard or touch screen.”
It may be the case that functional neural remote control is still a long way off, yet it is also worth taking a closer look at the history of efforts to develop such technology that Corrigan so quickly glosses over. As I detailed in a lengthy feature story last year for Massachusetts alternative newspaper Dig Boston and the Boston Institute for Nonprofit Journalism, such experiments in the 1960s were not always “Non-Surgical” and often far from “minutely invasive.” In 1966, at one of the most prestigious research hospitals in the country, Leonard Kille, an engineer who had worked for defense contractors and had a key connection or two to the shadowy world of clandestine intelligence, was subjected to an experimental brain surgery procedure that ultimately ruined his life.
Though a true “smoking gun” can be hard to find when it comes to pinning down cases of research covertly funded decades ago through the Central Intelligence Agency’s notorious “mind control” programs such as MKULTRA, in which many of the relevant files were ultimately destroyed, circumstantial evidence points to Kille as being as good a case as any for having received such questionable support.
Given Emondi’s role here as a mouthpiece for DARPA, an agency that obviously keeps much of its work classified, his reported claim that “scientists are only beginning to figure out how the brain’s 100 billion neurons interact, so controlling those interactions is next to impossible,” as Corrigan paraphrases it, should be taken with a grain of salt. This is especially true given that Corrigan writes, just two paragraphs later, that Emondi “theorized the interface could be used to help a pilot coordinate a fleet of drones with their thoughts or troops to control a remotely deployed robot by using their brain’s motor signals.” If such a control system were indeed “bidirectional,” as stated, it seems as though it, or something developing directly out of it, could indeed qualify as “mind-control,” if we are to use a fairly straightforward and self-explanatory definition of that term.
Exactly what the U.S. government is capable of, or soon will be, with its classified technology in the realm of “remote mind control” must remain for now largely a matter of continued speculation. As I wrote about earlier this year, in April an intelligence “Fusion Center” in Washington State “accidentally” released some files on this topic, in a move that had some of the hallmarks of a disinformation operation.
Yet the very fact that the Pentagon wants to spend taxpayer money on this research at all should be enough to warrant press scrutiny. Emondi’s seemingly conflicting statements on what exactly his agency aims to develop only intensifies the need for journalists writing for influential publications who have access to officials such as those in his position to ask tough questions.
To his credit, Corrigan apparently asked DARPA how much taxpayer money will be spent on its “N3” efforts, which are projected to play out over the next for years, although the agency declined to comment.
“Given the intensely personal nature of the technology, DARPA is requiring designs to comply with a number of health and safety requirements, and also address any potential cybersecurity concerns,” Corrigan writes. “While today the project’s biggest ethical questions relate to safety and risk of testing, ‘if N3 is successful,’ Emondi said, ‘I anticipate we could face questions related to agency, autonomy and the experience of information being communicated to a user.'”