A new social media simulation program gives the U.S. military the capability to train its information operators for a wide range of cyber deception and propaganda operations without going on the real internet.
The tool, called the Social Media Replication Toolkit System (SMRTS) and developed by the San Diego-based Cubic Corporation, “harvests and remixes real tweets to generate a convincing online environment for wargames large and small,” as Marcus Weinberger of Defense One puts it.
“The program gathers up real tweets and news articles from a specific region of the world, measures their influence on real-world events, then strips out any information that might identify the actual person who posted them, according to Michal Simek, a senior program director for Cubic. These anonymized tweets are then used to build a simulation of social-media traffic during a crisis. For one three-month exercise, the simulator generated six million tweets. Other kinds of information can also be injected into the simulation.”
The SMRTS program appears to be the end result of an effort I wrote about late last year to develop something which at the time was being called Dynamic Social Media for Training and Exercises, or if not then it’s something very similar to what that plan envisioned.
It makes sense to some extent that the Pentagon would want the capability that the SMRT System enables. As I’ve previously noted, for example, the military’s so-called WebOps against the Islamic State extremist group were exposed earlier this year by the Associated Press as rife with problems, and based on that report it appears they were probably doing more harm than good in their efforts to convince their targets to abandon jihadist ideology or dissuade them from acting on it. A major problem, for example, was a lack of qualified translators, which led to the word “authority” being translated as “salad,” and people joking on social media about the “Palestinian salad” as a result.
Yet some of the proposed solutions to these sorts of problems, such as the notion that we should hand over responsibility for online trolling ops against ISIS to social media bots, seem almost guaranteed to make the problems worse. The idea that we should perhaps not be engaging in these kinds of activities to begin with does not seem to have crossed anyone’s mind in intelligence or national security policy-making circles.
In the past year America has seen the rise of what at times seems like a whole new sub-industry of media dedicated to (largely Democratic) hand-wringing over Russian “election hacking,” disinformation trolling, and influence operations. The confusing and fairly unhelpful term “fake news” has cropped up and quickly become yet more fuel for the partisan flames that have consumed much of the American political scene in recent years.
Less apparent to Americans — for reasons that may be obvious to those who watch these things closely but not to the average voter — has been the exposure in recent years of the West’s own online propaganda operations. The U.S. military’s use of “sock puppet” social media accounts as part of a program called Operation Earnest Voice was revealed in 2011, for example. Then in 2014, details emerged about the activities of the Joint Threat Research Intelligence Group (JTRIG), a division of the United Kingdom’s Government Communications Headquarters (GCHQ), including that it engaged in online “ruse,” “infiltration,” “disruption,” and “false flag” operations.
In 2015, The Intercept published additional documentation showing that JTRIG targeted not only foreign extremists groups, but those it deemed a threat domestically. In the U.S., meanwhile, the Federal Bureau of Investigation — an agency that has been prominently accused of “manufacturing terrorism cases” — has been exposed as using similar tactics, and was even granted official authority last year to impersonate journalists in some cases — a capability it has utilized not only to troll the internet but in at least one case to create an entire fake documentary crew for a sting operation.
Besides remixed tweets, the new SMRT System also incorporates capabilities that likely make it attractive to the Pentagon in our current era of the “fake news” craze.
“If we’re building a scenario where we’re doing a simulated event, you could put some fake articles, you can put some real articles and that trains the analyst on identifying what caused the reaction,” Cubic’s Simek reportedly said. “Was it the fake article or was it the real article?”
If the Pentagon wants a program for teaching psychological operators to effectively disseminate lies and to build increasingly complex webs of deception throughout the worldwide information environment, the SMRT System may be the genuine article. Yet whether such a system is a smart one to implement is another question entirely.