“I think the idea that fake news on Facebook … influenced the election in any way is a pretty crazy idea,” CEO Mark Zuckerberg said back in November, following Donald Trump’s poll-defying win. A new report from the company, however, titled “Information Operations and Facebook,” shows just how far the social media giant’s position has shifted in only a few months.
“Of all the content on Facebook, more than 99% of what people see is authentic,” Zuckerberg wrote in a Facebook post shortly after the election. “Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
Yet the new report from Facebook’s security team, which uses phrases like “disinformation,” “false amplifiers,” “strategy to distort public perception” and “false flag operations,” paints a somewhat different picture of the company’s present stance.
“The term ‘fake news’ has emerged as a catch-all phrase to refer to everything from news articles that are factually incorrect to opinion pieces, parodies and sarcasm, hoaxes, rumors, memes, online abuse, and factual misstatements by public figures that are reported in otherwise accurate news pieces,” the report states. “The overuse and misuse of the term ‘fake news’ can be problematic because, without common definitions, we cannot understand or fully address these issues.”
Accordingly, the report defines “information (or influence) operations” as: “Actions taken by governments or organized non-state actors to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome. These operations can use a combination of methods, such as false news, disinformation, or networks of fake accounts (false amplifiers) aimed at manipulating public opinion.”
This is a somewhat similar, but by no means identical definition to one provided in a 2009 RAND Corporation report on “Foundations of Effective Influence Operations,” (which was obviously produced prior to the 2016 election debacle).
“Influence operations are the coordinated, integrated, and synchronized application of national diplomatic, informational, military, economic, and other capabilities in peacetime, crisis, conflict, and postconflict to foster attitudes, behaviors, or decisions by foreign target audiences that further U.S. interests and objectives,” the RAND report states.
Perhaps tellingly, the same day Facebook published its “information operations” report it also released its latest update on worldwide government requests for user data. The new numbers, for the second half of 2016, show a nine percent increase in requests over the first half of the year. Yet what may be most surprising about the data requests update (to Americans, at least, given what we’ve been hearing so much about in the media) is their national origin.
The Facebook information operations report says that its “data does not contradict the attribution provided by the U.S. Director of National Intelligence in the report dated January 6, 2017,” which would suggest a connection to Russia. The latest government request data, though, may betray Facebook’s bias.
During the latter half of 2016 Facebook received more than 26,000 data requests from the U.S. government, involving more than 41,000 accounts, and produced some amount of data in more than 83 percent of those cases. Russia, meanwhile, issued just four requests during the same period, all of which Facebook denied. It’s probably also worth noting, though, that most Russians don’t use Facebook. Instead, they use a social media network called VKontakte or simply VK.
In an interview with Public Radio International three years ago, Mark Milian, global tech editor for Bloomberg News noted that VK was moving towards a closer relationship with Russia’s security establishment — mirroring Facebook’s ambiguous relationship with America’s own shadowy spy world.
“It’s entirely possible that (the Russian government will), you know, want access to this treasure trove of information that’s happening on this social network in the same way that the NSA and other (agencies of other) governments put in requests to Facebook,” Milian said. “The same sorts of information exists on VK and as the Russian government comes under pressure for economic reasons, for geopolitical reasons, they’ll want to do this same sort of monitoring that many other of the superpowers are doing.”
Facebook’s impartiality is also called into question by its alleged politically-motivated censorship of certain topics in the platform’s “trending” news feed section, and its past cooperation with surveillance firms.
The new Facebook report notes that information operations can involve “content creation, false or real, either directly by the information operator or by seeding stories to journalists and other third parties, including via fake online personas.” It also notes that “in some instances dedicated, professional groups attempt to influence political opinions on social media with large numbers of sparsely populated fake accounts that are used to share and engage with content at high volumes. In other cases, the networks may involve behavior by a smaller number of carefully curated accounts that exhibit authentic characteristics with well-developed online personas.”
The report also notes that “there is some public discussion of false amplifiers being solely driven by ‘social bots,’ which suggests automation. In the case of Facebook, we have observed that most false amplification in the context of information operations is not driven by automated processes, but by coordinated people who are dedicated to operating inauthentic accounts.”
It is perhaps somewhat ironic in this context that the U.S. government is itself engaged in social media influence operations utilizing fake personas, specifically against would-be jihadist Islamic State sympathizers. In planning these operations — which have by no means always had the effectiveness often credited to the Russian pro-Trump campaign — counter-terrorism officials have gone as far as suggesting ideas such as essentially creating a fake internet to train information operators, or even giving control of the campaign to counter jihadist ideology on social media over to “Artificial Intelligence Targeting Personas,” otherwise known as chatbots.
Presumably, many of the tens of thousands of U.S. data requests to Facebook are for accounts related to these Islamist-baiting operations, while others likely relate to federal investigations into various alleged aspects of a Russian campaign to influence the U.S. election.
Despite its newfound interest in the topic, Facebook maintains that disinformation and influence operations on its platform did not ultimately sway the 2016 U.S. election. Its report states that “while we acknowledge the ongoing challenge of monitoring and guarding against information operations, the reach of known operations during the US election of 2016 was statistically very small compared to overall engagement on political issues.”
Nevertheless, the social media company is taking aggressive actions ahead of the upcoming French election, deleting tens of thousands of “fake accounts,” as the company describes them. “If legitimate voices are being drowned out by fake accounts, authentic conversation becomes very difficult,” the report states. “Facebook’s current approach to addressing account integrity focuses on the authenticity of the accounts in question and their behaviors, not the content of the material created.”
But with its acknowledgement that most “false amplifier” accounts on its platform are in fact operated by real people, Facebook appears to be attempting to walk a tightrope of sorts in its definitions of “authentic conversation” and “inauthentic accounts.” Others might view the information operations report as a captivating performance demonstrating Facebook’s evolving role in Silicon Valley’s social media circus into that of the star semantic gymnast.
“Facebook sits at a critical juncture,” the company’s report notes. “Our mission is to give people the power to share and make the world more open and connected. Yet it is important that we acknowledge and take steps to guard against the risks that can arise in online communities like ours. The reality is that not everyone shares our vision, and some will seek to undermine it—but we are in a position to help constructively shape the emerging information ecosystem by ensuring our platform remains a safe and secure environment for authentic civic engagement.”