Political Influence Through Social Media is Growing, But Slowly

Photo by Camilo Jimenez on Unsplash

Guest post by Jacob N. Shapiro, Diego Martin, and Julia Ilhardt

Last week, FBI Director Christopher Wray testified that Russia is using social media, state-run media outlets, and fake online journals to damage Vice President Biden’s chances in the upcoming presidential election. The week before, the US Treasury sanctioned Russian and Ukrainian nationals for interfering in American politics. And earlier this month we learned that the Internet Research Agency was trying to recruit writers for fake news sites intended to influence American politics.

None of this should be surprising. Back in August, the Director of the National Counterintelligence and Security Center announced that: “Russia is using a range of measures to primarily denigrate former Vice President Biden and what it sees as an anti-Russia ‘establishment’”. Two days before that, the State Department’s Global Engagement Center detailed Russia’s online disinformation infrastructure. On July 20, Democratic legislators warned of a potential campaign targeting Congress; on July 14, social media analytics firm Graphika reported on Russia’s “Secondary Infektion” campaign that targeted multiple countries in the last six years; and early in the COVID-19 crisis, EU officials found evidence that Russia was using disinformation to worsen the impact of the pandemic.

As disturbing as the current situation is, the United States is far from alone in being a target of state-sponsored disinformation. During the 2019 pro-democracy protests in Hong Kong, fake accounts linked to the Chinese government tried to muddle online discourse. And during the Libyan National Army’s efforts to capture the capital city of Tripoli in 2019, Twitter was flooded with pro-insurgency hashtags originating in Gulf Countries and Egypt.

So how widespread is this problem? Which states are employing these techniques, and who are they targeting?

To find out, we spent the last two years collecting data on 76 state-backed foreign influence efforts from 2011-2019, as well as 20 domestic operations—i.e., when governments use social media manipulation against their own population. Unlike traditional propaganda, these campaigns include the creation of content designed to appear as though it is produced by normal users in the target states. We released the first report on our data in July of 2019, documenting 53 foreign influence efforts, and just released an updated report. Over the past year, we’ve identified a number of significant trends in the use and organization of influence efforts. Here are the key takeaways.

Influence Efforts for Hire

There are dozens of marketing groups specializing in online content, and in recent years some have begun executing political influence efforts.

Take Archimedes Group, a marketing firm based in Israel. The firm’s specialty is “winning campaigns worldwide,” and when Facebook removed a network of accounts linked to the company in May 2019, the political targets ranged across Africa, Latin America, and Southeast Asia. In contrast to the St. Petersburg-based Internet Research Agency whose only documented customer is the government of Russia, Archimedes Group produces content to support a wide range of political goals while obscuring the involvement of state actors.

A similar political marketing firm is Smaat, a Saudi Arabian company operating out of downtown Riyadh. In addition to marketing for clients like Coca Cola and Toyota, Smaat also works for the Saudi Arabian government. In a now-removed Twitter network, Smaat’s fake social media accounts interspersed commercial content with pro-Saudi political messaging.

Not only does the use of these firms make it hard to identify the actors behind social media manipulation, but it also allows states to engage in political interference without having to develop digital infrastructure.

Further obfuscating government involvement in disinformation campaigns is the trend towards hiring local content creators. Networks linked to Russian oligarch Yevgeny Prigozhin—who was indicted during the Mueller investigation—paid people in Madagascar and Mozambique to manage election-related Facebook pages. Such tactics make it challenging to distinguish foreign interference from genuine local discourse.

Common Content for Local Audiences

In 13 Central Asian countries in 2019, residents following Facebook pages for Latvian travel or the President of Tajikistan may have unknowingly consumed content from government-linked Russian media outlets. By distributing stories from sources like Sputnik and TOK on pages that omitted or obscured the link to Russia, this campaign spread narratives sympathetic to the Kremlin’s foreign policy initiatives.

The Russian network targeting Central Asia was part of a wider move towards efforts that pushed common content to specific locations. In the first version of our report, only two out of 53 foreign influence efforts targeted multiple countries at once. Nine out of 23 additional campaigns in our 2020 report did so. Like Russia, Saudi Arabia and the United Arab Emirates have mounted multi-year efforts to promote sweeping, nationalistic content adapted to resemble domestic discourse in multiple countries. And growing evidence suggests China has begun a broad social media campaign targeting the Chinese diaspora in multiple countries since 2017.

Cases involving widespread distribution of common content are, in some sense, an updated form of propaganda. Disinformation or biased stories adopt an air of local authenticity. Attacking countries do not need to invest as much effort into the creation of generic content as they would for country-specific campaigns.

State-Backed Disinformation on the Domestic Front

Even in democracies, governments sometimes employ media consultancies to justify their policies and damage opposition politicians. In Mexico, for example, branches of the government have paid fake media outlets to amplify stories in favor of the Institutional Revolutionary Party (PRI). During the presidency of Enrique Peña Nieto from 2012-2018, pro-PRI Twitter accounts were so common that they came to be dubbed “Peñabots.”

While foreign influence efforts have been dominated by six countries, particularly Russia and Iran, we found 20 domestic influence efforts spread across 18 countries going back to 2011. In almost all cases, domestic interference has sought to suppress political dissent. Countries such as Vietnam were overt about this goal, with the creation of a cyber military unit called Task Force 47 operating under the Vietnam People’s Army that sought to discredit opposition narratives. Alternatively, government officials in Malta directed social media trolling and hate speech via secret Facebook groups.

Our inclusion criteria required influence campaigns to be directly connected to a government or ruling party. Although political parties in some democracies engage in social media manipulation, these parties are not necessarily representative of the state. For instance, in India, both Prime Minister Narendra Modi’s Bharatiya Janata Party and the Indian National Congress have long made use of influence operations. Similarly, disinformation originating with firms like Cambridge Analytica does not constitute an influence operation in our study unless explicitly linked with governments.

What’s Next?

Online influence efforts are becoming an increasingly widespread tool for both domestic politics and foreign interference. The commercialization of these campaigns could make them easier to access and, in some cases, harder to identify. But the problem of state-back influence efforts is not yet pervasive.

In fact, we found two positive trends in our report. First, only Russia initiated new influence efforts in 2019, and second, Russia initiated only three new efforts in 2019, compared to eight in 2018. This is promising given the widespread capacity for executing influence efforts, and the number of countries who would like to shape US politics, and suggests something is holding countries back from using fake online activity to interfere in their rivals’ politics.

But a global norm needs to be reinforced. At the moment, there is little international collaboration on monitoring social media platforms and no multilateral push to create strong prohibitions on cross-border influence campaigns. And with the US presidential election less than two months away, the threat of foreign interference is being brought squarely to the fore.

Jacob N. Shapiro is professor of politics and international affairs at Princeton University, Diego Martin is a PhD Student in economics at Purdue University, and Julia Ilhardt is a senior in the School of Public and International Affairs at Princeton University.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like
Read More

How Norms Die

By Tanisha M. Fazal and Seva Gunitsky. The Trump administration has challenged the global order on multiple fronts,…
Read More