Facebook parent company Meta said it took down a network of fake accounts, originating in China, that it says was attempting to interfere in the U.S. midterms. The network’s posts were aimed at appealing to both Democrats and Republicans and posted about controversial issues like abortion and gun rights. Experts say the network’s existence speaks to larger concerns about political disinformation on social media, in the U.S. as well as worldwide.
In a report published Tuesday, Meta detailed how it disrupted the network, which was the first known China-based operation targeting users in the United States with political content ahead of the Nov. 8 midterm elections. This operation was unique, it said in the report, because “Chinese influence operations that we’ve disrupted before typically focused on criticizing the United States to international audiences, rather than primarily targeting domestic audiences in the US.”
But, overall, countries using social media to interfere in each others’ elections is now familiar terrain, policy experts say. In the U.S. these worries were first triggered by reports of Russian meddling in the 2016 presidential election. While domestically produced and disseminated political disinformation has taken center stage in the U.S. in recent years, it’s important to remember that this is an issue of international scope, says Philip Napoli, a professor in Duke University’s Sanford School of Public Policy.
“It’s a strategy that nations deploy against each other, and we’re not really seeing any variation from election to election in terms of how concerned we need to be about this kind of activity,” he says. “It’s a constant.”
The network targeted people in the U.S. on both sides of the political spectrum by setting up fake accounts posing as Americans and attacking politicians from both parties. The report did not say whether the network was tied to the Chinese government or was merely based in China.
The disrupted influence campaign was not particularly effective, Meta’s report noted, in part because the accounts posted during working hours in China, when U.S. users are likely to be less active. Still, the network’s existence, no matter its impact, raises the alarm about social media’s readiness for policing interference. “It’s an indicator that none of the defense mechanisms these platforms have put in place seem to discourage this kind of activity,” Napoli says.
Meta did not immediately respond to TIME’s requests for comment.
Read More: Meta, TikTok, and Twitter Hope to Fight Election Misinformation. Experts Say Their Plans Aren’t Enough
The influence of fake accounts
Meta said the China-based network set up fake accounts across multiple social media platforms, including Facebook, Instagram, and Twitter, from March to August 2022, but was small (only 81 Facebook accounts, eight Pages, one Group and two accounts on Instagram) and did not attract much of an audience. The company said its automated systems took down a number of related accounts and Facebook Pages for violating community standards during this time.
Reports of this nature could heighten fears over Chinese influence operations impacting elections in the U.S., says Ho-Fung Hung, a professor of political economy in Johns Hopkins University’s department of sociology.
“China is still at a fairly low level [of activity] compared to Russia in terms of interfering in discussion about U.S. issues,” he says. “At this time, I wouldn’t be too worried about direct Chinese intervention in U.S. domestic politics.”
Hung also says it’s significant that Meta’s report doesn’t indicate that Chinese intelligence agencies were behind this network. “It’s not clear whether it’s related to the Chinese government or non-governmental people with their own initiative,” he says.
However, the discovery of this operation does call into question whether others of its kind are out there, says Joshua Tucker, co-director of New York University’s Center for Social Media and Politics.
“The question we don’t know the answer to is when Meta announces that it’s taken down a network, is that because Meta is really good at this and somebody tried one thing and got caught, or is it because there are 100 of these networks and this is just the one they caught?” he says. “Absent our own access to the data, it’s hard really to infer how successful Meta is at taking down these types of attacks.”
Tucker says that as long as these social platforms exist, they’re likely going to be vulnerable to these kinds of campaigns. “One of the things we know about these kinds of operations, especially these little ones, is they’re easy for people to pull off,” he says.
What we know about future elections
Meta said in Tuesday’s report that it also took down a much larger Russian network that primarily targeted Germany, France, Italy, Ukraine, and the United Kingdom. The operation was centered on a network of over 60 websites impersonating legitimate news sites to push pro-Russia content and criticize Ukraine and Western sanctions. Meta said it was “the largest and most complex Russian-origin operation” that it had disrupted since the beginning of the war in Ukraine.
This is not the first time social media platforms have been used by foreign powers to exploit political divisions. The Cambridge Analytica scandal, in which a political consulting firm used the data of tens of millions of Facebook users in its effort to elect Donald Trump president in 2016, became part of the larger investigation into possible collusion between the Trump campaign and Russia. Moreover, earlier this month, the New York Times reported on how organizations linked to the Russian government carried out a social media propaganda campaign aimed at inflaming tensions surrounding the 2017 Women’s March.
Meta, TikTok and Twitter have all revised and updated their plans for addressing possible election misinformation, regardless of the source, ahead of the US midterms. However, many technology experts say these plans need to be updated and further strengthened.
Napoli says the issue is widespread and will continue to be so. “Nations are still expanding the scope of their operations rather than scaling them back,” he says. “They seem to be continually emboldened.”
Write to Megan McCluskey at [email protected].