Facebook News

Meta Removes Coordinated Propaganda Efforts by China and Russia

Meta, the parent company of Facebook, said it took down two covert influence operations — from China and Russia — that were violating the company’s policy against Coordinated Inauthentic Behavior (CIB). Meta shared this information with peers at tech companies, security researchers, governments and law enforcement so they too can take appropriate action. At the end of the full report, the company is also including threat indicators to help the security community detect and counter malicious activity elsewhere on the internet. See the full CIB Report for more information.

Here’s what the company found:


Meta took down a small network that originated in China and targeted the United States, the Czech Republic and to a lesser extent, Chinese- and French-speaking audiences around the world. It included four largely separate and short-lived efforts, each focused on a particular audience at different times between the Fall of 2021 and mid-September 2022. In the United States, it targeted people on both sides of the political spectrum; in Czechia, this activity was primarily anti-government, criticizing the state’s support of Ukraine in the war with Russia and its impact on the Czech economy, using the criticism to caution against antagonizing China. Each cluster of accounts — around half a dozen each — posted content at low volumes during working hours in China rather than when their target audiences would typically be awake. Few people engaged with it and some of those who did called it out as fake. The company’s automated systems took down a number of accounts and Facebook Pages for various Community Standards violations, including impersonation and inauthenticity.

This operation ran across multiple social media, including Facebook, Instagram, Twitter and two Czech petition platforms. This was the first Chinese network disrupted that focused on US domestic politics ahead of the midterm elections, as well as Czechia’s foreign policy toward China and Ukraine. Chinese influence operations that were disrupted before typically focused on criticizing the United States to international audiences, rather than primarily targeting domestic audiences in the US. A network that was taken down in 2020 included a very limited effort to post about US politics, but primarily focused on the Philippines and Southeast Asia.


Meta took down a large network that originated in Russia and targeted primarily Germany, and also France, Italy, Ukraine and the United Kingdom with narratives focused on the war in Ukraine. The operation began in May of this year and centered around a sprawling network of over 60 websites carefully impersonating legitimate websites of news organizations in Europe, including Spiegel, The Guardian and Bild. There, they would post original articles that criticized Ukraine and Ukrainian refugees, supported Russia and argued that Western sanctions on Russia would backfire. They would then promote these articles and also original memes and YouTube videos across many internet services, including Facebook, Instagram, Telegram, Twitter, petitions websites Change.org and Avaaz, and even LiveJournal. Throughout our investigation, as we blocked this operation’s domains, they attempted to set up new websites, suggesting persistence and continuous investment in this activity across the internet. They operated primarily in English, French, German, Italian, Spanish, Russian and Ukrainian. On a few occasions, the operation’s content was amplified by the Facebook Pages of Russian embassies in Europe and Asia.

Meta began its investigation after reviewing public reporting into a portion of this activity by investigative journalists in Germany. The researchers at the Digital Forensic Research Lab also provided insights into a part of this network, and the company shared findings with them to enable further research into the broader operation.

This is the largest and most complex Russian-origin operation that has been disrupted since the beginning of the war in Ukraine. It presented an unusual combination of sophistication and brute force. The spoofed websites and the use of many languages demanded both technical and linguistic investment. The amplification on social media, on the other hand, relied primarily on crude ads and fake accounts. In fact, the majority of accounts, Pages and ads on our platforms were detected and removed by the company’s automated systems before we even began our investigation. Together, these two approaches worked as an attempted smash-and-grab against the information environment, rather than a serious effort to occupy it long-term.