Tech News

Meta Exposes Massive Covert Operation Targeting Global Social Media

Meta Unveils Massive Chinese Disinformation Campaign on Social Media

In a sweeping effort to combat a significant Chinese disinformation campaign, Meta recently took action by erasing nearly 8,000 Facebook accounts. These accounts were part of what has been called the “largest known cross-platform covert influence operation globally,” as per a recent report.

Meta’s response to this covert operation led to the deletion of 7,704 accounts and 954 pages from Facebook, along with the removal of 15 accounts from Instagram due to violations of coordinated inauthentic behavior policies. This operation had infiltrated over 50 social media platforms and forums, even extending its reach to platforms like YouTube, Pinterest, Reddit, and the former Twitter entity known as X.

The disinformation campaign primarily circulated positive content about China and its Xinjiang region, criticized the United States, Western foreign policies, and detractors of the Chinese government, including journalists and researchers. False narratives, such as COVID-19 originating from the United States, were also promoted.

This influence operation relied on sharing spam, links, memes, and text posts and spent at least $3,500 on Facebook ads. It had a global reach, targeting regions including the UK, Taiwan, the United States, Australia, and Japan.

Meta said: “Although the people behind this activity tried to conceal their identities and coordination, our investigation found links to individuals associated with Chinese law enforcement.”

Meta’s investigation revealed links to individuals associated with Chinese law enforcement behind this operation. The company began its efforts after reports of the campaign targeting a human rights NGO in late 2022.

Interestingly, Meta found connections between this operation and a previous one called Spamouflage, which had been active since August 2019. Additionally, it identified similarities between Spamouflage’s tactics and those of a pro-Russian disinformation campaign known as Secondary Infektion.

While the reasons for these parallels remain unclear, it’s possible that coordinated inauthentic behavior operators learn from each other, potentially influenced by public reporting on covert influence operations by industry professionals and security researchers.

This proactive cleanup by Meta sheds light on the ongoing battle against disinformation campaigns across social media platforms.


Meta said: “While the reasons behind these parallels are unclear, it is possible that CIB [coordinated inauthentic behaviour] operators learn from one another, including as a result of public reporting about covert influence operations by our industry and security researchers.”

Related Articles

Back to top button