Google has banned hundreds of Chinese YouTube channels that it says were involved in “coordinated influence operation campaigns.” Between April and June this year, the company’s division responsible for combating government-backed attacks, Threat Analysis Group (TAG) took down about 2,600 YouTube accounts, significantly up from the 277 channels it blocked in the first three months of 2020.
Most of these channels posted “spammy, non-political content,” Google said in a blog post, but some of them were actively participating in a spam network and uploaded political content primarily in Chinese.
“A subset posted political content primarily in Chinese similar to the findings in a recent Graphika report, including content related to racial justice protests in the U.S. This campaign was consistent with similar findings reported by Twitter,” Google added.
The Graphika report titled “Return of the (Spamouflage) Dragon: Pro Chinese Spam Network Tries Again” talks about a wide-scale pro-Chinese propaganda effort on Twitter, Facebook, YouTube, and other social media that was set in motion earlier this year.
“The network made heavy use of video footage taken from pro-Chinese government channels, together with memes and lengthy texts in both Chinese and English,” says the report. “It interspersed its political content with spam posts, typically of scenery, basketball, models, and TikTok videos.”
In addition to these Chinese spam accounts, Google has banned several channels that are said to be linked to influence campaigns from Russia and Iran.
We’ve reached out to Google for a comment on whether it’s developing any preemptive protections ahead of the election and we’ll update the story when we hear back.
Earlier this year, TAG also identified over a dozen government-backed attacker groups that were using COVID-19 themes as baits for phishing and malware attempts. On top of that, Google said it was relaying 240 million spam messages as well as 18 million malware and phishing Gmail messages per day related to COVID-19 and that its systems are trained to block 99.9% of them.
Over the years, social media platforms such as YouTube and Facebook have struggled to deal with foreign political interference. While they have now actively begun to crack down on these coordinated spam efforts, the issues are far from over — especially with the looming presidential elections in the United States.