In TikTok’s latest battle against misinformation, they are removing accounts that share QAnon-related content, NPR reports. Initially, TikTok reduced the discoverability of QAnon content by banning QAnon-related hashtags. Now, their policy has expanded to remove the content, ban accounts, and redirect related searches and hashtags to display community guidelines.
QAnon is a far-right online movement that has taken over social media to spread unfounded conspiracy theories. Its members use platforms, like TikTok, to spread their theories.
“Content and accounts that promote QAnon violate our disinformation policy and we remove them from our platform,” a spokesperson told The Verge through email. “We continually update our safeguards with misspellings and new phrases as we work to keep TikTok a safe and authentic place for our community.”
QAnon on TikTok
According to The Verge, previously blocked QAnon-related tags included “QAnon,” “QAnonTruth,” and the related phrase, “Out of the Shadows.” However, the videos were still on the platform and appeared on users’ “For You Page.”
BBC reports that TikTok’s latest update will also block URLs associated with QAnon from being shared in-app in order to halt “offline harm.”
Media Matters for America discovered multiple QAnon conspiracy theories and hashtags that remained active. Many were sparked by the announcement of President Donald Trump’s positive COVID-19 diagnosis earlier this month. Of the 14 QAnon hashtags that Media Matters reported, 11 were removed on TikTok.
“There should be recognition of a thing that is good and significant, even if it’s long overdue,” Angelo Carusone, president of Media Matters for America, told NPR. “TikTok is recognizing that by the nature of the QAnon movement, you can’t just get rid of their communities, the content itself is the problem.”
Platforms’ Fight Against QAnon
In recent weeks, multiple platforms have made moves to battle QAnons’ growing online presence.
Earlier this month, Facebook announced they would ban content related to QAnon from Facebook and Instagram, referring to it as “a militarized social movement.” According to The Verge, they removed some groups and pages in April, but users could still post QAnon content to their individual profiles.
Similar to TikTok, Twitter banned thousands of accounts posting QAnon content. Reddit banned the QAnon subreddit, r/GreatAwakening, over policy violations, similar to their banning of the pro-Trump subreddit, r/the_donald.
Last week, YouTube announced they wouldn’t ban QAnon content, but would remove videos that could promote violence.