TikTok removed almost 350,000 videos for spreading election misinformation
TikTok provides a new look into the amount of disinformation on its website.
The App deleted hundreds of thousands of videos between July and December of last year for breaching its guidelines on disinformation about the presidential election of 2020 and the coronavirus pandemic.
Details of the takedowns were disclosed in the new disclosure report of the organization.
Unsurprisingly, the most prevalent voting propaganda.
According to the study, the organization deleted 347,225 videos to spread electoral disinformation or distorted media.
An additional 441,000 clips were deleted because of the "unsubstantiated" material (as with Facebook, TikTok partners with non-party fact-checking organizations; the firm also informs users of "unsubstantiated" videos).
In the same time, 51,505 videos were taken by TikTok to spread misinformation on COVID-19.
In its study, TikTok states that 87% of these clips have been removed within 24 hours, while 71% have "zero views" since they have been deleted.
The latest numbers come after TikTok tightened its propaganda policy before the election.
Before the 2020 election, the firm adopted new guidelines that removed deepfakes and extended its function by testing factual organisations for false statements.
In-app alerts have also been introduced to allow consumers to access reliable information.
TikTok claims its PSAs have been viewed over 73 billion times.
TikTok says in her article that it is well prepared for the election and that much of the propaganda comes from domestic sources in the USA.
"We have prepared for 65 different scenarios such as premature victory declarations, or controversial results, which have enabled us to respond appropriately and promptly to emerging content," writes TikTok.
"We've also been planning for more domestic action based on patterns in the production and dissemination of false material online.
Indeed, during the 2020 polls, we find that a substantial part of domestic consumers – real people – were pushing disinformation."
The company also states that misinformation and disinformation only comprises a fraction of TikTok's overall material.
According to the report, the app took down more than 89 million videos that violated its rule.
As in its previous study, "minor safety" (36% of removals) and adult nudity were among the largest categories of removals (20.5 percent).
Including propaganda as well as stuff like bots and bogus accounts, "integrity and authenticity" accounted for 2.4 percent of TikTok's downs.
However, TikTok also has trouble suppressing viral misinformation, even with a comparatively limited volume of misinformation.
According to Media Matters, viral videos circulating degraded conspiracy theories about electoral fraud raked hundreds of thousands of thoughts before they were deleted.