TikTok says it removed more than 104 million videos globally for violating its guidelines during the first half of 2020, which represents less than one percent of all videos uploaded.
The social media giant says that of these videos, it found and removed 96.4 percent of them before a user reported them, and 90.3 percent were removed before they received any views.
TikTok’s transparency report notes that 30.9 percent of videos were removed for containing “adult nudity and sexual activities.” Further, 22.3 percent of videos were removed for “minor safety,” while 19.6 were removed for containing “illegal activities.”
Around 13 percent of the videos were removed for “suicide, self-harm, and dangerous acts.” Other reasons for removal include harassment, hate speech, integrity and authenticity.
The transparency report reveals that it received a single request from the Canadian government to restrict or remove content on its platform. Following the request, TikTok removed or restricted one piece of content on the platform.
The Canadian government did not submit any legal requests, but did submit 11 emergency requests. In emergency situations, TikTok will disclose user information without a legal process if the information is required to prevent the imminent risk of death or serious physical injury.
In its emergency requests, the Canadian government specified 13 accounts. TikTok’s compliance rate with these requests was 73 percent.
The release of TikTok’s transparency report comes as the company is proposing a global coalition to protect against harmful content. It also comes as TikTok is facing scrutiny for failing to stop the spread of a harmful video depicting a graphic suicide.
Source: TikTok
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.