YouTube has been working to make the platform a safer space for everyone by adding more transparency features regarding reported videos.
One of the things that the company is introducing is a ‘Reporting History Dashboard’ so that users can check the status of a video after they report it. This can help users see why a video does or does not get taken down after the report.
YouTube is also using machine learning to flag videos before people even view them. The company claims that its machines are doing quite well at moderating videos that show violent extremism and spam.
Last year YouTube removed over 8 million videos from its platform. Most of these videos were spam or adult content. What’s really interesting about this figure is 6.7 million of those videos were flagged by machines and 76 percent of those were removed before they even received a single view.
The use of machine learning has helped YouTube spread its human workforce farther, since it doesn’t have to allocate as many people to flagging inappropriate videos.
Source: YouTube
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.