Google-owned YouTube will remove conspiracy videos from the recommended section on its website, due to the recent policy change announced on January 25th.
Videos that have “borderline content” or “misinform users in a harmful way” will no longer be suggested, as YouTube aims to stop the spread of misinformation. This policy still applies to videos that follow YouTube’s community guidelines.
YouTube provided some examples of the type of videos that will be hit, such as a miraculous cure for an illness, the earth being flat and reporting historically false assertions of events. However, there are no set guidelines on what determines a video to be borderline.
How YouTube plans to remove these videos from the recommended section hasn’t been fully explained either, except that these decisions will be made by its machine-learning algorithms.
To help evolve the algorithm to make these changes, the company will have human raters assess a variety of YouTube videos.
It’s important to note that YouTube will not completely remove the conspiracy videos and that the platform will still recommend these videos for subscribers who watch this sort of content.
These videos will also not be excluded from the search bar.
YouTube says these actions “strike a balance between maintaining a platform for free speech and living up to our responsibilities as users.”
It’s not the first time other platforms have taken actions to prevent misinformation from spreading. Earlier this week, WhatsApp limited the amount of message forwarding to stop fake news from massively branching out to users.
These changes will affect videos in the United States before being applied globally once the system has been improved.