Facebook will now inform users who have interacted with misinformation or harmful content related to COVID-19 on its platform.
The messages will appear in users’ News Feed if they have liked, reacted or commented on harmful information that Facebook has since removed. The warning message will also connect users to COVID-19 myths that have been debunked by the WHO.
“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” the social media giant wrote in a blog post.
The platform says that ever since COVID-19 was declared a global public health emergency, it has directed over two billion people to resources from the WHO and other health authorities through its ‘COVID-19 Information Center’ and pop-ups.
Further, during the month of March, Facebook displayed warnings on around 40 million posts on the platform based on around 4,000 articles by its independent fact-checking partners. Facebook says that when people saw those warning labels, 95 percent of the time they did not click on the content.
Facebook states that it has removed hundreds of thousands of pieces of misinformation that could cause harm. For instance, it has removed several posts that have claimed that drinking bleach would kill the virus.
Source: Facebook
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.