A report on The Verge claims that Meta’s president of global affairs, Nick Clegg, said in a press call that Meta has been over-moderating content since the COVID-19 pandemic.
Clegg plans to improve the precision and accuracy of Meta’s moderation tools because the company is worried that it is overstepping free speech laws on Facebook, Instagram, and Threads. That being said, this isn’t the first time Facebook has promised to improve its moderation, but that’s a task that’s easier said than done.
Meta’s social platforms are massive, and moderating them is a huge undertaking. The Verge even reported in 2019 how difficult and draining Facebook moderation was, and I expect it hasn’t gotten much better since then.
Even the comments of the article on The Verge are littered with people claiming to have been unfairly moderated or having been victim to something that should have been moderated but wasn’t.
Clegg continued to say that Meta had “very stringent rules” that removed a lot of content throughout the pandemic, and in hindsight, he worries that Meta overdid it. Even during the lead-up to the last U.S. election, Meta removed pictures of Donald Trump after he survived his assassination attempt.
The Verge also reports that Meta hasn’t made any major changes to its content rules lately, but the Meta press call suggested that some changes might be on the way.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.
