Facebook is conducting an internal investigation after a U.K. TV news report revealed the social network’s moderators didn’t delete abusive and racist content.
The TV report, which will air July 17th on Channel 4 in the U.K., shows an undercover investigation into CPL Resources. CPL is based in Dublin, Ireland and is one of many companies Facebook contracts to aid in content moderation.
An undercover journalist was introduced to Facebook’s community standards as part of the CPL training procedure. Community standards are Facebook’s rule book for acceptable content, designed to encourage expression and create a safe environment. Additionally, the journalist worked on reviewing content including graphic violence, child abuse and hate speech.
Failing to meet standards
Moderators at CPL have three options when reviewing content — ignore, delete or mark as disturbing. Marking something as disturbing places an age restriction on the content, but doesn’t delete it.
However, the reporter found that CPL allowed shocking content to remain on the platform. Furthermore, the journalist discovered that there were inconsistencies in the training moderators received and Facebook’s community standards.
One example of this the reporter found was a video of a grown man beating a small boy. The training indicated that moderators should mark the video as disturbing. However, online anti-child abuse campaigner Nicci Astin reported the video to Facebook in 2012. At the time, the company reportedly said the video didn’t violate it’s terms.
The video reportedly garnered 44,000 shares in it’s first two days on the platform and was still up when Channel 4 investigated.
Facebook responded to the report by opening an investigation into what happened at CPL. Furthermore, the company made all CPL trainers retake the training and is preparing to do the same globally. Facebook’s vice president of global policy solutions Richard Allan sat down with the Channel 4 news team for an interview.
Importantly, Allan said that the training material was wrong. “[The material] should not have been in those training decks. They were old training decks using wrong examples, wrong material, so we have gone through that.”
Additionally, trainers used inappropriate language when discussing content. Those trainers were retrained accordingly, Allan said.
Along with that, Allan said the company would be doubling the number of people working on safety and security this year to 20,000. Despite that, Allan admitted that the system isn’t perfect.
The CPL team is on the front lines. They’re the first to review content. According to Allan, CPL should escalate certain content issues to Facebook staff, who have country-specific training as well as law enforcement contacts.
The interview also touched on issues of balance. CPL and Facebook staffers have to deal with complex issues and debates on a regular basis. Often they must make judgement calls, deciding whether a debate comment could be considered hate speech and other sensitive issues.
One issue moderators contend with is the why. Did someone post content to highlight an issue and garner support to stop it? For example, some users may try to draw attention to and raise support for an issue of bullying by posting a video of a kids bullying a girl. Someone else may post the same video to celebrate the act of bullying. Moderators make judgement calls with content like this too.
Does Facebook profit from shocking content?
One of the more important issues discussed in the interview was whether Facebook stands to profit from content like this. Former Facebook investor and recent critic of the platform Roger McNamee believes the company does.
“It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform. Facebook understood that it was desirable to have people spend more time on site if you’re going to have an advertising-based business,” he told Channel 4.
Allan disagrees, saying that’s a misunderstanding of the system.
“The way in which we make money is that we place advertisements in somebody’s news feed. Just like if you watch commercial television, your experience is interrupted by an ad break. Well on Facebook your news feed is interrupted by an ad break. And that then isn’t associated with any particular kind of content.”
You can read the full transcript of the interview between Allan and Channel 4 here.