The BBC has published a long-running report into Facebook and child abuse today. The report details how secret Facebook groups have been used to share images and information among pedophiles, some of whom were registered sex offenders.
The report details all manner of illegal behavior that's also contrary to Facebook's terms of use. But when the BBC sent the offending material to Facebook -- as the company requested before talking to BBC journalists -- Facebook turned around and reported the BBC to the cops.
The BBC's investigation has been running since at least 2015. As part of the investigation, the BBC found groups that were sharing sexually explicit images of minors, which the BBC reported through Facebook's normal channels:
"The BBC used the report button to alert the company to 100 images which appeared to break its guidelines. They included:
More From BGR
In an effort to work out why 80 percent of the offending images were still up, the BBC scheduled an interview with Facebook director of policy Simon Milner. One of the conditions of the interview was that the BBC share the images that it had reported but had not been reviewed.
This is where the story takes a strange twist. Once the images had been provided to Facebook, a move that the BBC said was requested by Facebook, Facebook then turned around and reported the BBC to the UK's National Crime Agency. Facebook confirmed that it had done so in a statement, saying "It is against the law for anyone to distribute images of child exploitation. When the BBC sent us such images we followed our industry's standard practice and reported them to Ceop [Child Exploitation & Online Protection Centre]."
It's a bizarre turn of events, and beyond the boilerplate statement, Facebook has not explained its actions. It's possible that Facebook didn't want to conduct an on-the-record interview about the issues, or that particular Facebook employees were worried about the possession of compromising images, so reported the images to the police as soon as they were received.
Either way, it's not a good look for Facebook, which has already been under fire recently for its lack of effort in policing fake news on the site. If the company's moderation policy is shown to be ineffective for something as comparatively simple as child porn, Facebook's role in navigating the complex world of fake news looks increasingly questionable.