Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.
Updated

Facebook announced changes to its policies regarding groups and pages as a way to stem the tide of bad information and fake news that some critics have called a public health crisis.

Beginning this week, all page managers will see a new tab that explains why Facebook removes certain content and shows when posts have seen their distribution reduced as a result of being flagged by third-party fact-checkers.

The tech giant is also changing its recidivism policy to make it harder for people who have had pages removed for violating community standards to use duplicate pages to post the same bad content.

"We're taking new steps in how we handle Page content that goes against our policies," Facebook, which according to COO Sheryl Sandberg has a long way to go in fixing its problems, said in a blog post.

MARK ZUCKERBERG'S MENTOR: FACEBOOK IS 'PUBLIC HEALTH' HAZARD AND SHOULd BE BROKEN UP

The new page quality tab will feature two main sections — one area showing content that was recently removed for violating community standards and another for content rated "False," "Mixture" or "False Headline" by third-party fact-checkers.

The social network plans to expand the feature, but to start it will show any content that has been removed for reasons such as hate speech, graphic violence, nudity or sexual activity, harassment and bullying, and regulated goods.

The aim, according to Facebook, is to empower users with the information they need to police pages, understand the platform's community standards and be able to let Facebook know if something has been taken down by mistake.

For the second change, involving their recidivism policy, Facebook is adapting to the increasingly clever ways that purveyors of fake news harness the platform.

POP STAR SLAMS FACEBOOK AND GOOGLE, SAYS THEIR BUSINESS MODELS ARE 'SCARY'

"We've long prohibited people from creating new Pages, groups, events, or accounts that look similar to those we've previously removed for violating our Community Standards," the company said in its blog post. "However, we've seen people working to get around our enforcement by using existing pages that they already manage for the same purpose as the page we removed."

In order to address this, the company may now remove other pages or groups that look similar or have the same administrators, even if the page or group has not met the threshold to be unpublished on its own.