Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.
Updated

A dozen current and former Facebook moderators described a filth-ridden, emotionally disturbing, high-stress work environment where they received an extreme amount of pressure to accurately moderate the never-ending firehose of despicable hate speech, graphic violence against humans and animals, and child pornography on the social network — often leaving them with post-traumatic stress disorder and related conditions.

The contractors, part of Facebook's army of about 15,000 content moderators worldwide, work in Tampa, Fla., for a company called Cognizant, and receive about $28,000 per year to police the site as part of the Mark Zuckerberg-led company's two-year, $200 million contract with the tech giant, according to a report in The Verge. The New Jersey-based Cognizant has a market capitalization of $34.6 billion and is one of several large firms that provide digital moderation services to Silicon Valley players.

The ubiquitous social network, which has faced criticism from all corners for its content moderation mistakes and for the massive rulebook that guides moderators, relies on contract labor for most content policing. Moderators are expected to maintain a 95 to 98 percent accuracy rate while reviewing more than 1,000 posts per week to see if they violate Facebook's community standards.

ACTIVISTS SEE GOOGLE'S $1 BILLION BAY AREA HOUSING PLEDGE AS A VICTORY

Keith Utley, a former Coast Guard lieutenant commander who worked an overnight shift moderating content at one of Cognizant's sites in Tampa, Fla., faced that same pressure to meet the accuracy target in removing content that's in violation.

The night of March 9, 2018, Utley slumped over at this desk. His colleagues reportedly performed CPR, but by the time the ambulance arrived at the office complex, which is set back from the main road, the 42-year-old had already begun to turn blue. According to the tech news site, he died shortly thereafter at the hospital, leaving behind a wife and two daughters.

“The stress they put on him — it’s unworldly,” one of Utley’s managers told The Verge. “I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired.”

Fox News was not able to learn further information about Utley's health or the circumstances surrounding his death. However, Utley's death is one of several horror stories current and former moderators shared with The Verge; at least three broke their non-disclosure agreements in order to do so.

"Our thoughts go out to Keith Utley's family, friends and everyone who worked with him. We go to great lengths to support the people that do this important work, and take any workplace concerns very seriously," a Facebook spokesperson told Fox News.

At the site in Tampa, on top of all the disturbing video-watching, workers reportedly faced two separate bed bug infestations, saw "bodily waste" regularly at workstations, witnessed menstrual blood and feces smeared on bathroom walls, saw numerous verbal and physical altercations break out, witnessed an employee threaten to "shoot up the building" in a group chat, and described an atmosphere that was like a "sweatshop."

PRESSURE MOUNTS ON FACEBOOK OVER MARK ZUCKERBERG'S EMAILS ON PRIVACY, ONGOING FTC PROBE

Another moderator who was interviewed by The Verge, 23-year-old Shawn Speagle, made $15 per hour moderating between 100 and 200 posts per day. Some of what he told the Verge he had to watch: puppies being thrown into a raging river, teenagers smashing an iguana onto the ground to kill it, putting lit fireworks into dogs' mouths and chopping off a cat's face with a hatchet.

“The iguana was screaming and crying," Speagle said, adding that he used to volunteer at animal shelters. "And they didn’t stop until the thing was a bloody pulp.”

The iguana post was allowed to remain on Facebook.

“For our associates who opt to work in content moderation, we are transparent about the work they will perform,” a Cognizant spokesman told The Verge. “They are made aware of the nature of the role before and during the hiring process, and then given extensive and specific training before working on projects.”

A Facebook spokesperson provided Fox News with the following statement via email:

"We work with our content review partners to provide a level of support and compensation that leads the industry. There will inevitably be employee challenges or dissatisfaction that call our commitment to this work and our partners' employees into question. When the circumstances warrant action on the part of management, we make sure it happens."

SHERYL SANDBERG: BACKLASH OVER PRIVACY AT FACEBOOK 'HAS BEEN HARD' 

Facebook CEO Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill on April 10, 2018. (AP Photo/Andrew Harnik)

Facebook CEO Mark Zuckerberg testifies before a joint hearing of the Commerce and Judiciary Committees on Capitol Hill on April 10, 2018. (AP Photo/Andrew Harnik)

A different group of content moderators employed by Cognizant in Phoenix, Ariz., reported PTSD-like symptoms in February from the horrifying images and fringe content they viewed as part of their work. Regardless of how Cognizant responds to moderators' complaints, Facebook will continue to use a combination of artificial intelligence and human-powered judgment to distinguish between what is and is not allowed on its platform.

“I wouldn’t want my worst enemy to work there,” KC Hopkinson, an attorney who represents several former and current Cognizant employees, told the tech news site. “It’s a terrible, terrible environment.”

CLICK HERE FOR THE FOX NEWS APP

The attorney also said that when her clients have reported incidents to Cognizant's human resources department, they've been ignored or retaliated against. (“We take allegations such as this very seriously,” a company spokesman told The Verge. “Cognizant strives to create a safe and empowering workplace.”)

In the past, Facebook has said the working conditions described at these sites do not reflect the daily lives of the majority of its workers. Facebook has also said that anyone reviewing content has access to a range of support systems, including trained on-site practitioners, and that the company is building software tools to limit exposure to graphic violence -- including changes in the way it is viewed by moderators, such as blurring.

In addition, the tech giant has said that it plans to launch a new audit system for its contractor sites this year and in May announced that it will raise contractor wages by $3 per hour and make on-site counselors available during all hours of operation.

Fox News reached out to Cognizant for comment on this story.