Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.

Google-owned YouTube is reworking its recommendation algorithm that will suggest what videos users should view next in a new bid to stem the flow of conspiracy theories and false information on the massive video platform.

The company, which has been criticized by lawmakers and called out in studies for pushing viewers toward fraudulent content and conspiracy theories, said in a Friday blog post that it is taking a "closer look" at how to reduce the spread of content that does not quite violate YouTube's Community Guidelines but comes close to doing so.

"To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11," the company said in its blog post.

JEFF BEZOS GIVES A PITIFUL AMOUNT OF HIS FORTUNE TO CHARITY

YouTube’s recommendation algorithm, a secretive formula that determines which clips are promoted in the “Up Next” column beside the video player, drives a large percentage of traffic on the video platform, where over a billion hours of footage are watched each day.

However, the company said this latest tweak will apply to less than one percent of the content on YouTube and will not impact whether a video is allowed on the site.

"This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations," YouTube said. "These evaluators are trained using public guidelines and provide critical input on the quality of a video."

TINDER SETTLES AGE DISCRIMINATION LAWSUIT FOR $17.3 MILLION

A Washington Post investigation in December found that — one year after YouTube promised to curb "problematic" videos — the tech giant continued to harbor and recommend hateful, conspiratorial videos and permitted racists and anti-Semites to harness the video platform as a way of spreading their views. After the Parkland school shooting, one of the top trending videos on YouTube claimed that a survivor of the incident was a "crisis actor." The company later took down the video and apologized.