Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.
Updated

YouTube’s powerful recommendation algorithm may be “optimizing for outrageous, salacious and often fraudulent content” or easily manipulated by “bad actors, including foreign intelligence entities,” a top-ranking Democrat on the Senate’s intelligence committee said.

Virginia Sen. Mark Warner made the statement after a Guardian investigation reported that the Google-owned video platform was consistently promoting divisive and conspiratorial videos damaging to Hillary Clinton’s campaign before the 2016 election.

“Companies like YouTube have immense power and influence in shaping the media and content that users see,” Warner told the Guardian. “I’ve been increasingly concerned that the recommendation engine algorithms behind platforms like YouTube are, at best, intrinsically flawed in optimizing for outrageous, salacious, and often fraudulent content.”

He added: “At worst, they can be highly susceptible to gaming and manipulation by bad actors, including foreign intelligence entities.”

YouTube’s recommendation algorithm, a secretive formula that determines which clips are promoted in the “Up Next” column beside the video player, drives a large percentage of traffic on YouTube, where over a billion hours of footage are watched each day.

US COURT DISMISSES LAWSUIT BLAMING TWITTER FOR AIDING ISIS

However, critics have been warning that the recommendation algorithm has developed alarming biases or tendencies, pushing viewers toward disturbing content that depicts violence against children or hateful rhetoric or conspiracy theories.

youtube logo

YouTube disagree with the findings of a recent investigation into its secretive algorithm. (Reuters)

However, up until this point, the algorithm’s role in the U.S. presidential election has gone mostly unexplored.

The Guardian’s research, based on a previously unseen database of 8,000 videos recommended by the algorithm in the months leading up to the election, suggested the algorithm was six times more likely to recommend videos damaging to Clinton than Trump, while also amplifying wild conspiracy theories about the former secretary of state.

All the videos in the database shared with the Guardian were watched more than 3 billion times before the election. Many of them have since disappeared from the video platform, which has prompted some experts to question whether the algorithm was manipulated or gamed by Russia.

One of the most recommended channels in the database of videos was that of Alex Jones, the far-right conspiracy theorist.

Guillaume Chaslot, a French computer programmer and ex-Google employee who worked on the YouTube recommendation algorithm, has used a program he designed to explore bias in YouTube content promoted during the French, British and German elections, global warming and mass shootings.

FACEBOOK ENABLES 'FAKE NEWS' WITH ITS DIGITAL ADVERTISING PLATFORM, REPORT SAYS

His findings are available on his website, Algotransparency.com.

YouTube, however, has challenged the British publication’s research, saying that it “strongly disagreed” with the findings.

“It appears as if the Guardian is attempting to shoehorn research, data and their conclusions into a common narrative about the role of technology in last year’s election,” a YouTube spokesperson told the outlet. “The reality of how our systems work, however, simply doesn’t support this premise.”

HillaryClinton

Former U.S. Secretary of State, Hillary Clinton speaks during an interview with Mariella Frostrup at the Cheltenham Literature Festival in Cheltenham, Britain Oct. 15, 2017. (REUTERS/Rebecca Naden)

In his statement, Warner added: “The [tech] platform companies have enormous influence over the news we see and the shape of our political discourse, and they have an obligation to act responsibly in wielding that power.”

Warner’s latest warning is noteworthy given the fact that Google largely played down the extent of Russian involvement in its video platform during Senate testimony late last year. The committee’s investigation into Russian interference in the U.S. presidential election is ongoing but has focused mostly on Facebook and Twitter.

The 8,000 YouTube-recommended videos were also analyzed by Graphika, a commercial analytics firm that has been tracking political disinformation campaigns. It concluded many of the YouTube videos appeared to have been pushed by networks of Twitter sock puppets and bots controlled by pro-Trump digital consultants with “a presumably unsolicited assist” from Russia.

YouTube told Fox News that it has sophisticated systems to detect and prevent manipulation of its platform, including its recommendation algorithms. "Less than 5 percent of all YouTube views come from external sources like social media platforms, search engines, and embeds," explained a YouTube spokesperson, via email. "Put simply, views from social networks do not have a significant impact on overall viewership. As we've reported to investigators, we've seen no evidence of manipulation by foreign actors.”