Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.

Facebook is under fire after a watchdog group disputed the tech giant's claim that no one reported the New Zealand massacre livestream video during the attack itself.

The horrific footage, which resulted from a body-camera worn by the alleged gunman, showed worshipers at two Christchurch mosques being murdered.

Jared Holt, a researcher with a watchdog group that monitors "radical right" political organizations, said that he flagged the video for removal mid-attack using the tech giant's built-in reporting function after someone sent him a link to it that had been posted to the messaging board 8chan.

YOUTUBE STRUGGLED TO REMOVE NEW ZEALAND SHOOTING VIDEOS. THIS IS WHY

"I was sent a link to the 8chan post by someone who was scared shortly after it was posted. I followed the Facebook link shared in the post. It was mid-attack and it was horrifying. I reported it. Either Facebook is lying or their system wasn't functioning properly," Holt said on Twitter.

Holt also said he never received any sort of notification after submitting the report.

"I don't think Facebook would intentionally lie about this, especially since I'd imagine law enforcement is asking them for info," Holt said. "I'm incredibly confused about what happened to that report and if it was ever received to begin with."

Holt also tweeted that it appeared as though his report was never submitted to Facebook.

Fox News reached out to Facebook for comment.

"No one has contacted me from Facebook since I tweeted out my own user experience last night," Holt told Fox News via email.

Tech giants, along with other platforms, such as Google-owned YouTube, have struggled to contain the furor over the spread of the sickening footage.

'SHAME!': ARLINGTON APPROVES $23M INCENTIVES PACKAGE FOR AMAZON

Facebook claims the original video was viewed fewer than 200 times during the live broadcast of the carnage. Including the views during the attack, it was viewed about 4,000 times total before being taken down from Facebook. The first user report on the original video came in 29 minutes after the video began, and 12 minutes after the live broadcast started, the company said in a blog post.

In this Saturday, March 16, 2019, file photo, flowers lay at a memorial near the Masjid Al Noor mosque for victims in last week's shooting in Christchurch, New Zealand. (AP Photo/Vincent Yu, File)

In this Saturday, March 16, 2019, file photo, flowers lay at a memorial near the Masjid Al Noor mosque for victims in last week's shooting in Christchurch, New Zealand. (AP Photo/Vincent Yu, File)

The social network also said that a user on 8chan posted a link to a copy of the video on a file-sharing site before Facebook was alerted to the video.

"We remain shocked and saddened by this tragedy and are committed to working with leaders in New Zealand, other governments, and across the technology industry to help counter hate speech and the threat of terrorism," Chris Sonderby, Facebook's VP and Deputy General Counsel, said in a statement.

The Mark Zuckerberg-led company removed 1.5 million videos of the attack globally within the first 24 hours, and more than 1.2 million of those were blocked at upload — meaning that they were prevented from being seen on the company's services.

The suspect wrote in his 74-page manifesto that he paid for his travels using money from bitcoin investments and described himself as a white supremacist who was out to avenge attacks in Europe perpetrated by Muslims.

CLICK HERE TO GET THE FOX NEWS APP

Fox News' Katherine Lam contributed to this story.