Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.

Facebook's vice president of content policy, Monika Bickert, pushed back Wednesday against whistleblower Frances Haugen, saying she is "not an expert" and did not work on child safety issues after she testified on Capitol Hill alleging the platform is more concerned about making money than protecting its users. 

Bickert said on "America's Newsroom" Haugen "mischaracterized" information she revealed from the platform. 

FACEBOOK'S ANGER MANAGEMENT: BOTH PARTIES EMBRACE WHISTLEBLOWER AT HEATED HEARING

"I can say she didn't work on these issues and her testifying about them and mischaracterizing some of the documents she stole is like if a journalist were to read another journalist's story, a colleague's story, and say oh, I'm an expert on this," Bickert said.

 "She's not an expert in these areas."

Haugen, who worked in project management for civic misinformation, testified before lawmakers on Tuesday alleging the company prioritizes making a profit over implementing safeguards to protect its users, which could even be considered a "national security issue."

Haugen leaked documents to the Wall Street Journal last month alleging Facebook knew internal research showed Instagram could be harmful to some young users, particularly teen girls. 

Bickert pushed back on the report, arguing the platform does prioritize safety and works with individuals who have expertise in the matter, including teachers, counselors, and legal experts.  

VICTOR DAVIS HANSON: DEMS USING FACEBOOK SCANDAL TO WEAPONIZE GOVERNMENT AGAINST FREE SPEECH

"These are the sorts of experts that we bring to Facebook because they care so deeply about these issues and on the research that has been mischaracterized," said Bickert. 

Bickert continued by claiming the platform could even be beneficial to the mental health of some young teens who use Instagram. 

"One is the research largely showed that teens have a positive experience when they are dealing with mental health issues, as many teens are," Bickert said. "Instagram is a place that helps them find support."

During her testimony, Haugen also suggested Facebook's algorithm is flawed and has had a primary role in exposing teens and other users to harmful content on the web since it rewards posts that garner polarizing responses. Bickert also pushed back on those claims emphasizing the company's push for transparency. 

CLICK HERE TO GET THE FOX NEWS APP

"We published in our Transparency Center a set of content distribution guidelines that explained, for instance, the types of content that we demote. We demote engagement bait, clickbait, sensationalist content," Bickert said. 

"And much like Mark said in his memo, that is designed to give people a better experience. It is in our business interest to make this a place that people will want to return to for years."

Haugen urged lawmakers to consider reform of Section 230 of the Communications Decency Act, which has historically garnered bipartisan support, to ultimately hold Facebook accountable for third-party content.