Este sitio web fue traducido automáticamente. Para obtener más información, por favor haz clic aquí.

On World Suicide Prevention Day, Facebook has taken steps to tighten its policies and expand access to resources for those in need.

The tech giant announced several changes on Tuesday as a result of consultations with experts that began earlier this year.

Facebook tightened its policy around self-harm to no longer allow graphic cutting images as a way to avoid promoting or triggering such an act. On Instagram, the company has made it much harder to search for this type of content and they've kept it from being recommended in Explore.

The social network has also tightened its policy to prohibit additional content that may promote eating disorders. And they are now displaying a "sensitivity screen" over healed self-harm cuts to help avoid unintentionally promoting self-harm.

TAYLOR SWIFT THREATENED MICROSOFT WITH LEGAL ACTION OVER RACIST CHATBOT 'TAY'

Facebook improved its policies around suicide prevention. (AP)

Facebook improved its policies around suicide prevention. (AP)

DID THESE APPS TELL FACEBOOK ABOUT THE LAST TIME YOU HAD SEX?

The company is hiring a new health and well-being manager for its safety policy team, as well, and that person will focus exclusively on the health and well-being impacts of all Facebook apps. Details of the consultation meetings with experts are on Facebook's new Suicide Prevention page.

As with any type of harmful content, the company is constantly improving its technology and machine learning systems to find and remove content or add sensitivity screens. It provided an update on how much content was removed during the second quarter of this year:

From April to June of 2019, we took action on more than 1.5 million pieces of suicide and self-injury content on Facebook and found more than 95% of it before it was reported by a user. During that same time period, we took action on more than 800 thousand pieces of this content on Instagram and found more than 77% of it before it was reported by a user.

The tech giant is also trying to facilitate connections to family and friends for those who might be at risk of committing suicide. Facebook is adding Orygen's #chatsafe guidelines to its Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.

GET THE FOX NEWS APP

According to the World Health Organization, every 40 seconds someone succumbs due to suicide.

If you or someone you know may be at risk, call 1-800-273-8255 to reach the National Suicide Prevention Lifeline. It offers free and confidential support 24 hours a day, seven days a week for people in suicidal crisis or distress. You can also text HOME to 741741 to have a confidential text conversation with a trained crisis counselor from Crisis Text Line.