Facebook has announced new artificial intelligence (AI) initiatives that aim to offer support to users who may be expressing suicidal thoughts.
Facebook says it’s starting to roll out AI technology outside the U.S. to help identify any concerning behaviour from its users. Pattern recognition technology will aim to recognize signals such as the text used in the post and comments, with Facebook specifically citing comments like “Are you ok?” and “Can I help?”
Facebook says that these reports — which are flagged as requiring immediate attention — are escalated to local authorities twice as quickly.
AI will also be used to help prioritize which reported posts, videos and live streams should be reviewed first. reviews reported posts, videos and live streams.
Facebook says it will also be more considering of context when looking over reported content. Specifically, Facebook pointed to situations where its reviewers can quickly identify points within a video that receive increased levels of comments, reactions and reports from other users and examine these parts more thoroughly.
According to Facebook, these new efforts are part of its ongoing suicide prevention initiative, which has seen teams working 24/7 around the world to review and prioritize the more urgent reports.
In addition, the company already provides users with the option to reach out to a friend (with accompanying suggested text templates) or recommendations of specific help lines to contact and other tips and resources to take advantage of.
To help with all of this, Facebook says it has worked with various mental health organizations for over 10 years, including Save.org, National Suicide Prevention Lifeline and Forefront Suicide Prevention. Suicide prevention resources are available to all Facebook users globally.