Facebook Updates Al to Recognize Suicidal Posts Prior to Being Flagged
Facebook has announced today that it is upgrading its algorithm to recognize suicidal posts without being reported or flagged. This new change is to seek out possibly depressed or suicidal users and offer them the needed help to prevent any tragedy from occurring.
This update to Facebook’s algorithm is being rolled out worldwide except some countries in Europe due some restrictions. When the algorithm catches the posts that are concerning through keywords or concerned comments, the posts are then sent out to a Facebook employee for review. If the reviewer finds that the posts are indeed a cause for concern, they will then seek out the user directly or reach a close friend or relative to alert them.
This is probably a big reason for Facebook hiring 3000 employees for reviewing purpose alone. Facebook CEO, Mark Zuckerberg, said in a post, “If we’re going to build a safe community, we need to respond quickly.”
In the post announcing this new update, Zuckerberg said:
“Starting today we’re upgrading our AI tools to identify when someone is expressing thoughts about suicide on Facebook so we can help get them the support they need quickly. In the last month alone, these AI tools have helped us connect with first responders quickly more than 100 times.”
For now the system is pretty basic with recognizing suicidal behavior through comments in which people are inquiring about the user’s well-being like ‘Are you okay?’ or ‘What happened?’ etc. The other deciding factors were not mentioned in the post but comments are just one of the ways that post could be filed for review.
Zuckerberg also mentioned that he is aware of the possible implications of this new Facebook Al update being harmful in the future but people should keep in mind that it is actually helping a lot of users from committing suicide which is one of the leading causes of death all around the world.