Instagram has launched a new detection technology to recognise suicide and self-harm content on their app in UK and Europe.
The new tools added to the app can identify any posts that breaks Instagram’s guidelines on harmful posts. The app will make sure posts of that kind will either be less visible or taken down automatically. Adam Mosseri, head of Instagram, stated that the new system uses artificial intelligence.
This system is currently being implemented outside Europe. Posts that are identified by the system are automatically referred to human moderators who has the power to choose whether or not the post should be taken down. The service also include directing the user suffering from harmful posts to emergency services.
Instagram told the UK Press Association that they will not be implementing human referral in the UK and Europe due to data privacy concerns, but it will be their next step.
Image by Slen Feyissa on Unsplash
News credit: BBC