Instagram feels that there is nothing as important to them as the safety of their people across the community. They also think that they have not been doing enough to protect their vulnerable users from any self-harm or suicide.
As a result, in partnership with global experts and academics on youth, mental health, and suicide prevention, they have released new changes to deal with the situation.
Here are the changes:
1. The platform would not allow graphic images related to self-harm at all, like cutting. Instagram will not allow posting of materials that are deemed to promote suicide.
2. When it comes to hashtags, explore tab, and in search, Instagram will not show nor recommend any self-harm like content, whether it is graphic or non-graphic. For example, displaying of healed scars. However, for non-graphic content, Instagram says that they are not removing them entirely to avoid isolating people who otherwise may be in distress when posting.
3. To make sure that the platform is supporting users with need, posting anything related to self-harm, they are getting additional resources to make sure they can help them or direct them to relevant organizations.
4. Instagram also says that they are continuously engaging with experts to find additional methods of helping out to deal with graphic and non-graphic content to protect their users. Some of the suggestions they have is blurring the non-graphic content related to self-harm by use of a sensitivity screen to make sure the images are not visible immediately.
More on finding that right balance
It is said that after comprehensive reviews, experts, and including the Centre for Mental Health and Save.org, it was open that creating a safe space for the youth to express their experiences online and including self-harm was the best move to make. It was also advised that sharing this type of content can be vital in helping support and save a life.
However, it is also advised that such graphic images related to self-harm, can potentially unintentionally promote the act of self-harm even if the user intended to express his/her struggles. Hence why the graphic images will not be allowed on the platform.