Instagram has introduced some new steps to keep teen users safer, such as increased content restrictions around self-harm, prompts to return to more private experiences, and other advanced actions.
To begin with, IG states that it is taking more steps to limit exposure to self-harm-related content based on the negative impacts it can have on young users.
Instagram says:
We could take the example of somebody posting about their constant struggles with thoughts of self-harm. That's a very important story; that can help to work down on the stigma surrounding the issue, but it is not an easy topic for some young people. We'll start to remove this sort of content from teens' experience on Instagram and Facebook as well as other sorts of age-inappropriate content.
We're also limiting teens from seeing some related recommendations of self-harm content on Reels and Explore already," the statement reads. It is expanding that to Feed and Stories as well, even if such content has been published by an account that they follow.
The expanded restrictions are based on advice from child and adolescent health experts, who have raised concerns about the rate of exposure in the app and the impacts of such on vulnerable users.
This new process will provide more protection without taking away Meta's ability to provide links to relevant assistance groups where possible.
On that front, the firm will also see search terms about suicide and eating disorders routed to more help services instead of turning them up on the search platform and hide specific content-related searches.
In other updates, Meta will begin auto-enrolling all teenagers across both its Facebook and Instagram products into the most secure forms of content settings for default.
New teen users are already defaulted into the most restrictive settings when they sign up to the app, but now, Meta will expand the same to all teens active across its platforms.
"Our content recommendation controls - known as "Sensitive Content Control" on Instagram and "Reduce" on Facebook - make it more difficult for people to come across potentially sensitive content or accounts in places like Search and Explore."
Finally, the company is sending all of its teens new notifications instructing them to update their setting to a more private experience. Meta continues to create more safety resources that aim to protect teen users much more in the metaverse future, where young lads will be able to attain even more immersive experiences.
Meta would, therefore, be interested in working with experienced partners to develop systems before the event. This will ensure that optimal protections are realized besides ensuring that Meta is at a position to prove its credibility to regulators in such a responsibility.