Instagram is adding some new security features to combat the sextortion scams on the platform, which will provide additional informational notes for teenagers about what intimate sharing online can lead to. Instagram has first rolled out a new process that is going to blur DMs likely containing nude images according to its own systems. For minors under 18, the potential nudes will by default be blurred. This not only saves them from being exposed but also cautions them to be thoughtful before replying and sharing their own nudes images. Which would seem a no-brainer, in that if you don't want your nudes to be seen by others, don't share them on IG. Or even better, don't take them at all, but for younger generations, nudes are, for better or worse, a part of how they communicate. Yeah, I'm old, and it makes no sense to me either. But given that this now is something accepted sharing, even expected in some circles, it'd make a bit of sense for IG to add more warnings themselves to help protect youngsters, especially, from such exposure. It'll help also in sextortion cases: It also prevents people from viewing unwanted nudity in DMs and protects them from scammers who might send their nude images to try to trick people into sending their own images. In addition, Instagram reports it also is building new technology to identify where accounts may be engaging in sextortion scams, "based on a range of signals that could indicate sextortion behavior." Where the circumstances call for it, in cases like these, Instagram will take action including reporting users to NCMEC. Instagram will also identify users who are trying to share nude images inside the app. It is also testing pop-up messages for any account that has interacted with an account it removes for sextortion, while the company also expands its partnership with Lantern, a program run by the Tech Coalition which allows technology companies to share signals about accounts and behaviors that violate their child safety policies. Upgrades will supplement a robust set of child-protection tools already in place on Instagram, including its newly rolled-out processes to limit exposure to content relating to self-harm. Of course, teens can take themselves out of such measures, but Instagram also can't be held responsible for all elements of protection and safety in this respect. More recently, Instagram has introduced its "Family Center" oversight option, and in combination, there is a range of options to help keep younger users safe in the app.