Instagram has announced 'new tools to help protect against sextortion and intimate image abuse'.
The social media platform is set to begin testing new features in a bid to better protect teenagers when it comes to nude image sharing, potential scams and being contacted by criminals.
The issues
While many of us simply use our Instagram DMs to share cute puppy videos or photos of restaurants we want to try, some people sadly have other intentions - and no, we're not just talking about f**kboys sliding in.
Advert
Scammers also utilise Instagram DMs to get in touch with people and share or ask for intimate images.
In a bid to combat this, Instagram has revealed it will 'soon' start testing a 'new nudity protection feature'.
The new features
It's announcement on its website on 11 April details one of the new features will 'blur' any images 'detected as containing nudity' under a warning screen so recipients can choose whether or not to view it - the image blurred by on-device machine learning rather than Meta having access to the photo itself.
"We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat," it adds.
Advert
Another feature will also 'encourage people to think twice before' sending such photographs by sending them a message, also reminding them such images can be unsent if they do change their minds.
It adds: "Anyone who tries to forward a nude image they’ve received will see a message encouraging them to reconsider."
Instagram hopes not only will this stop people from seeing 'unwanted nudity in their DMs', but also protect users from scammers who may send nude images to them to try and trick them into sending one back.images to trick people into sending their own images in return.
Instagram continues: "[...] When sending or receiving these images, people will be directed to safety tips, developed with guidance from experts, about the potential risks involved.
Advert
"These tips include reminders that people may screenshot or forward images without your knowledge, that your relationship to the person may change in the future, and that you should review profiles carefully in case they’re not who they say they are."
Instagram will also link users to several support sites such as StopNCII.org (Stop Non Consensual Intimate Image Abuse) for adults and Take It Down for teens.
The feature is set to 'turned on by default for teens under 18 globally' and Instagram will share a notification to those over the age of 18 so they can switch it on if they so desire.
Advert
Instagram are also 'developing technology' to help identify potential sextortion scammer accounts to prevent such messaging from occurring in the first place, which follows on from the stricter messaging defaults it announced earlier this year in January, blocking anyone under the age of 16 - or 18 in some countries - from being messaged by anyone they aren't already connected to.
It adds: "Now, we won’t show the 'Message' button on a teen’s profile to potential sextortion accounts, even if they’re already connected. We’re also testing hiding teens from these accounts in people’s follower, following and like lists, and making it harder for them to find teen accounts in Search results."
Instagram is also sending 'more sextortion-specific signals' to Lantern - a 'program run by the Tech Coalition that enables technology companies to share signals about accounts and behaviors that violate their child safety policies'.
Advert
The new features are set to start being tested in the forthcoming weeks and months, a spokesperson for Meta told UNILAD.
Topics: Instagram, Mental Health, Social Media, World News, Technology