You won’t need to go far to catch a glimpse of the infamous AI selfie app which is making the rounds on social media in recent moments.
The app - which either has a monthly charge of $7.99 (£6.50) or an annual fee of $29.99 (£24.50) - allows people to recreate their photos using artificial intelligence and transform them into ethereal, anime-style digital pieces of art.
Dubbed as Lensa AI, the app has even caught the attention of celebrities like Megan Fox and Britney Spears’ husband Sam Asghari with the impressive likeness the artificial intelligence is able to create.
Advert
But it turns out that the selfie app reportedly has the ability to do something very worrying without users’ consent.
The ‘Magic Avatars’ which has landed the app in the spotlight, may seem like harmless fun which allows users to envision themselves as an elf, nymph or astronaut - but beneath the surface, the app might have a worrying outlook about the kind of pictures people can create.
Lensa has implemented a ‘no nudes’ policy, but nonetheless, those on the app are able to easily generate nude images using the AI - which is fine, if one wants to consent to those images of themselves, but it opens up a whole new can of worms when you realise that it can also work for anyone else they have photos of.
Advert
After providing photoshopped nude celebrities into the app, Techcrunch’s Haje Jan Kamps wrote that the glitch has a ‘terrifying’ outcome.
Terrifyingly, the photoshopped pictures were able to override any of the app’s supposed rules on nudity.
He wrote: "The ease with which you can create images of anyone you can imagine (or, at least, anyone you have a handful of photos of), is terrifying.
Advert
"Adding NSFW content into the mix, and we are careening into some pretty murky territory very quickly: your friends or some random person you met in a bar and exchanged Facebook friend status with may not have given consent to someone generating softcore porn of them."
It may be a terrible thing to note - but it’s by far not the worst thing that someone on the app that someone has reported.
Journalist Olivia Snow explained that after attempting to upload photos of herself as a child to portray herself as a ‘fairy princess’, the app construed those images into what was essentially child pornography.
"I managed to piece together the minimum 10 photos required to run the app and waited to see how it transformed me from awkward six-year-old to fairy princess," she wrote in an article on Wired. "The results were horrifying."
Advert
Lensa denied the claims, stating that any pornographic outcome images are 'the result of intentional misconduct on the app'.
UNILAD has contacted a representative of Lensa for a comment.
Topics: Viral, Technology, Instagram