• News
  • Film and TV
  • Music
  • Tech
  • Features
  • Celebrity
  • Politics
  • Weird
  • Community
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Woman left horrified after AI steals her identity for x-rated pill advert

Home> Community

Published 15:34 17 Apr 2024 GMT+1

Woman left horrified after AI steals her identity for x-rated pill advert

A woman has shown just how disturbingly accurate a deep fake can be after it used her likeness and voice for an x-rated pill advert

Gerrard Kaonga

Gerrard Kaonga

The world just got one step closer to the bleak worlds depicted in Black Mirror as one woman detailed how one AI video stole her likeness.

Oh great, new horrors to be wary of as artificial intelligence videos get better and better.

TikToker Michel Janse explained to her followers in March that her likeness and voice had been used to promote a brands erectile dysfunction pill.

Michel suggested AI technology had been used to create the deepfake video in order to make it convincing.

Advert

 Michel found out about this after coming back from her honeymoon (Michel.c.janse/TikTok)
Michel found out about this after coming back from her honeymoon (Michel.c.janse/TikTok)

On top of this already eerie revelation, Michel found out about this after coming back from her honeymoon.

Speaking to her followers on her page, @michel.c.janse, she explained just how disgusted she was at the actions of the company.

“The thing that feels most violating about this is that they pulled footage from by far the most vulnerable video I had ever posted on my channel,” she said in her video that has been viewed over 1.2 million times.

Advert

“I was sitting in my bedroom explaining very traumatic and difficult things that I had gone through in the years prior.”

She explained the video showed her in the bedroom in her old apartment, wearing her clothes, talking about their pill and discussing the sexual difficulties her partner was having.

Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so people see how much AI is improving.

And to be fair, it looks very much like her and sounds just as convincing.

Advert

After picking up their jaws from the floors, social media users advised the TikToker to consider legal action to prevent this becoming the norm in the future.

“Get a lawyer, set a precedent that establishes protections for everyone else in the future. I’m terribly sorry this happened to you,” one user wrote.

Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so.(Michel.c.janse/TikTok)
Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so.(Michel.c.janse/TikTok)

"This is a huge lawsuit - make some calls, find a good lawyer, you will win and it will bring attn to this issue. I am so sorry this happened to you!” another wrote

Advert

“For your sake and for OUR sake, please please sue. People need to be holding companies accountable for this before they get even more comfortable,” a third wrote.

Others said they would be more wary about putting their likeness on the internet as deepfakes become more believable. I don’t blame them to be honest.

Featured Image Credit: TikTok/@michel.c.janse

Topics: Artificial Intelligence, Social Media, TikTok, Technology

Gerrard Kaonga
Gerrard Kaonga

Gerrard is a Journalist at UNILAD and has dived headfirst into covering everything from breaking global stories to trending entertainment news. He has a bachelors in English Literature from Brunel University and has written across a number of different national and international publications. Most notably the Financial Times, Daily Express, Evening Standard and Newsweek.

Advert

Advert

Advert

  • Engineers unveil bizarre AI prototype for ‘crash-proof’ plane following Air India disaster
  • Twitter users left saying 'oh my God' after Elon Musk's latest AI unleashes 'sexy AI' mode
  • Disturbing AI creates the 'worst celeb couples' imaginable and people are horrified
  • Disturbing fake news reports made with AI look so real it's making people terrified for the future

Choose your content:

a day ago
6 days ago
7 days ago
  • ABC
    a day ago

    Teen who lost limbs in horrifying shark attack details the one thing she heard before incident

    The teenager spotted the shark prior to the attack in Florida

    Community
  • Getty Images/Me 3645 Studio
    6 days ago

    Expert issues warning about new 'job hugging' trend which could ruin your career

    You might know every little corner of your office now, but are you hanging on for the sake of it?

    Community
  • SWNS
    6 days ago

    Couple finally discover wild truth behind mysterious stranger in their wedding picture four years later

    The couple now know the hilarious story as to why a 'random stranger' was at their wedding

    Community
  • Getty Stock Image
    7 days ago

    Psychologists issue stark warning on why you should never be 'best friends' with your parents

    Experts have expressed concerns about something known as 'parentification'

    Community