unilad homepage
unilad homepage
  • News
    • UK News
    • US News
    • World News
    • Crime
    • Health
    • Money
    • Sport
    • Travel
  • Music
  • Technology
  • Film and TV
    • News
    • DC Comics
    • Disney
    • Marvel
    • Netflix
  • Celebrity
  • Politics
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
TikTok
YouTube
Submit Your Content
Woman left horrified after AI steals her identity for x-rated pill advert
Home>Community
Published 15:34 17 Apr 2024 GMT+1

Woman left horrified after AI steals her identity for x-rated pill advert

A woman has shown just how disturbingly accurate a deep fake can be after it used her likeness and voice for an x-rated pill advert

Gerrard Kaonga

Gerrard Kaonga

google discoverFollow us on Google Discover
Featured Image Credit: TikTok/@michel.c.janse

Topics: Artificial Intelligence, Social Media, TikTok, Technology

Gerrard Kaonga
Gerrard Kaonga

Gerrard is a Journalist at UNILAD and has dived headfirst into covering everything from breaking global stories to trending entertainment news. He has a bachelors in English Literature from Brunel University and has written across a number of different national and international publications. Most notably the Financial Times, Daily Express, Evening Standard and Newsweek.

Advert

Advert

Advert

The world just got one step closer to the bleak worlds depicted in Black Mirror as one woman detailed how one AI video stole her likeness.

Oh great, new horrors to be wary of as artificial intelligence videos get better and better.

TikToker Michel Janse explained to her followers in March that her likeness and voice had been used to promote a brands erectile dysfunction pill.

Michel suggested AI technology had been used to create the deepfake video in order to make it convincing.

Advert

 Michel found out about this after coming back from her honeymoon (Michel.c.janse/TikTok)
Michel found out about this after coming back from her honeymoon (Michel.c.janse/TikTok)

On top of this already eerie revelation, Michel found out about this after coming back from her honeymoon.

Speaking to her followers on her page, @michel.c.janse, she explained just how disgusted she was at the actions of the company.

“The thing that feels most violating about this is that they pulled footage from by far the most vulnerable video I had ever posted on my channel,” she said in her video that has been viewed over 1.2 million times.

“I was sitting in my bedroom explaining very traumatic and difficult things that I had gone through in the years prior.”

She explained the video showed her in the bedroom in her old apartment, wearing her clothes, talking about their pill and discussing the sexual difficulties her partner was having.

Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so people see how much AI is improving.

And to be fair, it looks very much like her and sounds just as convincing.

After picking up their jaws from the floors, social media users advised the TikToker to consider legal action to prevent this becoming the norm in the future.

“Get a lawyer, set a precedent that establishes protections for everyone else in the future. I’m terribly sorry this happened to you,” one user wrote.

Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so.(Michel.c.janse/TikTok)
Michel reaffirmed it wasn’t her voice and said while she wasn’t happy showing the actual ad, it was important to do so.(Michel.c.janse/TikTok)

"This is a huge lawsuit - make some calls, find a good lawyer, you will win and it will bring attn to this issue. I am so sorry this happened to you!” another wrote

“For your sake and for OUR sake, please please sue. People need to be holding companies accountable for this before they get even more comfortable,” a third wrote.

Others said they would be more wary about putting their likeness on the internet as deepfakes become more believable. I don’t blame them to be honest.

  • iPhone users can check if they’re eligible for Apple's $250m payout over AI accusations
  • $500-a-month AI robot does all your chores but has left everyone saying the same thing
  • Disturbing fake news reports made with AI look so real it's making people terrified for the future
  • People are claiming photographic evidence is officially over after seeing crisp image AI can now create

Choose your content:

18 days ago
20 days ago
29 days ago
a month ago
  • Facebook
    18 days ago

    Healthy woman explained why she chose to end her life by euthanasia

    Wendy Duffy died by suicide on Friday

    Community
  • EMMANUEL DUNAND/AFP via Getty Images
    20 days ago

    Bizarre 'Scientology run' challenge explained as 'raid' trend sees teens storm church's buildings

    Users are divided on the trial, with some asking how they can 'sign up' to take part

    Community
  • ITV
    29 days ago

    Woman marrying convicted murderer on death row opens up about their relationship

    Tiana Krasniqi is set to wed James Broadnax, who was convicted of a double murder in 2009

    Community
  • Getty Stock
    a month ago

    Hiring managers are sharing the worst interview experiences that make them reject people instantly

    Hiring the right person for the job can be extremely difficult, especially in a world of artificial intelligence and Zoom interviews

    Community