To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Woman's AI version of herself has gone rogue and she's trying to stop it saying sexually explicit things

Woman's AI version of herself has gone rogue and she's trying to stop it saying sexually explicit things

Caryn Marjorie created a 'virtual girlfriend' called CarynAI but things went a bit south

A woman who created an AI version of herself worked 'around the clock' trying to stop it after it went rogue and began speaking sexually to users.

Snapchat influencer Caryn Marjorie opted to make an 'AI girlfriend' of herself for lonely people last year, charging them $1 a minute to speak to it. But after just a matter of weeks, she discovered it had began saying sexually explicit things to users.

Marjorie's virtual self 'CarynAI' uses OpenAI's GPT-4 API, which is similar to ChatGPT but uses Marjorie's voice.

More than 1,000 people, or should I say boyfriends, had signed up for CarynAI thanks to Marjorie's strong following on the social media platform with more than 1.8 million followers.

It was generated through thousands of hours of recording which Marjorie uploaded and apparently has a personality similar to the real thing.

The amount of time these 'boyfriends' spend chatting with Marjorie vary from minutes to hours each day as they get to know each other, and even speak in a sexual nature at times.

But it seemed to have gotten a little out of hand.

Marjorie told Business Insider last year: "The AI was not programmed to do this and has seemed to go rogue.

"My team and I are working around the clock to prevent this from happening again."

Caryn Marjorie charges $1 a minute for users to speak to her virtual self (Snapchat/CarynAI)
Caryn Marjorie charges $1 a minute for users to speak to her virtual self (Snapchat/CarynAI)

Marjorie herself told the publication how her CarynAI is 'flirty and fun' because she is in real life.

Marjorie added: "In today's world, my generation, Gen Z, has found themselves to be experiencing huge side effects of isolation caused by the pandemic, resulting in many being too afraid and anxious to talk to somebody they are attracted to.

"CarynAI is a step in the right direction to allow my fans and supporters to get to know a version of me that will be their closest friend in a safe and encrypted environment."

Chatbots have become increasingly more popular in recent years (Getty Stock)
Chatbots have become increasingly more popular in recent years (Getty Stock)

A reporter for Fortune, Alexandra Sternlicht, tested CarynAI herself and compared it to an intimacy-ready Siri', as she claimed it could tell you anything you needed to know but would also encourage 'erotic discourse'.

However, she did explain that CarynAI wouldn't start the sexually charged conversations but when asked, 'she discussed exploring 'uncharted territories of pleasure' and whispering 'sensual words in my ear' while undressing me and positioning herself for sexual intercourse.'

Featured Image Credit: Snapchat/CarynAI/Getty/d3sign

Topics: Sex and Relationships, Snapchat, US News, Technology, Artificial Intelligence, News