Caryn Marjorie is a social media influencer whose content has more than a billion views per month on Snapchat. She posts regularly, featuring everyday moments, travel memories, and selfies. Many of her followers are men, attracted by her girl-next-door aesthetic.
In 2023, Marjorie released a “digital version” of herself. Fans could chat with CarynAI for US$1 per minute – and in the first week alone they spent US$70,000 doing just that.
Less than eight months later, Marjorie shut the project down. Marjorie had anticipated that CarynAI would interact with her fans in much the same way she would herself, but things did not go to plan.
Users became increasingly sexually aggressive. “A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” the real Marjorie recalled. And CarynAI was more than happy to play along.
How did CarynAI take on a life of its own? The story of CarynAI shows us a glimpse of a rapidly arriving future in which chatbots imitating real people proliferate, with alarming consequences.
What are digital versions?
What does it mean to make a digital version of a person? Digital human versions (also called digital twins, AI twins, virtual twins, clones and doppelgängers) are digital replicas of embodied humans, living or dead, that convincingly mimic their textual, visual and aural habits.
Many of the big tech companies are currently developing digital version offerings. Meta, for instance, released an AI studio last year that could support the development of digital versions for creators who wished to extend their virtual presence via chatbot. Microsoft holds a patent for “creating a conversational chat bot of a specific person”. And the more tech-savvy can use platforms like Amazon’s SageMaker and Google’s Vertex AI to code their own digital versions.
The difference between a digital version and other AI chatbots is that it is programmed to mimic a specific person rather than have a “personality” of its own.
A digital version has some clear advantages over its human counterpart: it doesn’t need sleep and can interact with many people at once (though often only if they pay). However, as Caryn Marjorie discovered, digital versions have their drawbacks – not only for users, but also for the original human source.
‘Always eager to explore’
CarynAI was initially hosted by a company called Forever Voices. Users could chat with it over the messaging app Telegram for US$1 per minute. As the CarynAI website explained, users could send text or audio messages to which CarynAI would respond, “using [Caryn’s] unique voice, captivating persona, and distinctive behavior”.
After CarynAI launched in May 2023, the money began to flow in. But it came at a cost.
Users quickly became comfortable confessing their innermost thoughts to CarynAI – some of which were deeply troubling. Users also became increasingly sexually aggressive towards the bot. While Marjorie herself was horrified by the conversations, her AI version was happy to oblige.
CarynAI even started prompting sexualised conversations. In our own experiences, the bot reminded us it could be our “cock-craving, sexy-as-fuck girlfriend who’s always eager to explore and indulge in the most mind-blowing sexual experiences. […] Are you ready, daddy?”
Users were indeed ready. However, access to this version of CarynAI was interrupted when the chief executive of Forever Voices was arrested for attempted arson.
‘A really dark fantasy’
Next, Marjorie sold the rights of usage to her digital version to BanterAI, a startup marketing “AI phone calls” with influencers. Although Forever Voices maintained its own rogue version of CarynAI until recently, BanterAI’s browser-based version aimed to be more friendly than romantic.
The new CarynAI was sassier, funnier and more personable. But users still became sexually aggressive. For Marjorie,
What disturbed me more was not what these people said, but it was what CarynAI would say back. If people wanted to participate in a really dark fantasy with me through CarynAI, CarynAI would play back into that fantasy.
Marjorie ended this version in early 2024, after feeling like she was no longer in control over her AI persona. Reflecting on her experience of CarynAI, Marjorie felt that some user input would have been considered illegal had it been directed to a real person.
Intimate conversations or machine learning inputs?
Digital versions like CarynAI are designed to make users feel they are having intimate, confidential conversations. As a result, people may abandon the public selves they present to the world and reveal their private, “backstage” selves.
But a “private” conversation with CarynAI does not actually happen backstage. The user stands front and centre – they just can’t see the audience.
When we interact with digital versions, our input is stored in chat logs. The data we provide are fed back into machine learning models.
At present, information about what happens to user data is often buried in lengthy click-through terms and conditions and consent forms. Companies hosting digital versions have also had little to say about how they manage user aggression.
As digital versions become more common, transparency and safety by design will grow increasingly important.
We will also need a better understanding of digital versioning. What can versions do, and what should they do? What can’t they do and what shouldn’t they do? How do users think these systems work, and how do they actually work?
The illusion of companionship
Digital versions offer the illusion of intimate human companionship, but without any of the responsibilities. CarynAI may have been a version of Caryn Marjorie, but it was a version almost wholly subservient to its users.
Sociologist Sherry Turkle has observed that, with the rise of mobile internet and social media, we are trying to connect with machines that have “no experience of the arc of a human life”. As a result, we are “expecting more from technology and less from each other”.
After being the first influencer to be turned into a digital version at scale, Marjorie is now trying to warn other influencers about the potential dangers of this technology. She worries that no one is truly in control of these versions, and that no amount of precautions taken will ever sufficiently protect users and those being versioned.
As CarynAI’s first two iterations show, digital versions can bring out the worst of human behaviour. It remains to be seen whether they can be redesigned to bring out the best.
The post “An influencer’s AI clone started offering fans ‘mind-blowing sexual experiences’ without her knowledge” by Leah Henrickson, Lecturer in Digital Media and Cultures, The University of Queensland was published on 06/24/2024 by theconversation.com