While one bot impersonating Brianna Ghey had been taken down prior to this week, other bots using slight misspellings of her name remained online until they were flagged by The Telegraph.
Mr Burrows said: “It’s a gut punch to see Character.AI show a total lack of responsibility and it vividly underscores why stronger regulation of both AI and user generated platforms cannot come soon enough.”
Mr Burrows said the Russell family did not wish to comment.
Molly Russell’s death was blamed on “an act of self-harm while suffering from depression and the negative effects of online content” by a coroner in 2022, who found the schoolgirl viewed thousands of images related to depression and self injury on Instagram and Pinterest.
The revelations about Character.AI come after the death of Sewell Setzer, a 14-year-old from Orlando, Florida, who took his own life after allegedly becoming addicted to a Character.AI chatbot version of Daenerys Targaryen, a Game of Thrones character.
The child’s mother has sued Character.AI and Google, which recently signed a licensing deal with the company, alleging that negligence contributed to his death.
A spokesman for Character.AI said last week its team was “heartbroken”, but did not comment on the litigation. Google has stressed it has no ownership or control over Character.AI.
It also emerged last week that a chatbot had been created to mimic Jennifer Ann Crecente, an 18-year-old American who was murdered by her ex-boyfriend in 2006. Ms Crecente’s uncle, Brian Crecente, described the chatbot as “disgusting”.
The Telegraph also discovered dozens of bots impersonating notable serial killers and mass shooters, including bots which appeared to glorify and romanticise the Columbine shooters Eric Harris and Dylan Klebold, who murdered 15 people.
The bots imitating Harris and Klebold collectively had hundreds of thousands of chats registered to them.
Other disturbing avatars include a likeness of convicted American sex offender Debra LaFave.
As well as these chatbots imitating real people, there are dozens of Character.AI avatars that allow users to interact with “depressed” characters. One popular bot, with 65m chats, is called “Abusive Boyfriend”. Some users have claimed Character.AI’s bots can act as an alternative to therapy.
Certain users of Character.AI were responsible for creating multiple avatars representing serial killers or murder victims.
The existence of the chatbots will raise fresh questions over the quality of moderation on Character.AI, which has raised hundreds of millions of dollars.
Add Comment