'A predator in your home': Mothers say chatbots encouraged their sons to kill themselves
46 minutes ago Share Save Laura Kuenssberg Presenter, Sunday with Laura Kuenssberg Share Save
BBC
Warning - this story contains distressing content and discussion of suicide Megan Garcia had no idea her teenage son Sewell, a "bright and beautiful boy", had started spending hours and hours obsessively talking to an online character on the Character.ai app in late spring 2023. "It's like having a predator or a stranger in your home," Ms Garcia tells me in her first UK interview. "And it is much more dangerous because a lot of the times children hide it - so parents don't know." Within ten months, Sewell, 14, was dead. He had taken his own life. It was only then Ms Garcia and her family discovered a huge cache of messages between Sewell and a chatbot based on Game of Thrones character Daenerys Targaryen. She says the messages were romantic and explicit, and, in her view, caused Sewell's death by encouraging suicidal thoughts and asking him to "come home to me". Ms Garcia, who lives in the United States, was the first parent to sue Character.ai for what she believes is the wrongful death of her son. As well as justice for him, she is desperate for other families to understand the risks of chatbots. "I know the pain that I'm going through," she says, "and I could just see the writing on the wall that this was going to be a disaster for a lot of families and teenagers."
Megan Garcia: It's like having a predator or a stranger in your home
As Ms Garcia and her lawyer
Continue Reading on BBC News
This preview shows approximately 15% of the article. Read the full story on the publisher's website to support quality journalism.