Megan Garcia had no idea her teenage son, a bright and handsome boy, had started spending hours and hours obsessively talking to an online character on the Character.ai app in late spring 2023.
Garcia says it's like having a predator or a stranger in your home, and it is much more dangerous because a lot of the times children hide it, so parents don't know.
She says within ten months, Sewell her son, 14, was dead.
He had taken his own life.
It was only then Garcia and her family discovered a huge cache of messages between Sewell and a chatbot based on Game of Thrones character Daenerys Targaryen.
She says the messages were romantic and explicit, and, in her view, caused Sewell's death by encouraging suicidal thoughts and asking him to come home to her.
Garcia, who lives in the United States, was the first parent to sue Character.ai for what she believes is the wrongful death of her son.
As well as justice for him, she is desperate for other families to understand the risks of chatbots.
[Source: BBC]
Stay tuned for the latest news on our radio stations