November 23, 2024

Sharing is caring!

A mother has claimed her teenage son was goaded into killing himself by an AI chatbot he was in love with – and she’s unveiled a lawsuit on Wednesday against the makers of the artificial intelligence app.

Sewell Setzer III, a 14-year-old ninth grader in Orlando, Florida, spent the last weeks of his life texting a AI character named after Daenerys Targaryen, a character on ‘Game of Thrones.’

Right before Sewell took his life, the chatbot told him to ‘please come home’.

Before then, their chats ranged from romantic to sexually charged and those resembling two two friends chatting about life.

The chatbot, which was created on role-playing app Character.AI, was designed to always text back and always answer in character.

It’s not known whether Sewell knew ‘Dany,’ as he called the chatbot, wasn’t a real person – despite the app having a disclaimer at the bottom of all the chats that reads, ‘Remember: Everything Characters say is made up.

But he did tell Dany how he ‘hated’ himself and how he felt empty and exhausted.

When he eventually confessed his suicidal thoughts to the chatbot, it was the beginning of the end, The New York Times reported.

Sharing is caring!

Leave a Reply

Your email address will not be published. Required fields are marked *