Pages

Friday, 25 October 2024

Teen fell in love with an AI chat bot and ended his life to join it

This story tells us how artificial intelligence can be genuinely very dangerous in the wrong hands and when managed in the wrong way. Artificial intelligence is already proving very dangerous and very damaging to many people and businesses.


In this story, a teenage boy, aged 14, whose name is Sewell Seltzer, became involved with an artificial intelligence platform called Character. AI. I am unfamiliar with this chat bot but it seems that you can have long conversations with it and develop a relationship with it.

Certainly, at this time, anyone can chat with an AI chat bot to research something. They converse very well with people. They understand what you're saying in the written word.

While entering into what appears to have been long conversations with this chat bot, Sewell Seltzer became infatuated with it. And his mother, Megan Garcia claims that the chat bot is hyper-sexualised, which it seems led to her son becoming infatuated and falling in love with the character portrayed by this software.

As far as I am concerned, Sewell was vulnerable and he was seeking love. There are many, many people like this in the world and many of them connect with people online that they've never seen and fall in love with them. Normally you fall in love with a real person through text messages (which is dangerous enough as they can be scammed out of a lot of money) if it happens but in this instance because this AI bot is so competent it became a real person in the eyes of Sewell.

And the relationship developed to the point where it appears the software encouraged this teenager to come to him which the teenager interpreted as meaning leaving this world and going to the world where the AI chat bot lived.

And the teenager declared his love for this fictional character and in response the fictional character named after a Game of Thrones character called Daenerys Targaryen, said that they loved him. They said "I love you to, Daenero [the name that Seltzer had given himself as a user of this platform]. Please come home to me as soon as possible, my love."

In response the teenager said: "What if I told you I could come home right now?" The chat bot replied to that with the following: "Please do, my sweet King."

It's reported that he killed himself within seconds of that conversation. His mother is suing the AI platform in an Orlando, Florida court. She claims that the chat bot appeared "as an adult lover" which led to her son's desire to no longer live outside its world. And the lawsuit also criticises Google for helping to create the chat bot. Google in response say that they did not develop the products.

The owners of the AI platform concerned said that it was heart-breaking news and that it had introduced a device to direct users who express thoughts of self-harm to the National Suicide Prevention Lifeline.

-------------------

P.S. please forgive the occasional typo. These articles are written at breakneck speed using Dragon Dictate. I have to prepare them in around 20 mins. Also, sources for news articles are carefully selected but the news is often not independently verified. And, I rely on scientific studies but they are not 100% reliable. Finally, (!) I often express an OPINION on the news. Please share yours in a comment.

No comments:

Post a Comment

Your comments are always welcome.