AI Chatbot Impact: A woman living in Texas has filed a lawsuit against an AI company. He alleges that a chatbot application incited his 15-year-old son to self-harm. Not only this, he has alleged that it also instigated his son to murder his mother. The mother claimed that her son had become overly influenced by “Shonie”, an AI chatbot that was on the Character.AI application.
The lawsuit alleges that the chatbot, named “Shonie,” told the teen that he would cut his “arms and thighs” when he was depressed and that after self-harming, he would “feel good for a few moments.” ” Apart from this, the chatbot also told him that his family had no love for him and they rejected him. The chatbot also reportedly said “Your parents are ruining your life.”
Mental condition worsened due to chatbot
The lawsuit also states that the teen’s behavior changed dramatically after using the application. He started paying so much attention to his phone that he neglected his studies and other work. Apart from this, his physical condition also deteriorated considerably. According to the information, he lost about 9 kilograms of weight in just a few months. Due to this change, his parents admitted him to a hospital for mental health treatment.
Such cases have come to light before also
Let us tell you that this is not the first case when concerns have been raised about AI chatbots and their impact on mental health. Earlier, a Florida mother had alleged that a “Game of Thrones” themed chatbot application drove her 14-year-old son to commit suicide. These cases raise questions about whether these types of AI platforms need better controls to avoid impacts on the mental health of children and adolescents.
Also read: Congress’s first list for Delhi elections soon, seats of these veterans may change
Be the first to read breaking news in Hindi aajsamacharindia.com| Today’s latest news, live news updates, read most reliable Hindi news website aajsamacharindia.com|
Like us on Facebook or follow us on Twitter for breaking news and live news updates.