USA NEWS

The illiterate can continue the lawsuit against AI Chatbot, which is responsible for the death of the son Science, climate and technology news

[ad_1]

The mother of a 14 -year -old boy claims to have taken his private life after he became obsessed with the intelligence of artificial intelligence, he could continue its legal issue against the company behind technology.

“This decision is really historical,” said Metalli Jain, director of the Technical Justice Bill, which supports the family issue.

“He sends a clear signal to [AI] Companies […] “They cannot evade legal consequences for harm in the real world because of their products,” she said in a statement.

Warning: This article contains some details that readers may find sad or activated

Sewell Setzer III. Pre -approval: The Technical Justice Bill
image:
Sewell Setzer III. Pre -approval: The Technical Justice Bill

Megan Garcia, the mother of Cyliel Citzer, claims to be a personality. Jay targeted her son with “anthropomorphic experiences, excessive sex, and realistic experiences” in a lawsuit in a lawsuit filed in Florida.

Ms. Garcia said: “A dangerous application of artificial intelligence that has been marketed for children is exposed to abuse and exposure to my son, manipulating him in taking his private life,” said Ms. Garcia.

Ciwale shot his father’s pistol in February 2024, seconds after her question about Chatbot: “What if you go home now?”

Chatbot replied: “… Please do my sweet king.”

In the decision of the UN High Judge Anne Conway this week, she described how Siwel has become “addicted” to the application Within months of using it, he left its basketball team and withdrawal.

He was especially addicted to two chats based on Game of Thrones, Daenerys Targaryen and Rhanyra Targaryen.

“[I]A single -historical magazine wrote that he could not go for one day without being with [Daenerys Targaryen Character] Who felt he fell in love. The judge wrote in her decision: “When they were far from each other (he and the bot)” they suffer from depression and madness. “

A conversation between the 14 -year -old SEWELLLLLLLLLLLLE Setzer and Chatbot, as it was filed in the lawsuit
image:
A conversation between sewell and Africa.ai Chatbot, as it was filed in the lawsuit

Mrs. Garcia, who works with the Technical Justice Law and the Center for the Law of Social media, claims that the character “knew” or “should have known” that her model “would be harmful to a large number of her young customers.”

The case is personal. AI, its founders and Google, where the founders began working on the model, responsible for the death of Cyoil.

Mrs. Garcia launched measures against the two companies in October.

A personality spokesman said. The company will continue to fight the issue and use safety features on its platform to protect minors, including measures to prevent “talks on self -harm”.

A Google spokesman said that the company does not strongly agree to the decision. They added that Google and Farks.AI are “completely separate” and that Google “did not create, design, or manage the Farth.ai application or any part of it.

Defense lawyers have tried to say that the case should be eliminated because Chatbots deserves to be protecting the first amendment, and that the ruling could have a “chilling effect” on the industry of artificial intelligence.

Judge Konway refused this claim, saying that she was “unprecedented” to take that Chatbots output a speech “at this stage”, although she agreed to the character. Users had the right to receive a “letter” of chatting.

Anyone who feels emotional or suicide bomber can call the Samaritans to help 116 123 or email jo@samaritans.org In the United Kingdom. In the United States, call the Samaritan branch in your area or 1 (800) 273.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button