The judge is not ready to judge whether the outputs of the artificial intelligence are speech
Google and Former Technologies also moved to the refusal of the lawsuit based on the first amendment claims, on the pretext that C.AI users are entitled to listen to Chatbot outputs as a supposed “speech”.
Conway agreed that personal techniques can confirm the rights of its first amendment to its users in this case, but “the court is not ready to admit that the craft is produced.
C.AI tried to say that Chatbot’s outputs should be protected like speaking from video game characters, but Conway said that the argument was not useful. The Garcia team had retreated, noting that the dialogue of video game characters was written by humans, while Chatbot’s outputs are simply as a result of LLM predict the word that should come after that.
Conway wrote: “The defendants fail to express the reason that LLM’s words are speech,” Konway wrote.
As the case progresses, letters techniques will have an opportunity to enhance the first amendment claims, perhaps by better explaining how Chatbot’s outputs resemble other cases that include inhuman amplifiers.
C.AI spokesman made a statement of ARS, indicating that Conway looks confused.
“It has been correct for a long time that the law takes time to adapt to new technology, and artificial intelligence is not different,” said Sayy’s spokesman. “On today’s order, the court clarified that he was not ready to judge all the arguments. At this stage and we look forward to continuing to defend the advantages of the case.”
C.AI also noticed that she is now providing a “separate version” of LLM “for users under 18 years”, along with “parents’ visions, refinery letters, the notification spent by time, the evacuation of a prominent responsibility, and more.”
“In addition, we have a number of technical protection aimed at discovering and preventing talks on self -harm on the platform; in some cases, this includes flattening a specific emanating users to suicide and the national crisis,” said C.AI spokesman.
If you or anyone you know feel suicide or in distress, please call a lifeline number to prevent suicide, 1-800-273-Talk (8255), which will put you in contact with a local crisis center.