News

AI 'hallucinations' are causing lawyers professional embarrassment, sanctions from judges and lost cases. Why do they keep ...
The lawyers blamed AI tools, including ChatGPT, for errors such as including non-existent quotes from other cases.
Anthropic has responded to allegations that it used an AI-fabricated source in its legal battle against music publishers, ...
Hallucinations from AI in court documents are infuriating judges. Experts predict that the problem’s only going to get worse.
Anthropic’s attorney admitted to using an imagined source in an ongoing legal battle between the AI company and music ...
Claude hallucinated the citation with “an inaccurate title and inaccurate authors,” Anthropic says in the filing, first ...
The federal judge, Susan van Keulen, then ordered Anthropic to officially respond to these claims. This lawsuit is part of a ...
Anthropic on Thursday admitted that a faulty reference in a court paper was the result of its own AI assistant Claude and ...
The flawed citations, or "hallucinations," appeared in an April 30, 2025 declaration [PDF] from Anthropic data scientist ...
The AI chatbot was used to help draft a citation in an expert report for Anthropic's copyright lawsuit.
The chatbot added wording errors to a citation ... to allegations that it used an AI-fabricated source in its legal battle against music publishers, saying its Claude chatbot made an “honest ...