Remove ads, unlock a dark mode theme, and get other perks by upgrading your account. Experience the website the way it's meant to be.

Anthropic Apologizes After One of Its Expert Witnesses Cited a Fake Article

Discussion in 'Article Discussion' started by Melody Bot, May 16, 2025 at 8:32 AM.

  1. Melody Bot

    Your friendly little forum bot. Staff Member

    This article has been imported from chorus.fm for discussion. All of the forum rules still apply.

    Maxwell Zeff, writing for TechCrunch:

    A lawyer representing Anthropic admitted to using an erroneous citation created by the company’s Claude AI chatbot in its ongoing legal battle with music publishers, according to a filing made in a Northern California court on Thursday.

    Claude hallucinated the citation with “an inaccurate title and inaccurate authors,” Anthropic says in the filing, first reported by Bloomberg. Anthropic’s lawyers explain that their “manual citation check” did not catch it, nor several other errors that were caused by Claude’s hallucinations.

    Anthropic apologized for the error and called it “an honest citation mistake and not a fabrication of authority.”

    more

    Not all embedded content is displayed here. You can view the original to see embedded videos and other embedded content.
     
  2. DandonTRJ

    ~~~ヾ(^∇^ Supporter

    It's sloppy as hell, but (thankfully for Anthropic) not a merits issue. The article exists, the expert witness read it and relied on it, the lawyers just bungled the citation when preparing the declaration and nobody caught it. Doubly embarassing because it's a big firm, triply embarassing because it was caused by their own client's software, quadruply embarassing because that software is basically on trial. This is why I don't let AI anywhere near my work product.