• 12212012@z.org
    link
    fedilink
    arrow-up
    20
    ·
    2 months ago

    AI doesn’t hallucinate. It’s a fancy marketing term for when AI confidently does something in error.

    The tech billionaires would have a harder time getting the mass amounts of people that don’t understand interested if they didn’t use words like hallucinate.

    It’s a data center, not a psychiatric patient

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      It’s a fancy marketing term for when AI confidently does something in error.

      How can the AI be confident?

      We anthropomorphize the behaviors of these technologies to analogize their outputs to other phenomena observed in humans. In many cases, the analogy helps people decide how to respond to the technology itself, and that class of error.

      Describing things in terms of “hallucinations” tell users that the output shouldn’t always be trusted, regardless of how “confident” the technology seems.