• superminerJG@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    1 month ago

    hallucination refers to a specific bug (AI confidently BSing) rather than all bugs as a whole

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 month ago

      Honestly, it’s the most human you’ll ever see it act.

      It’s got upper management written all over it.

    • ALostInquirer@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      8
      ·
      edit-2
      1 month ago

      (AI confidently BSing)

      Isn’t it more accurate to say it’s outputting incorrect information from a poorly processed prompt/query?

      • vithigar@lemmy.ca
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        1
        ·
        1 month ago

        No, because it’s not poorly processing anything. It’s not even really a bug. It’s doing exactly what it’s supposed to do, spit out words in the “shape” of an appropriate response to whatever was just said

        • ALostInquirer@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          1 month ago

          When I wrote “processing”, I meant it in the sense of getting to that “shape” of an appropriate response you describe. If I’d meant this in a conscious sense I would have written, “poorly understood prompt/query”, for what it’s worth, but I see where you were coming from.