• ℕ𝕖𝕞𝕠
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    11 months ago

    No, because not all my decisions are language-based. As gotchas go, this one’s particularly lazy.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      6
      ·
      11 months ago

      I’m having a hard time imagining a decision that can’t be language based.

      You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.

      • ℕ𝕖𝕞𝕠
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        11 months ago

        But I don’t make all my decisions linguistically. A model that did would never act as I do.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          2
          arrow-down
          5
          ·
          edit-2
          11 months ago

          It doesn’t matter how it comes to make a decision as long as the outcome is the same.

          Sorry, this is beside the point. Forget ChatGPT.

          What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?

          • ℕ𝕖𝕞𝕠
            link
            fedilink
            arrow-up
            1
            ·
            11 months ago

            “Is something that acts exactly like you act indistinguishable from you?”

            Well, yes.