• Rhaedas@fedia.io
    link
    fedilink
    arrow-up
    42
    ·
    9 hours ago

    We accept how people anthropomorphize basic machines, but when they fall for more sophisticated things like LLMs we call them dumb. Blame the human brain, it’s wired to be that way.

    Also blame the corporate greed that ruined what could have been a good tool if developed responsibly.

    • rustydrd@sh.itjust.works
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      3 hours ago

      I don’t have a rational reason for this, but I think anthropomorphizing machines makes more sense on an emotional level for dumb machines than for smart ones. Kind of like the brave little toaster, or Wall-E going against the space ship’s autopilot. I guess part of it might be that we see the limitations dumb machines have and it reminds us about our own flaws and limitations, which makes us empathize with a Roomba more than with Alexa or ChatGPT.

    • Tyrq@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 hours ago

      Difference is they don’t believe the vacuum is sentient, but llms use natural language in a form where we already accepted the text came from a human before now

        • zod000@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          6
          ·
          3 hours ago

          I don’t think they are sentient, but there is definitely some sort of evil involved. I’m not sure if they themselves are the source of evil or they are acting as some sort of portal that lets evil into our world.

    • usrtrv@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 hours ago

      I think embodiment plays a larger role. We anthropomorphize physical things.

      The reverse of this, people often dehumanize other people when interacting with them virtually.