I know there’s other plausible reasons, but thought I’d use this juicy title.

What does everyone think? As someone who works outside of tech I’m curious to hear the collective thoughts of the tech minds on Lemmy.

  • ∟⊔⊤∦∣≶@lemmy.nz
    link
    fedilink
    arrow-up
    33
    arrow-down
    2
    ·
    edit-2
    1 year ago

    No, no need at all to worry about that kind of thing.

    AI (LLMs) are still just a box that spits out things when you put things in. It is a digital Galton board. That’s it.

    This is not going to take over the world.

        • Cogency@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          And also not that different from how most people would describe their fellow earthers.

          Ie - we aren’t that much more complicated than that when it gets right down to the philosophical break down of what an “I” is.

    • Sekrayray@lemmy.worldOP
      link
      fedilink
      arrow-up
      7
      arrow-down
      1
      ·
      1 year ago

      I mean, I don’t think AGI necessarily implies singularity, and I doubt singularity will ever come from LLM’s. But when you look at human intelligence one could make the argument that it is a glorified input-output system like LLM’s.

      I’m not sure. There’s a lot of things going on in the background with even human intelligence that we don’t understand.

      • agent_flounder@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Yes except human brains can learn things without the typical manual training and tweaking you see in ML. In other words, LLMs can’t just start from an initial “blank” state and train themselves autonomously. A baby starts from an initial state and learns about objects, calibrates their eyes, proprioception, movement, then learns to roll over crawl, stand, walk, grasp, learns to understand language then speak it, etc. of course there’s parental involvement and all that but not like someone training an LLM on a massive dataset.

      • xmunk@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        Spin up AI Dungeon with chatgpt and see how compelling it is once you run out of script.

        • Sekrayray@lemmy.worldOP
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          Really good point. I’ve actually messed around a lot with GPT as a 5e DM and you’re right—as soon as it needs to generate unique content it just leads you in an infinite loop that goes no where.

          • ∟⊔⊤∦∣≶@lemmy.nz
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            I’ve had some amazing fantasy conversations with LLMs running on my own GPU. Family and world history, tribal traditions, flora and fauna, etc. It’s quite amazing and fun.