Dell is now shifting it focus this year away from being ‘all about the AI PC.’

  • ch00f@lemmy.world
    link
    fedilink
    English
    arrow-up
    40
    ·
    2 days ago

    WTF even is an “AI PC”? I saw an ad for some AI laptop. To my knowledge, nobody is running LLMs on their personal hardware, so do these computers have like…a web browser?

    • Ledivin@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      3
      ·
      2 days ago

      To my knowledge, nobody is running LLMs on their personal hardware

      They absolutely are.

          • msage@programming.dev
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            4
            ·
            2 days ago

            Do 5% of people you know use local LLMs?

            If so, you don’t know a lot of people, or you are heavy into the lLLM scene.

            • Ledivin@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              5
              ·
              2 days ago

              Do 5% of people you know watch hockey regularly? If not, I guess it must not be a real sport, and that definitely has absolutely nothing to do with your own bubble

              • msage@programming.dev
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                2 days ago

                Surprisingly a lot of people around me watch hockey.

                And I also hear about a lot of people watching it.

                But even though I’m very much into IT, I know very few people who selfhost. Despite that being a small community, local genAI seems even smaller that that.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      It’s two things:

      • a machine that has NN-optimized segments on the CPU, or a discrete NPU
      • microslop’s idiotic marketing and branding around trying to get everyone to use Copilot
    • Gsus4@mander.xyz
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      2 days ago

      You can run it on your laptop, I’ve tried it before (PS e.g. https://www.nomic.ai/gpt4all, it’s fun to dig through your documents, but it wasn’t as useful as I thought it would be in collecting ideas), what is truly hard is to train. But yeah, what is an AI PC? Is it like a gaming rig with lotsa RAM and GPU(s)?

      • virku@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        It seems my laptop at work has a neural chip. I guess a special ai only gpu. I don’t think I could care less about a laptop feature.

      • ch00f@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I’m talking about an ad I saw on broadcast television during a football game. I don’t think the broad market of people are downloading models from huggingface or whatever.

          • ch00f@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            The ad was people doing generic AI stuff. I think it was even showing Copilot.

            Either way, the marketing for AI is far to nebulous for it to matter. Just looking for the ad, I found plenty (like this one) that explicitly mention “on-device AI,” but show people just searching for shit or doing nebulous office work. This ad even shows generating images in MS Paint which offloads the AI shit to the cloud.

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          14
          ·
          2 days ago

          I know Video editing software uses it for things like motion tracking.

          It’s all stuff your GPU can do, but the NPU can do it for like 1/10th to 1/100th the power.

        • atomicbocks@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          2 days ago

          For what it’s worth an NPU is why your phone could tell you that photo is of a cat years before LLMs were the hot new thing. They were originally marketed as accelerators for machine learning applications before everybody started calling that AI.

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      Running an LLM locally is entirely possible with fairly decent modern hardware. You just won’t be running the largest versions of the models. You’re going to run ones intended for local use, almost certainly Quantized versions. Those usually are intended to cover 90% of use cases. Most people aren’t really doing super complicated shit with these advanced models. They’re asking it the same questions they typed into Google before, just using phrasing they used 20+ years ago with Ask Jeeves.