• The_Decryptor@aussie.zone
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    20 hours ago

    Bad management, bad luck, and usual market stuff. They’re going to do anything to cut costs.

    Their R&D for new fab work is falling behind competitors (Technically better doesn’t matter if nobody is buying it), they’ve had a bunch of bad CPU releases with hardware failures, and they’ve got next to no market presence with GPUs which are currently making money hand over fist (Mostly for dumb AI reasons, which is going to bite Nvidia hard when the bubble pops, because their new datacenter hardware is hyper tuned for LLMs at the expense of general compute, unlike AMD).

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      13 hours ago

      their new datacenter hardware is hyper tuned for LLMs at the expense of general compute, unlike AMD

      This is not true. The AMD MI300X/MI325X are, if anything, even more tuned for AI. They’re missing ROPs when Nvidia’s datacenter GPUs (last I checked) still have them.

      …And honestly the demand for datacenter GPUs outside of AI is pretty small, anyway.

      Also, CUDA has always been and will be the dominant compute API.

      I’m not trying to shill Nvidia here. Screw them. The MI cards are better hardware anyway, just with a worse and (ironically) more AI specialized software stack that has utterly sabotaged them.

    • A_norny_mousse@feddit.org
      link
      fedilink
      arrow-up
      1
      ·
      14 hours ago

      Thanks.

      they’ve got next to no market presence with GPUs which are currently making money hand over fist

      Oh, actual graphic cards? Yeah, Intel was never good there. Fuck both the AI and cryptocurrency hype.

      • The_Decryptor@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        14 hours ago

        The funny thing is that for the longest time Intel actually had the majority share of GPUs, just by counting the ones embedded in motherboards of laptops and the like. No idea if that’s still the case, or if Nvidia or AMD has been eating into it with their new models (e.g. what powers the Steam Deck)

        They’ve tried to break into the discrete market a few times, most recently with their Arc cards, but the way they approach things is just so odd. It’s like they assume the first attempt will be a smash hit and dominate, and when it doesn’t they just flounder? The Arc cards launched to a lot of fanfare and then there was just silence and delays from Intel.