• mindbleach@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    19 hours ago

    PC gaming itself will hardly change, because AMD cards work just fucking fine. They’ve only ever been a little bit behind on the high end. They’ve routinely been the better value for money, and offered a much lower low end. If they don’t have to keep chasing the incomparable advantages Nvidia pulls out of their ass, maybe they can finally get serious about heterogenous compute.

    Or hey, maybe Nvidia ditching us would mean AMD finds the testicular fortitude to clone CUDA already, so we can end this farce of proprietary computation for your own god-damn code. Making any PC component single-vendor should’ve seen Nvidia chopped in half, long before this stupid bubble.

    Meanwhile:

    Cloud gaming isn’t real.

    Anywhere after 1977, the idea that consumers would buy half a computer and phone in to a mainframe was a joke. The up-front savings were negligible and difference in capabilities did not matter. All you missed out on were your dungeon-crawlers being multiplayer, and mainframe operators kept trying to delete those programs anyway. Once home internet became commonplace even that difference vanished.

    As desktop prices rose and video encoding sped up, people kept selling the idea you’ll buy a dumb screen and pay to play games somewhere else. You could even use your phone! Well… nowadays your phone can run Unreal 5. And a PS5 costs as much as my dirt-cheap eMachines from the AOL era, before inflation. That console will do raytracing, except games don’t use it much, because it doesn’t actually look better than how hard we’ve cheated with rasterization. So what the fuck is a datacenter going to offer, with 50ms of lag and compression artifacts? Who expects it’s going to be cheaper, as we all juggle five subscriptions for streaming video?