No shit. When 1080s from 6 years ago still work fine, there’s clearly some stagnation. They need to cut prices if they want people to actually buy their shit.
Intel needs to come thru with Battlemage and fuck up team red and and team green
I think it helps that AAA graphics got so realistic that improvements feel more incremental relative to older games, and indie games proved that much simpler, cheaper graphics are viable and often even preferred, and devs started going for stylized art over realism more often. Probably also helps that Steam Deck is a thing now, and the Switch allows 3rd party games, so that hardware can be a target to consider too.
Anyway yeah. I’m still running a 1070, and at absolute worst I might have to reduce some graphics settings in the latest or most poorly optimized games, and we’re long past the days where moderate or even minimal graphics settings looked awful. Games are still beautiful on lower settings.
A better GPU at this point would net me better FPS in some titles, but those games make up a relatively tiny proportion of what I play, and even then I still get a perfectly playable framerate as is.
So, yeah, not paying those prices for a tiny upgrade, and not when I remember prices pre-covid and pre-crypto miners. I can afford to wait out their greed.
I keep explaining to people how the world actually kind of benefits from the Graphical Plateau; but so many insist to me “You will want more pixels. Have you seen raytracing?”
The Steam Deck mostly gives an upper bounds for how much hardware a game should demand for the next few years, and it’s probably lower than some developers wanted it to be.
The silliest thing about raytracing in particular is it was planned to be a developer convenience. So in an RTX-only future, we were all going to upgrade to much more powerful GPUs, only to run games that look about as good as what we already have.
I absolutely love raytracing… and on my 3080 it just doesn’t look good enough yet to justify turning it on for most games. Maybe they just haven’t implemented it well yet, but the reduced framerate in most games just isn’t worth it, and I’ve hated effects like screen-space reflections since more or less they came out.
I think by the time we have a 50X0 or a 60X0 that raytracing will finally be fast enough to have it look good AND perform well. But for now it’s mostly just a gimmick I turn on to appreciate, and then turn back off so I can actually play the game smoothly.
It might be that they’ll put more time and effort into getting it looking right once more people can run it at all, too. I’m not sure what percentage of PC gamers have sufficiently new/powerful GPU’s to run it, but I’d suspect it’s still small, and I’d think there’s only so much time and effort that devs will want to put into something that most people won’t see at all, when they could spend those resources for other aspects of the game (including other aspects of graphics) instead.
The one thing I would really like now is better audio. Both stuff like better 3D positional audio (e.g. Deathloop if you turn that setting on - although the setting kept turning itself off for me, which was maddening) and more varied and complex sound effects and music. It can make a huge difference, even when people don’t consciously notice.
I didn’t know that about raytracing as a developer convenience - that’s really funny.
I do think raytracing is really cool, and when it’s available I think I’d rather have it than not, all else being equal. But… it seems like the kind of thing that I’d notice and appreciate when it’s there, but I don’t notice its absence, either, and can enjoy my games overall just the same without it.
Whatever happened with Intel’s discrete GPUs? I got whiplash trying to follow the news. At one point I thought the news was that they were discontinuing them altogether. But are they proceeding now?
No shit. When 1080s from 6 years ago still work fine, there’s clearly some stagnation. They need to cut prices if they want people to actually buy their shit.
Intel needs to come thru with Battlemage and fuck up team red and and team green
I think it helps that AAA graphics got so realistic that improvements feel more incremental relative to older games, and indie games proved that much simpler, cheaper graphics are viable and often even preferred, and devs started going for stylized art over realism more often. Probably also helps that Steam Deck is a thing now, and the Switch allows 3rd party games, so that hardware can be a target to consider too.
Anyway yeah. I’m still running a 1070, and at absolute worst I might have to reduce some graphics settings in the latest or most poorly optimized games, and we’re long past the days where moderate or even minimal graphics settings looked awful. Games are still beautiful on lower settings.
A better GPU at this point would net me better FPS in some titles, but those games make up a relatively tiny proportion of what I play, and even then I still get a perfectly playable framerate as is.
So, yeah, not paying those prices for a tiny upgrade, and not when I remember prices pre-covid and pre-crypto miners. I can afford to wait out their greed.
I keep explaining to people how the world actually kind of benefits from the Graphical Plateau; but so many insist to me “You will want more pixels. Have you seen raytracing?”
The Steam Deck mostly gives an upper bounds for how much hardware a game should demand for the next few years, and it’s probably lower than some developers wanted it to be.
The silliest thing about raytracing in particular is it was planned to be a developer convenience. So in an RTX-only future, we were all going to upgrade to much more powerful GPUs, only to run games that look about as good as what we already have.
I absolutely love raytracing… and on my 3080 it just doesn’t look good enough yet to justify turning it on for most games. Maybe they just haven’t implemented it well yet, but the reduced framerate in most games just isn’t worth it, and I’ve hated effects like screen-space reflections since more or less they came out.
I think by the time we have a 50X0 or a 60X0 that raytracing will finally be fast enough to have it look good AND perform well. But for now it’s mostly just a gimmick I turn on to appreciate, and then turn back off so I can actually play the game smoothly.
It might be that they’ll put more time and effort into getting it looking right once more people can run it at all, too. I’m not sure what percentage of PC gamers have sufficiently new/powerful GPU’s to run it, but I’d suspect it’s still small, and I’d think there’s only so much time and effort that devs will want to put into something that most people won’t see at all, when they could spend those resources for other aspects of the game (including other aspects of graphics) instead.
The one thing I would really like now is better audio. Both stuff like better 3D positional audio (e.g. Deathloop if you turn that setting on - although the setting kept turning itself off for me, which was maddening) and more varied and complex sound effects and music. It can make a huge difference, even when people don’t consciously notice.
I didn’t know that about raytracing as a developer convenience - that’s really funny.
I do think raytracing is really cool, and when it’s available I think I’d rather have it than not, all else being equal. But… it seems like the kind of thing that I’d notice and appreciate when it’s there, but I don’t notice its absence, either, and can enjoy my games overall just the same without it.
Agreed, my 1080Ti is still all I need.
Whatever happened with Intel’s discrete GPUs? I got whiplash trying to follow the news. At one point I thought the news was that they were discontinuing them altogether. But are they proceeding now?
Honestly, pretty damn well. If they keep with it, I see good things for them.
Imo, the A770 is a lower mid end hero. They’ve really improved their driver support, and I think Battlemage is going to be great.