I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games’ graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics’ level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn’t need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

  • DaSaw@midwest.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    I agree with everything he said. But I’ve also been saying things like that for thirty years. I remember when Morrowind came out complaining about companies using extra processing for shitty 3D graphics instead of sticking with high quality 2d that works perfectly fine and putting that extra processing power to work on better AI or something.

    I think the problem is that better graphics is the one thing they can do that will please a mass audience. Sure, there are plenty of other things they could be doing, but I would bet that each of them has a niche appeal that will have fewer fans to spread the cost among. Thus producers of “AAA” titles pretty much by definition have to pursue that mass audience. The question is when they reach that point of diminishing returns and be becomes more profitable to produce lower cost niche titles for smaller audience. And we also have to factor in that part of that “profit” of pleasing that assumption our society has that anything with niche appeal is necessarily “lower” in status than mass appeal stuff.

    I think we are approaching that point, if we haven’t already reached it. Indie stuff is becoming more and more popular, and more prevalent. It’s just hard to tell because indie stuff tends to target a smaller but more passionate audience. For example, while I am looking forward to trying Starfield out, I may be too busy playing yet more Stardew Valley to buy it right away, and end up grabbing it in a sale. (I haven’t even really checked if it’ll run on my current gaming laptop.)