• MentalEdge
    link
    fedilink
    arrow-up
    41
    arrow-down
    6
    ·
    edit-2
    9 months ago

    Lower frame rates can be perfectly fine, I find I’m far more bothered by inconsistent frametimes.

    The main reason 40fps feels fine on the deck is that the display can come down to that same Hz and operate in lockstep.

    I’ll take consistent 60 over hitchy 165 most of the time, though VRR means you can occupy kind of a middle ground. But even there frametime inconsistencies can make for a shit experience.

    My point is that game developers should aim to deliver games that render at similar framerates throughout.

    So many of these recent games do hit decent framerates, but then there’s that one in-game location, enemy type, player ability, or particle effect, that just makes the framerate completely shit itself.

    It’s like these studios are designing each element with a given GPU budget, pushing things right up to the limit, and then do a surprised pikachu face when things run like shit once they try to put more than of these elements together to make an actual game.

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      11
      ·
      9 months ago

      165 that dips to 100 is unquestionably better than 60 with no dips, especially with GSync.

      165 that dips below 60 is very bad.

      • flames5123@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Yea I was about to say. I play games that stay around 120 on my hardware and dip to maybe 80 sometimes. It’s not that noticeable, especially during action and if the dips aren’t super sudden drops. But 45-60 is noticeable.

      • MentalEdge
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        9 months ago

        That depends. VRR works beautifully when you walk through in-game locations and the framerate smoothly shifts up and down.

        What it’s absolutely shit at dealing with, is VFX that cause the frametimes to spike by an order of magnitude for just a frame or two. Something which is common in some games with a lot of player ability and enemy attack effects going off.

        In these cases I will actually just turn VRR off, and play at lower framerate that are consistently achievable.

        VRR is nice, and I absolutely do use it most of the time, but its very nature means that the latency in the human processing loop that is hand-eye-coordination becomes inconsistent. When it’s working smoothly with the framerate smoothly shifting around, it’s fine.

        But the kind of hitching I’m talking about isn’t the kind where the overall framerate shifts, but the kind where just couple frames take orders of magnitude longer to render, and that interfering with my hand-eye-coordination. I would have been in better shape to pull off a turn, shot, movement or whatever, had the game been running at that framerate the whole time.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      My point is that game developers should aim to deliver games that render at similar framerates throughout.

      Scenes in most games usually have a high variety of complexity, so the way you’d achieve that is through getting a baseline quite a bit higher than your target FPS, and then limit FPS to your target FPS. This way the game won’t utilize near 100 % of the GPU most of the time, but peaks in scene complexity won’t cause FPS to drop below the set cap.

      This is how it works or at least use to work for a lot of games on console. On PC, you almost always have to make the choice yourself (which is a good thing if you ask me).

      For many games with a lot of changing scenery I have to target around 45 FPS with graphics settings to even have a chance of achieving somewhat consistent 30 FPS/33.33ms on the Deck.

      On the one hand the Deck is heavily underpowered compared to even lower-end PCs. On the other hand tests show that the Z1 Extreme/7840U isn’t much faster at these lower wattages (10-15 watts TDP), so there hasn’t been a lot of progress yet.

      But it’s also that many games don’t scale so well anymore. I feel like half the settings in many modern games don’t affect performance to any noticeable degree, and even fewer settings affect CPU usage. And if there’s low settings, the game often looks unrecognizable because these lower setting models, textures and lighting/shadows are simply generated by the engine SDK and rarely given second thoughts.

      • MentalEdge
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Tech like nanite rendering does bring a potential of maybe solving that variability. But even before that, LODs, detail render distance limits etc. already allow frame rates to be leveled out, if utilized.

        And I would consider 30 and 45 within that “similar” range. I’m not asking the framerate to stay within even 10% of an average at all times. But games are getting a lot worse than that.

        A recent game even my desktop has been struggling with is Forbidden West, which I tuned the settings on to achieve 80-100 fps, yet in some locations (larger settlements) it will chug down to 20-30.

        Some newer games aren’t just losing 33% fps at worst vs best. But more like 70%. At that point you end up having to target 200fps just to never drop below 60, and that’s tricky even on high end desktops.