What irks me is when game developers ties the physics engine to the framerate. We all know this will cause issues down the road, could we just… not?
Not doing it also causes issues in the form of micro stutters when some but not other frames have updated physics or not. Frame pacing is hard, and locking everything down happens to be the only sure-fire way to completely eliminate display issues. But then, of course, you have a locked frame rate.
I imagine there is some reason we still see this. Any devs in the Industry lurking?
I’m not in the industry, but I’ve dabbled in Unity and that’s just kind of how it works by default. You create a game object and it gets an Update() function that is called once per frame. You’re encouraged to perform calculations and update it’s position in that callback.
You’re supposed to use
Time.deltaTime
to scale your calculations based on how long it’s been since the last frame.But that takes effort and it’s very easy to just not do that and your game will still work fine in most cases.
Unity also has FixedUpdate which is encouraged to be used for any physics related updates.
They better delivery that “visual fidelity” if you are already capping at 30 fps on a current-gen console.
I think they fixed this with Fallout 76, so here’s hoping that those changes also made its way into their future projects.
Assuming that it will be ported to PC then I’m sure they’ve resolved those issues.
Is this really a thing lately? Maybe on some Switch games, but I think most modern games can have a dynamic framerate.
Remember Red Dead Redemption 2? On PC, your stats depleted faster the more FPS you had so with 60FPS you’d get hungry twice as fast as with 30FPS. Iirc even the sun moved faster so a day was only half as long.
deleted by creator
How does that work in multiplayer
Skyrim famously did this. So the concern that Starfield could have similar issues is not unfounded.
One one hand, Skyrim was released in 2011, on the other hand Skyrim was also released in 2022
“Game dev here,” Carlone writes, adding that they are a “big fan” of Dreamcast Guy. “Wanted to clarify: it’s not a sign of an unfinished game. It’s a choice. 60fps on this scale would be a large hit to the visual fidelity. My guess is they want to go for a seamless look and less ‘pop in.’ And of course, [it’s] your right to dislike the choice.”
Sure. Maybe. It could be this. Or…
Arm-chair babbling idiot who plays too much video games here, I am one hundred percent convinced that it has nothing to do with visual fidelity and everything to do with that asthmatic engine they’ve been dragging since Morrowind. Can’t prove it but… you know. Just a hunch I get from playing their games.
People constantly complain about the engine that they use but no other game engine is as flexible when it comes to modding and no other game engine has the same level of complexity when it comes to being able to pick stuff up and move it around. You can take items off a shelf or desk in skyrim and fallout and stack them somewhere else. You can if you want decide to hoard a bunch of garbage you stole and stack them into a pyramid in your home base area.
Are their quirks? Sure the physics tied to framerate in skyrim was a problem, the games are always buggy, and they arent usually the prettiest games out there(though skyrim looked decent when it first came out and the graphical fidelity mods can work magic).
As for the premise does it have to do with fidelity? Of course it does. Setting a framecap on consoles means theyre able to use higher resolution assets, better lighting effects, and more complex models. I understand the preference of giving up fidelity for some smoothness and frames but 30fps isnt totally uncommon in console spaces and this is a bethesda game not a twitch shooter or a 2d fighter.
Outside the PC space gamers hardly ever talk about or think about framer rate. Graphical effects and details and fisual fidelity are a higher priority and more important in a game where generally you mostly just walk around and explore.
It would be nice if they had an option for a lower res mode or less detailed mode and 60fps target, but I get why they made the choice they did and ideally Im sure it’ll run at a normal framerate on pc.
Now if it runs poorly on PC then we can riot.
It’s also a personal choice of Bethesda not to rename their engine. Many other studios do this same thing and reuse engines, but they often rename them after significant rewrites. Bethesda just doesn’t do that.
Also they aren’t worried about how the game will be released. Their games have legs. So a 60fps version will eventually come out. Then they’ll release it 5 more times.
But they did? For Oblivion it was Gamebryo, for Skyrim it was the Creation Engine
https://en.m.wikipedia.org/wiki/Creation_Engine
The Creation Engine is a 3D video game engine created by Bethesda Game Studios based on the Gamebryo engine.
I mean that they haven’t changed it from the Creation engine. Which has been used since Skyrim despite some big rewrites for Fallout and I’m sure more big rewrites or additions for Starfield
But it’s only been 2 games since Skyrim, right? And for Starfield it’s being renamed Creation Engine 2. Either way that statement “Bethesda just doesn’t do that.” Doesn’t seem accurate when they have done that multiple times.
Huh okay yeah that’s fair. I guess I’m thinking more about the time span since that game engine is now well over a decade old whereas the previous examples are separated by a handful of years. And I didn’t know about them putting a ‘2’ in front of it for Starfield.
I also agree with that. I love the modding aspect of it and I fear it’ll go away with a new engine.
No, it’s most definitely a choice. You can make any engine run at 60 FPS if you sacrifice something else for it. The RE engine runs beautiful games at 60 FPS, but they had to make all sorts of sacrifices to fidelity to get World Tour in Street Fighter 6 to run at all, let alone at 60 FPS on current gen consoles.
I mean sure but give us the choice, damn it! :(
The choice is playing on PC, because unless the game was designed by complete shitheads who decided they don’t need a settings menu, you’ll actually get a choice of what features you do or don’t enable. Console games should have PC-style settings menus, but they don’t. For me, buying a new PC game always involves chores: turning off chromatic aberration, depth of field, motion blur, and other nonsense so I can claw back like 45 additional FPS.
I no longer trust triple A games on PC and if the game is not ridiculously busted optimisation wise for PC at release I would be amazed.
Yeah that’s the other side of it for sure, it’s going to be a lazy port with ham-sized icons, grandma-approved giant text, and menus that can only be navigated with the arrow keys and spacebar.
Starfield on PC will probably be great after modders finish nhe game for them though. Sucks that the physics are almost certainly going to be tied to the 30fps framerate, hopefully SFSE (starfield script extender) is out not too long after launch
That’s exactly why I mainly play on PC nowadays. I didn’t like PC gaming 10-15 years ago, but now I love being able to play at 4K60 / 1440p60 by downgrading settings I don’t care.
Arm-chair babbling idiot who plays too much video games here, I am one hundred percent convinced that it has nothing to do with visual fidelity and everything to do with that asthmatic engine they’ve been dragging since Morrowind.
Code doesn’t go bad with time, that’s not really how it works. And game engines tend to be a Ship of Theseus situation, where just because it’s still the same “engine” in theory, doesn’t mean that large parts (or all of it), haven’t eventually been replaced or refactored over the years.
Unreal Engine has been around for 30 years at this point, would you also consider that an “asthmatic engine”?
No, what I mean is that this engine always had a cobbled up together with duct tape feels to it. It’s also the beauty of it.
Some engines get better and some just get more and more spaghetti duct tape.
It sure reads like they are saying “more fps makes game look bad”, but my assumption is that they mean " if we want this to run at higher fps we will have to reduce fidelity or the engine cant handle it". At least thats what I hope they mean
Yeah, reducing graphical fidelity is often one of the things needed in order to increase framerate. That is not unique to their engine, or any engine.
If only players could make that choice themselves, perhaps through some sort of graphics settings menu. No, that’s crazy and unprecedented, it could never work.
I’m sure you’re right about this. Probably the framerate bounces all over the place which feels much worse than simply locking it to 30fps and having a consistent experience. I think a PC has the potential to simply brute force it into 60fps, but an Xbox simply cannot. Which is probably fine. The game is said to run at 4k and 1440p depending on which Xbox you have, and for a game like this where exploration is going to play a big role, those visuals will do a lot of silent storytelling.
I would rather walk over a hill and see an incredible alien sunset on some moon, than have more frames, especially if those frames are bouncing around between 60 and 40 and going over that hill stutters and jerks spoiling the immersion.
that asthmatic engine they’ve been dragging since Morrowind
I don’t believe that’s true at all, though. At least by Wikipedia, Morrowind was NetImmerse, Oblivion was Gamebryo (modified Havok), and Skyrim was Creation. And I remember in the announcements for Skyrim that they remade the engine for the game. And Starfield is an updated engine, Creation 2
Gamebryo was called netimmerse until 2003. Creation is a modified gamebryo. So Creation 2 will also be based on it. So yes they use kinda the same engine since morrowind. Beteshda will not change away from it because gamebryo is a large reason why the modding community is as strong as it is for skyrim etc. And the modding community sells a lot of copies!
The engine also started as an engine for MMOs, which allowed them rich scripting for every NPC, as well as an inventory for every NPC.
The world fidelity that Bethesda builds, on a technical and simulation level, is unmatched — yeah, something like The Witcher 3 might look better, but it also doesn’t let you interact with basically every item in the world or pickpocket every NPC’s weapon as a way to neutralize them in combat.
I think that they talked at one point about how they had it running at 60 fps at one point, but they opted for a more “stable” experience (translation: the amount of frame drops probably would’ve made Cyberpunk on the base Xbox One blush).
Call of Duty still runs on the Quake 3 engine, if we go off of the logic people uncharitably use for Bethesda’s games specifically.
This incessant nagging about fps is the most tiresome thing in gaming since gamergate.
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
What is the absolute most important thing about every video game? They all have it in common: there are zero video games ever made, ever, where this isn’t the absolute most important thing that there is.
The answer is: being able to play it. Is a game that crashes to desktop every time you move the camera a good game? No. If I can feel comfortable judging whether or not a video game is any good based on whether or not it passes that single metric, I feel even more comfortable to extend it to “being able to see it without motion sickness and eye strain”. Wanting your game to be optimized properly and not a juddery slide show isn’t entitlement, it’s the bare minimum of functionality.
Every video game and every TV program for DECADES ran at 30fps. 29.97, actually. Nobody was motion sick or got eye strain.
Removed by mod
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Nobody was motion sick or got eye strain.
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
Most games of the NES, Genesis, and SNES era ran at 240p, 60fps (in the NTSC regions).
The difference is that TV and movies have a consistent delay between frames. That is often not the case with video games.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?
I agree up to a point. If a game is at 30 and feels good to play, then I’m OK. For example, Zelda feels great. Controlling Link is tight and snappy.
On the other hand, if the game has bad frame pacing (like Bloodborne), playing at 30 feels real bad.
I try not to get too crazy about frames, but sometimes some games just don’t feel good.
I will say, though, that while I really like channels like Digital Foundry, I sometimes wonder if them picking apart games to show the most minor frame dips is slowly teaching us to see these things, and as a result we kind of subconsciously will be like, “Well now I noticed this game had some moments where the frames dropped during an explosion. Obviously it’s a bad game.” I know that’s some hyperbole, but still.
It’s also heavily dependend on getting used to it. There are games that have a quality and a performance mode where I sometimes start to think I’m at 60FPS until I switch to the actual 60FPS mode and realize that it’s a completely different feeling. Switching back lets those 30FPS seem pretty bad. But if I didn’t had the possibility of switching between those two, I would’ve been happy with the 30.
But as you said it has to be rock stable. I played GoW Ragnarök on my PS5 and that Quality 30FPS mode was just terrible anmd felt like 20FPS
I have no problem playing 30 fps games. It’s when it goes really low like 10 fps that it’s choppy. Like dark souls in blighttown. Now that isn’t the best to play. But it’s still doable and an awesome game. 30 fps games are great. 60 is butter but not necessarily.
No lock 24 and add film effect
30fps for top end consoles in 2023 is absolutely pathetic.
deleted by creator
on Xbox
I don’t know, Bethesda’s games have always been iffy when it comes to FPS. Skyrim for example breaks if you mod it to have over 60 fps if I remember correctly. Even on Fallout 76 movement speed was kinda tied to the FPS so players looking at the ground (so that less things render thus increasing FPS) would run faster than others.
I honestly wouldn’t be surprised if they cap it to 60 FPS on PC as well.
The Skyrim FPS lock to 60 fps was due to the physics engine not working beyond that. Knowing Bethesda, and knowing the fact that in the 30 or so rereleases of Skyrim they’ve never fixed that, I wouldn’t be surprised if that’s still there
Oh it definitely has something to do with this. They have been dragging this engine with them since Morrowind. I really hope that now they are with Microsoft that money can be poured into a new engine build from the ground up for Bethesda type games.
Funnily enough, despite Fallout 76 being completely (and IMO deservedly) derided, they actually did fix some aspects of the physics being tied to the frame rate. More specifically, the frame rate doesn’t affect player speed. I’m not sure if frame rate is completely decoupled, but there has actually been work done on that front since Skyrim.
Fallout 76 always seemed to me to be an engine modernization project that the needed a game attached to fund it. It’s in an okay spot now, and it was properly derided when it was released, but I don’t think most people understand the amount of work it’s taken to update the low-level internals of how their engine works. It’ll be very interesting to see how Starfield plays.
Of course it is. As always a choice between visual quality and framerate.
DF did a pretty decent video on the whole 30fps question. https://www.youtube.com/watch?v=i9ikne_9iEI
I honestly don’t really mind if a console game runs at a steady 30 fps. I just know that it isn’t going to be steady lol
Yeah that’s a weird choice. Todd Howard said that the game has performed upwards of 60fps in some places, but they made the choice to lock it down to 30fps on console for full graphical fidelity.
I get that not everyone has a TV that supports VRR, but they should be able to programmatically check what the xbox is currently supporting. If it is a Series-X and does support VRR they should be able to unlock FPS to up-to 60. I mean even 40fps on the steamdeck is surprisingly good, whereas 30 can be really jarring. Or give a choice to the user, 4k@30 or 1440p@60 with VRR.
If there is a ‘mods’ system like Skyrim on Xbox, it should be possible to remove the frame rate cap. People managed it with Xbox before they added FPS Boost to Skyrim, using INI tweaks and a dummy ESP plugin. That’s without VRR, though.
Ya, I’d also like to see a 40fps mode. Really adds to the smoothness. Todd Howard suggested that they still had a decent amount of overhead, just not enough to hit 60 consistently. Would be nice if 40 became a new standard option, at least.
I don’t understand it. If it’s a problem on console, why not have a full-fidelity “quality” mode but also offer a reduced-fidelity “performance” mode? Presumably there could be options like that similar to the PC build.
I just hope it plays 60+ on PC, who knows it could be pretty rough. I haven’t watched the digital foundry video on it yet, I assume it’s just Xbox version.
Reminds me of Horizon Zero Dawn running at 30fps but it felt silky smooth because the FPS was rock solid.
yeah… if you have not ever played anything 60fps or more.
30fps is normally alright for 3rd person adventure games, but shooting, especially first person, might feel different. Idk, I still don’t know how to feel about this one. The digital foundry guys seemed to be supportive, but I still just don’t trust Bethesda lately
Personally, frame rate is much more important to me than most other factors. If there’s the option to, I’m going to be cutting a lot to get it to 60, especially since I’m somehow way below specs with a 6600 xt
I decided to play Jedi Survivor at 30 on the PS5 just to sort of get the feeling of 30 and as I began playing, I"m like alright this is okay I’m loving the graphics and how awesome everything looks. Played for about 30 minutes like this at 30fps then decided alright I’m gonna toggle performance mode on and see how it compares now that I’ve experienced 30 and whew… it was a night and day difference. It felt so silky smooth, despite the fps drops in Survivor it still felt 100000% times better over 30. Just the smoothness and fluidity is insanely good. It was like I was going from slow motion to real life when I made the switch.
I really hope Starfield can feel good, but man being first person at 30 is gonna be rough I bet. I really hope I’m wrong and it’ll be decent though.
It will be no different than playing Fallout 4 at 30fps on Xbox One in the past, so I don’t mind too much. I also play a lot of games on Steam Deck though, so I’m used to lower framerates. Will be playing Starfield on Xbox Series X though, with Game Pass.
Ya, when switching between the modes, you can really feel the difference. Funnily enough, Jedi Survivor was one of the few games I actually preferred the 30fps mode. Even with VRR, the performance mode was just too inconsistent for me.
I would really like to see a balanced, 40fps mode on Starfield. If they can’t consistently maintain the 60fps mode, at least offer some choice for people who want a higher framerate, even if there are other concessions (ie. inconsistent performance).
FPS itself is not as important as consistent and low frametime.
If frametime graph is pretty much flat the stuttering would be low and overall experience is nice, but if it’s janky one would like to drop the game or decrease quality settings pretty fast.
No Man’s Sky runs at a very stable 60fps, I personally know people who have wrangled it up to 120fps. I know they don’t have the same underlying tech, but they’re very similar in terms of gameplay (from what we’ve seen)
They’re a wildly different level of detail though. The NMS physics engine is pretty simplistic, mostly effecting NPCs and a very few physics objects. Starfield is like other Bethesda games, tons of little items and junk that all have their own physics and interactions.
Ya, they just talked about this on Digital Foundry. Starfield (as with most Bethesda games) has a bunch of persistent objects and NPCs to track, which makes is most likely CPU limited. They showed a city section in Star Citizen which was getting like 20fps, which would be a better comparison than No Mans Sky.
My guess is that gunplay in any version of “creation engine” is going to be janky as …
30 FTS for this kind of game should not matter too much in world, but I agree that it’s pretty disappointing. But I’m extremely skepticial about the whole release anyway.