there are a few Windows games running on Linux via Steam Play (Proton), that [the previous vm.max_map_count] limit can actually be exceeded – DayZ, Hogwarts Legacy, Counter Strike 2, and other games.
Sounds good, especially if Ubuntu, Fedora, and other distros have already made this change. Are there any downsides to raising this value?
Raising the limit may increase the memory consumption on the server. There is no immediate consumption of the memory, as this will be used only when the software requests, but it can allow a larger application footprint on the server.
So I guess it just means that apps are less constrained so when an app or game is particularly a ram hog it can eat up more ram
Not really it seems. Steam has raised it to the MAX_INT - 5, Fedora originally planned to do so to but held off after concerns from engineers that it could lead to situations where having too much mapped would lead to the kernel killing other processes to solve OOM situations so they settled on the number that has now also been adapted by Arch. At least, according to another Phoronix article.
I’m betting on trial and error. Probably has a mysterious comment in the code, as a warning to other.
Source: I develop software and find these things are either deeply researched and analyzed scientific theory, or the result of guesswork and trial and error. At a ratio of about 95% and 5% respectively.
Sounds good, especially if Ubuntu, Fedora, and other distros have already made this change. Are there any downsides to raising this value?
The only potential con I can find of raising it is from a redhat solution thing (https://access.redhat.com/solutions/99913)
So I guess it just means that apps are less constrained so when an app or game is particularly a ram hog it can eat up more ram
Not really it seems. Steam has raised it to the MAX_INT - 5, Fedora originally planned to do so to but held off after concerns from engineers that it could lead to situations where having too much mapped would lead to the kernel killing other processes to solve OOM situations so they settled on the number that has now also been adapted by Arch. At least, according to another Phoronix article.
Now I’m interested where the 5 comes from.
I wonder too!
I’m betting on trial and error. Probably has a mysterious comment in the code, as a warning to other.
Source: I develop software and find these things are either deeply researched and analyzed scientific theory, or the result of guesswork and trial and error. At a ratio of about 95% and 5% respectively.
So deeply researched was set at
MAX_PERCENT - 5
?95% being trial and error. At least where I work.