The worst kind of an Internet-herpaderp. Internet-urpo pahimmasta päästä.

  • 8 Posts
  • 453 Comments
Joined 2 years ago
cake
Cake day: July 24th, 2023

help-circle




  • MalixtoPC Gaming@lemmy.caNEW RELEASE - PGA TOUR 2K25
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 days ago

    I’ve played 2k21 and 2k23, both pretty fine for casual/beer golfing with friends while absolutely trash talking to them in voice chat. It’s pretty great… but:

    The absolutely only reason I got 2k23 was because the 2k21 had stayed full price ever after the 2k23 released, so those of my friend group who didn’t already have the game didn’t (understandably) want to cough up 60-70€ for the game. Once 2k23 got a reasonable sale, we all got that as it was more affordable - but functionally identical to the older game.

    I’m going to assume the same pattern will repeat, and it’s only because of the in-game mtx which they want you to buy AGAIN, for basically the same game but with different number in the title. To my beer-gaming group the mtx stuff is entirely irrelevant, we just play with whatever gear the game gave us from the get go, this isn’t equipment-race.



  • Well, it’s a gametype/genre I tend to enjoy greatly - so I can probably overlook quite a bit of jank/issues/whatevs and still get some enjoyment out of it. First round was the “blind go”, second round I wanted to see what can be done differently and what kinds of different outcomes there are. IIRC not much changes when doing stuff differently. Admittedly the second round around was bit of a slog - I think I played it through, but not 100% of it.

    To me the “okay” means more of a “more fun than not”. The game isn’t great by any means, but it’s not also off-putting to play, but I don’t feel like I need to re-install the game ever again. Also, the game isn’t terribly long either.

    But, opinions, everybody has them. :)





  • I’m not trying to bust your chops or anything.

    didn’t take it as such, no worries. We cool? :)

    And I bet the whole “'murica” stuff is quiiite a bit more prevalent.

    Aaaaanyhoo, feels like these kinda drunklol games are pretty much just youtuber/streamer-bait for cheap giggles - and I do watch quite a bit of gaming content so these things are kinda inescapable. Oh well. Old man yells at cloud. :P





  • MalixtoMechanical Keyboards@lemmy.mlI regret getting a Keychron Q1
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    14 days ago

    about the spacebar:

    • you might want to swap the switch on that, could be it’s a wonky one.
    • unless I’m horribly wrong, the spacebar width is the standard 6.25u width one, finding a keyboard with shorter one could turn out to be nearly impossible unless there’s some unicorn-one-of-akind-layout out there, unless you’re looking at ergo/split/40/other-weirdo -ones.

    about via:

    if you can flash it with a firmware which supports vial, it might provide a better customization experience. edit: linky: https://get.vial.today/




  • zstd is generally stupidly fast and quite efficient.

    probably not exactly how steam does it, or even close, but as a quick & dirty comparison: compressed and decompressed a random CD.iso (~375 MB) I had laying about, using zstd and lzma, using 1MB dictitionary:

    test system: Arch linux (btw, as is customary) laptop with AMD Ryzen 7 PRO 7840U cpu.

    used commands & results:

    Zstd:

    # compress (--maxdict 1048576 - sets the used compression dictionary to 1MB) :
    % time zstd --maxdict 1048576 < DISC.ISO > DISC.zstd
    zstd --maxdict 1048576 < DISC.ISO > DISC.zstd  1,83s user 0,42s system 120% cpu 1,873 total
    
    # decompress:
    % time zstd -d < DISC.zstd > /dev/null
    zstd -d < DISC.zstd > /dev/null  0,36s user 0,08s system 121% cpu 0,362 total
    
    • resulting archive was 229 MB, ~61% of original.
    • ~1.9s to compress
    • ~0.4s to decompress

    So, pretty quick all around.

    Lzma:

    # compress (the -1e argument implies setting preset which uses 1MB dictionary size):
    % time lzma -1e < DISC.ISO > DISC.lzma
    lzma -1e < DISC.ISO > DISC.lzma  172,65s user 0,91s system 98% cpu 2:56,16 total
    
    #decompress:
    % time lzma -d < DISC.lzma > /dev/null
    lzma -d < DISC.lzma > /dev/null  4,37s user 0,08s system 98% cpu 4,493 total
    
    • ~179 MB archive, ~48% of original-
    • ~3min to compress
    • ~4.5s to decompress

    This one felt like forever to compress.

    So, my takeaway here is that the time cost to compress is enough to waste a bit of disk space for sake of speed.

    and lastly, just because I was curious, ran zstd on max compression settings too:

    % time zstd --maxdict 1048576 -9 < DISC.ISO > DISC.2.zstd
    zstd --maxdict 1048576 -9 < DISC.ISO > DISC.2.zstd  10,98s user 0,40s system 102% cpu 11,129 total
    
    % time zstd -d < DISC.2.zstd > /dev/null 
    zstd -d < DISC.2.zstd > /dev/null  0,47s user 0,07s system 111% cpu 0,488 total
    

    ~11s compression time, ~0.5s decompression, archive size was ~211 MB.

    deemed it wasn’t nescessary to spend time to compress the archive with lzma’s max settings.

    Now I’ll be taking notes when people start correcting me & explaining why these “benchmarks” are wrong :P

    edit:

    goofed a bit with the max compression settings, added the same dictionary size.

    edit 2: one of the reasons for the change might be syncing files between their servers. IIRC zstd can be compressed to be “rsync compatible”, allowing partial file syncs instead of syncing entire file, saving in bandwidth. Not sure if lzma does the same.