• misk
    link
    fedilink
    English
    arrow-up
    179
    ·
    2 months ago

    To be fair, assembly lines of code are fairly short.

    /ducks

      • addie@feddit.uk
        link
        fedilink
        English
        arrow-up
        52
        ·
        2 months ago

        Writing in ASM is not too bad provided that there’s no operating system getting in the way. If you’re on some old 8-bit microcomputer where you’re free to read directly from the input buffers and write directly to the screen framebuffer, or if you’re doing embedded where it’s all memory-mapped IO anyway, then great. Very easy, makes a lot of sense. For games, that era basically ended with DOS, and VGA-compatible cards that you could just write bits to and have them appear on screen.

        Now, you have to display things on the screen by telling the graphics driver to do it, and so a lot of your assembly is just going to be arranging all of your data according to your platform’s C calling convention and then making syscalls, plus other tedious-but-essential requirements like making sure the stack is aligned whenever you make a jump. You might as well write macros to do that since you’ll be doing it a lot, and if you’ve written macros to do it then you might as well be using C instead, since most of C’s keywords and syntax map very closely to the ASM that would be generated by macros.

        A shame - you do learn a lot by having to tell the computer exactly what you want it to do - but I couldn’t recommend it for any non-trivial task any more. Maybe a wee bit of assembly here-and-there when you’ve some very specific data alignment or timing-sensitive requirement.

        • henfredemars@infosec.pub
          link
          fedilink
          English
          arrow-up
          20
          ·
          2 months ago

          I like ASM because it can be delightfully simple, but it’s just not very productive especially in light of today’s tooling. In practice, I use it only when nothing else will do, such as for operating system task schedulers or hardware control. It’s nice to have the opportunity every once in a while to work on an embedded system with no OS but not something I get the chance to do very often.

          On one large ASM project I worked (an RTOS) it’s exactly as you described. You end up developing your own version of everything a C compiler could have done for you for free.

      • Honytawk@lemmy.zip
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        2 months ago

        Pssh, if you haven’t coded on punch cards, you aren’t a real coder

  • mtchristo@lemm.ee
    link
    fedilink
    English
    arrow-up
    178
    arrow-down
    9
    ·
    2 months ago

    Roller coaster Tycoon is one of a lifetime game.

    Now everything is electron or react shit. Gone are the times of downloading fully featured software under 10mb.

    • sushibowl@feddit.nl
      link
      fedilink
      English
      arrow-up
      64
      ·
      2 months ago

      Fun quote from an interview with Chris Sawyer:

      Latterly the machine code came back to haunt us when the decision was made to re-launch the original game on mobile platforms as RollerCoaster Tycoon Classic a few years ago, and it took several years and a small team of programmers to re-write the entire game in C++. It actually took a lot longer to re-write the game in C++ than it took me to write the original machine code version 20 years earlier.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          16
          ·
          2 months ago

          It’s probably not because it’s sucks. It’s because they’re trying to perfectly replicate an existing target. They have to read the assembly, digest it, then create the identical solution in C++. If they were just creating a new game, it likely would be much faster.

        • BigDanishGuy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          3
          ·
          edit-2
          2 months ago

          #include <iostream> // because writing to the console is not included by default.
          int main()
          {
          std::cout << "C++ is simple and fun ... you cretin\n";
          return 0;
          }

          I had a machine language course in uni, parallel with a C++ course. Not a fun semester to be my wife, or a relative of any of my classmates. Best case our brains were in C++ mode, worst case you needed an assembler to understand us.

          And yes I know my code format will piss people off, I don’t care, it’s the way I write when other less informed people don’t force me to conform to their BS “Teh oPeNiNg bracket shouwd bwee on teh sam line ass teh declawation

          Edit: added a \n for the sake of pedantry :)

      • Klear@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 months ago

        Well worth it. The mobile version is amazing, that is to say, almost exactly the same as the original.

      • CrazyLikeGollum@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        2 months ago

        Is there not a way to take assembly and automatically translate it to some higher level language?

        Edit: Post-post thought: I guess that would basically be one step removed from decompilation which, as I understand it, is a tedious and still fairly manual process.

        • sushibowl@feddit.nl
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 months ago

          Your thought is correct. The basic problem is that higher level languages contain a lot of additional information that is lost in the compilation process.

          • Saleh@feddit.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            But do we need this information then? E.g. shouldn’t it be possible to just write what the assembler is doing as a c++ code?

            E.g. high level languages also support stuff like bitwise operators and so on.

            • sushibowl@feddit.nl
              link
              fedilink
              English
              arrow-up
              5
              ·
              2 months ago

              You could, but there isn’t much benefit. The purpose of all that extra information is generally to make the program easier to understand for a human. The computer doesn’t need any of it, that’s why it’s not preserved in compilation. So it is possible to automatically translate assembly to C++, but the resulting program would not be much (if any) easier for a human to understand and work with.

              To give a bad analogy, imagine some driving directions: turn left at 9th street, enter the highway at ramp 36, go right when you’re past the burger king, etc. These are translated into physical control inputs by the driver to actually take the car to its destination. Now we could look at the driver’s physical inputs and turn that back into a written list of instructions: turn the wheel left 70 degrees, turn it right 70 degrees, push the gas for 10 seconds, and so on.

              All the street name references are now gone. There are no abstracted instructions like “enter the highway” or even “take the second left.” It would be quite difficult for a person to look at these instructions and figure out the trip’s destination. Let alone make some alterations to it because there is roadwork along the way and a detour is needed.

              • Saleh@feddit.org
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                I get that. But the game is “finished”. there is no need for alterations. translating the assembler code into c++ in this way could serve to quickly get it in a format that is then compileable for other platforms.

                • sushibowl@feddit.nl
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  2 months ago

                  But the game is “finished”. there is no need for alterations.

                  If only that was the case. But there is no chance a game built for windows 95 could run unaltered on an android phone. Things like the rendering systems, input handling, and sound output will need to be adapted to work on a new platform.

            • themoonisacheese@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Take for example Haskell. It’s a functionnal, typed language. In Haskell, at compile time, the compiler analyzes all the types of all your functions and if they all match, it drops them completely. There is no type information at all left in a compiled Haskell program, because the compiler can know ahead of runtime if it is correct.

    • flashgnash@lemm.ee
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      6
      ·
      edit-2
      2 months ago

      I don’t think old=good is a good mentality though, lot of people seem to have it

      All the old software I know and use is exceptionally good, however I’ve heard about and chosen to use it because it’s survived the test of time (also because it’s still actively maintained and has had thousands of bug fixes over the years)

      Vscode and obsidian are pretty good and they’re electron, discord’s alright, pretty sure steam uses some kind of web wrapper as well.

      Real issue is electron is very accessible to inexperienced developers and easy to do badly, but I imagine people back in the old Unix days got an equal amount of shit bloated software

      • lorty@lemmy.ml
        link
        fedilink
        English
        arrow-up
        20
        ·
        2 months ago

        Survivor bias is a thing and part of the reason people are nostalgic for old media.

        • Lennny@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          2 months ago

          For every There Will Be Blood, there exists an Alien vs Predator: Requiem

          • Madison420@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            2 months ago

            I mean avp requiem is a cult classic because it’s so bad so maybe not a good example. You need a movie that is immediately forgettable which is difficult because I forgot their names…

      • PraiseTheSoup@lemm.ee
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        4
        ·
        2 months ago

        Discord is garbage software lmao. Has been from the beginning. I can’t stand using it.

        • Buddahriffic@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          And I had to stop using vscode because of its ridiculous resource usage. I got tired of it filling up my home dir and just went back to vim.

          An intern was using it, but I saw that he had set it up to run locally and connect to the ETX we were using and figured he had found a way to avoid that. Nope, turns out it runs a server on the ETX that also likes to fill up the home dir and he also just uses vim now.

        • SynopsisTantilize@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          2 months ago

          Seconded. The only reason I have it installed is because my buddy refuses to answer his cell while we play games.

            • SynopsisTantilize@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              2 months ago

              I’d rather do a phone call on speakerphone while playing games…yes. I don’t wear headphones unless they’re wireless and I only put in one ear.

              • vonbaronhans@midwest.social
                link
                fedilink
                English
                arrow-up
                5
                ·
                edit-2
                2 months ago

                So like… do you play the game with no sound? Does your gaming partner hear everything coming through your speakers into your phone’s microphone?

                I’m just struggling to understand how that could be a good experience for anyone, including you. Am I just missing something?

                Edit: oh, I missed the wireless earphone on one side thing. Is that for your phone or for the game?

                • SynopsisTantilize@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  Headphone would be for the phone. And background noise cancellation is perfect now days so it’s not an issue with speakers playing at a reasonable level

        • flashgnash@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          2 months ago

          I’m not saying it’s phenomenal but it’s generally pretty well featured, running in a browser it’s not that heavy resource wise and the API/developer features are very good

      • bufalo1973@lemmy.ml
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        A fucking calculator needs megabytes to run? And I’m not talking about a full fledged graphic scientific calculator. I’m talking about a basic one.

        • flashgnash@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          2 months ago

          Gnome calculator uses 103m, it’s loading style sheets for themes, UI libraries that make it look nice and modern, scientific calculator features, keyboard shortcuts, nice graphical settings menu, touch screen and screen reader support etc

          I don’t think in this day and age for all the niceties people are used to that’s unreasonable.

          Also other calculators are available, some are bloated but I’m sure there’s a rust or C one out there somewhere that uses a fraction of that with the bare minimum feature set

          • UpperBroccoli@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 months ago

            bc is 91 kilobytes and can work with seriously big numbers.

            You want to know what 2^99812 is? bc will tell you. Hint: the result is so big I could not paste it in here. bc does not care, bc just delivers.

            Not saying there is anything wrong with a GUI calculator using 103m of RAM and looking fancy while only working with tiny numbers, just saying.

            • flashgnash@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              I mean personally if I need a heavy duty calculator I’ll just use python or something

      • Peruvian_Skies@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        2 months ago

        Old=good is a great mentality specifically when standing the test of time is an important factor. For the most part, the old code that’s still used today is only still used because it’s proven good, whereas it’s a grab bag with newer code. And that’s the cause of the unwarranted nostalgia thay you’re rightfully criticising.

        It’s like with music. “Oh, the X’s were the best decade for music, today’s music is garbage”. No, 90% of everything is crud but unless you’re an enthusiast, once enough time has passed, you’ll only ever be exposed to the 10% that isn’t. 50 years from now nobody is going to be listening to Cardi B.

        • psud@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I listen to music on a new music radio station - the good new music really stands out

          Most people just like the (better bits of) stuff they listened to when they were young

          • Peruvian_Skies@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            That isn’t the whole picture. I was born in 1988. The sampling of music from the 70’s that I’ve been exposed to is completely different to the sampling of music from the same period that someone born in '58 was exposed to in their lifetime. They got to listen to a bunch of bad stuff (and probably some great stuff) that I don’t even know exists.

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        If you want to get a glimpse of the hate against Unix in the early 90’s, give “Unix hater’s handbook” a read. It’s a funny piece

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        47
        arrow-down
        4
        ·
        2 months ago

        Probably not as optimized though.

        RCT could run on a toaster from the 90’s (ok, maybe early 2000’s) and looked amazing for the time.

        OpenRCT can run on a toaster from the 2010’s and looks great because of the timeless art style of the original.

        It’s still an incredible feat, though!

        • patatahooligan@lemmy.world
          link
          fedilink
          English
          arrow-up
          22
          ·
          2 months ago

          You are very unlikely to write assembly that is more optimized than what a modern compiler could produce for anything longer than a trivial program. I don’t know if it made sense at the time of the original RCT, but OpenRCT would definitely not benefit from being written in assembly.

          • jas0n@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            edit-2
            2 months ago

            I feel like that’s only true if I was asked to “write the assembly for this c++ program.” If I’m actually implementing something big in assembly, I’m not going to do 90% of the craziness someone might be tempted to do in c++. Something that is super easy in c++ doesn’t mean it’s easy for the CPU. Writing assembly, I’m going to do what’s easy for the CPU (and efficient) because, now, I’m in the same domain.

            The bottom line is cranking up the optimization level can get you a 2-5x win. Using memory efficiently can give you a 10-100x win.

            • patatahooligan@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              ·
              2 months ago

              Using memory efficiently can give you a 10-100x win.

              Yes, it can. But why is this exclusive to assembly? What are you planning to do with your memory use in assembly that is not achievable in C++ or other languages? Memory optimizations are largely about data structures and access patterns. This is available to you in C++.

              Also, if you don’t want 90% of the craziness of C++ then why not just code in C++ without 90% of the craziness? As far as I know what’s what a lot of performance-critical projects do. They operate with a feature whitelist/blacklist. Don’t tell me you have the discipline to work entirely in assembly and the knowledge to beat the compiler at the low level stuff that is not available to you in C++ but you can’t manage avoiding the costly abstractions.

              I think it speaks volumes how rarely you hear about programs being programmed in assembly. It’s always this one game and never any meaningful way to prove that it would gain performance by not being written in C++ when using a modern compiler.

              • jas0n@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                I shouldn’t have used C++ as the example. Even C would work. I agree with everything you’re saying, but the original premise. I think if you put ASM vs C, C++, rust, etc, performance would fall near 50/50.

                I’m not the best assembly guy, and I’m not advocating we all write it. But I always felt that the compiler optimization assumption was wrong or weak. Everything would be aligned nicely for my sanity, not performance =]

  • lugal@lemmy.world
    link
    fedilink
    English
    arrow-up
    156
    arrow-down
    1
    ·
    2 months ago

    I don’t know if everyone gets the reference: RollerCoaster Tycoon is in fact writing mostly in assembly to use the hardware more efficiently

      • Faresh@lemmy.ml
        link
        fedilink
        English
        arrow-up
        54
        arrow-down
        1
        ·
        edit-2
        2 months ago

        Writing it in assembly would make it pretty much the opposite of portable (not accounting for emulation), since you are directly giving instructions to a specific hardware and OS.

          • __dev@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            ·
            2 months ago

            That’s no less true than games written in C, or otherwise with few dependencies. Doom is way more portable than RCT precisely because it’s written in C instead of assembly.

          • Faresh@lemmy.ml
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            2 months ago

            you’re not usually directly accessing/working on the hardware

            I mean, you are. Sure, there’s a layer of abstraction when doing tasks that require the intervention of the kernel, but you are still dealing with cpu registers and stuff like that. Merely by writing in assembly you are making your software less portable because you are writing for a specific ISA that only a certain family of processors can read, and talking with the kernel through an API or ABI that is specific to the kernel (standards like Posix mitigate the latter part somewhat, but some systems (windows) aren’t Posix compilant).

          • Fubber Nuckin'@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            We created the world of monorail 1. Everything exists to bring more people to monorail 1. What is monorail 1? It is a 4 car monorail that takes the shortest possible path back to the start of the station. We have several other attractions at the park such as: The Pit; Memento Mori; Install CSS, but none of them are the main attraction.

  • einlander@lemmy.world
    link
    fedilink
    English
    arrow-up
    152
    arrow-down
    7
    ·
    2 months ago
    • Programming was never meant to be abstract so far from the hardware.
    • 640k is enough ram for everybody.
    • The come with names like rust, typescript, go, and python. Names thought up by imbeciles.
    • Dev environments, environmental variables, build and make scripts, and macros, from the minds of the utter deranged.

    They have played us for fools

    • mynameisigglepiggle@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 months ago

      I dabbled with making a fairly complex program for a microcontroller the other day and quickly hit the stack limit for a simple object.

      It wasn’t so much that it was a large object, but to provide flexibility I was amazed how fast I filled the memory.

      I’ve done heaps with memory managed languages in the past but shit as soon as I had to think about what I was doing under the hood everything got hard af.

      So serious question - does anyone have any good resources for a competent programmer, but with no clue whatsoever how to manage memory in a microcontroller space and avoid fragmentation etc?

      I got it to work but I’m sure I did shit job and want to be better at it.

      • BigDanishGuy@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 months ago

        The best book I’ve ever bought on programming, and the second best book I bought for a class in uni, was https://dl.acm.org/doi/book/10.5555/1824214 it may be worth checking out on libgen and buy if it suits your needs.

        Whenever I do low-level programming on the AVR architecture, I’ll make a memory map. As in I’ll map out where I’ll put what. It may not be suitable for more complex programs, but it does the job for me. And it has enabled teamwork in assembly in the past.

        If you want to work in a language that doesn’t offer memory management, but manually mapping memory isn’t feasible either, how about building your own memory management? Or perhaps use an RTOS? I’ve used freeRTOS before on various arm-based micros, and it does take a bit to get started, but after that it’s easy sailing.

        Sorry for the following tangent, all semi intelligent content in this comment is found above this line.
        BTW I tried CoOS once, I wouldn’t recommend it… OK it was 12 years ago, I can’t remember exactly what was wrong other than the documentation was crap, but I don’t need to remember why to hold a grudge.

    • CascadianGiraffe@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Adobe promised that Lingo was the future of ‘PC and internet gaming’

      Luckily by the time I had to learn to write that garbage I already coded in several other languages. Made it easier, but somehow more painful. I’m pretty sure that shit was designed so that executives could look at the code and pretend they understood what was going on. At least with ‘common terms’ it eliminated the need for commenting out most of the time. One line of code would take a paragraph of text lol.

  • Wilzax@lemmy.world
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    5
    ·
    2 months ago

    Your game will actually likely be more efficient if written in C. The gcc compiler has become ridiculously optimized and probably knows more tricks than you do.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      English
      arrow-up
      46
      ·
      2 months ago

      Especially these days. Current-gen x86 architecture has all kinds of insane optimizations and special instruction sets that the Pentium I never had (e.g. SSE). You really do need a higher-level compiler at your back to make the most of it these days. And even then, there are cases where you have to resort to inline ASM or processor-specific intrinsics to optimize to the level that Roller Coaster Tycoon is/was. (original system specs)

      • KubeRoot@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        I might be wrong, but doesn’t SSE require you to explicitly use it in C/C++? Laying out your data as arrays and specifically calling the SIMD operations on them?

        • acockworkorange@mander.xyz
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          There’s absolutely nothing you can do in C that you can’t also do in assembly. Because assembly is just the bunch of bits that the compiler generates.

          That said, you’d have to be insane to write a game featuring SIMD instructions these days in assembly.

          • Buddahriffic@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            I think they meant the other way around, that if you wanted to use it in C/C++, you’d have to either use assembly or some specific SSE construct otherwise the compiler wouldn’t bother.

            That probably was the case at one point, but I’d be surprised if it’s still the case. Though maybe that’s part of the reason why the Intel compiler can generate faster code. But I suspect it’s more of a case of better optimization by people who have a better understanding of how it works under the hood, and maybe better utilization of newer instruction set extensions.

            SSE has been around for a long time and is present in most (all?) x86 chips these days and I’d be very surprised if gcc and other popular compilers don’t use it effectively today. Some of the other extensions might be different though.

            • calcopiritus@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 months ago

              If you want to use instructions from an extension (for example SIMD), you either: provide 2 versions of the function, or just won’t run in some CPUs. It would be weird for someone that doesn’t know about that to compile it for x86 and then have it not run on another x86 machine. I don’t think compilers use those instructions if you don’t tell them too.

              Anyway, the SIMD the compilers will do is nowhere near the amount that it’s possible. If you manually use SIMD intrinsics/inline SIMD assembly, chances are that it will be faster than what the compiler would do. Especially because you are reducing the % of CPUs your program can run on.

            • acockworkorange@mander.xyz
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Oh I see your point. Yeah, I think they meant that. And yes, there was a time you’d have to do trickery in C to force the use of SSE or whatever extensions you wanted to use.

          • Wilzax@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            Technically assembly is a human-readable, paper-thin abstraction of the machine code. It really only implements one additional feature over raw machine code and that’s labels, which prevents you from having to rewrite jump and goto instructions EVERY TIME you refactor upstream code to have a different number of instructions.

            So not strictly the bunch of bits. But very close to it.

        • dejected_warp_core@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Honestly, I’m not 100% sure. I would bet that a modern compiler would just “do the right thing” but I’ve never written code in such a high performance fashion before.

      • Wilzax@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        ·
        2 months ago

        If you’re writing sloppy C code your assembly code probably won’t work either

          • calcopiritus@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            I recently came across a rust book on how pointers aren’t just ints, because of UB.

            fn main() {
                a = &1
                b = &2
                a++
                if a == b {
                    *a = 3
                    print(b)
                }
            }
            

            This may either: not print anything, print 3 or print 2.

            Depending on the compiler, since b isn’t changed at all, it might optimize the print for print(2) instead of print(b). Even though everyone can agree that it should either not print anything or 3, but never 2.

          • Buddahriffic@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            A compiler making assumptions like that about undefined behaviour sounds just like a bug. Maybe the bug is in the spec rather than the compiler, but I can’t think of any time it would be better to optimize that code out entirely because UB is detected rather than just throwing an error or warning and otherwise ignoring the edge cases where the behaviour might break. It sounds like the worst possible option exactly for the reasons listed in that blog.

            • calcopiritus@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              The thing about UB is that many optimizations are possible precisely because the spec specified it as UB. And the spec did so in order to make these optimizations possible.

              Codebases are not 6 lines long, they are hundreds of thousands. Without optimizations like those, many CPU cycles would be lost to unnecessary code being executed.

              If you write C/C++, it is because you either hate yourself or the application’s performance is important, and these optimizations are needed.

              The reason rust is so impressive nowadays is that you can write high performing code without risking accidentally doing UB. And if you are going to write code that might result in UB, you have to explicitly state so with unsafe. But for C/C++, there’s no saving. If you want your compiler to optimize code in those languages, you are going to have loaded guns pointing at your feet all the time.

  • MonkeMischief@lemmy.today
    link
    fedilink
    English
    arrow-up
    129
    ·
    2 months ago

    I love Roller Coaster Tycoon. It’s absolutely crazy how he managed to write a game in a way many wouldn’t even attempt even in those days, but it’s not just a technical feat, it’s a creative masterpiece that’s still an absolute blast to play.

    It still blows my mind how smoothly it gives the illusion of 3D and physics, yet it can run on almost anything.

    OpenRCT brings a lot of quality of life and is often the recommended way to play today, but the original RCT will always deserve a spot on any “Best Games of All Time” list.

    • dai@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      2 months ago

      It was even ported to the original Xbox. I remember the total games file size being incredibly small - compared to most other titles on that system.

      • ClassifiedPancake@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        1
        ·
        edit-2
        2 months ago

        I’m a developer, I don’t just continue doing things for years if it doesn’t make sense.

        (If I’m the one making the decisions)

          • unemployedclaquer
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            3
            ·
            2 months ago

            programmers just not a uniform bunch. not all of them blockchain grifters. fancy that.

        • ziggurat@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          ·
          2 months ago

          Like the classic, inherit a broken code base, and not being allowed by the owner to rewrite it from scratch. So you have to spend more time making each part work without the others working. Also before you are finished the customer says they have something else for you to do

          • derpgon@programming.dev
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 months ago

            That’s when you start introducing modules that have the least impact on the legacy code base. Messaging is a good place to start, but building a new code next to the existing one and slowly refactoring whenever you got time to spare is at least a bearable way to go about it.

            • drphungky@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              Shhhh you just described iterative development. Careful not to be pro agile, or the developers with no social skills will start attacking you for being a scrum master in disguise!

              • derpgon@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 months ago

                Fuck agile, or scrum, or whatever it is called. I just look at the issues and pick whatever I feel like doing. Kanban for life.

          • jabjoe@feddit.uk
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 months ago

            Programmers love to rewrite things, but it’s often not a good idea, let alone good for a business. Old code can be ugly because it is covered with horrible leasons and compromises. A rewrite can be the right thing, but it’s not to be taken lightly. It needs to be budgeted for, signed off on and carefully planned. The old system needs to stable enough to continue until the new system can replace it.

            • ziggurat@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Okay, I’ll tell you, in this situation, the code never really worked outside of the demo stage. It was written in bash+ansibel+terraform+puppet designed to use ssh from a docker container and run stages of the code on different servers. And some of it supposedly worked on his computer, but when it failed to run when he was not clicking the buttons, and I read through each part, I can promise you that it never worked

              I didn’t write broken code base because I didn’t like the code, I meant that it didn’t work

              • jabjoe@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 months ago

                The whole point of docker is to solve the “work on my computer” by providing the developer hacked up OS with the app. (Rather than fixing it and dealing dependencies like a grown up)

                Bit special for it to still be broken. If it flat out doesn’t work, at all, then it may well be “sunk cost fallacy” to keep working on it. There is no universal answer, but there is a developer tendency to rewrite.

                • ziggurat@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 months ago

                  I’ll consede that his point in using docker was to avoid the “it works on my computer” problem. It was literally one of his talking points in his handover meeting. But that is not the problem docker is trying to solve, and not it’s strength.

                  Docker and similar container software makes many things very convenient, and has uses far outside it’s originally intended usage.

                  And in this situation, when want stable package versions, and simpler uniform setup. And you don’t have stable package versions because docker doesn’t provide reproducible builds (and he didn’t do the work work srojdn that), and it is not a simpler setup when you want to use the hosts ssh agent with ssh inside docker, which require different steps for different distros, Mac and Idk if windows would have worked? And sharing your ssh agent into the docker image is not stable either even if you set it up, it isn’t sure to work the next reboot. And can be every difficult in some Linux distros due to permissions, etc.

                  Then I ended up putting it on a vm, that is already used for utilities. If I were to do it today, I would probsbly use nix, to actually run these programs that is very sensitive to program version changes in a stable reproducible environment that can run on any Linux distro, including in docker

                  But the program had many more issues, like editing yaml files by catting them and piping them into tac and piping into sed and then into tac again… And before you say you could do that just with one sed command, sure, but the sane solution is to use yq. Let’s just say that was the tip of the iceberg

                  Oh and just have to note, claimed working features, but no way for that code the be executed, and when I actually tried to hook up this code, I can’t believe its ever fully worked.

  • Valmond@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    4
    ·
    2 months ago

    try writing it it in Assembly

    Small error, game crashes and takes whole PC with it burning a hole in the ground.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      It was really easy to crash an Apple II game and get into the assembler. And my goodness am I glad I didn’t destroy my computer as a kid randomly typing things in to see what would happen.

      • Valmond@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Remember old Apple, had to use them when learning to program, there were 2 types, one with the OS on a diskette, one with a small hard drive, and they randomly showed a large bomb in the middle of the screen and you had to reload the OS. Probably the compuler that broke everything.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    2
    ·
    2 months ago

    Step 1: Begin writing in Assembly

    Step 2: Write C

    Step 3: Use C to write C#

    Step 4: Implement Unity

    Step 5: Write your game

    Step 6: ???

    Step 7: Profit

    • Capt. Wolf@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      2 months ago

      I tried decades ago. Grew up learning BASIC and then C, how hard could it be? For a 12 year old with no formal teacher and only books to go off of, it turns out, very. I’ve learned a lot of coding languages on my own since, but I still can’t make heads or tales of assembly.

      • Dubiousx99@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        ·
        2 months ago

        Assembly requires a knowledge of the cpu architecture pipeline and memory storage addressing. Those concepts are generally abstracted away in modern languages

        • WolfLink@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          12
          ·
          edit-2
          2 months ago

          You don’t need to know the details of the CPU architecture and pipeline, just the instruction set.

          Memory addressing is barely abstracted in C, and indexing in some form of list is common in most programming languages, so I don’t think that’s too hard to learn.

          You might need to learn the details of the OS. That would get more complicated.

          • Dubiousx99@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            2 months ago

            I said modern programming languages. I do not consider C a modern language. The point still stands about abstraction in modern languages. You don’t need to understand memory allocation to code in modern languages, but the understanding will greatly benefit you.

            I still contend that knowledge of the cpu pipeline is important or else your code will wind up with a bunch of code that is constantly resulting in CPU interrupts. I guess you could say you can code in assembly without knowledge of the cpu architecture, but you won’t be making any code that runs better the output code from other languages.

      • zod000@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        Sounds very similar to my own experience though there was a large amount of Pascal in between BASIC and C.

        • Capt. Wolf@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          Yeah, I skipped Pascal, but it at least makes sense when you look at it. By the time my family finally jumped over to PC, C was more viable. Then in college, when I finally had to opportunity to formally learn, it was just C++ and HTML… We didn’t even get Java!

          • zod000@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            I had used like four different flavors of BASIC by the time I got a IBM compatible PC, but I ended up getting on the Borland train and ended up with Turbo Pascal, Turbo C, and Turbo ASM (and Turbo C++ that I totally bounced off of). I was in the first class at my school that learned Java in college. It was the brand new version 1.0.6! It was so rough and new, but honestly I liked it. It’s wildly different now.

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    27
    ·
    2 months ago

    Shifts bit to the left

    Um what am I doing

    Shifts bit to the right

    program crashes

    • ayyy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      68
      ·
      2 months ago

      The game Roller Coaster Tycoon was famously hand written in raw CPU instructions (called assembly language). It’s only one step removed from writing literal ones and zeros. Normally computers are programmed using a human-friendly language which is then “compiled” into CPU instructions so that the humans don’t have to deal with the tedium and complication of writing CPU instructions.

      • OsrsNeedsF2P@lemmy.ml
        link
        fedilink
        English
        arrow-up
        22
        ·
        edit-2
        2 months ago

        To further emphasize this, I had an assembly course in university. During my first lab, the instructor told us to add a comment explaining what every line of assembly code did, because if we didn’t, we would forget what we wrote.

        I listened to his advice, but one day I was in a rush, so I didn’t leave comments. I swear, I looked away from the computer for like 2 minutes, looked back, and had no idea what I wrote. I basically had to redo my work.

        It is not that much better than reading 1s and 0s. In fact in that course, we spent a lot of time converting 1s and 0s (by hand) to assembly and back. Got pretty good at it, would never even think of writing a game. I would literally rather create my own compiler and programming language than write a game in assembly.

        • pivot_root@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          ·
          2 months ago

          I’m probably completely insane and deranged, but I actually like assembly. With decent reverse engineering software like Ghidra, it’s not terribly difficult to understand the intent and operation of isolated functions.

          Mnemonics for the amd64 AVX extensions can go the fuck right off a bridge, though. VCVTTPS2UQQ might as well be my hands rolling across a keyboard, not a truncated conversation from packed single precision floats into packed unsigned quadword integers.

          • emergencybird@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            2 months ago

            I had a course in uni that taught us assembler on z/os. My advisor told me most students fail the course on the first try because it was so tough and my Prof for that course said if any of us managed to get at least a B in the course, he’d write us a rec letter for graduate school. That course was the most difficult and most fun I’ve ever had. I learned how to properly use registers to store my values for calculations, I learned how to use subroutines. Earned myself that B and went on to take the follow up course which was COBOL. You’re not crazy, I yearn to go back to doing low level programming, I’m mostly doing ruby for my job but I think my heart never left assembler hahaha

          • MonkderVierte@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 months ago

            Ah yes, there was this guy in our tech school class that used to code golf in assembly. Was a crack in math and analytics too, which might explain it somewhat. Well, everyone is different i guess.

        • ericbomb@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 months ago

          To send the point home even more, this is how in python you make a line of text display:

          print("Hello World")

          This is the same thing, in assembly (According to a blog I found. I can’t read this. I am not build better.)

            org  0x100        ; .com files always start 256 bytes into the segment
          
              ; int 21h is going to want...
          
              mov  dx, msg      ; the address of or message in dx
              mov  ah, 9        ; ah=9 - "print string" sub-function
              int  0x21         ; call dos services
          
              mov  ah, 0x4c     ; "terminate program" sub-function
              int  0x21         ; call dos services
          
              msg  db 'Hello, World!', 0x0d, 0x0a, '$'   ; $-terminated message
          

          But python turns that cute little line up top, into that mess at the bottom.

          I like python. Python is cute. Anyone can read python.

          • pivot_root@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 months ago

            That assembly is for a DOS application. It would be more verbose for a modern Linux or Win32 application and probably require a linker script.

            But python turns that cute little line up top, into that mess at the bottom.

            Technically, not quite. Python is interpreted, so it’s more like “call the print function with this string parameter” gets fed into another program, which calls it’s own functions to make it happen.

            • ericbomb@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Yeah over simplifying it a bit, and that’s funny that the stupid thing I found wasn’t even stupid enough.

              But was mostly trying to impart that we should be happy for modern languages, because for every line you write in a modern language, it’ll do a dozen things on the back end for you that in assembly you’d need to do by hand.