• Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      95
      arrow-down
      2
      ·
      1 year ago

      You mean OpenAI didn’t just create a superintelligent artificial brain that will surpass all human ability and knowledge and make our species obsolete?

      • kescusay@lemmy.world
        link
        fedilink
        English
        arrow-up
        71
        arrow-down
        5
        ·
        1 year ago

        The funny thing is, last year when ChatGPT was released, people freaked out about the same thing.

        Some of it was downright gleeful. Buncha people told me my job (I’m a software developer) was on the chopping block, because ChatGPT could do it all.

        Turns out, not so much.

        I swear, I think some people really want to see software developers lose their jobs, because they hate what they don’t understand, and they don’t understand what we do.

        • enkers@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          32
          arrow-down
          1
          ·
          edit-2
          1 year ago

          As a software developer, I do want to see software developers lose their jobs to AI. This shouldn’t be surprising, as the purpose of a lot of software development is to put other people out of a job via automation, and that’s fundamentally a good thing. The alternative is like wanting a return to preindustrial society. Automation generally raises quality of life.

          The real problem is that we still haven’t figured out how to distribute the benefits of society’s automation efforts equitably so that they raise quality of life for everyone.

          • nicetriangle@kbin.social
            link
            fedilink
            arrow-up
            23
            ·
            1 year ago

            Yeah that would be all fine and well if it meant we’re on track for some post-work egalitarian utopia but you and I know that’s not at all where this is heading.

            • FaceDeer@kbin.social
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              Unfortunately based on what I know of history it seems likely that humanity won’t ever be on track to build a post-work egalitarian utopia until we’ve got no other option left. So I support going ahead with this tech because that seems like a good way to force the issue. The transition period will be rough, but better than stagnation IMO.

            • enkers@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              Oh, for sure, it’ll definitely further wealth disparity, as automation always seems to in a capitalist system. But that’s a societal problem that we continually have to address, and it spans nearly all fields of human work to varying degrees.

              Fortunately, for the most part tech advancements are very hard to control. Progress can be impeded from spreading, but not stopped, and it means the average individual has access to more and more powerful tools.

          • Eldritch@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            We’ve figured it out. They already had a start on it in the 19th and 20th centuries. However, those with the means have spent the last 100 years screaming bloody murder. Dismantling government and any progress that had been made to address it. As well as invading and overthrowing any foreign group that though about opposing them.

              • Eldritch@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                It’s all well planned out. Step one is getting money out of politics. So no, it’s not viable unfortunately.

        • Flying Squid@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Even if ChatGPT gets far in advance of the way it is now in terms of writing code, at the very least you’re still going to need people to go over the code as a redundancy. Who is going to trust an AI so much that they will be willing to risk it making coding errors? I think that the job of at the very least understanding how code works will be safe for a very long time, and I don’t think ChatGPT will get that advanced for a very long time either, if ever.

          • kescusay@lemmy.world
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            1
            ·
            1 year ago

            There’s more to it than that, even. It takes a developer’s level of knowledge to even begin to tell ChatGPT to make something sensible.

            Sit an MBA down in front of a ChatGPT window and tell them to make an application. The application has to save state, it has to use the company’s OAuth login system, it has to store data in a PostgreSQL database, and it has to have granular, roles-based access control.

            Then watch the MBA struggle because they don’t understand that…

            • Saving state is going to vary depending on the front-end. Are we writing a browser application, a desktop application, or a mobile application? The MBA doesn’t know and doesn’t understand what to ask ChatGPT to do.
            • OAuth is a service running separately to the application, and requires integration steps that the MBA doesn’t know how to do, or ask ChatGPT to do. Even if they figure out what OAuth is, ChatGPT isn’t trained on their particular corporate flavor for integration.
            • They’re actually writing two different applications, a front-end and a back-end. The back-end is going to handle communication with PostgreSQL services. The MBA has no idea what any of that means, let alone know how to ask ChatGPT to produce the right code for separate front-end and back-end features.
            • RBAC is also probably a separate service, requiring separate integration steps. Neither the MBA nor ChatGPT will have any idea what those integration steps are.

            The level of knowledge and detail required to make ChatGPT produce something useful on a large scale is beyond an MBA’s skillset. They literally don’t know what they don’t know.

            I use an LLM in my job now, and it’s helpful. I can tell it to produce snippets of code for a specific purpose that I know how to describe accurately, and it’ll do it. Saves me time having to do it manually.

            But if my company ever decided it didn’t need developers anymore because ChatGPT can do it all, it would collapse inside six months, and everything would be broken due to bad pull requests from non-developers who don’t know how badly they’re fucking up. They’d have to rehire me… And I’d be asking for a lot more money to clean up after the poor MBA who’d been stuck trying to do my job.

              • kescusay@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                You’re welcome! And it occurs to me that the fact that it took a developer to explain all of that is an object lesson in why ChatGPT won’t end software development as a career option - and believe me, I simplified it for a non-developer audience.

          • thisfro@slrpnk.net
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            1
            ·
            1 year ago

            Who is going to trust an AI so much that they won’t risk it making coding errors?

            Sadly, too many

              • Jeena@jemmy.jeena.net
                link
                fedilink
                English
                arrow-up
                8
                ·
                1 year ago

                I don’t believe it. If it’s good enough then they will ship and make money, and those who put people on it will be so slow that they will be just outperformed by those who don’t.

                • Flying Squid@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  1 year ago

                  If your code doesn’t work because you rely entirely on an AI to do it, you don’t have a business you can run unless you want to go back to paper and pencil.

                  • Jeena@jemmy.jeena.net
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    2
                    ·
                    1 year ago

                    If your code doesn’t work because you rely on humans understanding it, you don’t have a business you can run. We already are there where humans have no idea why the computer does this or that decision because it’s so complex especially with all the machine learning and complex training data, etc. let’s not pretend it will get less complex with time.

          • nicetriangle@kbin.social
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            That’s a fuckin bleak outcome for a lot of people if the job transition goes from \ to \

            That’s like being an artist and being told your job now is simply to fix the shitty hands Midjourney draws. And your job will only last as long as that remains a problem.

            • Flying Squid@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              Hey, I didn’t say the future would be bright, just that it will still need people familiar with code for the foreseeable future. At least until the Earth heats up so much that the lack of potable water and the unsurvivable high temperatures destroy civilization.

          • archomrade [he/him]@midwest.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            It isn’t surprising that this is the way we conceptualize the potential impact of AI, but it’s frustrating to see it tossed around as if AI disruption is a forgone conclusion.

            AI will start re-defining the problems that code is written to solve long before we get anywhere close to GPT models replacing human workers, and that’s a big enough problem by itself.

            It used to be that before code could even be employed to solve a problem, it had to be understood procedurally. That’s increasingly not the case, given that ML is routinely employed to decode things that were previously thought to be too chaotic to be understood, like brain waves and image pixel data. I don’t know why we’re so sure of ourselves that machine learning is just a gimmick and poses no real threat, just because anthropomorphizing it seems silly.

        • dustyData@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          4
          ·
          edit-2
          1 year ago

          Your comment reminds me the cesspit of Xitter with the generative AI bros trying to conflate AI with assistive tech. They seriously argued that “artistically impaired” was a genuine disability and that they were entitled to generative AI training sets because it allowed them to draw. It was the most disingenuous argument, that they had a right to steal artists work, and leave them without income, to train their AI because they couldn’t be bothered to rub a pen against some paper.

          • Peanut
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            3
            ·
            1 year ago

            Hey! Artist here. I love drawing. My hands go numb within minutes and they shake more every year. I appreciate having a tool and medium that allows great artistic control despite these facts.

            Now, if you’re really butthurt about the training data you can use adobe’s proprietary model. I for one think it’s good that peasants have an open available tool that isn’t owned by adobe, even if it was trained less proprietarily.

            This anger about it reminds me of deviant art artists getting mad at each other for “copying my style”

            And the fact that copywrite used to be about the general good, and promotion of creative works.

            This world needs new artistic priorities. Pen and paper aren’t losing their place, but new tech will lead to independent artists creating entire movies, games, and holodeck style experiences without looming overhead of whatever art oligarch holds the funding.

            • dustyData@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Au contraire. Art oligarchs will own every single thing you make with AI. You’re not being liberated, you’re being further imprisoned, and they got you to cheer for the jailers.

              ADD: remember, the final goal of the technocrats is not to make more artists. But to remove the artist from the art altogether.

              • FaceDeer@kbin.social
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                1 year ago

                Art oligarchs will own every single thing you make with AI.

                No, where are you getting that from? I’m not even sure how to refute that, it’s nonsensical.

                There might be some AI services out there that try to use some sort of ToS to “claim” anything you generate using them, but any such service would be radioactive to a serious artist. Just use a different one, or run the AI locally yourself.

          • RobotToaster@mander.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            I mean, I was literally diagnosed with “clumsy child syndrome” (we call it dyspraxia now) as a kid, in part because I’m artistically impaired.

            • Jesus_666@feddit.de
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              Fair point although there is a difference between “can’t make a reasonable drawing with instruction at the level of one’s classmates” and “never progressed beyond very basic drawing skills because you never practiced”.

        • nicetriangle@kbin.social
          link
          fedilink
          arrow-up
          7
          arrow-down
          4
          ·
          1 year ago

          Lotta people have already lost jobs because of it. I know a few personally. People with college educations. We’re just getting started with this, it will get worse.

          • kescusay@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 year ago

            In software development? Not many - and certainly not at smart companies.

            ChatGPT is a tool. It goes in the developer toolbox because it’s useful. But it doesn’t replace the developer, any more than a really good screwdriver replaces the construction worker.

            More and more, understanding how to use LLMs for software development will be a job requirement, and developers who can’t adapt to that may find themselves unemployed. But most of us will adapt to it fine.

            I have. I’m using Copilot these days. It’s great. And the chances of it replacing me are roughly 0%, because it doesn’t actually know anything about our applications, and if told to make code by someone else who doesn’t know anything about them either, it’ll make useless garbage.

            • nicetriangle@kbin.social
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              1 year ago

              Yeah so your job is a harder to fully replace with AI at this moment than jobs like copywriting, narration, or illustration. Enjoy it while it lasts, because the days are numbered.

              And before all those jobs are gone, people using AI tools like your Copilot will be more productive requiring less headcount. At the same time there will still be a lot of people seeking work, but now with fewer jobs there will be downward pressure on wages.

              • kescusay@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                In the early 2000’s, there was all this panic about how these newfangled languages and tools were going to obliterate the developer job market. They were too easy to use! Too simple for non-developers to pick up! Why, you could almost code in plain English now! Developers are doooooooomed!

                Instead, demand for developers shot through the roof, because the barrier to entry for developing applications had been lowered enough that adding staff developers to your employee roster became a no-brainer.

                Part of the problem is one of precision instructions. We call instructions that are comprehensive, specific, detailed, and precise enough to be turned into programs code, and ChatGPT doesn’t change that. It can only do what you tell it to do.

                Maybe someday, a large language model will be so sophisticated, you can say something like, “Write a program to do X, Y, and Z. It uses (address) for authentication, and (other address) for storing data. Here are the credentials to use for each. (Credentials). Your repo is at (address). You deploy the front-end at (address) and the back-end at (address). Your pipelines should be written using (file) as a template.” And maybe what it does with that will truly be able to replace me.

                But I genuinely doubt it. I glossed over an enormous amount of detail in that example. If I add it in, what it’ll start looking like is, well, more code.

                  • kescusay@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    I didn’t find it surprising, because I do stuff like it every day (except I’m using Copilot integrated into my IDE, not ChatGPT).

                    What he did was very cool. But he’s a game developer who already knows all the parts he needs and what to ask for, and he still has to do a lot of work by hand. He glossed over it quickly, but there are parts where he had to add code to specific, already-existing blocks of code in his program, and in order to do that, he had to know and understand what the current code was doing.

                    And throughout the video, he had to know not only what to ask for, but how to ask for it. That takes experience and understanding.

                    It’s possible that eventually, programming for a lot of people will mean expertise in interacting with large language models + lesser expertise in the actual programming language, but I don’t see that as likely to end programming as we know it. In fact, it might cause a surge in developer demand as the bar to entry lowers again, much like it did in the 2000’s. And there will always be demand for people with a deep understanding of the actual code, because they’ll be necessary for things like performance improvement, bug fixing… and writing the next generation of large language models.

        • R0cket_M00se@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Those people were always misinformed.

          At some point in the future AI will replace most programmers, because AI will allow senior devs to automate large portions of their codebase. Human devs will act more like QA, fixing the small errors during the automation process.

          Either way it’s a tool to by used by you to multiply your efforts, not one to replace you.

      • Hyperreality@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        1
        ·
        1 year ago

        You laugh now, but just you wait. If it turns out they’ve created a hyperintelligent waifu/husbando, this will inevitably lead to plummeting birth rates and the end of human civilisation.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Because that’s the version that gets posted and gets clicked on.

      A dry technical writeup looking at the name of the project and how it indicates this is a different approach more in line with DeepMind’s work and what that means in the context of doing high school level math is going to be interesting to only a handful of people.

      But an article that’s contentious and gets hundreds of comments about how “AI is BS” to “AI is dangerous” all arguing with each other drives engagement.