• finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    12 hours ago

    Good shit. A carefully thought out handcrafted experience will always be better than interactive slop.

    • finitebanjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      12 hours ago

      I remember an old song “I’ll go green when they go green and they’ll go green but not really green more like aquamarine” and it appears to no longer exist on the internet.

      Another song I can’t find is about a guy who tells the story of all his past lives and in each he was a whore and someday he’ll be a whore again.

      Really wish songs would stop disappearing.

  • RizzoTheSmall@lemm.ee
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    2
    ·
    21 hours ago

    They cannot possibly assure customers that remote devs aren’t using copilots to help them code.

    • jsomae@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      11 hours ago

      Generative AI is a technology that can create pictures, movies, audio (music or voice action) and writing using artificial intelligence

      By their definition of Gen AI, it’s unclear to me if the label says anything about code. I’m not sure I would consider it “writing.”

      • mke@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 hours ago

        This might be a little off-topic, but I’ve noticed what seems to be a trend of anti-AI discourse ignoring programmers. Protect artists, writers, animators, actors, voice-actors… programmers, who? No idea if it’s because they’re partly to blame, or people are simply unaware code is also stolen by AI companies—still waiting on that GitHub Copilot lawsuit—but the end result appears to be a general lack of care about GenAI in coding.

        • jsomae@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 hours ago

          I think it’s because most programmers use and appreciate the tool. This might change once programmers start to blame gen AI for not having a job anymore.

          • takeda@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 hours ago

            I noticed a bad trend with my colleagues who use copilot, chatgpt etc. They not only use it to write code, but also trust it with generally poor design decisions.

            Another thing is that those people also hate working on existing code, claiming it is communicated and offering to write their (which also ends up complicated) version of it. I suspect it’s because copilot doesn’t help as much when code is more mature.

          • mke@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            6 hours ago

            There remains a significant enclave that rejects it, but yeah, it’s definitely smaller than equivalent groups in other mentioned professions. Hopefully things won’t get that far. I think the tech is amazing, but it’s an immense shame that so many of my/our peers don’t give a flying fuck about ethics.

            • nickwitha_k (he/him)@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              4
              ·
              6 hours ago

              There remains a significant enclave that rejects it, but yeah, it’s definitely smaller than equivalent groups in other mentioned professions.

              Reporting in.

              I think the tech is amazing, but it’s an immense shame that so many of my/our peers don’t give a flying fuck about ethics.

              Yup. Very much agreed here. There are some uses that are acceptable but it’s a but hard to say that any are ethical due to the ethically bankrupt foundations of its training data.

    • finitebanjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      12 hours ago

      Indie studio teams are pretty small so its possible, I personally hate that the word copilot ever even appears and never ever autogen code, but moreso I’m sure the stamp refers to art, texture, and sound.

  • bia@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    6
    ·
    21 hours ago

    Not sure how to interpret this. The use of any tool can be for good or bad.

    If the quality of the game is increased by the use of AI, I’m all for it. If it’s used to generate a generic mess, it’s probably not going to be interesting enough for me to notice it’s existence.

    If they mean that they don’t use AI to generate art and voice over, I guess it can be good for a medium to large game. But if using AI means it gets made at all, that’s better no?

    • 10001110101@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      8 hours ago

      I’d argue that even if gen-AI art is indistinguishable from human art, human art is better. E.g. when examining a painting you might be wondering what the artist was thinking of, what was going on in their life at the time, what they were trying to convey, what techniques they used and why. For AI art, the answer is simply it’s statistically similar to art the model has been trained on.

      But, yeah, stuff like game textures usually aren’t that deep (and I don’t think they’re typically crafted by hand by artists passionate about the texture).

    • jsomae@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      I am for the most part angry that people are being put out of work by AI; I actually find AI-generated content interesting sometimes, for example AI Frank Sinatra singing W.A.P. is pretty funny. This label is helpful to me so that I know I’m supporting humans monetarily.

    • deur@feddit.nl
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      10
      ·
      21 hours ago

      People want pieces of art made by actual humans. Not garbage from the confident statistics black box.

      • Lumiluz@slrpnk.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        What if they use it as part of the art tho?

        Like a horror game that uses an AI to just slightly tweak an image of the paintings in a haunted building continuously everytime you look past them to look just 1% creepier?

        • mke@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          7 hours ago

          That’s an interesting enough idea in theory, so here’s my take on it, in case you want one.

          Yes, it sounds magical, but:

          • AI sucks at make it more X. It doesn’t understand scary, so you’ll get worse crops of the training data, not meaningful changes.
          • It’s prohibitively expensive and unfeasible for the majority of consumer hardware.
          • Even if it gets a thousand times cheaper and better at its job, is GenAI really the best way to do this?
          • Is it the only one? Are alternatives also built on exploitation? If they aren’t, I think you should reconsider.
          • Lumiluz@slrpnk.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            12 minutes ago

            •Ok, I know the researching ability of people has decreased greatly over the years, but using “knowyourmeme” as a source? Really?

            • You can now run optimized open source diffusion models on an iPhone, and it’s been possible for years. I use that as an example because yes, there’s models that can easily run on an Nvidia 1060 these days. Those models are more than enough to handle incremental changes to an image in-game

            • Already has for awhile as demonstrated by it being able to run on an iPhone, but yes, it’s probably the best way to get an uncanny valley effect in certain paintings in a horror game, as the alternatives would be:

            • spending many hours manually making hundreds of incremental changes to all the paintings yourself (and the will be a limit to how much they warp, and this assumes you have even better art skills)
            • hiring someone to do what I just mentioned (assumes you have a decent amount of money) and is still limited of course.

            • I’ll call an open source model exploitation the day someone can accurately generate an exact work it was trained on not within 1, but at least within 10 generations. I have looked into this myself, unlike seemingly most people on the internet. Last I checked, the closest was a 90 something % similarity image after using an algorithm that modified the prompt over time after thousands of generations. I can find this research paper myself if you want, but there may be newer research out there.

        • Dizzy Devil Ducky@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 hours ago

          Would the feature in that horror game Zort where you sometimes use the player respon item and it respons an NPC that will use clips of what a specific dead player has said while playing count as AI use? If so, that’s a pretty good use of AI in horror games in my opinion.

      • RampantParanoia2365@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        13 hours ago

        Honest question: are things like trees, rocks, logs in a huge world like a modern RPG all placed by hand, or does it use AI to fill it out?

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          12 hours ago

          Not AI but certainly a semirandom function. Then they go through and manually clean it up by hand.

        • skibidi@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          13 hours ago

          Most games (pre-ai at least) would use a brush for this and manually tweak the result if it ended up weird.

          E.g. if you were building a desert landscape you might use a rock brush to randomly sprinkle the boulder assets around the area. Then the bush brush to sprinkle some dry bushes.

          Very rare for someone to spend the time to individually place something like a rock or a tree, unless it is designed to be used in gameplay or a cutscene (e.g. a climable tree to get into a building through a window).

          • TwanHE@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 hours ago

            That’s only for open world maps, many games where the placement of rocks and trees is something that’s subject to miniscule changes for balance reasons.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        6
        ·
        19 hours ago

        It’s all virtue signaling. If it’s good, nobody will be able to notice anyway and they’ll want it regardless. The only reason people shit on AI currently is because expert humans are still far better than it.

        We’re just at that awkward point in time where AI is better than the random joe but worse than experts.

        • mke@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          7 hours ago

          The only reason people shit on AI currently is because expert humans are still far better than it.

          Not it’s not! There are a whole bunch of reasons why people dislike the current AI-wave, from artist exploitation, to energy consumption, to making horrible shitty people and companies richer while trying to obviate people’s jobs!

          You’re so far off, it’s insane. That’s like saying people only hate slavery because the slaves can’t match craftsmen yet. Just wait a bit until they finish training the slaves, just a few more whippings, then everyone will surely shut up.

          • Pennomi@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 hours ago

            I agree that those are reasons people give for their reasoning, but if history has shown anything, we know people change their minds when it becomes most convenient to use a technology.

            Human ethics is highly dependent on convenience, unfortunately.

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        14 hours ago

        One of my favourite games used procedural generation to create game “art”, “assets”, and “maps”.

        That could conceivably be called (or enhanced by) ML today, which could conceivably be called AI today.

        But even in modern games, I’m not opposed to mindful usage of AI in games. I don’t understand why you’re trying to speak for everyone (by saying “people”) when you’re talking to someone who doesn’t share your view.

        This is like those stupid “non-GMO” stickers. Yes, GMOs are being abused by Monsanto (and probably other corporations like them). No, that doesn’t mean that GMOs are bad in all cases.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          12 hours ago

          I think the sort of generative AI referred to is something that trains on data to approximate results, which consumes vast amounts more power.

      • piccolo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        9
        ·
        18 hours ago

        Humans are confident statistical black boxes. Art doesnt have to be made by a human to be aspiring.

          • SchmidtGenetics@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            12 hours ago

            What do you think grammarly is dude? Glorified spell and auto check, which people already utilize everyday. But of course new tools are looked down upon, the hypocrisy of people is amazing to see. It comes in cycles, people hated spell check, got used to it and now it’s prominent in every life, autocorrect, same thing is happening.

            And now the same is happening again. If they want to claim no ai, no spellcheck, no auto correct, and no grammarly for emails. Everyone already uses “AI” everyday. But theirs is acceptable… okay…

            • finitebanjo@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              12 hours ago

              Right but to detect close-enough spellings and word orders, using a curated index or catalogue of accepted examples, is one thing.

              To train layers of algorithms in layers of machines on massive datasets to come up with close enoughs would be that but many times over the costs.

              You would be a moron to use llms for spellchecking.

              To clarify to you, not all programs are equal. Its not all different methods to do the same thing at the same cost.

        • Fonzie!@ttrpg.network
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          15 hours ago

          That’s not art, that’s a tool. Tools can be made better through a confident statistics box.

  • Lem Jukes@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    11
    ·
    edit-2
    9 hours ago

    This feels discouraging as someone who struggled with learning programming for a very long time and only with the aid of copilot have I finally crossed the hurdles I was facing and felt like I was actually learning and progressing again.

    Yes I’m still interacting with and manually adjusting and even writing sections of code. But a lot of what copilot does for me is interpret my natural language understanding of how I want to manipulate the data and translating it into actual code which I then work with and combine with the rest of the project.

    But I’ve stopped looking to join any game jams because it seems even when they don’t have an explicit ban against all AI, the sentiment I get is that people feel like it’s cheating and look down on someone in my situation. I get that submitting ai slop whole sale is just garbage. But it feels like putting these blanket ‘no ai content’ stamps and badges on things excludes a lot of people.

    Edit:

    Is this slop? https://lemjukes.itch.io/ascii-farmer-alpha https://github.com/LemJukes/ASCII-Farmer

    Like I know it isn’t good code but I’m entirely self taught and it seems to work(and more importantly I mostly understand how it works) so what’s the fucking difference? How am I supposed to learn without iterating? If anyone human wants to look at my code and tell me why it’s shit, that’d actually be really helpful and I’d genuinely be thankful.

    *except whoever actually said that in the comment reply’s. I blocked you so I won’t see any more from you anyways and also piss off.

    • Probius
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      I like to use AI autocomplete when programming not because it solves problems for me (it fucking sucks at that if you’re not a beginner), but because it’s good at literally just guessing what I want to do next so I don’t have to type it out. If I do something to the X coordinate, I probably want to do the same/similar thing to the Y and Z coordinates and AI’s really good at picking up that sort of thing.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          9 hours ago

          Firstly, a calculator doesn’t have a double digit percent chance of bullshitting you with made up information.

          If you’ve ever taken a calculus course you likely were not allowed to use a calculator that has the ability to solve your problems for you and you likely had to show all of your math on paper, so yes. That statement is correct.

      • Lumiluz@slrpnk.net
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        11 hours ago

        Same vibes as “if you learned to draw with an iPad then you didn’t actually learn to draw”.

        Or in my case, I’m old enough to remember “computer art isn’t real animation/art” and also the criticism assist Photoshop.

        And there’s plenty of people who criticized Andy Warhol too before then.

        Go back in history and you can read about criticisms of using typewriters over hand writing as well.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          9 hours ago

          None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.

          • Lumiluz@slrpnk.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            41 minutes ago

            Have you ever looked at the file size of something like Stable Diffusion?

            Considering the data it’s trained on, do you think it’s;

            A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes

            Second, what’s the electrical cost of generating a single image using Flux vs 3 minutes of Balder’s Gate, or similar on max settings?

            Surely you must have some idea on these numbers and aren’t just parroting things you don’t understand.

    • Demigodrick@lemmy.zipM
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      3
      ·
      edit-2
      17 hours ago

      FWIW I agree with you. The people who say they don’t support these tools come across as purists or virtue signallers.

      I would agree with not having AI art* or music and sounds. In games I’ve played with it in, it sounds so out of place.

      However support to make coding more accessible with the use of a tool shouldn’t be frowned upon. I wonder if people felt the same way when C was released, and they thought everyone should be an assembly programmer.

      The irony is that most programmers were just googling and getting answers from stackoverflow, now they don’t even need to Google.

      *unless the aim is procedurally generated games i guess, but if they’re using assets I get not using AI generated ones.

      • mke@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        The people who say they don’t support these tools come across as purists or virtue signallers.

        It is now “purist” to protest against the usage of tools that by and large steal from the work of countless unpaid, uncredited, unconsenting artists, writers, and programmers. It is virtue signaling to say I don’t support OpenAI or their shitty capital chasing pig-brethren. It’s fucking “organic labelling” to want to support like-minded people instead of big tech.

        Y’all are ridiculous. The more of this I see, the more radicalized I get. Cool tech, yes, I admit! But wow, you just want to sweep all those pesky little ethical issues aside because… it makes you more productive? Shit, it’s like you’re competing with Altman on the unlikeability ranking.

      • chaos@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 hours ago

        The irony is that most programmers were just googling and getting answers from stackoverflow, now they don’t even need to Google.

        That’s the thing, though, doing that still requires you to read the answer, understand it, and apply it to the thing you’re doing, because the answer probably isn’t tailored to your exact task. Doing this work is how you develop an understanding of what’s going on in your language, your libraries, and your own code. An experienced developer has built up those mental muscles, and can probably get away with letting an AI do the tedious stuff, but more novice developers will be depriving themselves of learning what they’re actually doing if they let the AI handle the easy things, and they’ll be helpless to figure out the things that the AI can’t do.

        Going from assembly to C does put the programmer at some distance from the reality of the computer, and I’d argue that if you haven’t at least dipped into some assembly and at least understand the basics of what’s actually going on down there, your computer science education is incomplete. But once you have that understanding, it’s okay to let the computer handle the tedium for you and only dip down to that level if necessary. Or learning sorting algorithms, versus just using your standard library’s sort() function, same thing. AI falls into that category too, I’d argue, but it’s so attractive that I worry it’s treating important learning as tedium and helping people skip it.

        I’m all for making programming simpler, for lowering barriers and increasing accessibility, but there’s a risk there too. Obviously wheelchairs are good things, but using one simply “because it’s easier” and not because you need to will cause your legs to atrophy, or never develop strength in the first place, and I’m worried there’s a similar thing going on with AI in programming. “I don’t want to have to think about this” isn’t a healthy attitude to have, a program is basically a collection of crystallized thoughts and ideas, thinking it through is a critical part of the process.

        • Lem Jukes@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          I know you’re replying to a reply here, but do people think I mean just putting in a prompt and then running the output and calling that something I made?

          I’ve spent years trying to teach myself how to code but always inevitably would lose track of some part or get stuck on some bug or issue I alone couldn’t get past. I went to theatre school for chrissakes and I just wanna make games and silly little projects. I don’t have any friends in this field and pestering random people in discords or on stack overflow can be really annoying for those people.

          So why is using an ai assistant I can berate with as many terse questions I want to iterate code that’d I’d normally spend hours struggling just to remember and string together, such a big stick people are putting up their butts?

          • chaos@beehaw.org
            link
            fedilink
            English
            arrow-up
            2
            ·
            9 hours ago

            I’ll acknowledge that there’s definitely an element of “well I had to do it the hard way, you should too” at work with some people, and I don’t want to make that argument. Code is also not nearly as bad as something like image generation, where it’s literally just typing a thing and getting a not-very-good image back that’s ready to go; I’m sure if you’re making playable games, you’re putting in more work than that because it’s just not possible to type some words and get a game out of it. You’ll have to use your brain to get it right. And if you’re happy with the results you get and the work you’re doing, I’m definitely not going to tell you you’re doing it wrong.

            (If you’re trying to make a career of software engineering or have a desire to understand it at a deeper level, I’d argue that relying heavily on AI might be more of a hindrance to those goals than you know, but if those aren’t your goals, who cares? Have fun with it.)

            What I’m talking about is a bigger picture thing than you and your games; it’s the industry as a whole. Much like algorithmic timelines have had the effect of turning the internet from something you actively explored into something you passively let wash over you, I’m worried that AI is creating a “do the thinking for me” button that’s going to be too tempting for people to use responsibly, and will result in too much code becoming a bunch of half-baked AI slop cobbled together by people who don’t understand what they’re really doing. There’s already enough cargo culting around software, and AI will just make it more opaque and mysterious if overused and over-relied on. But that’s a bigger picture thing; just like I’m not above laying back and letting TikTok wash over me sometimes, I’m glad you’re doing things you like with the assistance you get. I just don’t want that to become the only way things happen either.

            • Lem Jukes@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 hours ago

              thanks for the thoughtful reply. I’m in the first boat of just wanting to make games and other small, self driven projects. I think its mostly the feeling of being excluded from participating in things like game jams and the larger game development community because I use a specific tool.

              In an effort to clarify what i think is an example of something like a middle ground between no AI code gen period and as you put it “do the thinking for me” let me see if i can put it in similar terms. Instead of “do it for me” its very much so a back and forth of “i want this behavior when these conditions are met for this function and expect these types of outcomes.” Copilot then generates code referencing the rest of the codebase as reference and i then usually manually copy and paste chunks over to the working files and then compile & run from there for testing.

              I definitely agree that over reliance on tools as a means of masking a real understanding of a subject is a genuine problem. And I too hope it doesnt end up having the same kind of effect algorithmic social media has had on society as a whole. But i think i do have hope that it will enable a subset of people like me who struggle with the wrote memorization aspects of computer programming but still desires the thrill of putting some pieces together and watching it work.

    • otp@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      14 hours ago

      Back in the day, people hated Intellisense/auto-complete.

      And back in the older day, people hated IDEs for coding.

      And back in the even older day, people hated computers for games.

      There’ll always be people who hate new technology, especially if it makes something easier that they used to have to do “the hard way”.

  • Skullgrid@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    17
    ·
    20 hours ago

    this is stupid, there’s SO many indie games using procedural generation which is fucking generative AI. It’s in a shitload of them, from speulunky to Darkest Dungeon 2.

    • Paradachshund@lemmy.today
      link
      fedilink
      English
      arrow-up
      11
      ·
      13 hours ago

      To be fair to the people protesting this isn’t what they’re objecting to. They don’t like tools which were built on theft, which all the major LLMs were. That’s the core issue, along with the fear that artists will be devalued and replaced because of them.

      • jsomae@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 hours ago

        There are many reasons that people dislike gen AI; you can’t be sure that it’s because they dislike how it’s built on theft. Here are three different unrelated reasons to dislike gen AI:

        • it puts people out of work;
        • it’s built on theft;
        • it produces “slop” in large quantities
    • parlaptie@feddit.org
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      16 hours ago

      Procedural generation is generative, but it ain’t AI. It especially has nothing in common with the exploitative practices of genAI training.

      • Lumiluz@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        11 hours ago

        “AI” is just very advanced procedural generation. There’s been games that used image diffusion in the past too, just in a far smaller and limited scale (such as a single creature, like the pokemon with the spinning eyes

        • Probius
          link
          fedilink
          English
          arrow-up
          6
          ·
          8 hours ago

          To me, what makes the difference is whether or not it’s trained on other people’s shit. The distinction between AI and an algorithm is pretty arbitrary, but I wouldn’t consider, for example, procedural generation via the wave function collapse algorithm to have the same moral implications as selling something using what most people would call AI-generated content.

          • Lumiluz@slrpnk.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            38 minutes ago

            And if you train an open source model yourself so it can generate content specifically on work you’ve created? Or are you against certain Linux devices too?

        • jsomae@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          11 hours ago

          By this logic, literally any code is genAI.

          Has a branch statement? It makes decisions. Displays something on the screen, even by stdout? Generated content.

        • Railcar8095@lemm.ee
          link
          fedilink
          English
          arrow-up
          11
          ·
          15 hours ago

          It doesn’t make decisions, but neither does Gen AI. Not sure if you’re doubly wrong or half right.

          But it’s not Gen AI.

        • parlaptie@feddit.org
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          16 hours ago

          As I touched on previously, those aren’t the qualities that make people opposed to AI. But have fun arguing dictionary definitions.

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      19 hours ago

      Ah but remember that AI no longer means the what it has meant since the dawn of computing, it now means “I don’t understand the algorithm, therefore it’s AI”.

      Hell, AI used to mean mundane things like A* pathfinding, which is in like, every game ever.

      I’m really tired of the shift in what AI means.

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        14 hours ago

        I remember we used to refer to enemy logic as AI. The 4 Pac-Man ghosts each had different “AI”. The AI of the enemies in this FPS sucks. This kind of stuff, lol