Here’s some context for the question. When image generating AIs became available, I tried them out and found that the results were often quite uncanny or even straight up horrible. I ended up seeing my fair share of twisted fingers, scary faces and mutated abominations of all kinds.

Some of those pictures made me think that since the AI really loves to create horror movie material, why not take advantage of this property. I started asking it to make all sorts of nightmare monsters that could have escaped from movies such as The Thing. Oh boy, did it work! I think I’ve found the ideal way to use an image generating AI. Obviously, it can do other stuff too, but with this particular category, the results are perfect nearly every time. Making other types of images usually requires some creative promptcrafting, editing, time and effort. When you ask for a “mutated abomination from Hell”, it’s pretty much guaranteed to work perfectly every time.

What about LLMs though? Have you noticed that LLMs like chatGPT tend to gravitate towards a specific style or genre? Is it longwinded business books with loads of unnecessary repetition or is it pointless self help books that struggle to squeeze even a single good idea in a hundred pages? Is it something even worse? What would be the ideal use for LLMs? What’s the sort of thing where LLMs perform exceptionally well?

  • PerogiBoi@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Tech troubleshooting, specifically for Linux but I’ve used it successfully for Docker as well.

    • Toribor@corndog.social
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I used it to learn Ansible and Terraform. It probably does 80% of the work with me occasionally having to point out that it made up a module, is using a deprecated format or something like that. Still a huge time saver though. In ten seconds it can output something at least as good as what I’d produce with 15 minutes of reading documentation and searching for examples.

      • HamartiogonicOP
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        That is a valid use for an LLM, especially in easy cases. With more complex cases, I usually end up getting completely incorrect tech advice, but eventually I’ve always managed to make things work. It may require a few messages back and forth, but eventually I’ve managed to narrow it down enough that I can ask the right question and I finally get the right answer.