• Railcar8095@lemm.ee
      link
      fedilink
      arrow-up
      17
      ·
      2 years ago

      This. I always use that example, ChatGPT is stack overflow or a very eager intern. Review and make test cases

    • Lucidlethargy@sh.itjust.works
      link
      fedilink
      arrow-up
      10
      arrow-down
      3
      ·
      2 years ago

      Man, it’s great until it contently feeds you incorrect information. I’ve been burned far too many times at this point…

      • Gabu@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        2 years ago

        TBH, if you can’t almost instantly figure out why and how ChatGPT suggested bad code, you shouldn’t be using it at all - you’re out of depth.

        It’s why I’ll gladly use it to suggest markdown or C code, but never for a complex Python library.

      • R0cket_M00se@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 years ago

        Blaming the AI for misinformation is like blaming Google for giving you bad search results.

        Learn how to parse the data and fact check it. Usually you can get a hyperlink to the source to see if it’s even reasonably trustworthy.

    • tweeks@feddit.nl
      link
      fedilink
      arrow-up
      5
      ·
      2 years ago

      Plain copy paste without a critical view is not recommended, but it surely provides good pieces of code from time to time. Especially in obscure frameworks/languages, compared to what can be googled.

      ChatGPT 4 is a really big difference with 3.5 though. What took me hours together with the 3.5, was fixed in a few minutes with 4.