• orca@orcas.enjoying.yachts
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    Make sure you ask the AI not to hallucinate because it will sometimes straight up lie. It’s also incapable of counting.

    • lugal
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      But where is it fun in it if I can’t make it hallucinate?

        • lugal
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          But does it work to tell it not to hallucinate? And does it work the other way around too?

          • orca@orcas.enjoying.yachts
            link
            fedilink
            arrow-up
            2
            ·
            8 months ago

            It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.