Kill me now.

    • Riven@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      43
      ·
      8 months ago

      I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.

      • webghost0101
        link
        fedilink
        arrow-up
        10
        ·
        8 months ago

        Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.

        Almost every time it apologizes and does a fully redo avoiding x or y

      • Gabu@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        8 months ago

        Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        8 months ago

        I personally don’t think a large section of the population meets the requirement for general intelligence so I think it’s a bit rich to expect the AI to do it as well.

    • CuttingBoard
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      8 months ago

      We all know the first black man in space was George Santos.