• eleitl@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    16 hours ago

    Cargo cult pretends to be the thing, but just goes through the motions. You say alignment, alignment with what exactly?

    • Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      5
      ·
      16 hours ago

      Alignment is short for goal alignment. Some would argue that alignment suggests a need for intelligence or awareness and so LLMs can’t have this problem, but a simple program that seems to be doing what you want it to do as it runs but then does something totally different in the end is also misaligned. Such a program is also much easier to test and debug than AI neural nets.

      • eleitl@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        11 hours ago

        Aligned with who’s goals exactly? Yours? Mine? At which time? What about future superintelligent me?

        How do you measure alignment? How do you prove conservation of this property along open ended evolution of a system embedded into above context? How do you make it a constructive proof?

        You see, unless you can answer above questions meaningfully you’re engaging in a cargo cult activity.