AI Industry Struggles to Curb Misuse as Users Exploit Generative AI for Chaos::Artificial intelligence just can’t keep up with the human desire to see boobs and 9/11 memes, no matter how strong the guardrails are.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    This is trivially fixable, it’s just at 2-3x the per query cost so it isn’t deemed worth it for high volume chatbots given the low impact of jailbreaking.

    For anything where jailbreaking would somehow be a safety concern, that cost just needs to be factored in.

    • Hamartiogonic
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      That’s true for all the things that can have a query cost. What about those AI applications that don’t have any financial cost to the user? For instance, The Spiffing Brit continues to find interesting ways to exploit the YouTube Algoritm. I’m sure you can apply that same “hacker mentality” to anything with AI in it.

      At the moment, many of those applications are on the web, and that’s exactly where a query costs can be a feasible way to limit the number of experiments you can reasonably run in order to find your favorite exploit. If it’s too expensive, you probably won’t find anything worth exploiting, and that should keep the system relatively safe. However, nowadays more and more AI is finding its way in the real world, which means that those exploits are going to have some very spicy rewards.

      Just imagine if the traffic lights were controlled by an AI, and you found an exploit that allowed you to get the green light on demand? Applications like this don’t have any API query costs. You just need to be patient and try all sorts of weird stuff to see how the lights react. Sure, you can’t run a gazillion experiments in an hour, which means that you might not find anything worth exploiting. Since there would be millions of people experimenting with the system simultaneously, surely someone would find an exploit.