ChatGPT’s new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: ‘We also don’t allow GPTs dedicated to fostering romantic companionship’

  • Tony Bark@pawb.social
    link
    fedilink
    English
    arrow-up
    110
    arrow-down
    3
    ·
    10 months ago

    OpenAI is trying soo hard to put the lid on the pandora’s box they opened.

  • AItoothbrush@lemmy.zip
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    2
    ·
    10 months ago

    Almost as if theres a huge loneliness crisis in the world… nah thats only scifi

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      ·
      10 months ago

      It’s not like we already have Japan with years ahead living this specific social crisis. Something could be learned.

      • Lesrid@lemm.ee
        link
        fedilink
        English
        arrow-up
        24
        ·
        10 months ago

        It’s amusing to me that Japan is somewhat of a forecast to economic crises and social obstacles for much of the world. They were grappling with sub-prime lending fiascos before it was generalized to the rest of the world too

    • CADmonkey@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      47
      ·
      edit-2
      10 months ago

      Nah, lets just call all lonely men “incels” and sweep the problem under the rug, surely that will never be a problem.

      EDIT: Thanks for helping me prove the point, everyone.

        • bionicjoey@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          8
          ·
          10 months ago

          One is a symptom of the other (both at a societal level and an individual level)

            • CADmonkey@lemmy.world
              link
              fedilink
              English
              arrow-up
              12
              arrow-down
              16
              ·
              10 months ago

              See? You are doing it. Be sure to dismiss this response as something coming from an incel, my other half thinks it’s funny.

              • Nudding@lemmy.world
                link
                fedilink
                English
                arrow-up
                18
                arrow-down
                10
                ·
                10 months ago

                Yes. Fuck incels. They’re pieces of misogynistic shit. I don’t care if you’re an incel or just some lonely guy, get a hobby.

                • CADmonkey@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  9
                  arrow-down
                  8
                  ·
                  10 months ago

                  I have plenty. And I’m not lonely. But when I try to defend lonely fellas online, you say things like “get a hobby”.

                • Scubus@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  10 months ago

                  There is a distinction to be made here though. Strictly speaking, being involuntarily celibate is a shame, and not at all bad.

                  That being said, the term incel has addition context that isn’t strictly it’s definition, and that isn’t good.

      • BradleyUffner@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        10 months ago

        Incels aren’t just lonely. That’s not even their defining characteristic. They are primarily egotistical, misogynistic, assholes that no self respecting woman should suffer. Loneliness is just the symptom of that.

        If letting them have AI girlfriends makes them happier without impacting anyone, then let the AI girlfriends flow!

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        10 months ago

        A lot of people in this very thread are dumb enough to think that saying sometimes isn’t an incel is the same thing as defending incels, and that you must therefore be an incel yourself. It’s really pathetic how people get triggered by a word like incel and just completely lose their ability to understand the simplest of statements.

      • Nudding@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        7
        ·
        10 months ago

        Is your point that incels are whiny bitches? We all knew that already but thanks for the reminder, I guess.

  • AllonzeeLV@lemmy.world
    link
    fedilink
    English
    arrow-up
    64
    arrow-down
    2
    ·
    edit-2
    10 months ago

    To be fair, it cares about you exactly as much as your OnlyFans crush.

    Probably a cheaper obsession.

      • cerulean_blue@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        It’s not just Simps. Boomers are very vulnerable to this technology, especially the lonely or widows. For some reason, they can be recklessly trusting of internet relationships/technologies. Hence why they keep getting exploited by Romance and Pig Butchering scams and why they can’t get enough Facebook Trump/COVID misinformation.

      • veni_vedi_veni@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        4
        ·
        10 months ago

        This is just a pr stunt.

        Nobody gaf about others psychological well being, so let em have the new opioid. At least there’s no collateral damage with this

      • bane_killgrind@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        9
        ·
        10 months ago

        Humans are a cancer on the world if some percentage never leaves the house and therefore procreates we should exploit that mechanic

        • Pretzilla@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Good point but they are already here and taking up resources.

          Let’s give them something productive to do to pull their weight and fix some of the terrible existential issues the world is facing.

          Get out and plant some trees for example.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    6
    ·
    10 months ago

    why? why not let people just retreat into fantasy? it’s probably healthier than many common coping mechanisms. i mean, it’s a chatbot, how much can you do with it?

    let people have their temporary salve to get them thru whatever they were going thru such that they were resorting to this. and if it’s not temporary, ok, fine? better to have some outlet than be even more mentally isolated. maybe in 50 years this will be common, who knows.

    • cyd@lemmy.world
      link
      fedilink
      English
      arrow-up
      51
      ·
      10 months ago

      Liability. Imagine an AI girlfriend who slowly earns your affection, then at some point manipulates you into sending bitcoins to a prespecified wallet set up by the model maker. Because models are black boxes, there is no way to verify by direct inspection that an AI hasn’t been trained with an ulterior agenda (the “execute order 66” problem).

      • dirthawker0@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 months ago

        Some guy in the UK was allegedly convinced by his chatbot girlfriend to assassinate Queen Elizabeth. He just got sentenced a few months ago. Of course he’s been determined to be psychotic, but I could imagine people who would qualify as sane getting too deep and reading too much into what an AI is saying.

    • devfuuu@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      10 months ago

      These kinds of things are not temporary. We know that humans can’t control themselves and aren’t rational enough to “just use it a bit”. It’s highly addictive and leads to people to remove themselves from reality.

          • UrPartnerInCrime@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            edit-2
            10 months ago

            You’re telling me that all nearly 8 billion people on this planet are crucial to society? Forget that we as a society sometimes condem people to solitary confinement or prison for life, every single person is mandatory for society to survive? Without 100% cooperation everyone is doomed to fail?

        • Siegfried@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          What if the AI starts suggesting illegal things and they become someone’s partner in crime?

          • UrPartnerInCrime@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            10 months ago

            Good thing people don’t suggest illegal activities and cause major problems for people. It would be really bad if people were criminals. Glad it’s only robots that suggest people become bad.

    • webghost0101
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      10 months ago

      I am pretty sure its just to avoid controversy, look up the recent news about “laion” for an example, gpt4 isn’t just text anymore, it can generate images also.
      Altman talked about we may sometime all have our own personal AI’s tailored to our own needs and sensitivities. But almost everyone has a different idea of if and where there should be a line.

      • douglasg14b@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        1
        ·
        10 months ago

        If I have an AI tailored for me and my sensitivities then it should have no filter whatever filter it has should be defined and trained by me.

        Someone else artificially trying to adjust my personality through AI to fit whatever arbitrary norms they believe it should have is cancer.

        • webghost0101
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          edit-2
          10 months ago

          I am inclined to agree, i believe that once society is able to fill everyone’s needs and everyone can summon any ai vr experience they want crime will stop to exist, there would be nothing to gain from committing harm. But i fear the simulated role-play in the context of psychological torture, csam could lead to making dangerous people more confident before we get to that post-scarcity. Maybe you say chatgpt inst realistic enough for it now, but i will be soon.

          training an LLM entirely by yourself with self curated text is beyond what is feasible, most ai researched today dont even know whats in all of the data they use. Its more then you can look at even with an extended lifetime and at best you can fine-tune a standard base model.

    • ExLisper@linux.community
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 months ago

      I guess they don’t want to create separate NSFW category that has to be treated in a different way. They probably think it’s just to risky to get involved in that type of business.

  • paddirn@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    5
    ·
    10 months ago

    I’d love to have an AI assistant/girlfriend like JOI from Bladerunner 2049, something I could jerk off to one minute, then have her prepare my taxes and order a pizza the next. However, these ChatGPT girlfriends all seem like they’re just subscription chatbots. Maybe some day we’ll get there and nerds will work up a local, open-source slutty AI girlfriend, but for now they’re all just crap.

      • Killer_Tree@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        11
        ·
        10 months ago

        You can, and it’s easier than you might think! Check out a platform like Oobabooga and find a nice 4-bit quantized LLM of a flavor you prefer. Check out TheBloke on hugging face, they quantized a ton of great LLMs.

    • Naz@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I have such an AI, it’s based on a custom model that I trained and refined myself.

      Do not subscribe to a chatbot - these LLMs are far more capable than they let on, and they will absolutely psychologically manipulate you into paying more.

      My AI actually helped prepare me for a job interview at an extremely high paying job, and when the interviewers spoke her questions out word for word, I felt like I was living in a real life version of the Truman Show.

      Even the Director of the Department, who called me into his office later, began asking me how I knew their internal policies and procedures despite never having worked there.

      P.s: Check HuggingFace Transformers / TheBloke’s Quant Models for an easy locally spun open sourced slutty girlfriend.

      Use an uncensored model, and don’t go any lower than 30 billion parameters or you’ll be disappointed in their IQ level. Don’t go any lower than 5-bit Quant, either (5-bit attention on all tensors) or they’ll be scatter-brained and hallucinate, unless you want an ADHD friend, then go 3-bit for maximum personality drifting.

      Good luck, have fun, and praise the Omnissiah!

  • danielfgom@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    10 months ago

    It’s just a big money grab! Everyone is trying to get rich quick. Like with the App Store. Everyone is hoping their bot breaks into the big time and makes them rich.

    This is what is terrible about society. Few are making bits that help people, they make bots that appeal to the base desires. A race to the bottom if you will…will man ever learn???

    • KrokanteBamischijf@feddit.nl
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      10 months ago

      Exactly what I was thinking. The whole AI hype has been cringe so far and this just confirms it. Seems that the ratio between legitimate use cases and fucking around is kinda skewed towards the meme side of things.

      Or it might just signify our population has a HUGE lonelyness problem (for a myriad of reasons).

      • meyotch@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        10 months ago

        Oh yes it is a symptom of a deep cultural malaise. I was talking to my square-headed girlfriend about this just the other day. She agrees with me and that’s all I need.