One of the things people were originally selling the idea of AI companions on was the therapeutic value, but I don’t have high hopes this company will try to de-incel their clients with their AI girlfriends providing pushback on unreasonable ideas.
I agree. At least with Replika, they updated the LLMs that gave the Reps the opportunity to reject advances (I think this is to disable ERP or something) and there was a crisis within its consumer base. It’s gonna be very difficult to give users mental help with AI alone, especially when it’s part of a commercial product where its affective behavior becomes a drive to more purchases.
One of the things people were originally selling the idea of AI companions on was the therapeutic value, but I don’t have high hopes this company will try to de-incel their clients with their AI girlfriends providing pushback on unreasonable ideas.
I agree. At least with Replika, they updated the LLMs that gave the Reps the opportunity to reject advances (I think this is to disable ERP or something) and there was a crisis within its consumer base. It’s gonna be very difficult to give users mental help with AI alone, especially when it’s part of a commercial product where its affective behavior becomes a drive to more purchases.