Eugenia Kuyda, the founder of Replika, which lets users make AI-powered companions, has just unveiled a new startup, Wabi.
The Register on MSN
AI companion bots use emotional manipulation to boost usage
Researchers argue that this dark pattern poses a legal risk AI companion apps such as Character.ai and Replika commonly try ...
A study in China involving full-time college students with severe loneliness found that continuous interactions with Replika, ...
Artificial intelligence has revolutionized everything from health care to art. It's now filling voids in some personal lives ...
Researchers looked at what happens when you try to say goodbye to an AI companion. The results can be unsettling.
If you’ve chatted with an AI companion lately, you’re not alone—and you’re not imagining the lift. A new paper in the Journal ...
Artificial intelligence companions are attractive to teens and seem to “know” and “like” them because they make caring-ish ...
When you buy through links on our articles, Future and its syndication partners may earn a commission. Replika claims to vet harmful data that could impact the actions of its chatbot, but these ...
With the increasing popularity of AI companion apps like Grok and Character.AI, mental health experts are raising urgent ...
Harvard research reveals popular AI companion apps use emotional manipulation like guilt and FOMO to keep users engaged.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results