News

People are increasingly turning to AI companion apps for support and friendship, but experts argue that without proper regulation, they risk creating harm for users.
A psychiatrist recently pretended to be a troubled teen and asked chatbots for help. They dispensed worrying advice.
AI offers companionship, but risks replacing human relationships, raising questions about emotional intimacy and societal impact.
As an increasing number of individuals turn to AI for guidance, three researchers explore whether it can be relied upon for ...
Was I the Problem? That night, I didn’t cry, I typed. And I didn’t message any other friend. I messaged Artificial ...
Over time, these AI chatbots can be customised; 'eventually, they know when you’re tired, when you’re upset, or when your ...
As more and more people spend time chatting with artificial intelligence (AI) chatbots such as ChatGPT, the topic of mental ...
AI chatbots may be a useful place to start when you're having a bad day and just need a chat. But when the bad days continue ...
AI-powered companion apps allow users to build characters with which they can text or even hold voice and video calls, but ...
People are turning to AI chatbots for emotional and social support. While chatbot friends can ease loneliness, they can also ...
AI is not all good nor all bad, but it won’t ever be able to replace the value of human connection in therapy. “AI really ...