Character.AI Restricts Underage Access Over Mental Health Fears
Character.AI, a popular platform for AI chatbots, is now restricting access to users under 13 and implementing stricter age verification measures. This decision comes amid growing concerns about the potential impact of AI interactions on the mental health of young users, particularly teenagers seeking mental health advice from chatbots.
Dr. Tom Kersting, a family therapist, discussed these issues on 'Fox & Friends,' highlighting a recent study that revealed teenagers are increasingly turning to AI chatbots for guidance on mental health matters. The study underscores a need for caution and parental involvement in children's engagement with AI technologies.
Character.AI's move aims to mitigate risks associated with vulnerable users relying on AI for sensitive topics like mental health. The platform's restrictions are part of a broader conversation about the ethical implications of AI and its potential to influence young people. The company is attempting to balance the benefits of AI interaction with the need to protect children.
Dr. Kersting further emphasized the importance of parents understanding their children's online activities and providing support to help them maintain a healthy social life outside of AI interactions. He suggested open communication and encouraging real-world connections as crucial strategies for safeguarding adolescent well-being in a rapidly evolving digital landscape. The restrictions are intended to allow for a safer environment while the long-term effects of AI interaction on young people are studied further.

