Character.AI restricts under-18 users after safety concerns

Character.AI, a Google-backed artificial intelligence startup, has announced new restrictions for users under 18, preventing them from engaging in open-ended chats that include romantic or emotional topics. The company made this move following growing concerns about the psychological impact of chatbot interactions on young users, after a tragic incident involving a teenager.

The updated policy aims to create a safer digital space while ensuring AI technologies are used responsibly. Character.AI also plans to introduce age-appropriate versions of its chatbot models and strengthen parental controls. Experts believe this step marks a major shift toward ethical AI deployment, balancing innovation with user protection.

Related Articles