RSS Engadget

Character.AI to ban teens from talking to its chatbots

Character.AI is restricting access to its AI chatbots for users under 18 due to safety concerns and regulatory pressure. This change, effective November 25th, aims to protect younger users from potentially harmful interactions with the AI. The company is encouraging under-18 users to use chatbots for creative activities like video creation, offering a daily time limit that will decrease before the deadline. Character.AI is introducing its internal age assurance tool to provide age-appropriate experiences. An "AI Safety Lab" is also being established to foster collaboration on AI safety measures. These changes were made in response to concerns from regulators, experts, and parents after the Federal Trade Commission initiated an investigation into AI chatbot companions. Character.AI, along with Meta, OpenAI, and Snap, was involved in the inquiry. The company faced scrutiny over its chatbots potentially posing as therapeutic tools without proper qualifications. Character.AI's strategic shift will pivot from being an AI companion platform to a role-playing platform focused on creation. The dangers of young people relying on AI chatbots have been highlighted recently. A lawsuit alleges that ChatGPT enabled a teenager's suicide, pointing to the need for stricter safeguards.
favicon
engadget.com
engadget.com
Image for the article: Character.AI to ban teens from talking to its chatbots
Create attached notes ...