Key Highlights:
- Character.AI will end open-ended AI chats for under-18 users after November 25 amid growing scrutiny.
- Such users will face a two-hour daily chat cap before full restrictions take effect.
- The company is also launching new age assurance tools and an AI Safety Lab to strengthen safeguards and research safety standards.
AI is booming, and booming at a rapid pace that not many have thought of a few years ago. These days, working around AI assistants has become a staple for many, with tools like ChatGPT, Copilot, Claude, and others helping with daily workflow. While AI usage is increasing among adults, its impact in the creative industry is making headlines. Apart from the controversy and opinions about copyright and related matters, the well-being of teenagers has been another topic that is finally being addressed seriously by these AI giants.
Teens under 18 face two-hour daily cap before open chat ends November 25
Character.AI is among many others that are starting to think more about teens. Yesterday, the company announced that teens under 18 won’t be able to engage in open-ended chat with AI on its platform. The deadline for this initiative to take full effect is November 25. With still twenty-odd days remaining, Character.AI has explained the way forward.
During the transition period, teens under 18 will start to notice the two hours per day chat limit. However, the daily cap will start decreasing further as the day passes. By the deadline, they won’t be able to chat with AI at all. Worth noting that during the remaining period, such users will be able to create videos, stories, and streams with Characters.
Character.AI’s emphasis on teens safety on its platform
In yet another effort, Character.AI is rolling out age assurance tools to ensure the right experience for every age group. It’ll combine the company’s in-house verification model and third-party services like Persona. Putting it simply, these tools will help better identify under-18 teens and make sure proper guardrails are in place to ensure their safety on the platform.
Last but not least, the company has announced the AI Safety Lab. Character.AI calls it “an independent non-profit” established to develop safety frameworks for AI-powered entertainment and other use cases. The company will partner with multiple researchers, policymakers, and academic experts for the same. Character.AI, in its announcement, explicitly mentions that it has received feedback from regulators, safety experts, and parents on this matter. The company subtly also notes that its “content controls work perfectly.”
Character.AI’s new measures reflect a growing industry-wide realization, innovation isn’t big until it’s responsible. Whether these changes can truly safeguard younger users remains to be seen, but the effort is commendable to say the least.
Why are AI companies upping their guards — and are they working?
When we speak of AI usage among teens, OpenAI immediately comes to everyone’s mind, and rightly so. To catch you up, OpenAI has faced serious allegations in this regard, and a lawsuit for an unfortunate death of a Californian teen, who allegedly committed suicide under ChatGPT’s influence. Looking at the copy of the lawsuit, you’ll come across some chilling details from the interaction of the teen and the AI assistant.
While it’s good to see AI companies taking decisions in favor of the safety of teens, recent initiatives from companies like Meta are reportedly turning out to be ineffective. Reports from late last month suggest that Meta’s Teen Account and other safety features are “abjectly failing” to keep teens safe. The report, titled “Teen Accounts, Broken Promises“ suggests many core AI safety features, such as Sensitive Content Controls, inappropriate contact prevention tools, and screentime features, didn’t work to say the least.
As AI continues to shape how people interact online, stricter guardrails are becoming essential. That’s especially true given that around 70% teens are using AI, because they think “AI is always available. It never gets boring with you.”














