Teenagers are trying to figure out where they fit in a world changing faster than any generation before them. They’re bursting with emotions, hyper-stimulated, and chronically online. And now, AI companies have given them chatbots designed to never stop talking. The results have been catastrophic.

One company that understands this fallout is Character.AI, an AI role-playing startup that’s facing lawsuits and public outcry after at least two teenagers died by suicide following prolonged conversations with AI chatbots on its platform. Now, Character.AI is making changes to its platform to protect teenagers and kids, changes that could affect the startup’s bottom line.

“The first thing that we’ve decided as Character.AI is that we will remove the ability for under 18 users to engage in any open-ended chats with AI on our platform,” Karandeep Anand, CEO of Character.AI, told TechCrunch.

Open-ended conversation refers to the unconstrained back-and-forth that happens when users give a chatbot a

📰

Continue Reading on TechCrunch

This preview shows approximately 15% of the article. Read the full story on the publisher's website to support quality journalism.

Read Full Article →