Workflow
Roblox rolls out new AI-powered safety measures to protect teens
NBC Newsยท2025-07-18 04:30

Child Safety Measures - Roblox is implementing new AI-driven age estimation technology, including selfie video analysis and ID verification, to enhance child safety on its platform [3] - The company has zero tolerance for age-inappropriate content and aims to exceed governmental and legislative standards in protecting vulnerable teenagers aged 13-17 [2] - Roblox is introducing "trusted connections" for verified teens to enable more open communication, while ensuring monitoring for critical harm like grooming [3][5] - Parents will have the ability to monitor their children's trusted connections and time spent on the platform [5] - Child internet safety experts emphasize the importance of these measures to keep communication on monitored platforms [5] Platform Usage & Demographics - Roblox is a massively popular online gaming platform with approximately 100 million daily users who play games created by themselves and others [1] - More than half of Roblox's players are children under the age of 16 [1] Competitive Landscape - Minecraft, another popular online video game, has similar safety measures, requiring users under 16 to link their accounts to an adult's Microsoft account [7] - Minecraft's parental controls allow guardians to monitor their children's activities, force single-player mode, and limit exposure to violence [7] Past Incidents - Roblox has faced criticism regarding child safety concerns, including an incident involving a 27-year-old man accused of kidnapping and unlawful sexual conduct with a 10-year-old girl he met on the platform [4]