Ask Onix
Roblox imposes mandatory age verification to restrict adult-child interactions
Roblox will enforce facial age verification for all users accessing chat features, starting in December for Australia, New Zealand, and the Netherlands before expanding globally in January, the company announced Wednesday. The move aims to prevent adults from communicating with underage players on the platform, which hosts over 80 million daily users-nearly half of them under 13.
Pressure from lawsuits and regulatory scrutiny
The policy shift follows mounting criticism over child safety failures and lawsuits in multiple U.S. states, including Texas, Kentucky, and Louisiana. Earlier this year, a BBC investigation revealed vulnerabilities allowing a 27-year-old and a 15-year-old to exchange messages via unlinked devices. Roblox has previously stated that such breaches often involve users migrating conversations to external platforms.
Australia's impending social media ban for users under 16 has also intensified scrutiny on gaming platforms like Roblox, though the company's new measures preempt potential regulatory action. The UK's Online Safety Act, enforced by Ofcom, already mandates stricter child protections for tech firms. Anna Lucas, Ofcom's online safety supervision director, praised the age-check rollout but emphasized that "more needs to be done."
How the age verification system works
Users must complete a facial age estimation via their device's camera within the Roblox app. The system, provided by an external vendor, deletes images immediately after processing and claims accuracy within "one to two years" for ages 5-25, according to Matt Kaufman, Roblox's chief safety officer. Verified users are then sorted into age brackets (e.g., under 9, 9-12, 13-15) and can only chat with peers in similar ranges-unless they designate someone as a "trusted connection" (e.g., a known friend or family member).
Parents retain control over their child's account, including the ability to adjust age settings post-verification. Under-13s remain blocked from private messages and certain chats unless granted parental permission. Roblox asserts the changes will create more "age-appropriate" experiences and expects other platforms to adopt similar safeguards.
Ongoing concerns and advocacy demands
Despite existing restrictions-such as bans on image/video sharing in chats and limits on external links-campaigners argue Roblox still exposes children to risks. Rani Govender of the NSPCC called the platform's current safety measures insufficient, warning that "young people face unacceptable risks, leaving many vulnerable to harm and online abuse." The charity urged Roblox to ensure its updates "deliver change in practice" and block predators from targeting minors.
This week, advocacy groups ParentsTogether Action and UltraViolet staged a virtual protest within Roblox, delivering a 12,000-signature petition demanding stronger protections. The petition declares, "Roblox must stop being a playground for predators," echoing longstanding criticisms from parents and child safety organizations.
"Supplies have stabilized, but conservation remains essential."
Dave Baszucki, CEO of Roblox, in a March interview with the BBC
Broader industry impact
Roblox's facial age verification system positions it as the first major gaming platform to mandate such checks for chat access. The company frames the move as a model for the industry, though skeptics note that determined users may still find workarounds. With global enforcement beginning in January, the policy's effectiveness-and potential regulatory ripple effects-will likely face close scrutiny in 2026.