Online gaming giant Roblox has just announced it will start checking users’ ages from early December in an attempt to stop children and teenagers talking with adults.
In what the company has described as a move that sets a “safety gold standard” for the industry, it says it will be the first online gaming or communication platform to require facial age assurance to access chat features for all users.
This requirement comes into effect in Australia just days before the country’s social media age restrictions launch on December 10. It also comes at a time when Roblox – which boasts nearly 380 million active monthly users worldwide – finds itself embroiled in several lawsuits and facing growing public concerns about child grooming and other harms on the platform.
So how exactly will the age requirement work? And will it actually help to keep users – more than half of whom are under 16 – safe?
A global rollout
The age check requirement will be rolled out first in Australia, New Zealand and the Netherlands in early December. It will be expanded globally in early January.
Roblox will require the checks for all users who want to access chat features.
Age checks will involve either facial age estimation enabled by artificial intelligence (AI) or ID verification. Once the age check is complete, users will then be grouped by age and only allowed to chat with people of similar ages.
Roblox says its age checks (to be run by Persona, a third-party identity verification platform) will be “fast” and “secure”, with the Roblox app using the camera on the user’s device.
Users will take a video selfie and be required to move their face in specific directions, to ensure a real person is being checked, to estimate their age.
Once the video is processed it will be deleted, immediately.
Roblox under fire
At the moment Roblox will not be included in Australia’s social media ban for under 16s. However, the company has come under fire in recent months over concerns about grooming, gambling behaviour, and other potential harms for children on its platform.
In April 2025, a California man was accused of kidnapping and engaging in unlawful sexual conduct with a 10-year-old child he met on Roblox.
This year, several lawsuits have been launched against Roblox.
Earlier this month, Texas Attorney General Ken Paxton sued Roblox for “ignoring [American] online safety laws while deceiving parents about the dangers of its platform”.
Separate lawsuits were filed in Kentucky in October, and Louisiana in August, accusing Roblox of harming children.
Florida also filed a criminal subpoena in October alleging Roblox was “a breeding ground for predators”.
Roblox announced in September that it would implement safety measures in Australia “as a result of eSafety’s engagement with the platform”. These measures include:
- making accounts for users under age 16 private by default
- introducing tools to prevent adult users from contacting under 16s, without parental consent
- switching off by default direct chat and “experience chat” within Roblox games, until a user has completed an age check
- not allowing voice chat between adults and children 15 and under.
Unlike many other platforms, Roblox does not encrypt private chats. This enables the company to monitor and moderate the conversations.
Age checks won’t fix other problems
While these measures will likely be welcomed by parents and others concerned for child safety online, they are not foolproof.
There are limitations to age assurance technologies, which can estimate a person to be between one to three years older – or younger – than their actual age.
This means some children may be assigned into an incorrect age grouping. It also means some adults over 18 may be estimated to be under 18, enabling them to chat with younger people.
Parents whose accounts are linked to their child’s account will be able to correct their child’s age. All users over 13 will be able to correct their age by uploading ID into the system, which may raise concerns about data privacy for users.
There may also be people who lack the appropriate ID necessary to make the corrections, which may restrict their access to age-appropriate features on the platform.
Roblox also allows users to be “trusted connections” and chat with age-checked users 13 and older, with whom they have an existing real-world connections. This will be verified via a QR code or phone number. This means parents will need to check these connections carefully and continue to monitor children’s interactions.
While Roblox’s restrictions will limit interactions to users of similar ages, that doesn’t mean many of the other potential harms – such as cyberbullying – won’t occur within a peer group.
There are also other potential harms that young users may encounter that may not involve chat features. These include virtual sexual assault, as highlighted by a recent investigation by Guardian Australia into Roblox.
The eSafety Commissioner will continue to monitor Roblox and other platforms in future, and these may be classed as age-restricted social media under the legislation if warranted. Meanwhile, parents and other carers should review eSafety’s advice about the upcoming ban and steps they can take to keep their kids safe online.
This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Lisa M. Given, RMIT University
Read more:
- What teenagers want adults to know about their digital lives
- The psychology of generation Alpha
- Porn not ‘inherently harmful’, says first inquiry of its kind in Australia
Lisa M. Given receives funding from the Australian Research Council. She is a Fellow of the Academy of the Social Sciences in Australia and the Association for Information Science and Technology.


The Conversation
KPLC
Los Angeles Times Business
CNET
Essentiallysports Golf
IMDb TV
CNN
Raw Story
AlterNet
Atlanta Black Star Entertainment