Roblox is making a fresh attempt to reassure parents and regulators that its platform can be made safer for younger users. In the Middle East, where chat remains unavailable in much of the region after last year’s clamp-down, the company is still trying to answer the concerns that prompted it.
On Monday, the company announced two new age-based account types for under-16s – Roblox Kids for children aged 5 to 8, and Roblox Select for those aged 9 to 15. The system ties content access and parental controls more closely to verified age, while narrowing what younger users can see and do on the platform.
Founded in 2004 and launched publicly in 2006, Roblox has grown from a niche gaming platform into a vast ecosystem of user-made worlds, accessed by about 144 million people each day, up from 97.8 million daily active users in the first quarter of 2025.
The platform has evolved into its own social metaverse, particularly popular with Gen Z and Gen Alpha. Users move through millions of “experiences” created by other players, from obstacle courses and role-playing worlds to social spaces and branded events. That scale is central both to its popularity and to the questions that have followed it.
In March, UK regulator Ofcom said major platforms including Roblox still needed stronger age assurance and “fail-safe grooming protections” to stop strangers contacting children online.
Researchers at Revealing Reality, which previously examined Roblox’s safety systems, have also argued that while the company has made improvements, adults and children could still interact without what it called “meaningful separation”.
For Matt Kaufman, Roblox’s chief safety officer, the latest changes address a problem the company struggled to address for years: users could simply enter a false age when signing up.
With the new system, the platform will verify age through facial age estimation or ID checks, using that process to place younger users into the new account types.
Facial age checks work by asking users to take a live face scan, which is then analysed by verification provider Persona to estimate their age. The company says the images or video used for that process are deleted immediately after secure processing.
“Before age checks, if you told users they had to be 13 or 16 or 18 to access something, they would just enter whatever age they needed,” Kaufman tells The National. “Now that verification technology is working at scale, we can use it to govern both communication and content.”

Roblox began introducing age checks last year for certain features, then expanded them in January to cover communication more broadly.
Under the new system globally, users who do not complete age verification will be limited to a narrower version of the platform, with communication disabled and access restricted to lower-rated experiences. Those who do verify their age will be placed into account types that determine which features and games are available to them.
The updates address several criticisms experts have made previously. “The challenge with online age verification is that it’s easy to bypass by entering a false birth date,” Mudresh Shah, a manager at UAE-based cyber security company Help AG, said last year. “There are also personal data risks, due to privacy concerns and poor security, and inconsistent methods across platforms, which can lead to gaps in protection.”
Kaufman says the aim is to both protect younger users inherently while also making the platform easier for parents to understand. The new system also integrates stronger parental controls, allowing them to individually block content they don't find acceptable for their children on a case-by-case basis.
“We’ve designed these accounts so that the content and features a child can access are more clearly aligned with their age,” Kaufman says.
The company’s latest safety push comes after a period of wider regional pressure. Kuwait, Qatar and Turkey blocked Roblox in August 2025, after earlier bans in Oman and Jordan, while Saudi Arabia and UAE remained available. Jordan and Kuwait have since restored access, a representative says, and no further bans have been made in the region.
The UAE’s own intervention came in September, when Roblox suspended in-game chat across Arabic-speaking countries in the Middle East after discussions with regulators, including the Telecommunications and Digital Government Regulatory Authority. The move followed concerns over child safety, particularly the risk of adults interacting with children on the platform and the difficulty of moderating communication in Arabic.

“It’s no surprise Roblox gets a lot of attention, in the Middle East or anywhere else,” Kaufman says. “We have millions of daily users, and a lot of them are kids and teens.”
Roblox says younger users will still be able to access most of the games they already play, but only after those titles have passed additional layers of review. This includes content ratings, developer verification and time for the company to assess how older users interact with a game before making it available more widely.
That caution reflects the open-source nature of the platform itself. It is not working with a fixed library of titles, but with an ecosystem in constant motion, where user-generated games are built, updated and reshaped all the time.
“There are millions of experiences on Roblox being created constantly,” Kaufman says.
What may look harmless at launch, he suggests, can become more complicated once users begin spending time inside it. That is why Kaufman says Roblox is putting more weight on how games behave over time, not just how they appear at first glance.
In the Middle East, however, the bigger question is still the chat function. Roblox says it intends to restore the feature eventually, but Kaufman offers no timeline. For now, he says, the company is still focused on improving the Arabic-language safety systems needed to bring it back.
“We turned chat off last year to give ourselves time to improve our Arabic safety technology,” he says. “That work is still under way.”
Roblox currently uses what Kaufman describes as a three-layer moderation system. The first relies on pattern matching, allowing the company to quickly block harmful words and phrases.
The second is trained on past communication and user reports to identify harassment, discrimination and other abusive behaviour.
The third uses large language model tools to assess intent over time, including attempts to share personal information.
That final layer is where the regional challenge becomes most specific. Kaufman says Arabic remains harder to moderate at scale than English, especially when the goal is to understand how the language is being used.

“We’re especially interested in Arabic-native models that understand not just formal Arabic, but also colloquial speech – the way children actually speak day-to-day,” he says.
Abu Dhabi's own technology efforts could play a vital role. Roblox is paying close attention to Arabic AI development, including work at Mohamed bin Zayed University of Artificial Intelligence, as it looks for ways to strengthen moderation in the region, though no formal partnerships have been forged.
Until Arabic LLM technology is closer to the maturity of its English-language equivalents, chat will remain off on Roblox.
“We're trying to figure out how we can use those technologies to improve our own systems. As they improve, we certainly have the intention of turning chat back on in the Middle East,” says Kaufman.
Age controls are intended to address some of the most persistent issues, attempting to persuade parents and regulators that the platform has become more governable.
In the Middle East, it remains unclear whether regulators will view those developments as sufficient without further progress on the chat issues that drew scrutiny in the first place.



