Roblox requires facial verification to chat on the platform

  • Roblox requires mandatory age verification via facial recognition or document to access the chat.
  • The system segments conversations into six age groups and limits contact between adults and minors.
  • The measure comes after strong criticism, investigations and lawsuits for failures in the protection of minors.
  • Verification relies on the Persona provider, with image removal after the process and appeal mechanisms.

Facial verification in Roblox

Roblox has made a major change in how it manages communication between its users: from now on, To use the chat functions, it will be mandatory verify age through facial recognition or official documentation. This decision directly affects a community in which a significant portion of the players are children and teenagers, which has brought the balance between security and privacy into sharp focus.

The new policy is already being implemented globally and involves moving from a simple date of birth field to much stricter identity checksAlthough the change is aimed at users worldwide, it is especially relevant for families and educators in Europe and Spain, where the debate on how to protect minors on digital platforms and metaverses is increasingly present.

How the new age verification works in Roblox

Age verification system in Roblox

The new control system doesn't change basic access to the game, but it does It marks a turning point in the use of chat within the platform.To be able to send messages, users must complete an age verification process that is done directly from the Roblox app, using the device's camera or by uploading an identification document.

In practice, the procedure is quite guided: the player opens the corresponding section in the application, accepts the use of the camera and follows a series of steps to capture their face in a photo or short video. The verification is not done directly by Roblox, but is delegated to Persona., a company specializing in digital identity solutions that uses Facial Age Estimation technology to calculate the user's age range.

For those over 13 years old, there is an alternative: instead of facial recognition, They can choose to upload an official document (such as a national identity card, passport, or equivalent depending on the country). In both cases, Roblox emphasizes that the images and videos used in the process are deleted once the verification is complete and are not stored as a permanent biometric database.

If the system fails to estimate age or classifies someone in the wrong group, an appeals mechanism is available. Through this channel, the user can request a new review using alternative methods, such as additional documentation or parental involvement. They can update their children's account information and adjust the data in case of error.

Roblox also indicates that it will continue to monitor account behavior continuously and that, when it detects striking discrepancies between the declared age and actual activity, may require additional verificationThis aims to close loopholes for those who try to circumvent the filters to contact minors.

Chat segmented by age: six groups and less mixing between minors and adults.

Once the person passes the age check, the system activates what Roblox calls age-based chatA communication architecture that limits who can talk to whom. The platform divides all users into six well-defined groups: under 9 years old; 9 to 12; 13 to 15; 16 to 17; 18 to 20; and over 21 years old.

The design of this scheme has a clear objective: Minimize direct contact between adults and children under 16 years of ageBy default, each user can only communicate with people in their own age group and the groups immediately above and below. That is, a player aged 13 to 15 can chat with people aged 9 to 12 and 16 to 17, but not with adults aged 18 or older.

For children under 9, the platform further tightens the conditions: the chat is disabled by default and It is only enabled if a responsible adult gives explicit consent. after completing the required verification himself. This point aims to put a clear stop to unsupervised conversations with young children.

For teenagers aged 13 and up, Roblox maintains the feature of so-called "trusted connections." This feature allows, under certain conditions, Expand the circle of communication and include contacts outside the immediate areaprovided there are prior relationships or verified connections. The idea is that friends and family can stay in touch even if their ages don't quite fit within the standard age limits.

It is worth emphasizing that mandatory age verification only applies to chat. It is not required to enter Roblox or to play the available experiences on the platform, which means that it is still possible to access the games without going through facial recognition, but with restricted social functions.

Privacy, accuracy, and doubts about the use of biometric data

The use of facial recognition and age estimation technology has raised concerns among some members of the community, who are questioning the extent to which this practice is being used. Is it safe to give images of your face to a technology company?Roblox insists that the system has been designed with multiple layers of protection and that privacy has been a priority from the beginning.

In this regard, the company emphasizes that Persona, its technology partner, processes the images only to calculate the age range and that The photos and videos are deleted after the verification process is complete.According to their version, they would not be used for other commercial purposes nor would they be stored on servers long-term, something that attempts to address fears about the creation of biometric databases.

To reinforce security, Roblox combines this age verification with other tools already present on the platform: automatic filters that block the exchange of personal data between users who do not have a relationship of trust, proactive moderation systems to detect inappropriate content, and fast reporting channels to report suspicious behavior.

Regarding the system's reliability, the company claims that its facial estimation model has been tested by independent laboratories in the UK, with an average margin of error of around 1,4 years for those under 18. However, it acknowledges that no mechanism of this type is infallibleThis is why it has provided from the beginning for review and appeal procedures when the calculation does not match reality.

From a European perspective, these types of solutions operate in sensitive territory, marked by regulations such as the General Data Protection Regulation (GDPR). Privacy advocacy organizations and digital rights experts are watching closely. How are these controls applied in EU countries?where the treatment of biometric data It is considered especially sensitive and is subject to strict legal requirements.

A change accelerated by legal pressure and criticism of security

The decision to impose facial verification for chatting didn't come out of nowhere. It comes after several years in which Roblox has faced very serious accusations regarding failures in the protection of minorswith investigations, reports and a mountain of lawsuits that have damaged his public reputation.

One of the most discussed investigations was that of Hindenburg Research, published in 2024, which described how the platform served as a meeting point for hundreds of malicious actors. According to its findings, there were at least 38 groups within Roblox, some with more than 100.000 members. dedicated to the exchange of child abuse material, sex games, violent content, and extremely offensive speechThis scenario, if confirmed, would demonstrate a serious failure of supervision.

Meanwhile, the British organization Revealing Reality conducted a practical experiment by creating several fictitious accounts simulating users aged 5, 9, 10, 13, and over 40. The aim was to verify, beyond commercial promises, whether Roblox's security controls actually worked in practice and whether they were able to prevent direct contact between young children and adults.

The results were worrying: multiple cases were documented in which children as young as 5 years old could interact via chat with adults without effective age verification. The report concluded that existing mechanisms were "limited in their effectiveness" and that Significant risks to minors remained in the gaming environment, despite the parental control tools announced by the company.

These and other investigations have had significant legal repercussions. In the United States alone, in the last year More than 20 federal lawsuits have been filed accusing Roblox of facilitating, through action or omission, situations of child sexual exploitation. At the same time, the game is banned in nearly ten countries—including Turkey, Algeria, China, Iraq, North Korea, Palestine, and Russia—primarily due to concerns related to child safety already the presence of content considered harmful.

Security improvements and skepticism among experts

Faced with this context of legal and media pressure, the company maintains that in recent years it has taken important steps to strengthen the protection of its most vulnerable users. Roblox states that in 2024 alone... It introduced more than 40 security improvementswith a particular emphasis on expanding and refining parental control tools that allow mothers and fathers to manage their children's activity.

These measures include more aggressive moderation of content and experiences, options to limit the types of games a minor can access, and additional filters on social features. The company itself emphasizes facial verification and age-based chat segmentation. They are part of a broader plan to raise its standards and respond to criticism received internationally.

Despite this, there are still voices that believe the efforts are insufficient. Damon De Ionno, director of research at Revealing Reality, told the British newspaper The Guardian that, although the new security features are a step in the right direction, Children can still chat with strangers who are not on their friends list and that it is unrealistic to expect families to monitor everything that happens in an environment with more than six million experiences available.

One of the most delicate points is that many of these experiences have descriptions and ratings that don't always accurately reflect their content. This means that, in practice, It may be very difficult for parents to assess the actual risk. of each game, even if they spend time browsing the catalog. Facial verification can help better segment interactions, but it doesn't solve all the problems associated with controlling what minors see and do within the Roblox universe.

With a user base of approximately 111,8 million daily active users worldwide, roughly 40% of whom are under 13, the scale of the challenge is enormous. The access model freemium And the absence of a clear age limit for registration has made the platform a particularly attractive environment for both children and, unfortunately, for those who try to take advantage of themThe new verification policy is thus presented as one more piece of a security puzzle that is still far from complete.

Global impact of the deployment and Europe's role

The rollout of mandatory age verification in chat rooms has been presented as a phased worldwide rolloutAfter a pilot phase that began in late 2024 in countries such as Australia, New Zealand and the Netherlands, Roblox began to extend the measure to other key regions, including Spanish-speaking territories such as Mexico and Colombia.

According to data shared by the company, more than 50% of daily active users In those test markets, verification was completed almost immediately, and globally, tens of millions of accounts have already been verified. These numbers position Roblox as one of the first major video game platforms to require such a rigorous age verification system to enable communication between users.

In Europe, the movement is also observed from a regulatory perspective: legislation such as the Digital Services Act (DSA) and regulations on the online protection of minors require large platforms to adopt proactive measures such as strengthening age controlespecially when their services are aimed at children. Roblox's move can be interpreted as an attempt to align itself with this increasingly demanding framework.

The company has also announced that in the coming months It will extend similar verification controls to other sensitive functions.This includes real-time collaboration within Roblox Studio, the experience creation tool. The goal is to ensure that participants in shared creative spaces also meet certain age requirements, thereby reducing potential situations of harassment or exploitation.

At the same time, Roblox has made informational materials and demonstrations of the verification process available in its newsroom and official channels. The intention is to clearly show the process. how the validation is carried out and what data is collectedtrying to build trust with parents, educators, and authorities who scrutinize any use of facial recognition in minors.

With this package of changes, the gaming platform finds itself at the center of the debate on how the most popular metaverses and virtual spaces among children and teenagers should be configured. The imposition of facial verification for chatting, strict age segmentation, and the strengthening of parental control tools point to a somewhat more controlled environment, but recent research, user volume, and expert criticism remind us that Child safety in Roblox remains an open challenge which will require constant monitoring, continuous adjustments and real involvement from families, regulators and the company itself.

Roblox chats will require facial recognition
Related article:
Roblox will require age verification with facial recognition in chat