Roblox, the online gaming platform that has been under fire due to child safety concerns, has introduced age-verification software that uses facial scanning to estimate the age of players.
The system is currently voluntary, but by the first week of December it will be a requirement in markets such as Australia, the Netherlands and New Zealand in order for players to chat with others online. By early January, players in all Roblox markets, including the US, will be required to use the software if they want to engage in chats with other players. Roblox said it has also launched a Safety Center hub with information for parents and parental control tools.
Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
Roblox says the age-verification system is being put in place to limit contact between adults and children, which has been a chief concern among child-safety advocates.
How it works
Roblox's new age-verification feature takes a 3D scan of a player's face, using a webcam or a mobile device's camera, to estimate the person's age. Based on that estimate, a player can use online chat with other players in their age group.
In a video about the software, Roblox says it immediately deletes captured images or video after the age check is complete.
The age check is performed by a vendor of Roblox called Persona.
Once they complete the check, players are grouped into the following age categories: under 9, 9–12, 13–15, 16–17, 18–20, or 21 and over. The company said that those under 9 won't be allowed to chat without parental permission. The chats won't be strictly limited to those age groups, necessarily. Roblox said players "can chat only with peers in their group or similar groups, as appropriate."
The company said it's also taking measures such as restricting media sharing among players and using AI to monitor chats.
... continue reading