Tech News
← Back to articles

Discord will soon require age verification to access adult content

read original more articles

Discord is the latest company looking to bolster its child safety (again). Starting in March, all users will have a "teen-appropriate experience" by default. Unlocking adult content and age-gated spaces will require a (usually one-time) verification process.

The platform's big safety update encompasses communication settings, restricted access to age-gated spaces and content filtering. Users who aren't verified as adults will see blurred sensitive content. In addition, age-restricted channels, servers and app commands will be blocked. DMs and friend requests from unknown users will be routed to a separate inbox.

If you're an adult, removing these restrictions will require one of two verification methods at launch. You can take a selfie video for age estimation or submit a government ID to Discord's vendor partners. (Let's just hope the age estimations work better than Roblox's.) The company stresses that the video selfies you submit for age estimation never leave your device. And it claims ID documents sent to its vendor partners are deleted quickly, "in most cases, immediately after age confirmation."

Advertisement Advertisement

Advertisement

Although Discord says the process will be one-and-done for most people, some may be required to submit multiple forms of verification. It also says that additional verification options will arrive in the future, including an age inference model that runs in the background.

This isn’t the company’s first attempt at beefing up its child safety measures. In 2023, it banned teen dating channels and AI-generated CSAM. Later that year, it added content filters and automated warnings. Those changes followed an NBC News report that 35 adults had been prosecuted on charges of "kidnapping, grooming or sexual assault" that involved Discord communication.

Alongside today’s changes, Discord is recruiting for a new Teen Council. The group will include 10 to 12 teens aged 13 to 17. The company says this "will help ensure Discord understands — not assumes — what teens need, how they build meaningful connections, and what makes them feel safe and supported online." This sounds like the corporate equivalent of the parenting advice: “Don’t just talk to your children; listen to them, too.”

The child safety changes will start rolling out globally in early March. Both new and existing users will be required to submit verification for adult content.