Discord will restrict all user accounts globally starting next month unless users verify their age through facial recognition or government-issued ID, the company announced on Monday.
The messaging platform plans a phased rollout beginning in early March that places every account into a “teen-appropriate experience” by default. Users who want full access must prove they are adults through one of two methods: submitting video selfies analyzed by artificial intelligence or uploading identification documents to third-party vendors.
JUST IN – Discord to globally require a bio-metric face scan or ID verification for full access next month to protect "teen safety." — Verge pic.twitter.com/5EYFwz5YCY
— Disclose.tv (@disclosetv) February 9, 2026
The change affects more than 200 million monthly users worldwide. Discord tested similar restrictions in the United Kingdom and Australia throughout 2025 before deciding to expand globally.
Users who decline to verify will face multiple restrictions. The platform will block access to age-gated channels and servers, blur sensitive content permanently, route direct messages from unknown contacts to a separate inbox, and prevent participation in stage channels where users can speak to audiences.
Discord requires verification to access age-restricted content, even for longtime users who joined those servers years ago. The company says most people will complete the process once, though some may need multiple verification methods if additional confirmation is needed.
“Nowhere is our safety work more important than when it comes to teen users,” Savannah Badalich, Discord’s global head of product policy, said in a statement.
The platform promises privacy protections for both verification methods. Video selfies remain on users’ devices and never upload to Discord servers, the company said. Identity documents submitted to vendor partners are deleted quickly, in most cases immediately after confirmation.
Discord announced the change four months after a data breach exposed roughly 70,000 users’ government identification photos. Hackers compromised a third-party vendor that Discord used for age-related appeals in October. The company said it immediately severed ties with that vendor and switched to a new provider.
Discord also plans an age inference model that runs in the background to determine whether accounts belong to adults without requiring active verification. The system will analyze account activity patterns to make these assessments.
The company is recruiting 10 to 12 teenagers aged 13 to 17 for an inaugural Teen Council that will advise on safety features and product design. Applications close May 1.
The move follows regulatory pressure in multiple countries. The UK’s Online Safety Act and similar Australian legislation mandate age assurance measures for platforms hosting user-generated content. Other social platforms, including Roblox and YouTube, have introduced their own age verification systems in recent months.
Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.