Discord to Implement Age Verification Process for Adult Content Access
In a move aimed at bolstering its child safety measures, Discord is set to introduce a new age verification process for accessing adult content on its platform. Starting in March, all users will have a "teen-appropriate experience" by default, with adult content and age-gated spaces requiring a one-time verification process.
Under the new policy, users who are not verified as adults will see blurred sensitive content, while age-restricted channels, servers, and app commands will be blocked. Direct Messages (DMs) and friend requests from unknown users will also be routed to a separate inbox. If a user wishes to access adult content or remove these restrictions, they can opt for one of two verification methods: submitting a government-issued ID or undergoing an age estimation process using a video selfie.
Discord has emphasized that the age estimation process is designed to ensure the security and integrity of user data, with the video selfies never leaving the user's device. Additionally, the company claims that ID documents sent to its vendor partners are deleted immediately after age confirmation.
While the new policy aims to provide a safer environment for children on Discord, it has raised questions about the effectiveness of the age estimation process and the potential for false positives. The company has acknowledged that some users may be required to submit additional forms of verification, with further options, including an age inference model, being developed in the future.
Discord's child safety measures have been expanded several times over the past year, following a high-profile NBC News report on prosecutions related to Discord communication. In 2023, the platform banned teen dating channels and AI-generated CSAM (child sexual abuse material), and introduced content filters and automated warnings.
The new policy will roll out globally in early March, requiring both new and existing users to submit verification for adult content.
In a move aimed at bolstering its child safety measures, Discord is set to introduce a new age verification process for accessing adult content on its platform. Starting in March, all users will have a "teen-appropriate experience" by default, with adult content and age-gated spaces requiring a one-time verification process.
Under the new policy, users who are not verified as adults will see blurred sensitive content, while age-restricted channels, servers, and app commands will be blocked. Direct Messages (DMs) and friend requests from unknown users will also be routed to a separate inbox. If a user wishes to access adult content or remove these restrictions, they can opt for one of two verification methods: submitting a government-issued ID or undergoing an age estimation process using a video selfie.
Discord has emphasized that the age estimation process is designed to ensure the security and integrity of user data, with the video selfies never leaving the user's device. Additionally, the company claims that ID documents sent to its vendor partners are deleted immediately after age confirmation.
While the new policy aims to provide a safer environment for children on Discord, it has raised questions about the effectiveness of the age estimation process and the potential for false positives. The company has acknowledged that some users may be required to submit additional forms of verification, with further options, including an age inference model, being developed in the future.
Discord's child safety measures have been expanded several times over the past year, following a high-profile NBC News report on prosecutions related to Discord communication. In 2023, the platform banned teen dating channels and AI-generated CSAM (child sexual abuse material), and introduced content filters and automated warnings.
The new policy will roll out globally in early March, requiring both new and existing users to submit verification for adult content.