The American social network Discord will be implementing filters and security measures for young internet users by default on all accounts on its platform.
Users who want to lift the restrictions will be required to verify their age through facial recognition, among other things. The app theoretically does not accept anyone under 13.
Teen accounts will automatically blur “potentially sexual or disturbing” content, block access to channels with a minimum age requirement, or warn the user when they receive a contact request from a stranger. To determine the user’s actual age, Discord will use an artificial intelligence (AI) model that can lift the restrictions if it deems the user to be an adult. In some cases, the platform may ask the person in question to provide images of themselves or proof of identity.
Discord assures users that the video will only be downloaded to the smartphone and that the ID image will be quickly deleted after verification.
Discord’s initiative comes amid pressure from several countries to regulate young people’s use of social networks. The system was already introduced in the United Kingdom and Australia in 2025, in both cases to accommodate new, stricter legislation.
