The EU has launched an app for verifying your age on social media, and it's undergoing its first test in Italy.

The EU Commission has presented guidelines for the digital protection of minors and a prototype of an application for verifying the age of users on platforms pursuant to the Digital Services Act. This was announced by European Commission Vice President Henna Virkkunen. Italy, France, Spain, Greece, and Denmark will participate in the pilot phase of the application, which is one of the building blocks of the digital identity wallet expected by the end of 2026. The app will be used to verify whether a user is over 18 without having to reveal any further personal information, while respecting privacy.
The prototype app, which sets a gold standard in online age verification, will allow users, for example, to easily prove they are over 18 when accessing adult content online, while retaining full control over any other personal information, such as their exact age or identity. No one will be able to track, view, or reconstruct the content accessed by individual users. The verification app will be tested and further customized in collaboration with Member States, online platforms, and end users.
The five member states will be the first to adopt the technical solution, with the aim of launching a customized national app for age verification. The prototype could be integrated into a national app or remain a standalone app. "Ensuring the safety of our children and young people online is of paramount importance to this Commission," said Virkkunen. "Platforms," he emphasized, "have no excuse to continue putting children at risk."
Other issues addressed by the guidelines include addictive design, cyberbullying, harmful content, and unwanted contact from strangers. Specifically, it is recommended to reduce minors' exposure to practices that can foster addictive behavior and to disable features that promote excessive use of online services, such as streaks, a sort of activity tracking system, and message read receipts. To combat cyberbullying, it is considered necessary to give minors the ability to block or mute users, ensuring they cannot be added to groups without their explicit consent. It is also recommended to prohibit accounts from downloading or taking screenshots of content posted by minors to prevent the unwanted distribution of sexual or intimate content. To reduce minors' exposure to harmful content, platforms are encouraged not to re-recommend content that young users have indicated they do not wish to see. Finally, platforms are required to set minors' accounts to private by default, meaning they are not visible to users who are not on their friends list, to minimize the risk of them being contacted by strangers online.
ansa