Search

Type below to search the site

Publications

Alert // Age Verification and Data Reuse: emerging regulatory trends in Europe

02.04.2026

Overview

Recent trends on age verification in Europe

The protection of minors in digital environments is increasingly emerging as one of the principal areas of focus for supervisory authorities, with the result that Age Verification Systems now assume growing importance from a compliance standpoint. Recent developments show that, where an online service is directed at minors or is otherwise likely to be used by them, the absence of effective age verification measures does not merely constitute a digital safety concern, but may also give rise to material infringements under different legal frameworks, such as data protection law, or, for example, the laws regulating online services (e.g. the European Digital Services Act).

It is against this background that the findings of the 2025 Sweep conducted by the Global Privacy Enforcement Network (“GPEN”) must be viewed.

The exercise, carried out by 27 Data Protection Authorities across 876 websites and mobile applications commonly used by minors, paints an overall concerning picture. According to the report published on 25 March 2026, in 72% of the cases examined the age verification measures in place could be circumvented, in most instances because they were based on simple self-declaration. At the same time, more than half of the services reviewed required the provision of personal data in order to access the platform’s full functionality, while 71% did not provide information on protective controls or privacy practices specifically tailored to minors. These findings confirm that, notwithstanding the increased use of age verification mechanisms as compared to 2015, significant weaknesses remain, particularly in contexts where risks to minors appear most acute.

The sanctioning implications of such shortcomings are illustrated particularly clearly by the recent enforcement action taken by the UK Information Commissioner’s Office (“ICO”) against Reddit, which was fined GBP 14.47 million. According to the UK authority, the platform had failed to implement adequate measures to prevent the registration of users under the age of 13, relying in substance on a system based solely on self-declaration. In the absence of an effective age verification mechanism, the ICO considered that Reddit lacked a valid legal basis for processing the personal data of users below the age of 13, and further alleged that the company had failed to carry out a prior data protection impact assessment. The case therefore confirms that, where minors are involved, the ineffectiveness of age verification measures may directly affect the lawfulness of the processing and justify the imposition of sanctions of a very significant amount.
In Italy, the Court of Rome set aside the EUR 15 million administrative fine imposed by the Italian Data Protection Authority on OpenAI. Although the grounds of the judgment have not yet been published, one of the issues raised by the Authority were also the alleged absence of effective age verification systems and the risk that minors could access inappropriate content or be subject to improper processing of their personal data.

Italy demonstrated a certain activism on the matter also on the legislative side. In fact, the legislator approved decree law 123/2023 which forbids access to pornographic contents for minors of 18 years and delegated to the Italian Communications Regulatory Authority (“Agcom”) the adoption of a regulation setting out technical measures for the application of the age assurance techniques on the services in scope. The regulation is fully enforceable against all the operators, wherever established, since the 1st of February, 2026.

In this connection, Agcom is required to monitor the proper application of the law by website operators and video-sharing platform providers and, in the event of non-compliance, to issue a formal notice of violation, whereby it shall require them to comply within twenty days. In the event of non-compliance with the formal notice, the Authority shall take all necessary measures to block the website or platform until the parties restore conditions of service that comply with the terms of the formal notice issued.

Taken as a whole, these developments confirm that processing activities involving minors are now subject to particularly intense regulatory scrutiny, and that shortcomings in age verification systems may expose digital service providers to extremely significant regulatory and financial consequences.

A new regime for secondary use of real world health data with the proposal for an EU Biotech Act

On December 16, 2025, the European Commission published a proposed regulation – known as the European Biotech Act – aimed at strengthening the competitiveness of the health biotech sector by simplifying the rules governing clinical trials and data use.

Specifically, this text amends Article 93 of EU Regulation 536/2014 – the so-called Clinical Trials Regulation – to allow the reuse of health and genetic data for other trials or for scientific research by the same data controller, without relying exclusively on consent. There is also a provision for a specific rule (Article 27e) regarding clinical trials in which the sponsor plans to use AI systems or models: in such cases, the proposal requires an assessment of the benefits and risks to patients’ health and the accuracy of the data in light of the EMA’s non-binding guidelines.

The proposal was commented on by the EDPB-EDPS in Joint Opinion 3/2026 of March 10, 2026, through which the authorities expressed support in principle for the objectives of harmonization and simplification, highlighting certain relevant GDPR issues regarding the precision of legal bases (they suggest framing data reuse within the public interest – Article 6.1. of the GDPR, as well as to 9.2.i or 9.2.j for special categories of data) and to specify the safeguards to be applied to the data in such cases.

What this new proposed regulation envisages is a data reuse regime for research purposes that overlaps with the one established by the European Health Data Space, in addition to the one recently designed for Italy by Law 132/2025 on AI, which also establishes a simplified regime for the secondary use of data without direct identifiers (i.e., pseudonymized, anonymized, or synthesized).

According to this rule, secondary use of data is authorized ex lege, including the processing of special categories of data, and without the need for new consent from the data subjects, provided that both AI-based processing for primary and secondary research purposes “must be communicated to the Personal Data Protection Authority with an indication of all the information provided for in Articles 24, 25, 32, and 35 of Regulation (EU) 2016/679, as well as with the express indication, where present, of the subjects identified pursuant to Article 28 of the same Regulation (EU) 2016/679, and may be initiated thirty days after the aforementioned communication if they have not been subject to a blocking measure ordered by the Personal Data Protection Authority”.

Alert // Age Verification and Data Reuse: emerging regulatory trends in Europe
Read the PDF

Want to know more?

Get in touch with the authors

Andrea Fedi

Andrea Fedi

Partner

Lucio Scudiero

Lucio Scudiero

Counsel

You might also be interested in