This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields TQ

Technology quotient - the ability of an individual, team or organization to harness the power of technology

| 7 minute read

DSA decoded #6: The European Commission Finalises Guidelines on the Protection of Minors: What They Mean For Platforms

The European Commission (EC) published final guidelines on 14 July 2025 (the Guidelinessetting out its interpretation of Article 28(1) of the EU’s Digital Services Act (DSA). This follows a period of public consultation on draft Guidelines which started in May 2025. The Guidelines set out how online platforms should ensure a high level of privacy, safety and security for minors and offer insight into the EC’s stance on online child safety and the measures it expects from providers. 

 

Legislative Background 

We have published several blogs unpacking the DSA, which sets rules for safety, accountability and transparency. 

Article 28(1) DSA is the main provision of the DSA to ensure protection of minors online. It requires providers of online platforms accessible to minors to implement “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors”. 

Given the broad drafting of this provision, the Guidelines, issued under Article 28(4) DSA, clarify what this means in practice. Whilst the Guidelines are not legally binding, the EC will use them as a benchmark for assessing compliance. 

Although the Guidelines adopt a broad interpretation of Article 28 DSA, supervision and enforcement of other Union legislation, for example the AI Act or the GDPR, are not affected by the Guidelines. Also, enforcement of individual EU member state legislation remains the sole responsibility of the competent authorities under those legal frameworks. In this context, while the DSA fully harmonises rules within its scope, national legislation may apply to provider of intermediary services, in compliance with Union law, where the provisions of national law pursue other legitimate public interest objectives than those pursued by the DSA (Recital 9). 

 

Scope of the Guidelines 

The Guidelines reflect the EC’s broad interpretation of Article 28(1) DSA and set out (1) key principles the EC say providers should consider when adopting measures to ensure a high level of privacy, safety, and security for minors (being proportionality, children’s rights, privacy/safety/security-by-design and age-appropriate design); and (2) the main measures that the EC expects providers to adopt in relation to Article 28(1) DSA.  

These measures build on broader DSA obligations and cover a range of topics, including age assurance, registration, default settings, platform design (including persuasive design features), recommender systems, commercial practices, user reporting, content moderation, governance and transparency. 

Although the Guidelines are designed specifically to protect minors online, they also encourage providers to create and preserve safe online spaces for users of all ages. The Guidelines highlight that in doing so, this will inherently result in more privacy, safety and security for minors.

 

Key Themes  

Risk orientated approach 

In determining which measures are appropriate and proportionate under Article 28 (1), providers should conduct a risk review using the 5Cs typology of risks (covering risks related to content, conduct, contact, consumer and cross-cutting). Reviews should consider the likelihood of minors accessing a service and the mitigation measures already in place. Also, providers are encouraged to take into consideration the best interest of the child as primary consideration, in line with principles outlined in Article 3 of the United Nations Convention of the Rights of the Child. Considerations of the appropriateness of practices, features or design choices should be informed by scientific and academic sources.

For Very Large Online Platforms (VLOPs), this process can form part of systemic risk assessments under Article 34 DSA – meaning providers of VLOPs can rely on existing processes. 

The Guidelines strengthen expectations: risk reviews should occur at least annually, involve consultation with children and guardians, and (where appropriate) be published. These reviews are likely to be used by regulators, meaning platforms should be ready to explain how key safety decisions were made and documented. 

Cross-provision impact  

While the Guidelines are framed as relating to Article 28 DSA, they also touch on a number of other DSA provisions, including Article 14 DSA (terms and conditions), Articles 15 and 24 DSA (transparency reporting), Articles 16, 17, 20 and 22 DSA (content moderation), Articles 27 and 38 DSA (recommender systems), Article 25 DSA (online interface design and organisation) and Article 26 DSA (advertising on online platforms). 

While the Guidelines are not legally binding (and therefore cannot impose new, standalone obligations on providers), the EC notes that the measures set out in the Guidelines are intended to “build on” existing DSA provisions, including, for example, what it means to have a reporting mechanism in place which is accessible to minors. The EC also states that adopting the measures in the Guidelines will not amount to compliance with broader DSA obligations but relates specifically to Article 28(1) DSA. 

A focus on age assurance  

Given the EC’s focus on age assurance (including its tender for an age verification solution due later this year), it is unsurprising that the Guidelines focus heavily on age assurance, setting out that:   

  • It is not considered sufficient for providers to rely on a user’s self-declared age. Instead, providers should assess whether age verification and/or age estimation is appropriate. The Guidelines provide examples of where age verification is expected (e.g., platforms offering/displaying the sale of alcohol considering applicable EU/national law or where terms and conditions require a user to be over 18) and where verification or estimation may suffice (e.g., where terms and conditions require a minimum age U18 or where a provider has identified ‘medium risks’ on an online platform).  
  • The EC accepts that a context specific approach may be necessary and acknowledges that it will not always be considered appropriate for all providers to use age assurance methods across all functionalities or for all content.  
  • If age assurance is deemed necessary, providers must always make sure more than one method is available. 
  • The criteria for evaluating age assurance methods include accuracy, reliability, robustness, non-intrusiveness and non-discrimination (aligned with recently released Ofcom guidance on highly effective age assurance).  
  • For age verification specifically, the EC considers that its own “age verification solution” (due to be released this year) should be used as a minimum benchmark. The Guidelines state that age estimation methods can temporarily be used for use cases that require age verification if effective age verification tools are not yet readily available. In this regard, we understand that the EC clarified in a presentation given 17 July 2025 that a “temporary exception” for age assurance would be in place for up to 12 months if a provider uses comparable tools instead.   
  • The Guidelines also emphasise the importance of robust data protection (especially data minimisation) principles when considering age estimation methods that require the processing of personal data.

The role of recommender systems  

The Guidelines detail restrictive expectations regarding the usage of children’s data in recommender systems, reaching beyond Articles 27 and 38 DSA.  

In addition to limiting the use of behavioural data, the Guidelines state that minors should have more control over recommendations - including the ability to reset their feeds and prompts to explore new content after extended viewing.  

In comparison to the draft guidelines, the final Guidelines provide an increased focus on ensuring minors are not exposed to excessive total volumes, frequency and / or recommendation of commercial content in relation to recommender systems and behavioural data. The Guidelines highlight that excessive exposure may lead to an increase in unwanted spending or addictive behaviours, which may have a detrimental effect on minors’ privacy, safety and security online. 

Default settings and persuasive design 

The EC still recommends that children’s accounts be set to private by default to reduce unwanted contact. The final Guidelines kept explicit recommendations for privacy by default settings, including to turn off features that may contribute to excessive use or push notifications, with the recommendation that push notifications are always turned off during “core sleep hours”. In addition to the draft guidelines, providers are now also advised to turn recommendations for other accounts off by default for child users, and to present warning signals at the point at which a minor changes their settings, clearly explaining the potential consequences of their changes. Providers should treat these features as part of their Article 28 (1) compliance obligations and be prepared to justify them through a safety-by-design lens. 

While the core principles and many of the specific recommendations included in the draft Guidelines have remained the same, reference to excessive and addictive behaviour is now explicitly described in the Guidelines, in the context of cross cutting risks, such as health and wellbeing risks, which may significantly affect minors’ lives in multiple ways. For example, increased prevalence of eating disorders and mental health issues linked to the use or excessive use of online platforms, which may result in negative impact for minors’ physical and mental health and wellbeing, including addiction, depression, anxiety disorders, deregulated sleep patterns and social isolation.

Commercial practices  

The Guidelines argue that minors are uniquely exposed to risks arising from commercial practices and a general lack of commercial literacy. With these vulnerabilities in mind, the EC recommends targeted measures – including responsible marketing and advertising policies, avoiding manipulative design features, and limiting exposure to commercial content via AI tools. The Guidelines also cite research relating to children’s inability to distinguish between commercial and non-commercial content and refer to broader consumer protection rules, including the EU’s Unfair Commercial Practices Directive.  

AI risks 

The Guidelines significantly expand on the potential risks of AI, with a specific focus on interactions between minors and generative systems, chatbots, and personalised content. Under the Guidelines, providers should assess risks to minors before deploying AI tools and include safeguards to detect potentially harmful prompts. AI systems should clearly signal that the user is interacting with a machine and this warning must persist throughout the interaction. Importantly, AI features should be opt-in (not default), and platforms should avoid nudging or influencing minors through AI to engage with commercial content or spend money. The EC stresses that warnings should always be child-friendly and visible, and that AI system should not imitate human contact, promote harmful content, or replace human support without appropriate oversight. 

The final Guidelines further clarified the expectation of the Commission that if AI features, such as AI chatbots and filters, are integrated into an online platform accessible to minors, they are not activated automatically, and minors are not encouraged or enticed to use them. Such systems should also be in line with minors’ evolving capacities and designed in a way safe to them.

 

Outlook 

With the Guidelines now final, the EC has confirmed they will consider these as a benchmark when assessing compliance with Article 28(1) DSA, potentially immediately. The Commission has stated that it will review the Guidelines as soon as it deems necessary, and at the latest after a period of 12 months. While non-binding, the Guidelines are likely to influence enforcement and stakeholder expectations in the EU and member states. Providers affected should now move from preparation to implementation, if not done already. 

Tags

e-commerce, eu digital services act, eu digital strategy, eu dsa decoded series, onlinesafety, platforms