The European Commission (EC) published draft guidelines on 13 May 2025 (the Draft Guidelines) setting out its interpretation of Article 28(1) of the EU’s Digital Services Act (DSA), detailing the steps that providers of online platforms (Article 3(i) DSA) should consider in order to ensure a high level of privacy, safety and security for minors online. Whilst still in draft form and open for public consultation, the anticipated guidelines offer a crucial insight into the EC’s stance on child safety online and what measures the EC expects from providers of online platforms. This post highlights a few key topics in the draft guidelines – available in full here – and associated challenges.
Legislative Background
We have published several blogs explaining the EU’s Digital Services Act, which imposes various requirements on online services directed to safety, accountability and transparency.
Article 28(1) DSA is the main provision of the DSA to ensure protection of minors. It requires providers of online platforms accessible to minors to put in place “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors”. Given the broad drafting of this provision, the Draft Guidelines, which the EC is empowered to issue under Article 28(4) DSA, aim to help providers unpick the concepts of “privacy, safety, and security” and explain the “high level” standard in practice. Whilst the final guidelines will not be legally binding, the EC has indicated their use as a benchmark when assessing whether providers comply with Article 28(1) DSA.
Scope of the Guidelines
The Draft Guidelines reflect the EC’s broad interpretation of Article 28(1) DSA and set out (1) key principles the EC says providers should consider when adopting measures to ensure a high level of privacy, safety, and security for minors (being proportionality, children’s rights, privacy/safety/security-by-design and age-appropriate design) and (2) the ‘main measures’ that the EC expects providers to adopt in relation to Article 28(1) DSA.
It is the latter which reveals the EC’s broad interpretation of Article 28(1): the measures covered in the Draft Guidelines build upon a number of wider DSA obligations as well as an expansive set of topics including age assurance, registration, default settings, platform design (including persuasive design features), recommender systems, commercial practices, user reporting, content moderation, governance and transparency.
Key Themes
Risk orientated approach
In determining which measures are appropriate and proportionate to ensure the required high level of privacy, safety, and security of minors, providers are should conduct a risk review based on the 5Cs typology of risks (covering risks related to content, conduct, contact, consumer and cross-cutting). The risk review should also consider the likelihood of minors accessing a service and the mitigation measures already in place. The Draft Guidelines make clear that for Very Large Online Platforms, this risk review should be carried out as part of the systemic risk assessment under Article 34 DSA – meaning providers of Very Large Online Platforms can rely on existing processes.
Providers are advised to carry out risk reviews whenever they are making significant changes to an online platform and the EC suggests publication of the results of these reviews, which would be in addition to existing transparency reporting obligations under the DSA (Articles 15, 24 and 42).
Conducting and reporting risk assessments/reviews are an emerging regulatory trend across jurisdictions in the field of online safety, with regulators increasingly keen to increase transparency around online platforms.
Cross-provision impact
While the Draft Guidelines are framed as relating to Article 28 DSA, it is notable that they touch on a number of other DSA provisions including Article 14 DSA (terms and conditions), Articles 15 and 24 DSA (transparency reporting), Articles 16, 17, 20 and 22 DSA (content moderation), Articles 27 and 38 DSA (recommender systems), Article 25 DSA (online interface design and organisation) and Article 26 DSA (advertising on online platforms).
While the Draft Guidelines are not legally binding (and therefore cannot impose new, standalone obligations on providers, the EC explicitly notes that the measures set out in the Draft Guidelines are intended to “build on” existing DSA provisions, including, for example, what it means to have a reporting mechanism in place which is accessible to minors. The EC also states that simply adopting the measures set out in the Draft Guidelines will not amount to compliance with broader DSA obligations, but only relates to provider’s compliance with Article 28(1).
A focus on age assurance
Given the EC’s focus on age assurance (including the recent tender for development of an age verification solution – due to be released later this year), it is unsurprising that the Draft Guidelines focus heavily on age assurance, setting out that:
- It is not considered sufficient for providers to rely on a user’s self-declared age and, instead, providers should assess whether age verification and/or age estimation is appropriate. The Draft Guidelines provide examples of where the EC considers age verification should be used (e.g., where a minimum age is required under EU/national law – such as access to platform offering/displaying the sale of alcohol or where terms and conditions require a user to be over 18) and examples of where verification or estimation may suffice (e.g., where terms and conditions require a minimum age U18 or where a provider has identified ‘medium risks’ on an online platform).
- The EC accepts that a tailored, context specific approach may be necessary and acknowledges that it will not always be considered appropriate for all providers to use age assurance methods across all functionalities or for all content.
- If age assurance is deemed necessary, providers must always make sure more than one method is available.
- The criteria for assessing the efficacy of age assurance methods relate to accuracy, reliability, robustness, non-intrusiveness and non-discrimination (factors which align with recently released Ofcom guidance on highly effective age assurance).
- In respect of effective age verification specifically, the EC considers that its own “age verification solution” (which is due to be released this year) should be used as a minimum benchmark.
While the Draft Guidelines set out a fairly broad interpretation of Article 28 DSA, individual EU member states may have even stricter measures in mind. There has already been some early criticism (Denmark’s Minister for Digital Affairs, Caroline Stage Olsen) of the stance on age assurance, particularly the fact that the EC has not advocated for a verification-only approach.
The role of recommender systems
The Draft Guidelines detail restrictive expectations as regards the usage of children’s data in recommender systems, reaching beyond the requirements already set out in Articles 27 and 38 DSA.
In addition to imposing requirements relating to the collection of behavioral data and the way in which certain data should be prioritized by recommender systems, the Draft Guidelines comment on the need for minors to be able to control their recommendations, including through prompting minors to search for new content after being shown recommended content for a certain period.
Commercial practices
The Draft Guidelines argue that minors are uniquely exposed to risks arising from commercial practices and a general lack of commercial literacy. With these vulnerabilities in mind, the EC has recommended different measures, for example having responsible marketing and advertising policies in place and ensuring minors are not exposed to manipulative design techniques. The Draft Guidelines cite research relating to children’s inability to distinguish between commercial and non-commercial content and the impacts of AI-enhanced nudging.
AI risks
Relatedly, the Draft Guidelines also touch on AI, noting the potential impact of AI on minors’ privacy, safety and security online, particularly in the context of content moderation and user support. The Draft Guidelines suggest, inter alia, that providers should ensure generative AI systems include safeguards which detect prompts that are potentially harmful to minors. Also, the EC urges certain levels of caution when AI features are used to support users, suggesting that AI-warnings (e.g. about hallucinations) should persist through the entirety of the minor’s interaction with a system. Notably, the usage of AI is not addressed as a stand-alone topic in the draft.
Outlook
The Draft Guidelines are now open for consultation, with stakeholders able to provide feedback until 10 June 2025. The final guidelines are anticipated later this year. Given their breadth, we expect to see a high level of stakeholder engagement. Providers within the scope of the DSA should monitor developments closely.