This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 4 minute read

DSA decoded # 10: Algorithmic transparency under the DSA

Recommender systems can shape what users see online every day – from user generated content, personalized ads and suggested communities, groups and other online users.  Public interest in how content is recommended and moderated, how platforms personalize and tailor content and how this interacts with free speech and may contribute to dissemination of harmful content is ever growing. The DSA explicitly mandates algorithmic transparency – requiring information about how recommender systems work, options for non-profiling based systems and consideration of recommender systems and algorithmic transparency in mitigating system risks. 

Recommender systems and algorithms  

The DSA defines ‘recommender system’ broadly as being an automated system used by an online platform to suggest, prioritize (including as a result of a user’s search input) or otherwise determine the relative order or prominence of information displayed to a user.  

These systems use algorithms to determine what content to display to users, the order of that content and how it interrelates with other content on a platform. These algorithms consider a variety of factors (often millions) to determine what content to recommend, including user-related data, such as browser history, cookies and engagement patterns as well as content-related data such as user settings to tailor recommendations to the user. In essence: these systems are responsible for content appearing on social media feeds, the order of search results and personalised content recommendations (including ads).

Recommender systems are often described as being black boxes due to the complexity of the underlying machine learning models, the proprietary nature of algorithmic design, and their dynamic nature - which changes continuously based on user interactions, settings and real-time data.

From personalisation to accountability

Recommender systems play an important role in enabling users on online platforms and search engines to navigate vast amounts of content by providing personalized suggestions based on their preferences, behaviour, and interests. Most online platforms use recommender systems to enhance the user experience by making it easier to discover relevant products, services, or information that matter most to the user. 

The DSA, however, also recognises that recommender systems can amplify systemic risks such as disinformation, harmful content, and political polarisation and accordingly aims to make these systems more explainable and accountable. For the largest online platforms (VLOPs) and search engines (VLOSEs) with over 45 million monthly average users, the DSA seeks to tackle the black box issue by requiring:  

  • Online platforms to disclose the main parameters used in recommender systems and associated user settings and provide users with option to modify such parameters  (Articles 27)
  • VLOPs/VLOSEs to provide options for recommender systems that are not based on profiling
  • VLOPs/VLOSES to consider algorithmic systems as part of the systemic risk assessment and risk mitigation measures (Articles 34 and 35)
  • VLOPs/VLOSEs to enable data access for regulators and vetted researchers (Article 40)

In addition, compliance with these obligations is also subject to an annual, third-party audit. 

Parameter disclosure for recommender systems

Article 27 DSA requires providers of online platforms to disclose in plain and intelligible language in their terms and conditions the main parameters used in their recommender systems as well as any options to modify or influence those main parameters. They must explain why certain content is suggested, and the relative importance of parameters (e.g., watch history, time spent, geographical location, or user ratings). The requirement for transparency within terms and conditions (1) does not apply to providers of online search engines or VLOSEs; and (2) does not require platforms to provide information on each and every parameter used or provide information on how individual parameters are weighted within such systems. 

VLOPs and VLOSEs must also offer user choice, including at least one option not based on profiling as defined under the GDPR. In broad terms, this means providers must include an option for users to use the platform / search engine, without the recommender system using or assessing personal data to determine some insight about that user that goes beyond their inputted personal data.

Risk assessment and mitigation 

VLOPs/VLOSEs are required to conduct regular risk assessments to identify and evaluate how their algorithmic systems may contribute to systemic risks, such as the dissemination of illegal content, threats to fundamental rights, or the manipulation of public discourse. The findings of these assessments must be documented and reported as part of annual risk assessment processes.

In addition to internal assessments, the DSA mandates independent audits to assess the VLOPs/VLOSEs’ DSA compliance. Auditors are tasked with evaluating the adequacy and effectiveness of the measures implemented to address identified risks, as well as the transparency of algorithmic processes.

Data access 

Article 40 DSA establishes a framework for data access, enabling regulators and vetted researchers to obtain data for the purpose of assessing compliance with the DSA from VLOPs/VLOSEs. Under this framework, there is an explicit provision allowing the Digital Services Coordinator of establishment (DSC) and the European Commission (the Commission) to request an explanation of the design, logic, functioning and testing of algorithmic systems, including recommender systems. Under Article 72 DSA, the Commission also has the power to order VLOPs/VLOSEs to provide access to, and explanations relating to, its databases and algorithms.

Vetted researchers, once formally approved, may also request access to non-public data to conduct studies on systemic risks. 

The scope of data access under Article 40 (and, for the Commission only, under Article 72) is intentionally broad, potentially encompassing datasets used for the training, deployment, and assessment of algorithms and recommender systems, as well as technical documentation explaining their operation. Responding to such requests, however, presents several operational and legal challenges: 

  • The requests from the Commission / DSC must be ‘necessary’ to enable monitoring and assessment of compliance with the DSA. This is a subjective test.
  • Data access requests must be balanced against the need to protect intellectual property, trade secrets, security of a service and user privacy.
  • The complexity and dynamic nature of modern algorithmic systems, particularly those based on machine learning, means technical documentation is often difficult to understand in isolation. 

Key Takeaways

The transparency and accountability requirements for algorithms and recommender systems are extensive and intended to address the challenges of algorithmic opacity and systemic risk. As these rules are implemented, ongoing attention to the balance between transparency, privacy, and commercial interests will be essential to achieving the DSA’s objectives in practice. Data access requests, especially now that the Commission has implemented its regulation on information sharing under the DSA, are likely to result in additional challenges for VLOPs/VLOSEs and regulatory authorities. 

Tags

eu digital services act, eu dsa decoded series, platforms