Is the DSA Revolutionizing Algorithmic Risk Governance?

Anton Grabolle / Better Images of AI / AI Architecture / CC-BY 4.0

Share this Post

The technology and business model of social media platforms have not only transformed them into some of the most valuable companies to date, but have also led them to play a key societal role by shaping the way, in which people access information, acquire knowledge and socialize with others. However, the rise of social media has also heralded multiple challenges and adverse societal effects that are linked to the way in which content on social media platforms is moderated, leading up to debates in different jurisdictions around the world, including AustraliaChinathe European Unionthe United Kingdom and the United States on ways to address these adverse effects by means of regulation.

What is risk governance and how could it help in addressing emerging technologies?

One aspect of these regulatory debates concerns the method of risk governance as a way to effectively assess and mitigate risks. Insofar as risk governance is crucial for all emerging technologies, it is also indispensable in addressing the societal challenges posed by the rise of  social media. The aspect that separates legal risk governance from other forms of technology regulation is the focus on processes. This allows risk governance to anticipate and to account for risks that are not known today. In the case of social media, this orientation is necessary since the effects of these technologies change over time. Instead of providing for specific requirements, legal risk governance defines the framework for a procedure that can address and mitigate risks as they arise. Such processes can thus also relate to new risks and call for their mitigation. Yet, whether such procedural measures are successful depends largely on their design. In EU law, there is a long tradition of legal risk governance including environmental impact assessments and risk assessment under Art. 35 of the General Data Protection Regulation (GDPR). The Digital Services Act (DSA), a legislative initiative, which sets out to address the governance of online platforms in the EU, contains several provisions on risk governance and specific developments and novelties. While many questions regarding interpretation and consistency remain, the DSA might be considered the next step in the evolution of European risk governance.

How does the DSA address risk governance?

The rules on risk governance are applicable to so-called “very large online platforms” (VLOPs), i.e. services that store and disseminate content (Art. 3 (i) DSA), have a minimum of 45 million active recipients of the service, and are designated by a formal decision of the European Commission to be such a platform (Art. 33 sec. 1 DSA).

The first step in the risk governance process is the risk assessment (Art. 34 DSA), which has to be performed every year and additionally in case new functionalities are introduced which could have critical impacts (Art. 34 sec. 1 DSA). The identified risks have to be mitigated, while taking the effects of such mitigation measures into account (Art. 35).  While the European Commission can intervene in cases of emergency (Art. 36), the legislation stipulates that independent organizations shall perform audits of the risk assessment and mitigation processes (Art. 37).

How does the DSA define risks?

The DSA contains several requirements of what constitutes a systemic risk and mentions four exemplary areas:

  1. Illegal content;
  2. Negative effects on the exercise of fundamental rights;
  3. Negative effects on civic discourse, electoral processes, and public security;
  4. Negative effects relating to gender-based violence, the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being.

Several features of this definition of risks stand out: While there is considerable overlap between the risk and legal categories, Recital 90 clarifies that risks are not identical with the current state of the law. Open definitions such as the reference to human rights can accommodate new developments. The definition of risk also applies to actions that are not considered to be violations of the law or the terms and conditions of platforms. While the law provides examples of risks and values that are most relevant today, the risk definitions are formulated in an open manner. In addition, the definition goes beyond risks merely tied to the technology to cover those risks “stemming from the design, including algorithmic systems, functioning and use made of their services in the Union” (Art. 34 sec. 1 DSA). The risks to be taken into account are qualified by the word “systemic”. The broadest definition of this term would be every risk of the socio-technical system while the narrowest definition would align with financial regulation that limits systemic risks to the danger of market failure. While it is probable that systemic risks will be interpreted to be somewhere between these extremes and to denote a form of aggravated risk, details are yet to be determined.

How does the DSA approach knowledge governance?

Another area in which the DSA covers potentially new ground is relating to the realm of knowledge governance. This presents its own set of challenges. Requiring the enhancing of transparency can create synergy effects and stir public debate about how to deal with risks of technologies. However, such measures may also create vulnerabilities, presenting attack vectors for the manipulation of social media, also imperiling the secrecy of commercial assets. The DSA provides for significant measures concerning knowledge governance. On the one hand, it requires providers to include the measures they have taken into their transparency reports. On the other hand, providers also have to report to the European Commission (Art. 35 sec. 2), which will compile items in a “comprehensive report” that identifies risks and best practices. The European Commission should provide a central knowledge base for stakeholders such as executives, engineers and users that could offer orientation and inspiration. The usefulness of such an exercise for different stakeholders like executives, engineers or users will be dependent on the design of these reports.

Participatory features of the DSA

Another important aspect of the risk assessment governance in the DSA are participatory features. Several analyses of technology impact assessments have shown that participatory elements that involve citizens and users who are subject to those risks are crucial for contextualization purposes. The inclusion of citizens is thus key to understanding and mitigating risks in various situations. Participatory elements are mentioned in Recital 90, which is exclusively devoted to this point. The DSA here states that risk governance has to be based on “the best available information and scientific insights” while assumptions should be tested “with the groups most impacted.” This is the reason for “the involvement of representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organizations.” Given that this obligation is only applicable “where appropriate,” participatory elements are not a necessary part of every technology assessment process, but should be performed when they can contribute significantly to the functioning of the process. Therefore, it is important to understand that participation can fulfill different functions. One is to take in the expertise of citizens about the contexts in which they operate. Participation can also help the assessment to “take into account specific regional or linguistic aspects, including when specific to a Member State…” as foreseen in the DSA. In addition, participation can enhance innovative solutions when it comes to co-creation of measures. Participation could furthermore contribute a further dimension to typical models of co-regulation, which usually consist of a mix of companies and public authorities. Given that regulation that impacts public discourse is a highly sensitive issue in democracies, it makes sense to also pro-actively engage the public and think about co-regulation as a triangle between companies, public authorities and the public. Thus, the question as to how to design participatory processes in the context of the DSA in an effective manner, remains.

Takeaways

The definition of risks, knowledge governance and participation are key features when tackling questions of risk governance in relation to social media. More generally, they are also an impetus to develop the field of algorithmic risk assessment and technology assessment to a state-of-the-art level. The DSA seems to provide for next steps concerning risk assessments and other areas of technology governance. However, the actual impacts of this model depend on its implementation, especially in light of questions that remain open. Efforts like the project REMODE (a participatory certification mechanism to re-innovate content moderation) at the Technical University of Munich’s Reboot Social Media Lab will be needed to harness these technologies in a way that ensures they are beneficial to society and profitable to the service providers. In the end, risk mitigation means nothing else but making the online environment a better, safer and more enjoyable space. This positive striving for a beneficial and sustainable digitization must remain a concerted effort, motivated not least by the many instances in which risks posed by social media have already materialized.

Further readings:

  • Clarke, R. (2011) “An evaluation of privacy impact assessment guidance documents” International Data Privacy Law 1, 111.
  • Djeffal, C., Magrani, E., Hitrova, C. (2022)  “Recommender Systems and Autonomy: A Role for Regulation of Design, Rights, and Transparency” Indian Journal of Law and Technology 17, 1-55.
  • Djeffal, C. (2022). Soziale Medien und Kuratierung von Inhalten: Regulative Antworten auf eine demokratische Schlüsselfrage. In I. Spiecker genannt Döhmann (Ed.), Demokratie und Öffentlichkeit im 21. Jahrhundert – zur Macht des Digitalen (pp. 169–189). Nomos.
  • Stilgoe, J., Guston, D. H. (2017) “Responsible Research and Innovation,” in U. Felt, R. Fouché, C. A. Miller and L. Smith-Doerr (eds.), The Handbook of Science and Technology Studies, Cambridge.
  • Stirling, A., Scoones, I. (2009)  “From Risk Assessment to Knowledge Mapping: Science, Precaution, and Participation in Disease Ecology” Ecology and Society 14.
  • Vaccaro, K., Xiao, Z., Hamilton, K., Karahalios, K. (2021) “Contestability for Content Moderation” Proc. ACM Hum.-Comput. Interact 5.
  • van der Velden, M., Mörtberg, C. (2014) “Participatory Design and Design for Values,” in J. van den Hoven, P. E. Vermaas and I. van de Poel (eds.), Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains, Dordrecht.

 


The opinions expressed in this text are solely that of the author/s and do not necessarily reflect the views of  the Israel Public Policy Institute (IPPI) and/or its partners.

Spread the word

Share this Post

Read More