Users rely on private platforms and services for virtually anything. Thereby, they are placing significant trust in them despite frequent violations. While much attention has been paid to data breaches, a broader scope is needed to address other emerging risks – such as threats to democratic values (Murphy, 2024), users’ mental and physical well-being (Paul, 2024), and financial security (Sherman & Matza, 2023). These fundamental issues highlight the challenges digital platforms face in maintaining a safe and trustworthy environment.
As platforms have become dominant players in our interconnected, digital economy (Griffin, 2023), the role of trust and safety departments has become increasingly critical. These departments take on the crucial task of protecting user welfare and ensuring the integrity of online spaces. Their work directly impacts not just individual user experiences but broader societal issues like public health and democratic processes.
The complexity and scale of digital risks mean that trust and safety efforts are more important than ever. A key challenge is ensuring these departments have proactive tools for risk prevention, allowing them to mitigate issues before they escalate. Understanding how platforms minimize risk is essential to safeguarding users and improving trust in digital ecosystems.
From a regulatory standpoint, such risk minimization provision is introduced by the EU’s Digital Services Act (DSA), Article 34, which establishes mandatory periodical assessments for the aggregation of profound individual risks, so-called ‘systemic risks’[1]. Article 35 requires the designated platforms and search engines to take appropriate measures to mitigate these risks. Yet, the DSA’s risk provisions have sparked debates about the most appropriate methodologies and benchmarks.
In the present research project Trust in Digital Markets: Keeping Tabs on Systematic Risks, the Trust in the Digital Society RPA is organizing a workshop in May 2025, inviting trust and safety professionals to Amsterdam to discuss the current state of risk management and safety practices on digital platforms.
Participants will have the opportunity to contribute to the development of a policy roadmap that not only addresses (the mitigation of) systemic risks, but also helps clarify the DSA’s ambiguities surrounding risk assessments. Professionals interested in contributing to this important discussion can contact Dr. Linda Weigl at l.weigl@uva.nl for further details or to express interest in participation.
Griffin, R. (2023). Public and private power in social media governance: Multistakeholderism, the rule of law and democratic accountability. Transnational Legal Theory, 14(1), 46–89. https://doi.org/10.1080/20414005.2023.2203538
Murphy, H. (2024, January 11). The rising threat to democracy of AI-powered disinformation. https://www.ft.com/content/16f23c01-fa51-408e-acf5-0d30a5a1ebf2
Paul, K. (2024, January 31). Zuckerberg tells parents of social media victims at Senate hearing: ‘I’m sorry for everything you’ve been through.’ The Guardian. https://www.theguardian.com/us-news/2024/jan/31/tiktok-meta-x-congress-hearing-child-sexual-exploitation
Sherman, N., & Matza, M. (2023, October 4). Sam Bankman-Fried: FTX crypto empire “built on lies” - prosecutors. BBC News. https://www.bbc.com/news/business-67013020
[1] The concept of ‘systemic risks’ encompasses a wide range of risks jeopardizing fundamental rights, the dissemination of illegal content, the violation of the integrity of public discourse, and threats to public health. The DSA highlights a couple of factors for assessing these risks. It emphasizes evaluating data practices, terms and conditions enforcement, and content moderation processes. It further emphasizes providers’ responsibility in assessing how their service design, misuse and manipulation contribute to systemic risks. Moreover, it lists an array of mitigation strategies, such as adapting platform design, improving content moderation systems, removing non-consensual intimate material, or enhancing the cooperation with trusted flaggers.