For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.

How do we trust each other in the digital society?

The digital society relies on an ever-expanding network of digital infrastructures. This forces us to rethink some of our basic assumptions about one of the most fundamental resources in all interpersonal, social, economic relations: trust.

  1. Despite their widespread use, we know little about the trustworthiness of our digital infrastructures. At best it is difficult to establish their trustworthiness (such as with AI), at worst they are proven to be untrustworthy (as is the case with social media).
  2. Some of these digital infrastructures play key roles in interpersonal and societal trust dynamics. They disrupt existing trust relations and offer new ways to trust each other.
  3. Our scientific methods and theories face serious limits when it comes to the study of change of social structures, institutions and processes under the conditions of rapid technological transformation. Research on trust in technology and trust by technology is siloed, and focuses on narrowly defined technologies (AI), or problems (system security), and lacks a comprehensive, multidisciplinary, long-term-view.

The problem in detail

Trust is a fundamental building block of our society. But trust, both interpersonal, and institutional, is constantly being disrupted – willfully, or inadvertently – by technological innovation. Take for example the current pandemic, where we enlisted various novel technologies – from mRNS vaccines, via contact tracing apps and digital vaccination passports, to remote teaching and work technologies – to combat. Each and every of these technologies raised fundamental trust questions: Can the novel, untested vaccine technology be trusted? Can scientist, experts, elected government officials, politicians who developed, tested, mandated vaccines be trusted? How can citizens trust information sources, how can one recognize misinformation? What happens to interpersonal trust relations when an increasing share of our social interactions at the workplace, in schools, among family members are facilitated, mediated by digital technologies? What is the role of platforms in the shifting dynamics of trust in society? How does the increasing use of digital technologies by the government affect the trust relationship between the state and the citizen? How can we ensure the trustworthiness of our technological infrastructures?  Can we prevent the breakdown of society without a shared, common core of knowledge, trust in institutions, trust in each other, and in the technological infrastructures supporting our society?

The pandemic only highlighted and emphasized a more general problem around trust in the digital society. Various digital technologies transformed societal trust relations at unprecedented scale. Social media, for example, have contributed to global communities and allow for easy access to various ways of communicating amongst people, but it is also used by hostile actors to sow mistrust and discord in those communities. Incidents such as data leakage gives raise to the question if we can trust the state to be a good custodian of our test-, trace-, and vaccination data. The increased number of cyberattacks make people wonder if their personal data, or valuable and/or sensitive business data can safely be exchanged and will that data be processed respecting that person's interests.

While we rely more and more on digital technologies – among other to trust each other - , multiple survey point to the growing distrust in them, and the companies which provide them. Our trust relationships in the digital society seem to be in crisis, and we don’t have reassuring ways to deal with the destabilization of such a crucial social, political, and economic resource.  

New technologies, from nuclear energy to online marketplaces have presented new, often unknown forms of risks to the individual and to the society at large. Trust is a key resource in maintaining interpersonal, social, economic, political relations in face of such risks. But trust, if we want to ensure it’s not misplaced, requires independently verifiable trustworthiness of techno-social systems.

Trust, in the context of digitization faces a double challenge. First, digital infrastructures need better, most concrete, more complex trustworthiness safeguards and guarantees than what they have now. The current piecemeal approach, where computer scientists, competition lawyers, economists, UX designers, and business evangelists try to address the serious trustworthiness problems of digital technologies so far couldn’t prevent and avert mass data leaks, biased algorithmic decisions, social, political, economic destabilization, or even genocide.

Second, we nevertheless use the these fundamentally untrustworthy infrastructures to produce trust in social relations. Blockchains, and smart contracts, automated filtering and recommender systems, online reputation management platforms are often explicitly designed to disrupt and replace time-tested, locally embedded trust producing social, cultural practices, and public institutions. The interpersonal and institutional frameworks which so far helped us to trust each other are being replaced by opaque, unaccountable, profit-driven, heavily automated, and lightly regulated foreign private entities with their own economic, political, social, and cultural agendas. And as the underlying trust producing infrastructures change, so does the nature of interpersonal and societal trust they enable.

The goal of the RPA is to address the crisis of trust in the digital society by (1) fostering new, cross-cutting, fundamental, multidisciplinary research on digitization and trust; (2) coordinate existing UvA research taking place in the 5 faculties of the University of Amsterdam. Using the RPA funding as seed money to generate substantial 2nd and 3rd stream funding, and by closely working with public and private stakeholders on current, real-world problems, it aims to position UvA at the forefront of global digital trust research.