Governing “Big Tech” in a European, Trustworthy Way – New PERITIA Special Issue
Governing “Big Tech” in a European, Trustworthy Way – New PERITIA Special Issue
We are all living in a world flooded by online information, fast messaging, and app notifications interrupting our train of thoughts. We get shocked by intrusive targeted advertisement or hear stories about Twitter “troll factories” or geopolitical battles ignited by cell phone producers.
What is all this about? Who governs this tangled mix of daily online interactions where we study, where we work, where we talk to our loved ones? In which universe of the “Big Five” tech companies have you landed? Facebook? Apple? Microsoft? How much can you trust them?
The publication focuses on the impact of this online communication transformation on trust and trustworthiness. A fundamental question in this work is central to PERITIA:
How can we trust democratic institutions in an online world that is largely governed by global, corporate platforms?
With an interdisciplinary approach, the collection of academic articles brings together some of the leading experts investigating the intersections between digitalisation, media, trust, and democracy. Authors include PERITIA’s Advisory Board member Judith Simon (University of Hamburg) and some of the experts that participated in the PERITIA conference Trust in Media in a Changing Media Landscape.
The roundtable discussion with Van Dijck, Simon, Stefan Larsson (Lund University), Alison Powell (London School of Economics and Political Science), and Jo Pierson (Free University Brussels) offers a glimpse into some of the key questions and takeaways of the Special Issue.
Are We Living in Platform Societies?
In this research, Van Dijck argues we are living in platform societies. Online platforms are virtual universes of interconnected apps, services, algorithms and data managed by companies like Facebook and Apple.
These online platforms have moved beyond being neutral “mediators” of online interactions; they are becoming central to our political and economic systems. “There is not a single private or public sector in society that for its functioning is not at least partially dependent on this online infrastructure,” she observes.
This dependence brings a worrying trend. Platforms are “increasingly governed by automated mechanisms of selection and distribution – mostly self-regulated by tech companies – while the governance of platforms still largely escapes the control of governments, institutions and users who demand public accountability”.
“Data-hungry machines and bots are increasingly used for automating human acts such as communicating, buying and decision-making.” – José van Dijck (University of Utrecht)
Her research addresses this conflict between the private, profit-driven values of the five “Big Tech” companies and the common public values of open, democratic societies (privacy, security, tolerance, fairness, equality, autonomy and so on).
The trustworthiness, reliability and transparency of the technological systems they exploit is being questioned, not least after recent scandals like the “Facebook-Cambridge Analytica”. Should citizens and governments require that all tech companies abide to standards such as openness and transparency to allow for fair competition and user control?
Trust Models Are Shifting, but Towards Which Direction?
Following this framing of the problem, the Special Issue explores the shifting models of trust and the role of computation, artificial intelligence and business models in how we trust media, science, or politics.
Before digitalisation came to the centre of our lives, trust was placed in systems of human-made rules, mediated by institutions like journalism or science. With the online transformation, the “new models of trust” are being redefined in “black boxes”: AI-driven, ungoverned sets of standards created within global companies and with no clear accountability.
“Whereas the first model is predicated on human-made rules of power governed by publicly accountable institutions and professional norms and routines, the second one is grounded in an obscure dynamic that mixes personalized data flows, algorithmic computation and proprietary business models”, Van Dijck argues.
“The gravitation from ‘institutional’ to ‘computational’ trust is not a given, but a hotly disputed transfer of power”, she underlines.
An example of this can be found in Simon’s and Gernot Rieder’s research on the development of the Corona-Warn-App in Germany in 2020. Their analysis shows how the design of the app became a negotiation stage between governments, institutions, technologies, its owners and users – an example of how institutional trust is being distributed between different actors.
Yet trust in these platforms is low compared to previous “gatekeepers”. Some “aspire to become institutional entities to gain higher levels of public trust, but they refuse to subject themselves to the accountability apparatuses that usually come along with such status”.
Europe between American and Chinese Platform Ecosystems
A final focus of the Special issue is the role played by European governments and EU institutions in the governing of these platform ecosystems. The authors investigate the geopolitical contest between the United States and China concerning the control of the vital digital infrastructures behind our daily online world.
Wires and cables as well as the computational infrastructure are essential to process large amounts of data flows distributed across platforms. “Europe’s dependence on American big tech, besides divulging its infrastructural vulnerability, has become a geo-political liability, as it is now caught between the superpowers’ fight over technological control”.
The resolution of these disputes is not yet settled and not a mere battle over technological infrastructure. The definition of their standards is ongoing and shaped by ideological views and concepts, a stage where European “public values” come at play. The meaning of openness or transparency, as Jean-Christophe Plantin’s article shows, are at stake.
Is There a European, Trustworthy Way?
The two models of trust, the institutional-professional versus the computational corporate model, are also evolving. As Van Dijck concludes, they “may not refer to a shift from the first to the second but may rather imply a gradual amalgamation of the two into a distributed model of trust”. In this “distributed” process, rapid power shifts are to come in the next years where the EU can play a key role on its promotion, defence and definition of public values.
“European platform societies are not doomed to fall prey to either American or Chinese domination of a new geopolitical world order. It is more likely that through an increasing awareness of its own priorities in terms of public values, the EU will learn how to navigate the socio-technical and political-economic shifts that will proceed over the next couple of years, at rapid speed. Whereas even as little as 20 years ago, online platforms were simply considered emerging ‘markets’ or ‘technical conduits’ for market transactions – markets that had to be regulated – it has finally dawned upon most governments that platform ecosystems and the technical infrastructures upon which they are built have moved to the heart of society. And hence, the governance of platforms takes place at all societal levels: from the supranational to the national to the local and the institutional level. It requires the acute awareness of all legislative and political actors when it comes to policymaking; by the same token, professionals’ sensitivity to the implementation of public values in online environments – public as well as private – is indispensable to the creation of trusted policies.”
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 870883. The information and opinions on this website and other communications materials are those of the authors and do not necessarily reflect the opinion of the European Commission.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.