Abstract

Digital technologies and platforms have brought a much-needed plurality to communications and the press, but have also opened the door to increased manipulation, misinformation, and distortion in the marketplace of ideas. The worrying erosion of citizens’ autonomy has progressively intensified with the evolution of certain business models. This article explores the ways in which the digital economy contributes to distortion of information, and the role competition law may play in safeguarding democracy.

1. INTRODUCTION

Democratic governance is anchored in the principle that power is vested in the people and that people can choose wisely. To enable this, citizens must benefit from an undistorted flow of relevant information, which allows them to exercise their autonomous choices as citizens and voters. The democratic ideal is dependent on these foundations, which are increasingly under threat.

Our modern era is characterized by digital technologies, network effects, and online platforms, as well as an increased reliance on social media and the Internet as sources of information. These trends have brought about much-needed plurality to the concentration of communications and the press, but have also opened the door to increased manipulation, misinformation, and distortions in the marketplace of ideas. The worrying erosion of citizens’ autonomy and of the truth has progressively intensified with the evolution of certain business models characteristic of the online platform economy. The desire to increase online engagement and income has led to a rise in monitoring and data harvesting,1 an expansion of behavioural targeting, and the proliferation of mistruths and deepfakes.2

These trends have increasingly been recognized by policymakers and elected politicians as a threat to liberal democracies. They have led to proposals for, and the implementation of, a range of laws and regulatory tools.3 They have also intensified the discussion on the possible role competition law can play in safeguarding the democratic ideal. This topic is the central focus of the present article.

We begin our discussion with an illustration of the way in which the digital economy contributes to distortions in the marketplace of ideas. We look at the ways in which digital platforms have created power imbalances that distort competition, autonomy, and the market for ideas, and how the value chains underlying their business models easily lead to this outcome. That discussion emphasizes the significant role competition law could play in promoting democracy.

We then reflect on the positioning of the democratic ideal in relation to antitrust enforcement. We note two opposing endpoints of integration. On the one hand, the ‘competition dynamic’ approach views democracy as a valuable incidental outcome of effective competition enforcement, but does not incorporate the value of democracy into the antitrust mechanism or treat it as a goal of competition enforcement. On the other hand, we note the ‘integrated’ approach, which argues for democracy to form an internal substantive benchmark of competition assessments. In between these two endpoints, we position a third model which we refer to as the ‘external benchmark’ approach to democratic antitrust. That approach imports relevant external benchmarks, which could be used to assess harm to democracy, without directly changing the traditional intervention benchmarks. It is anchored in developments of European case law, and, in particular, the recent Court of Justice judgment in Meta Platforms v Bundeskartellamt.4 We elaborate on this model, its application, and its usefulness.

Whichever approach one favours, there should be little doubt as to the link between healthy rivalry and democracy. With the year 2024 being an important election year throughout the world, the significance of undistorted markets, debates, and electoral processes cannot be overstated. Overall, national elections took, or will take, place this year in at least 64 countries, with 49 per cent of the worldwide population called to the ballots.5 Voters in the 27 Member States recently elected the Members of the European Parliament. A US President will soon be elected. And while election observers regularly monitor the fairness and robustness of the democratic process worldwide,6 and many lawmakers strive to safeguard the integrity of democratic elections, the risks of distortion through digital interference are now greater than ever.

2. POWER IMBALANCES, MARKET POWER, AND THE POLITICAL PROCESS

The Internet, in its early days, was created as the ultimate democratic landscape. A web that is structured to enable anyone to have a presence and a voice. A de-centralized and accessible infrastructure that can supercharge the flow of information, promote individual autonomy, fairness, and equality.7

Those days are gone. Today’s Internet is characterized by the presence of gatekeepers, large ecosystems, and private marketplaces. An environment in which the winner takes all, or most. A concentrated landscape in which a handful of powerful platforms and gatekeepers control many of the levers of competition.

The dynamics that led to the rise of this market power have been explored extensively elsewhere.8 Among their key contributors are direct and indirect network effects that support the rise of barriers to entry, the significance of big data and big analytics, which have tipped many markets in favour of the larger operators, and a range of business strategies and transactions that have entrenched the power of several dominant firms.

Central to our discussion is the rise of controlled ecosystems in which the flow of information and communications, as well as the rules which govern entry, exit, and market dynamics, are determined by a single ‘Gamemaker’.9 Many of our online interactions, ranging from shopping to search and social media, take place in such privately controlled ecosystems. And while we obtain many benefits from their operation, we are also exposed to their ability to distort the landscape in which we interact and spend our time. While not immune from competitive pressure, these ecosystems have invested heavily in means to hook users, retain us, and exploit our presence.10 They use friction points in the design of the interface to reduce switching to competing services. Personal data, advanced targeting, manipulations, and dark patterns are deployed in various intensities to influence our choices, our exposure to competing services, and our view of the world.11 And so, while many of us feel inherently autonomous in our decision-making, we often unwittingly walk down a path that was designed to lead us to a destination set by the Gamemakers.

At the market level, these strategies are used to influence competition dynamics, control the user base, and entrench power. In addition, they form part of the machinery of advertising and targeting that stands at the heart of many ecosystems. For many platforms and ecosystems, the harvesting, targeting, and advertising value chains have become one of their key income generators. A technological powerhouse that drives modern ecosystems’ profitability.

To deliver on its potential, heavy investment is directed at these value chains. To ensure retention, personal data and preferences inform the algorithmic machinery and enable behavioural targeting, which is set to maximize engagement.12 Rather unfortunately, one key externality of these profit-maximizing strategies has been the distortion of information flows and users’ perceptions. The reason for this lies in our human traits and our increased engagement when confronted with controversy, sensations, or fear. Having identified these as engagement accelerators, Gamemakers deploy their algorithms to give us more of what triggers our attention. The machinery has been set to engage us, increase retention, and subsequently, profitability. The negative externalities on society and human autonomy have often been set aside as unavoidable side effects in the quest for growth and profits.

Internal studies by Facebook (Meta) found, for example, that its own algorithmic recommendations amplified the joining rate into extremist groups, that its news feed algorithms ‘exploit the human brain’s attraction to divisiveness’ and that ‘[i]f left unchecked’, it would feed users ‘more and more divisive content in an effort to gain user attention & increase time on the platform’.13 Meta is not alone, as the same value chains power many of the ecosystems that dominate our lives. Chamath Palihapitiya, CEO of Social Capital, commented on the inherent risks posed by these value chains: ‘[W]e have created tools that are ripping apart the social fabric of how society works. That is truly where we are…The short-term dopamine-driven feedback loops that we’ve created are destroying how society works. No civil discourse, no cooperation. Misinformation, mistruth. And it’s not an American problem–this is not about Russian ads. This is a global problem….’14 Inflammatory content pushes out honest content creators while supporting extremism, narrow viewpoints, and filter bubbles.15 The problem is not restricted to social media, but also affects video content and recommendations, as well as books and news items.

Fear, rancour, anger, and antagonism are the gold dust that drives bad content, racial divisions, mistruths, conspiracy theories, and ultimately mistrust in the state and governance. The Cambridge Analytica scandal, which made headlines a few years back, was merely an early symptom of a much deeper problem. Indeed, Cambridge Analytica’s ability to use micro-targeting to identify persuadable voters and target them stands in the shadow of new analytical tools.16 What did not change is the profit motive to use technology to increase profitability, despite its ripple effects. Illustrative are comments allegedly made by Facebook’s communications official following criticism of it distorting the democratic process (and made public following a leak of sensitive documents from the company): ‘It will be a flash in the pan. Some legislators will get pissy. And then in a few weeks they will move onto something else. Meanwhile we are printing money in the basement, and we are fine’.17

The friction between the value chains that enable profitability through targeting, manipulation, and retention, on the one hand, and the wider societal interests on the other hand, can no longer be ignored. The profit motive inhibits internal corporate measures that could limit the negative externalities that undermine liberal democracies. Similarly, profitability incentivizes actions that entrench power and the value chains. With such dynamics, there is little surprise that information flows remain distorted.

Ultimately, the combination of technology, concentration, and profit motives creates a toxic mix that risks undermining the democratic foundations of our society.18 Some elements of this tech-driven downward spiral could be addressed through specific regulatory tools. These could offer a valuable remedy to known distortions or practices. As we elaborate below, they may also play a role in addressing these distortions through the case-by-case application of competition law.

3. THREE APPROACHES TO DEMOCRATIC ANTITRUST

In a recent Policy Brief, the European Commission stated that ‘competition law can achieve broader objectives, ensuring consumer choice is a means to ultimately guarantee plurality in a democratic society’.19 In this part, we reflect on the means through which one may achieve these objectives and the various ways to describe the relationship between democracy and competition law.

It is hardly surprising that a range of views exists with respect to the positioning of democracy in relation to antitrust enforcement. In abstract terms, one can identify two endpoints or ‘models’.

For some, the promotion of democracy forms an incidental outcome of antitrust enforcement. Dynamic markets may support democracy, but democracy as a value should not form part of the theories of harm or the benchmarks relied upon by the antitrust enterprise.20 According to this view, competition law could positively influence the democratic process by promoting dynamic contestable markets. Effective competition enforcement can play an important role in safeguarding the marketplace of ideas by supporting rivalry and choice, and targeting abusive exclusions and manipulations. Furthermore, democracy-related considerations can affect the choice and prioritization of cases. When applied effectively to areas that govern the platform economy, antitrust enforcement could strengthen competition, which in turn supports liberal democracies. Importantly, according to this approach, democracy as a value does not penetrate the analytical envelope used to examine platform conduct. We refer to this approach as the ‘competition dynamic’ approach vis-à-vis democratic antitrust.

On the other end of the spectrum, others may see the goals of competition law as encompassing democratic values and justifying its inclusion as a relevant benchmark that directly influences the substantive analysis of conduct.21 Such a widening approach would see plurality, freedom of choice, and democracy forming an integral part of the goals of the antitrust regime. By its very nature, this approach is heavily dependent on the competition regime at issue and the values promoted by it.22 In the European Union (EU), for example, economic plurality and freedom of choice are inherently linked to the quest for an effective competition structure.23 Article 2 TEU24 enshrines the Union’s fundamental values of democracy and the rule of law, and reflects a societal agenda that seeks to promote the general public interest. The European Court of Justice has come to directly rely on this provision more and more.25 In another context, the Court of Justice has emphasized that competition rules—themselves contained in the quasi-constitutional Founding Treaties—must be interpreted in light of this very Article.26 Already in its early days, the European Commission held that going against the Treaty’s objectives could constitute an abuse of dominance.27 Accordingly, democratic values could be pursued directly, or by promoting consumer well-being in its wider sense.28 We refer to this as an ‘integrated’ approach vis-à-vis democratic antitrust, as it directly integrates democratic values into the competition fabric.

Reflecting on the discussion in Section 2, one should not underestimate the need to offer effective intervention. At the same time, one cannot ignore the risk of overstretching the goals of competition law and diluting its analytical framework in an ‘integrated’ approach. This approach also raises challenging questions as to the balancing between multiple, and at times inconsistent, goals.

Taking note of the practical limitations of the ‘integrated’ approach, as well as the desire to move beyond the ‘competition dynamic’ approach, one can see the benefits of an ‘external benchmark’ approach, which is situated in between the two endpoints.

The external benchmark approach to democratic antitrust supplements the ‘competition dynamic’ approach by making use of the Court of Justice’s ruling in Meta Platforms v Bundeskartellamt.29 As such, it forms a mid-way for limited incorporation of democratic values into antitrust enforcement. This approach, as we elaborate below, can be used to import relevant external benchmarks that allow us to assess harm to democracy, without fully integrating democracy into the antitrust machinery. As such, it does not necessitate a discussion on the goals of competition law, nor does it affect the analytical envelope used for appraisal.

4. THE META RULING AS AN INSTRUMENT FOR INTEGRATING EXTERNAL BENCHMARKS

In the Meta Platforms v Bundeskartellamt judgment, the Court of Justice confirmed that competition authorities, when examining the legality of an action under the competition laws, could take into account whether the conduct in question complies with rules other than those relating to competition law.

The case emerged from a 2019 decision of the German Bundeskartellamt prohibiting several data practices that the social network Facebook had engaged in.30 Facebook offered its social network to users in return for the collection of their personal data, which was then monetized through targeted advertising. The Bundeskartellamt took issue with the fact that Facebook collected personal data about its users from outside its social network and then combined this with personal data it gathered on its social network, thereby enhancing the comprehensive user profiles it used as a basis for its targeted advertising services. This collection, combination, and subsequent use of user data, according to the Bundeskartellamt, lacked valid user consent and therefore represented a breach of the European General Data Protection Regulation (GDPR)31 and of the German abuse of dominance rules.32 In the course of the German proceedings, the Higher Regional Court Düsseldorf referred a number of questions to the Court of Justice of the European Union (CJEU).33 Of interest in the present context is the referring court’s question whether the Bundeskartellamt, ‘for the purposes of monitoring abuses of competition law, [could find] that [Meta/Facebook]’s contractual terms relating to data processing and their implementation breach the GDPR and issue[] an order to end that breach’.34

In its ruling, the CJEU recalled Advocate General Rantos’s Opinion,35 and held that in considering a possible abuse of dominance, a competition authority had to carry out a fact-specific analysis of the dominant company’s conduct in the course of which:

the compliance or non-compliance of that conduct with the provisions of the GDPR may, depending on the circumstances, be a vital clue among the relevant circumstances of the case in order to establish whether that conduct entails resorting to methods governing normal competition and to assess the consequences of a certain practice in the market or for consumers.36

But the Court did not end its analysis there, for it also asserted that:

in the context of the examination of an abuse of a dominant position by an undertaking on a particular market, it may be necessary for the competition authority of the Member State concerned also to examine whether that undertaking’s conduct complies with rules other than those relating to competition law […].37

The Court underlined that, given the importance of personal data for the digital economy, ignoring data protection rules in the competitive assessment ‘would be liable to undermine the effectiveness of competition law within the European Union’.38 As data protection authorities were called upon to safeguard companies’ compliance with the GDPR, the Court held that—based on the principle of sincere cooperation—a competition authority needs to cooperate with the competent data protection authorities where it carries out an assessment such as the one the Bundeskartellamt had carried out.39

While the consideration of a conduct’s legality under other laws and regulations when establishing a violation of competition law is not novel,40 the holding in Meta opens the door for competition agencies to directly consider, subject to the duty of sincere cooperation, wider regulatory frameworks when assessing the meaning of competition on the merits. As such, it expands the range of benchmarks that may be considered to assess violations of competition law.41

In the context of our discussion, the Meta judgment can enable authorities to more easily move towards safeguarding democratic ideals without resorting to the ‘integrated model’, thus avoiding the complexities that it carries. This ‘external benchmark’ approach could enable authorities and claimants, when applying Article 102 TFEU, to rely on traditional theories of harm while informing their analysis by making reference to external benchmarks from other laws and regulations aimed at protecting democratic freedoms and processes.

Of course, an abuse of a dominant position does not require the dominant company’s conduct to be unlawful under another legal regime. Quite to the contrary, it may very well be that a dominant company’s conduct is lawful under another legal regime but that, based on the special responsibility that a position of market power conveys on dominant undertakings,42 the conduct constitutes an infringement of Article 102 TFEU. In AstraZeneca, for instance, the Court of Justice found conduct that was lawful under patent law to be an abuse of dominance.43 But, as AG Rantos stated in Meta:

the compliance or non-compliance of [a dominant company’s] conduct with the provisions of the GDPR, not taken in isolation but considering all the circumstances of the case, may be a vital clue as to whether that conduct entails resorting to methods prevailing under merit-based competition.44

Nevertheless, ‘conduct relating to data processing may breach competition rules even if it complies with the GDPR; conversely, unlawful conduct under the GDPR does not automatically mean that it breaches competition rules’.45

5. DEMOCRACY-ORIENTED EXTERNAL BENCHMARKS

The ‘external benchmark’ approach enhances the capacity to utilize competition law to safeguard democracy, without affecting its core analytical envelope. While it does not inject democracy as a substantive competition value, it does widen the scope of enforcement through the reliance on external benchmarks. This, in itself, is not without controversy, as it offers the enforcement agency flexibility that it may not have benefitted from before. Importantly, however, such an approach does not require competition enforcers to act as appraisers of democracy, nor does it bestow on them the power to weigh non-economic against economic values. Rather, the approach relies on the boundaries of legality that the legislators have set in other laws and enables competition enforcers to use those benchmarks to inform their competition assessment.

To illustrate the possible use of the ‘external benchmark’ approach, let us consider democracy-related provisions in two new European legal instruments: the Regulation on Transparency and Targeting of Political Advertising (RTPA) of 2024 and the Digital Services Act (DSA) of 2022.46

The Regulation on Targeted Political Advertising

Concerned about the undue influence of targeted political advertising on the state of democracy,47 the European Commission adopted the RTPA in March 2024.48 In addition to its direct application, the RTPA can be utilized as part of the ‘external benchmark’ approach. To that end, it may offer ‘vital clues’ in cases in which targeted political advertising forms part of activities suspected of infringing competition law.49

The RTPA provides EU-wide rules on transparency and due diligence related to political advertising, as well as rules for political advertising that targets users based on their personal data.50 Ad-tech platforms and influencers are important actors in this ecosystem.51 Hypernudging through ‘manipulative’52 micro-targeted political advertisements and through ‘dynamically personalized data‐driven nudging’ of users53 is seen as a particular threat.54 With its rules, the RTPA aims to facilitate open and fair political debate as well as free and fair elections.55

The RTPA establishes that in the 3 months preceding an election or referendum—no matter at what level—political advertising56 services may only be provided to actors with close ties to the EU,57 with the intention of minimizing foreign election interference.

Chapter III of the RTPA contains rules on targeting58 and ad delivery of online political advertising. This section appears of particular relevance to possible democracy-related harm stemming from digital platforms. It limits targeted political advertising to instances in which the data controller59 collected personal data from a data subject that explicitly and separately consented to their data being processed for political advertising.60 Even in the case of such consent, however, profiling61 may not rely on data that reveal racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership. Genetic data, biometric data, data concerning health, or data concerning a natural person’s sex life or sexual orientation may not be relied upon for targeted political advertising, either.62 This prevents targeted political ads that are based on these special categories of personal data, thus trying to counter the echo chamber63 effects of many digital platforms and ensuring a broader political debate.

The Digital Services Act

The DSA identifies risks related to ‘negative effects on democratic processes, civic discourse and electoral processes’64 coming from disinformation, manipulation, and abusive conduct in the online environment. It aims at ‘greater democratic control and oversight over systemic platforms’ as well as intending to ‘mitigat[e] systemic risks, such as manipulation or disinformation’.65 Several of the DSA’s substantive rules relate to democratic values, and these can broadly be categorized as belonging to content moderation, targeted advertisements, and identification of misinformation and illegal content.

Online intermediaries must publish information on how they carry out content moderation and handle complaints.66 At least once a year, they also need to make available detailed reports on their content moderation.67 This is meant to ensure digital platforms’ accountability concerning the spreading of disinformation and misinformation online.

Providers of hosting services need to allow users to flag content that they believe to be illegal.68 User accounts that ‘frequently provide manifestly illegal content’ are to be suspended.69

Concerning online advertisements, the DSA foresees certain transparency obligations.70 Very large online platforms71 must carry out a risk assessment related to:

  • the dissemination of illegal content through their services;

  • any … negative effects for the exercise of fundamental rights, in particular … to freedom of expression and information, including the freedom and pluralism of the media …;

  • any … negative effects on civic discourse and electoral processes ….72

Where risks are identified, the platform must introduce mitigation measures73 and, if it is of a certain size, a compliance function.74

Targeted advertisements may not be based on profiling75 that uses ‘special categories of personal data’76 – that is data that reveals users’ racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership. Genetic data, biometric data, data concerning health, or data concerning a natural person’s sex life or sexual orientation may also not be relied upon for targeted advertising.77

Breaching democracy-preserving rules as anti-competitive harm

Under what circumstances could it be outside of ‘normal competition’ to breach the provisions in the RTPA or the DSA aimed at preserving democracy? When would manipulation, distortion, or targeted political advertising that does not stay within the limits set by these instruments be such that ignoring these rules in the competitive assessment would, in the words of the Court, ‘be liable to undermine the effectiveness of competition law within the European Union’?78

Generalizations are difficult, and the Court has repeatedly and rightly emphasized that it is the concrete economic and legal circumstances of a case that matter. In order to find a breach of Article 102 TFEU, competition must be affected. Typically, we distinguish exclusionary and exploitative abuses. In the case of an exploitative abuse through, for example, targeted political advertisements, one needs to establish who is being exploited and how, as well as whether such an exploitation comes within the sphere of application of Article 102 TFEU.

It is conceivable that two groups may be exploited in such a case: the data subjects whose personal data is being misused, and the users/voters targeted by political advertisements.79 The exploitation could occur through the targeting of political advertising based on specially protected categories of personal data that has been barred as the basis for targeted political advertisements.

How can such exploitation come within the sphere of Article 102 TFEU? Akman has argued that for there to be anti-competitive exploitation, there must be harm to competition—which in turn requires an element of exclusion.80 But what about direct harm to consumers, as also occurs in the well-established exploitative abuse of unfair prices?81 And what about reductions in quality? If a reduction in quality amounts to harm, then a reduction in privacy82 or a reduction in the quality of online debates could well fall into this category. It is arguable that just as the excessive collection of data can, under certain circumstances, amount to an abuse of dominance as per the recent Meta judgment, so could the illegal use of certain categories of personal data for targeted political advertisements satisfy this threshold. One may find support for this assertion in Google Android, where the General Court emphasized that Google’s abusive conduct was ‘detrimental to the interest of consumers in having more than one source for obtaining information on the internet’.83 The General Court linked that interest to an interest in privacy or the provision of certain linguistic features. ‘Such interests’, it reasoned, ‘were not only consistent with competition on the merits, … but were also necessary in order to ensure plurality in a democratic society’.84 This assertion could support the utilization of the ‘external benchmark’ approach as a vehicle for protecting democracy-related interests.

Elsewhere, the General Court has emphasized that not only do dominant undertakings have a special responsibility to act within competition on the merits, but their responsibility may also be enhanced where digital platforms are ‘superdominant’.85 Arguably, this logic would also apply to actions that harm the democratic process and offer further justification for the use of the ‘external benchmark’ approach.

Cooperation with competent enforcement authorities: practical considerations

To complete our discussion of the ‘external benchmark’ approach, we also briefly note the cooperation requirements stipulated in the Meta judgment. Should competition authorities consider relying on breaches of the RTPA or the DSA as indicators of anti-competitive harm, then they are required to cooperate with the authorities enforcing such ‘other regulation’. Both the RTPA and the DSA foresee that the EU Member States designate competent enforcement authorities. Cooperation with these can also ensure that competition authorities do not fall foul of the ne bis in idem principle.86 In the case of the RTPA, the enforcement regime very much depends on the rules that are to be enforced. The rules related to targeted political advertisements (Articles 18 and 19) are enforced by the same supervisory authorities that are responsible for ensuring adherence to the GDPR. They may be assisted by the European Data Protection Board (EDPB), which can publish guidelines for assessing compliance.87 For the other rules contained in the RTPA, the Member States need to designate competent authorities.88 Effective cooperation among different competent authorities must be ensured.89 Enforcement of the DSA happens at both the national and European levels, with the European Commission as the main enforcer of obligations for very large online platforms.90

6. CONCLUSION

Predominant value chains that drive the digital platform economy and innovation dynamics may place a significant strain on key foundations of democratic societies. Their ripple effects may limit genuine and free debates, distort information flows, perception, and decision-making, while targeting and exploiting voters’ vulnerabilities for political gain or profit.91

Competition law cannot save democracy on its own. But it can help support and protect the main pillars of the democratic processes. As we illustrated, the two endpoints each carry certain limitations. The ‘competition dynamic’ approach expects a more democratic outcome to arise from stringent competition enforcement and is of great value, but it may fail to overcome challenges associated with specific activities that do not easily translate into theories of harm. In light of current dynamics in the digital economy, today’s core competition law theories of harm may fail to capture some of the anti-democratic effects generated in the digital economy, resulting in a feeble impact. In contrast, the fully ‘integrated’ approach offers the opportunity to develop new theories of harm directly based on democratic values. It could enable the integration of democracy as a value protected alongside consumer well-being and consumer welfare, but such an ‘integrated’ approach risks undermining the analytical integrity of competition law.

In the face of these shortcomings and pitfalls, the Court of Justice may have provided us with a middle ground in the EU: the ‘external benchmark’ approach. This approach does not turn competition authorities into election sheriffs for digital platforms, nor does it lay back and let competition (law) do its magic. Instead, it relies on legislation that explicitly targets anti-democratic conduct and imports this into the competition analysis. We have identified the RTPA and the DSA as possible external benchmarks, with the recent European Media Freedom Act92 presenting a further option. Evidently, the ‘external benchmark’ approach is not free of obstacles—including uncertainties associated with the choice of external benchmarks and the weight attributed to them in the competition analysis. Still, this approach comes with fewer hurdles than a fully integrated approach. Importantly, it offers competition enforcers a valuable benchmark against which to assess possible distortions. It provides a workable compromise between an overly reticent and an overly integrated approach to safeguarding democracy in the digital sphere. Furthermore, it retains an attribute that many regulatory regimes lack: the flexibility to evolve with changing market realities.

Footnotes

1

Ariel Ezrachi and Viktoria HSE Robertson, ‘Competition, Market Power and Third-Party Tracking’ (2019) 42 W Comp 5.

2

Ariel Ezrachi and Maurice E Stucke, How Big-Tech Barons Smash Innovation―and How to Strike Back (Harper Collins 2022).

3

Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act, DSA) [2022] OJ L277/1; Regulation (EU) 2024/900 of the European Parliament and of the Council of 13 March 2024 on the transparency and targeting of political advertising (RTPA) [2024] OJ L900. In the USA, two States took a different direction: Florida and Texas introduced laws banning social media companies from removing political content or banning candidates running for office, leading to a case that is currently heard by the Supreme Court; see Texas House Bill 20 relating to censorship of or certain other interference with digital expression, including expression on social media platforms or through electronic mail messages (9 September 2021); Florida Senate Bill 7072 relating to social media platforms (24 May 2021); David McCabe, ‘What to Know About the Supreme Court Arguments on Social Media Laws’ New York Times (New York, 26 February 2024) <https://www.nytimes.com/2024/02/25/technology/free-speech-social-media-laws.html> accessed 14 August 2024; Moody v NetChoice, Docket No 22-277; NetChoice v Paxton, Docket No 22-555.

4

Case C‑252/21 Meta Platforms v Bundeskartellamt ECLI:EU:C:2023:537.

5

See Koh Ewe, ‘The Ultimate Election Year: All the Elections Around the World in 2024’ Time Magazine (New York, 28 December 2023) <https://time.com/6550920/world-elections-2024/> accessed 14 August 2024.

6

See, eg, Organization for Security and Co-operation in Europe (OSCE), ‘Elections’ <https://www.osce.org/odihr/elections> accessed 24 February 2024.

7

John Naughton, A Brief History of the Future: Origins of the Internet (Diane Publishing 1999); Konstantinos Komaitis, ‘The Democratic Nature of the Internet’s Infrastructure’ (Discussion Paper 3/2023) <https://www.idea.int/sites/default/files/publications/democratic-nature-of-the-internets-infrastructure.pdf> accessed 14 August 2024.

8

Ariel Ezrachi and Maurice E Stucke, Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy Hardcover (HUP 2016); Ezrachi and Stucke (n 2); Maurice E Stucke and Ariel Ezrachi, ‘How Digital Assistants Can Harm Our Economy, Privacy, and Democracy’ (2017) 32 Berkeley Tech L J 1239.

9

On this terminology, see Maurice E Stucke and Ariel Ezrachi, Competition Overdose: How Free Market Mythology Transformed Us from Citizen Kings to Market Servants (HarperCollins 2020) ch 8.

10

ibid.

11

Amit Zac and others, ‘Dark Patterns and Online Consumer Vulnerability’ (22 August 2023) <https://ssrn.com/abstract=4547964> accessed 14 August 2024.

12

Ezrachi and Stucke (n 2) ch 7.

13

ibid, with references to Jeff Horwitz and Deepa Seetharaman, ‘Facebook Executives Shut Down Efforts to Make the Site Less Divisive’ Wall Street Journal (New York, 26 May 2020) <https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499> accessed 14 August 2024; Keach Hagey and Jeff Horwitz, ‘Facebook Tried to Make Its Platform a Healthier Place. It Got Angrier Instead’ Wall Street Journal (New York, 15 September 2021) <https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215?mod=article_inline> accessed 14 August 2024; Luke Darby, ‘Facebook Knows It’s Engineered to “Exploit the Human Brain’s Attraction to Divisiveness”. Like Other Platforms, Facebook Is a Hotbed for Violent Extremism and It Doesn’t Seem to Care’ GQ (New York, 27 May 2020) <https://www.gq.com/story/facebook-spare-the-share> accessed 14 August 2024.

14

Full citation available in Ezrachi and Stucke (n 2). Video: Interview with Chamath Palihapitiya, Founder and CEO Social Capital, ‘Money as an Instrument of Change’ (13 November 2017) <https://www.youtube.com/watch?v=PMotykw0SIk&t=1281s> at minute 21, accessed 14 August 2024.

15

Karen Hao, ‘YouTube is Experimenting with Ways to Make its Algorithm even More Addictive’ MIT Technology Review (Boston, 27 September 2019) <https://www.technologyreview.com/2019/09/27/132829/youtube-algorithm-gets-more-addictive/> accessed 14 August 2024; Jonas Kaiser and Adrian Rauchfleisch, ‘Unite the Right? How YouTube’s Recommendation Algorithm Connects the U.S. Far-Right’ Medium (11 April 2018) <https://medium.com/@MediaManipulation/unite-the-right-how-youtubes-recommendation-algorithm-connects-the-u-s-far-right-9f1387ccfabd> accessed 14 August 2024.

16

Chanel 4 news, ‘Cambridge Analytica Uncovered: Secret Filming Reveals Election Tricks’ <https://www.youtube.com/watch?v=mpbeOCKZFfQ> accessed 14 August 2024.

17

Craig Timberg, ‘New Whistleblower Claims Facebook Allowed Hate, Illegal Activity to Go Unchecked’ Washington Post (Washington, DC, 22 October 2021) <https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblower-complaint/> accessed 14 August 2024.

18

Eg, see Jan AGM van Dijk and Kenneth L Hacker, Internet and Democracy in the Network Society (Routledge 2018); Martin Moore, Democracy Hacked: How Technology Is Destabilising Global Politics (Oneworld 2019); Siva Vaidhyanathan, Antisocial Media: How Facebook Disconnects Us and Undermines Democracy (OUP 2018); Didier Bigo, Engin Isin and Evelyn Ruppert (eds), Data Politics: Worlds, Subjects, Rights (Routledge 2019); Kris Shaffer, Data Versus Democracy: How Big Data Algorithms Shape Opinions and Alter the Course of History (Apress 2019); Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (Profile Books 2019); Nathaniel Persily and Joshua A Tucker (eds), Social Media and Democracy (CUP 2020).

19

European Commission, ‘A Dynamic and Workable Effects-Based Approach to Abuse of Dominance’ Competition Policy Brief (Brussels, March 2023) <https://competition-policy.ec.europa.eu/system/files/2023-03/kdak23001enn_competition_policy_brief_1_2023_Article102_0.pdf> accessed 14 August 2024.

20

Eg, see Daniel A Crane, ‘Antitrust as an Instrument of Democracy’ (2022) 72 Duke L J Online 21, 23; Spencer W Waller, ‘Discretionary Justice in Antitrust’ (2024) J Antitrust Enforc <https://doi.org/10.1093/jaenfo/jnae035> accessed 14 August 2024.

21

Recounting this debate, see Viktoria HSE Robertson, ‘Antitrust, Big Tech, and Democracy: A Research Agenda’ (2022) 67 Antitrust Bull 259; Viktoria HSE Robertson, ‘Demokratiedefizite auf digitalen Märkten aus kartellrechtlicher Sicht’ in Matthias Wendland, Iris Eisenberger, and Rainer Niemann (eds), Smart Regulation: Theorie- und evidenzbasierte Politik (Mohr Siebeck 2023) 127.

22

Ariel Ezrachi, ‘Sponge’ (2017) 5 J Antitrust Enforc 49.

23

Ariel Ezrachi, ‘EU Competition Law Goals and the Digital Economy’ Oxford Legal Studies Research Paper No 17/2018 (6 June 2018) <https://ssrn.com/abstract=3191766> accessed 14 August 2024. Where a merger may impact media pluralism, the EU already foresees that this aspect of the merger can be separately assessed by the competent national competition authority; art 21, para 4 of Council Regulation (EC) No 139/2004 of 20 January 2004 on the control of concentrations between undertakings (EUMR) [2004] OJ L24/1.

24

Treaty on European Union (TEU) [2016] OJ C202/15.

25

Luke D Spieker, ‘Defending Union Values in Judicial Proceedings. On How to Turn Article 2 TEU into a Judicially Applicable Provision’ in Armin von Bogdandy and others (eds), Defending Checks and Balances in EU Member States: Taking Stock of Europe’s Actions (Springer 2021) 237; Luke D Spieker, EU Values Before the Court of Justice: Foundations, Potential, Risks (OUP 2023).

26

Joined Cases 6 and 7/73 Commercial Solvents v Commission ECLI:EU:C:1974:18, para 32.

27

European Commission, ‘Le problème de la concentration dans le marché commun’ Série concurrence no 3 (Brussels 1966) 25, para 25 (‘Il y a exploitation abusive lorsque le comportement de l’entreprise constitue objectivement un comportement fautif au regard des objectifs fixés par le Traité.’).

28

Note that the Treaty does not make reference to the concept of consumer welfare, but rather is concerned with the protection of consumer well-being. On the notion of well-being, see, eg, Cases T-213/ and T-214/01 Österreichische Postsparkasse and Bank für Arbeit und Wirtschaft v Commission ECLI:EU:T:2006:151, para 115.

29

Case C-252/21 Meta Platforms v Bundeskartellamt ECLI:EU:C:2023:537.

30

Bundeskartellamt, Facebook (B6-22/16, 6 February 2019). On the theory of harm in that case, see already Viktoria HSE Robertson, ‘The Theory of Harm in the Bundeskartellamt’s Facebook Decision’ CPI EU News (March 2019) <https://www.competitionpolicyinternational.com/wp-content/uploads/2019/03/EU-News-Column-March-2019-Full-1.pdf> accessed 14 August 2024.

31

Regulation (EU) 2016/679 of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation, GDPR) [2016] OJ L119/1.

32

On this, see Viktoria HSE Robertson, ‘Excessive Data Collection: Privacy Considerations and Abuse of Dominance in the Era of Big Data’ (2020) 57 Common Mark Law Rew 161.

33

Case C‑252/21 (n 4).

34

ibid, para 35. The remaining questions focused on GDPR-related questions such as consent. On this case, see Ana-Maria Hriscu, ‘C-252/21 Meta v Bundeskartellamt: The Lawfulness of Big Tech’s Processing of Personal Data and the Relationship Between Data Protection and Competition Law’ (2023) 9 Eur Data Prot Law Rev 371. Arguing that the case would have invited the CJEU to pronounce itself on the interplay of national and EU competition law, see Or Brook and Magali Eben, ‘Another Missed Opportunity? Case C-252/21 Meta Platforms v. Bundeskartellamt and the Relationship between EU Competition Law and National Laws’ (2024) 15 J Eur Comp Law Pract 25.

35

Opinion of AG Rantos in Case C‑252/21 Meta Platforms v Bundeskartellamt ECLI:EU:C:2022:704, para 23. On this opinion, see also Markus Meyer, ‘EuGH: Wettbewerbsbehörden dürfen Datenschutzrecht prüfen’ (2022) 38 Computer und Recht R124.

36

Case C‑252/21 (n 4) para 47.

37

ibid, para 48 (emphasis added).

38

ibid, para 51.

39

ibid, para 54.

40

Eg, in AstraZeneca, the Court considered misleading representations to the patent office, and in Allianz Hungária, it took note of the illegality of an agreement under national legislation concerning insurance brokers. See Case C-457/10 P AstraZeneca v Commission ECLI:EU:C:2012:770; Case C-32/11 Allianz Hungária ECLI:EU:C:2013:160.

41

Peter J van de Waerdt, ‘Meta v Bundeskartellamt: Something Old, Something New’ (2023) 8 European Papers 1077, 1077.

42

See, eg, Case 322/81 Michelin v Commission ECLI:EU:C:1983:313, para 57.

43

The Court held that ‘in the majority of cases, abuses of dominant positions consist of behaviour which is otherwise lawful under branches of law other than competition law’; Case C-457/10 P AstraZeneca v Commission ECLI:EU:C:2012:770, para 132.

44

Opinion of AG Rantos in Case C‑252/21 Meta Platforms v Bundeskartellamt ECLI:EU:C:2022:704, para 23.

45

ibid, fn 18.

46

Suggesting that, following Meta, the DSA may be incorporated into the data-related assessment of an abuse of dominance, see already Anne C Witt, ‘Meta v Bundeskartellamt—data-based conduct between antitrust law and regulation’ (2024) 12 J Antitrust Enforc 345.

47

For the European Commission’s wider concerns regarding the state of democracy in the EU, see European Commission, ‘European Democracy Action Plan’ COM (2020) 790 final.

48

Regulation (EU) 2024/900 of the European Parliament and of the Council of 13 March 2024 on the transparency and targeting of political advertising (RTPA) [2024] OJ L900.

49

Case C‑252/21 (n 4) para 47.

50

art 1 RTPA.

51

Preamble para 1 RTPA.

52

Preamble para 78 RTPA.

53

Viktorija Morozovaite, ‘Hypernudging in the Changing European Regulatory Landscape for Digital Markets’ (2023) 15 Pol Internet 78, 79.

54

Preamble para 78 RTPA.

55

Preamble para 4 RTPA.

56

Political advertising is ‘the preparation, placement, promotion, publication, delivery or dissemination, by any means, of a message […] by, for or on behalf of a political actor [or a message] which is liable and designed to influence the outcome of an election or referendum, a voting behaviour or a legislative or regulatory process, at Union, national, regional or local level’; art 3 para 2 RTPA.

57

art 5 para 2 RTPA. This includes citizens of the EU and legal persons established in the EU that are not ultimately owned or controlled by a third-country national.

58

Political advertising is targeted when it only addresses ‘a specific person or group of persons, or […] exclude[s] them on the basis of the processing of personal data’; art 3 para 11 RTPA.

59

art 4 para 7 GDPR.

60

art 18 para 1 lit b RTPA.

61

As per art 4 para 4 GDPR, ‘“profiling” means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements’.

62

art 18 para 1 RTPA.

63

For a definition, see Kathleen Hall Jamieson and Joseph N Cappella, Echo Chamber: Rush Limbaugh and the Conservative Media Establishment (OUP 2008) 76: ‘a bounded, enclosed media space that has the potential to both magnify the messages delivered within it and insulate them from rebuttal’.

64

Preamble para 82 of the DSA. See also Preamble para 104.

65

European Commission, ‘The Digital Services Act’ <https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en> accessed 14 August 2024.

66

art 14 DSA. It has been warned that as the DSA (and now also the RTPA) focus on relatively large digital platforms, this ‘leaves regulatory blind spots and creates dependencies as well as adverse incentive structures’ as regards societal harms done by digital platforms below these thresholds; see Johann Laux, Sandra Wachter, and Brent Mittelstadt, ‘Taming the Few: Platform Regulation, Independent Audits, and the Risks of Capture Created By the DMA and DSA’ (2021) 43 CLS Rev 105613, 3.

67

art 15 DSA.

68

art 16 DSA.

69

art 23 para 1 DSA.

70

art 26 DSA. These obligations are enhanced for very large platforms; art 39 DSA.

71

On this categorization, see art 33 DSA (requiring at least 45 million average monthly active service users; designation by the Commission is required before the obligations enter into force).

72

art 34 para 1 DSA.

73

art 35 DSA.

74

art 41 DSA.

75

art 4 para 4 GDPR.

76

art 9 para 1 GDPR.

77

art 26 para 3 DSA.

78

Case C‑252/21 (n 4) para 51.

79

While these groups will often overlap, it is also possible to infer certain political preferences and vulnerabilities from users/voters whose personal data has not been directly collected.

80

Pınar Akman, ‘The Role of Exploitation in Abuse under Article 82 EC’ (2009) 11 CYELS 165.

81

For the critique that ‘exploitative abuse was simply read into the law and remains undefined’, see Gregory J Werden, ‘Exploitative Abuse of a Dominant Position: A Bad Idea that Now Should be Abandoned’ (2021) 17 Euro CJ 682, 683. For an alternative view, see Marco Botta, ‘Sanctioning Unfair Pricing under Art. 102(a) TFEU: Yes, We Can!’ (2021) 17 Euro CJ 156.

82

On a reduction of privacy in the course of excessive data collection, see already Robertson, ‘Excessive Data Collection’ (n 32).

83

Case T-604/18 Google v Commission (Google Android) ECLI:EU:T:2022:541, para 1028.

84

ibid (emphasis added).

85

Case T-612/17 Google v Commission (Google Shopping) ECLI:EU:T:2021:763, paras 182–183.

86

Indeed, the Court has held that where competition law is applied in parallel to other rules, cooperation and coordination between the enforcers needs to ensure that the ne bis in idem rule is not breached; see Case C-117/20 bpost v Autorité belge de la concurrence ECLI:EU:C:2022:202, paras 51, 55.

87

art 22 para 1 RTPA.

88

art 22 paras 3, 4 RTPA. For ensuring compliance with the transparency-related provisions (arts 7–17 and 21 RTPA), the RTPA suggests that Member States designate the same authorities as are responsible for enforcing the DSA; art 22 para 3 RTPA. For the remaining rules, it suggests that they may be the authorities designated under the Audiovisual Media Services Directive; see art 22 para 4 RTPA; art 30 of Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation, or administrative action in Member States concerning the provision of audiovisual media services [2010] OJ L95/1 as amended.

89

art 22 para 7 RTPA.

90

arts 49, 56 DSA.

91

Ezrachi and Stucke (n 2).

92

Regulation (EU) 2024/1083 of the European Parliament and of the Council of 11 April 2024 establishing a common framework for media services in the internal market (European Media Freedom Act) [2024] OJ L1083.

This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs licence (https://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial reproduction and distribution of the work, in any medium, provided the original work is not altered or transformed in any way, and that the work is properly cited. For commercial re-use, please contact [email protected]