1. Introduction
Online platforms of significant influenceFootnote 1 have become central forums for the dissemination of information and the exercise of freedom of expression. Their capacity to amplify or suppress content and structure visibility has positioned them as powerful gatekeepers of the digital public sphere. In response, European policymakers have sought to regulate platforms of significant influence in order to address illegal and harmful content while safeguarding fundamental rights. These efforts reflect a broader shift from liability exemptions and indirect obligations to direct delegation of regulatory functions to private platforms.
Within this regulatory evolution, so-called ‘must-carry’ obligations have emerged as a distinctive tool. Borrowed from broadcasting law, the term now refers to measures that restrict platforms’ discretion in moderating lawful content. Such obligations take diverse forms, from proposals to prioritise information of public interest in recommender systems to privileges shielding media organisations or politicians from suspension or deplatforming. Proponents frame these measures as necessary to protect pluralism, editorial independence, and democratic discourse. Critics warn, however, that they risk entrenching power imbalances, undermining platform autonomy, and creating a tiered system of expression.
This paper situates these developments within the broader European debate on platform regulation and freedom of expression. It argues that contemporary must-carry provisions are better understood as special treatment rules that privilege particular speakers. While designed to recalibrate the relationship between platforms, media providers, and political actors, such measures carry significant risks for equality of expression and democratic accountability.
The paper proceeds as follows. Section 2 traces the historical development of platform regulation in the EU, looking at the increased delegation of enforcement responsibilities. Section 3 analyses the EU framework, focusing on the rejected proposals in the Digital Services Act (DSA)Footnote 2 and the adoption of Article 18 of the European Media Freedom Act (EMFA).Footnote 3 Section 4 turns to national approaches in Germany, the UK, and Poland, supplemented by relevant case law. Section 5 compares the new measures with must-carry obligations in broadcasting and evaluates these measures as special treatment rules for media and political actors, assessing their implications for freedom of expression and democratic accountability. Section 6 concludes by arguing that sustainable solutions lie not in compelled inclusion but in systemic safeguards, including financial support for journalism, innovation in local media, and user empowerment.
2. The emergence of new ‘must-carry’ obligations
Over the past decade, online platforms have become crucial fora for dissemination and access to information and digital content in the online environment, as well as for individual expression.Footnote 4 At the same time, platforms also became points of control, due to their ability to affect user behaviour. Examples include eliminating or disabling access to their service or to the information they host, reducing or increasing the visibility of content, or even assisting third-party enforcement by identifying the wrongdoers on their services. It is therefore unsurprising that platforms have become the centre of policy discussions on how the law can and should regulate illegal and harmful activities online, fitting in the broad area of platform regulation. In particular, States are interested in using the technical and human resources of platforms to regulate illegal content online, as well as information that, although legal, is considered harmful – for example mis/disinformation.Footnote 5 States attempt to do so by assigning some of their own traditional policing functions to platforms and thus privatising enforcement of public policies on speech. This legislative trend towards co-opting platforms to regulate expression online initially took place through indirect means, namely conditional liability exemptions or ‘safe-harbours’ for certain intermediary services, such as mere conduit, caching and hosting, provided for initially in the e-Commerce Directive.Footnote 6 Currently, as we discuss below, there is a noticeable shift towards direct delegation mechanisms.
In Europe, this trend is visible at multiple levels. At the EU level, it is motivated by arguments on harmonising the digital single market, tackling illegal content online and enhancing the responsibility of platforms.Footnote 7 The EU legislature has been steadily churning out horizontal and sector-specific instruments that explicitly delegate monitoring, assessing, and sanctioning powers to the platforms. Examples include, in chronological order, the Code of Conduct on Hate Speech Online, the Copyright in the Digital Single Market Directive (CDSMD), the amended Audiovisual Media Services Directive (AVMSD), and the Terrorist Content Regulation.Footnote 8 At national level, sometimes predating and spurring EU intervention, relevant examples of proposed and enacted legislation include the German Network Enforcement Act (NetzDG), the Austrian Anti-Hate Speech Law (KoPlG), the French Avia Law, and the UK Online Safety Act.Footnote 9
Broadly speaking, these legal instruments and initiatives share common ground insofar as they use similar mechanisms to enhance the ‘responsibility’ of platforms.Footnote 10 First, they attempt to further detail and increase the exposure of platforms to (direct) liability and obligations to deploy enforcement measures for the third-party content they host. Second, they impose additional obligations on the platforms themselves regarding the effective moderation of the content they host and (the behaviour of) their users, as well as the design and functioning of their systems.Footnote 11 Overall, these efforts require platforms to take up additional enforcement duties due to their increasingly prominent role in online communications. Calls for enhanced responsibility of platforms point also to the political, societal, and even moral responsibility that accompanies such a role.Footnote 12
But delegation of enforcement measures to private entities raises several concerns.Footnote 13 First, private platforms are not competent to make decisions on fundamental rights, a role traditionally assigned to the judiciary. Second, platforms do not carry out a proper balancing of the competing rights at stake. As a result, this privatisation of enforcement duties risks adversely affecting the exercise of fundamental rights of Internet users. It may, in particular, lead to undue restrictions to freedom of expression through private censorship.Footnote 14
In response to mounting criticism, European policymakers began introducing measures and safeguards to allow for more effective protection of users’ right to freedom of expression, or at least to mitigate the negative effects of the obligations imposed on platforms. These countermeasures come in different shapes and forms. In some cases, they can be viewed as natural developments of existing duties and obligations in the area of intermediary liability, as updated to the specific legislation or subject matter at hand. This is particularly noticeable in the crown jewel of European platform regulation, the DSA, which amends and extends upon the 2000 e-Commerce Directive.
Illustrations of such countermeasures include the reappearing prohibition on the imposition of general monitoring obligations, specifications of notice-and-action procedures, in-platform redress mechanisms, and out-of-court redress mechanisms, to name a few.Footnote 15 These new instruments also include a proportionality-based approach, manifested in tailoring of obligations to the size, influence or reach of platforms, to mitigate overzealous enforcement.Footnote 16
Some of the countermeasures, however, are harder to conceptualise. In particular, we observe a new class of rules that are sometimes loosely referred to as ‘must-carry’ obligations.Footnote 17 This umbrella term covers a variety of rules, including: prohibitions to moderate (or outright prohibitions to remove) content originating from predefined sources (eg, elected officials); put-back orders by politically appointed councils; obligations to reinstate removed content that is subsequently shown to be lawful; or obligations to preserve or prioritise content for public interest reasons.Footnote 18
Some of the proposed measures have laudable aims, such as avoiding undue restrictions on content of public interest or strengthening the position of legacy media in the online environment. Others, however, present serious risks, such as facilitating State-controlled narrative (or even propaganda).Footnote 19 In any case, these measures or obligations give rise to important questions from the perspective of fundamental rights. For instance, does their introduction mean that certain types of legal content must always be ‘carried’ by online platforms, even against their will? Do some of these obligations create a right to a forum on private property? To what extent are these obligations analogous to the must-carry obligation known in traditional broadcasting regulation? More fundamentally, what are the freedom of expression implications of these obligations?
Historically and legally, the concept of ‘must-carry’ refers to the obligation imposed on transmission services to make certain channels, which serve general interest objectives, available to the public. The central aim of such an obligation is to guarantee access to public service broadcasting and ensure a diverse choice of programmes in order to effectively protect the right to freedom of expression and access to information for the public. However, for private entities subject to the obligation, this may amount to a limitation on their own right to freedom of expression, as they are forced to serve content that they would otherwise not be interested in carrying. Or, as US scholars would point out, a case of ‘compelled speech’.Footnote 20 While broadcasting regulation and the discussed measures for platforms are obviously distinct regimes, the underlying question about the impact of the obligations on the right to freedom of expression of the involved stakeholders looks and feels familiar. Given the abundance of this type of proposals in the area of platform regulation, as well as their potential risks for fundamental rights, the new wave of ‘must-carry’ obligations justifies closer scrutiny.
3. Mapping the wave of new ‘must-carry’ obligations in the EU
Crucial to the debate on platform regulation is the question of how to find balance between protecting society from illegal and harmful content, ensuring effective exercise of the right to freedom of expression and access to information, and respecting the right to conduct business. Novel must-carry obligations are increasingly viewed by policymakers, at least in part, as an answer to that question. They are advanced as a contemporary policy response to platform power not only in Europe but also across the world, for instance in the US or Brazil.Footnote 21 This section sets out to understand the object and scope of novel must-carry obligations from a European perspective. For this purpose, it discusses different manifestations of these obligations in EU law, European legislative proposals and judicial decisions. Given the broad and loose definition of the ‘must-carry’ term in this context, it is possible to identify several proposals and existing rules at European level in the DSA and in the EMFA. These instruments are examined here on the grounds that they are both relevant (from a policy standpoint) and sufficiently illustrative of the scope and characteristics of this type of rules. We follow the chronological order as it best reflects the development of the arguments and revisions that shaped the ultimate wording of what we view as the important legal provision in our analysis: Article 18 EMFA.
A. Digital Services Act: must-carry proposals in the legislative process
The DSA covers, among others, how online platforms can carry out moderation of content posted by their users.Footnote 22 The main aim of the DSA is to create a safer digital space and to enhance protection of fundamental rights online.Footnote 23 The DSA does not contain rules on content per se. The definition of what is illegal, for instance what constitutes defamation or hate speech, is left to the discretion of the particular EU Member States.Footnote 24 Instead, the DSA contains rules on the liability of providers of intermediary services and separate due diligence obligations that they must abide by. In doing so, it sets out detailed procedures to regulate content moderation practices of online platforms, including with the aim to address the problem of private censorship and over-removal of content.Footnote 25
This is in line with the DSA’s core aim of securing remedies for users whose expression is restricted by platforms. Footnote 26 For this purpose, The DSA provides three routes of redress:Footnote 27 an internal complaint and redress system in Article 20; a non-binding out-of-court dispute settlement mechanism for users and notice-submitters, functioning as an alternative or second instance to complaints; and (non-waivable) judicial remedies under national laws.Footnote 28 Although these mechanisms enhance the right to an effective remedy, they stop significantly short of imposing ‘must-carry’ duties. Platforms retain discretion to exclude lawful but unwanted content, subject to Article 14’s requirement of due regard for fundamental rights. The DSA thus constrains moderation practices without obliging platforms to host all legal speech, rejecting stronger must-carry proposals considered during the legislative process.
The DSA does not contain, strictly speaking, a must-carry obligation. However, the legislative process featured a number of proposals that explicitly or implicitly attempted to impose such obligations. For the most part, these proposals were advanced by different European Parliament committees.Footnote 29 Although none of the proposals made it to the final text of the DSA, they are nevertheless illustrative of the type of novel must carry obligations under consideration in platform regulation discussions and have the potential to (and in fact did, in the EMFA) resurface in future legislative initiatives. As such, they merit a closer look.
A. Must-carry of public interest information for recommender systems
One example is found in the context of regulating recommender systems,Footnote 30 where the DSA imposes a series of obligations on very large online platforms (VLOPs), such as regarding the design of these systems and the way in which users can interact with them. During the legislative process, the IMCO Committee considered that these rules should be amended to ‘further strengthen the empowerment of consumers’.Footnote 31 The centrepiece of the various amendments proposed to achieve this aim was a provision, which would introduce inter alia ‘a ‘must-carry’ obligation to ensure that information of public interest is high-ranked in the platforms [sic] algorithms’.Footnote 32 According to its proponents, such an obligation ‘should ensure that recommender systems display information from trustworthy sources, such as public authorities or scientific sources as first result following search queries in areas of public interest’. Footnote 33 The goal was to prevent ‘platforms from nudging users into the direction of unscientific propaganda, abusive content or conspiracy theories in order to keep them active on the platform (dark patterns)’. Footnote 34
Prohibition to moderate content from recognised media service providers
In another example, the CULT committee advanced a text for a must-carry obligation to the benefit of ‘recognised media service providers’, as defined in the AVMSD. Footnote 35 In particular, the proposal targeted VLOPs and stated that they must ensure that their content moderation practices, tools and terms and conditions are ‘applied and enforced in such a way as to prohibit any removal, suspension, disabling access to or otherwise interference with content and services from the account of a recognised media service provider’.Footnote 36 This ‘media exemption’ from content moderation was concisely justified on the need to protect editorial independence in the media sector. For this purpose, it was considered that ‘commercial online platforms should not be allowed to exercise a supervisory function over legally distributed online content originating from service providers who exercise editorial responsibility and consistently adhere to Union and national law as well as journalistic and editorial principles’.Footnote 37 Furthermore, media service providers should ‘remain solely responsible for the content and services they produce’, since ‘platforms cannot be held either responsible or liable for the content offered by media service providers on their platforms’. Footnote 38 As we shall see, a watered-down version of this proposal resurfaced in the EMFA, as a rule requiring special treatment for media service providers.Footnote 39
The ‘Trump amendment’
Perhaps the most prominent example of a must-carry proposal during the DSA drafting process was the so-called ‘Trump Amendment’. The amendment was advanced after several social media platforms suspended the account of the then-US president Donald Trump, based on violations of their terms and conditions. Footnote 40 The event, sometimes referred to as ‘the great deplatforming’,Footnote 41 has given rise to considerable discussion on the power of platforms to curtail the speech of elected politicians. For that reason, we examine it in greater detail here.
The proposal was advanced by the IMCO Committee, as an exception to the measures against misuse of the platform services by users who frequently provide manifestly illegal content, or submit notices or complaints that are manifestly unfounded. These measures include temporary suspensions of the account of a user or of the processing of user’s notices and complaints.
The amendment sought to limit the margin of discretion of platforms when applying the measures in relation to users that are ‘of public interest’. The only example provided in the justificatory note was that of politicians, there being no indication that this concept would also apply to other accounts that might otherwise be considered of public interest, such as journalists, human rights activists or whistleblowers.
For the covered ‘public interest’ accounts, the harsher measure of temporary account suspension ‘must receive the approval of the relevant judicial authority’.Footnote 42 The stated justification was the need to ‘ensure that accounts of public interest, eg, of politicians, are not suspended on the basis of the platform’s decision alone’.Footnote 43
Crucially, this amendment would apply solely to one moderation measure: the temporary termination of a user account, or ‘de-platforming’. It was not supposed to be a restriction on any other content-level measures, like the removal or blocking of specific posts by the user. In theory, taking such measures by platforms against specific posts, therefore, would have been still allowed without additional judicial approval. The amendment did not address the possibility of permanent (or at least not time limited) bans, which was the action taken by several platforms in relation to Trump at the time.
B. Special treatment of media in the European Media Freedom Act
The idea of introducing special treatment of media providers was not included in the final version of the DSA. It has reappeared, however, in the recently adopted EMFA. The goal of the EMFA is to establish a common framework for media services in the internal market, in particular, to protect media pluralism and independence in the EU. To this end, the Regulation provides safeguards against political interference in editorial decisions and against surveillance of journalists. It addresses the issues of the independence and stable funding of public service media, as well as the transparency of media ownership and of the allocation of state advertising. The EMFA, moreover, contains several rules addressing the provision of and access to media services in a digital environmentFootnote 44 and in particular, the presence of media content on VLOPs. The provision on the special treatment of media service providers, although heavily criticised during the negotiation process, has prevailed and is present, in a modified version, in the final version of the EMFA.Footnote 45
Article 18 EMFAFootnote 46 mandates that VLOPs provide a functionality for their users to declare their status as media service providers that are ‘editorially independent from Member States, political parties, third countries and entities controlled or financed by third countries’.Footnote 47 Next, those media service providers should declare that they are ‘subject to regulatory requirements for the exercise of editorial responsibility in one or more Member States and oversight by a competent national regulatory authority or body, or adhere[…] to a co-regulatory or self-regulatory mechanism governing editorial standards’. Media service providers should also declare that they ‘do not provide content generated by artificial intelligence systems without subjecting it to human review or editorial control’. In response to the filed declaration, VLOPs should indicate whether or not they accept the declaration.
Media services that fulfil the criteria and are accepted by VLOPs benefit from two main privileges. The first privilege applies when a VLOP intends to suspend the provision of its services to a media service provider or restrict the visibility of its contentFootnote 48 on the grounds that the content is incompatible with the VLOP’s terms and conditions. In that case, the VLOP must communicate a statement of reasons for the intended decision prior to the suspension or restriction taking effect.Footnote 49 With this prior notification, VLOPS should give the media service provider the possibility to reply within 24 hours, although the timeframe may be shorter in case of a crisis.Footnote 50 The content should stay available until media organisation has been given time to respond.Footnote 51 Interestingly, the described procedure does not apply if suspension or restriction of visibility is triggered by the VLOPs obligations in relation to specific types of content. These include various types of systemic risks defined in the DSA, protection of minors and general public based on the AVMSD, or obligations relating to illegal content.Footnote 52 In other words, the scope of the privileged treatment is limited, as it would not apply to certain types of controversial content, such as (illegal) hate speech, incitement to violence, or racist speech.Footnote 53 The privilege may also not apply in relation to content considered as disinformation, in cases where systemic risks under the DSA are identified; we return to this point below.Footnote 54
The second privilege applies where a media service provider considers that a VLOP repeatedly restricts or suspends the provision of its services without sufficient grounds. In that case, the VLOP shall ‘engage in a meaningful and effective dialogue with the media service provider, upon its request, in good faith with a view to finding an amicable solution’ for terminating unjustified restrictions or suspensions and avoiding them in the future.Footnote 55 The media service provider may notify the European Board for Media Services (the ‘Board’) and the Commission about the outcome and the details of such exchanges. It may also request an opinion by the Board on the outcome of the dialogue, including recommended actions for the VLOP.Footnote 56 At time of writing, the Commission is in the process of drafting guidelines to help protect media providers from unwarranted content removals from VLOPs, thereby further specifying the meaning and scope of Article 18 EMFA.Footnote 57
During the legislative process the text of the provision that would ultimately result in Article 18 was controversial from the start. As pointed out by Joan Barata, the definition of media services that was included in the Commission’s proposal was very narrow and focused on traditional understanding of media (‘providing programmes or press publications’). Footnote 58 Because of this narrow framing, the protections introduced in the EMFA would exclude some forms of media and journalistic activity from the scope of application.Footnote 59 The proposed privilege, as a result, would protect content originating from a commercial or a public broadcaster but not necessarily from personal accounts of journalists employed by them. It would also not apply to posts coming from a human rights organisation, or a citizen journalist. This limited approach is not exactly up to date with the recent trends promoting broad understanding of media and journalism.Footnote 60 It rather follows the ‘old school’ of thought that only professional media actors deserve special treatment.Footnote 61 The European Parliament attempted to extend the definition to apply also to ‘standard and non-standard forms of employment’, clarifying in the preamble that it would include bloggers.Footnote 62 In the final agreement, bloggers are not mentioned. The definition of media services refers to the TFEU and focuses on any form of economic activity. Recital 9, however, explains further that ‘the definition of media service provider should cover a wide spectrum of professional media actors falling within the scope of this definition, including freelancers’.
Another point of criticism of the privilege in Article 18 EMFA refers to the condition of editorial independence. It is a crucial concept of media freedom and pluralism. But it is not defined by law and is subject to many variables.Footnote 63 Both for public service media and for commercial media multiple complex indicators have to be taken into account to answer the question of independence.Footnote 64 The European Media Pluralism Monitor (MPM), conducted yearly, consistently shows that effective protection of editorial independence continues to be the source of major concerns.Footnote 65 The 2024 MPM indicated that ‘the European media sphere is still significantly affected by high levels of political capture’ with seven countries are considered high risk regarding state interference and political independence.Footnote 66 Editorial independence is also extremely difficult to assess, despite indicators and tools such as the MPM. In many cases, no simple ‘yes’ or ‘no’ is possible but rather, the answer is found somewhere on a spectrum. It would be up to the VLOPs to make that assessment, based on information provided in the self-declaration, ultimately legitimising their power over media providers. The same concern applies to the fact that VLOPs are to assess whether a media service adheres to the standards of editorial responsibility.
To address these concerns, the final text of EMFA includes an additional safeguard, stating that VLOPs will be able to confirm the provided information with the relevant national regulatory body or the relevant co- or self-regulatory mechanism – in case of reasonable doubts. Such an addition should help eliminating varying assessments by different VLOPs. Still, some scholars argue persuasively that the overall design of Article 18 EMFA ‘risks structurally embedding platform firms as adjudicators of news legitimacy’.Footnote 67
The final version of EMFA, moreover, clarifies the available recourses (eg, in case a VLOP rejects or invalidates a declaration or ignores a response) by directly listing redress mechanisms available in the so-called Platform-to-business Regulation (2019/1150) as well as the DSA.Footnote 68 The former instrument creates a procedural privilege for media services, by giving them priority in complaints handling, and thereby treating them as a special type of business user. The latter is meant for regular (non-business) users. Relying on the DSA might be still interesting for the media providers or regular journalists as it offers more remedies, such as the aforementioned out-of-court dispute settlement procedure. On its own, the DSA mechanism does not offer any preferential treatment nor bump the complaint to the front of the appeals line, but together with the wording of Article 18(5) EMFA it could achieve this effect.Footnote 69
A final and broader critique of Article 18 EMFA is warranted. The provision ‘aims to re-establish a space for traditional media in the online world by providing specific guarantees for media content on digital platforms’.Footnote 70 However, its design appears ill-suited to the specific needs of local journalism. First, the privilege applies only to recognised media service providers. Smaller, less established, or emerging local outlets may find it difficult to qualify. This reflects what Tambini describes as the ‘paradox of media privilege’.Footnote 71 If the criteria are overly rigid or burdensome, many local journalists may fall outside the protected category. Second, Article 18 may reinforce disparities in resources. Even with procedural safeguards, local outlets often lack the legal, financial, or technical capacity to pursue litigation or to ensure compliance by VLOPs. Finally, such privileges do not aim to address the financial fragility of local journalism or its broader structural challenges. We return to this final point below when highlighting alternative approaches to regulating the relationship between media providers and platforms of significant influence.Footnote 72
4. New ‘must-carry’ developments at national level
This section examines the development of so-called must-carry rules for online platforms at national level in Europe. It looks first at the legislative developments in different European countries, namely: the German law prohibiting to discriminate content; the UK’s Online Safety Act; and a since abandoned proposal of must-carry obligations in Poland (A). This is then followed by an exploration of national case law in this area, which has proved influential in shaping national debates, as well as the previously discussed EU rules and proposals (B).
A. National must-carry law
Germany’s prohibition to discriminate
The German Interstate Treaty on Media (Medienstaatsvertrag-MStV) updated the long-standing broadcasting regulation of the Interstate Treaty on Broadcasting and Telemedia.Footnote 73 Its goal is to boost media pluralism and safeguard diversity of information by focusing on visibility and discoverability of information.Footnote 74 In particular, the Treaty updates the rules targeting traditional broadcasting media, while also implementing the amended AVMSD, which is gradually expanding to other types of media services. Article 94 of the Treaty prohibits ‘media intermediaries’ from discriminating ‘against journalistic-editorial offers, the discernibility of which they have a particularly high influence on’. The term media intermediary refers to ‘any telemedia that also aggregates, selects, and generally presents third-party journalistic-editorial offers without combining them into an overall offer’.Footnote 75 It is a broad concept, which includes search engines, social media services, app portals, user-generated content portals, blogging portals and news aggregators.Footnote 76 According to Article 94, discrimination occurs where the criteria on visibilityFootnote 77 ‘are systematically deviated from, in favour or to the detriment of a specific offer, or if these criteria systematically and directly or indirectly impede offers unfairly’ for no objectively justified reason. If the provider of journalistic/editorial content believes that their content has been discriminated against, they can file a claim with their state broadcasting authority.Footnote 78
The provision refers to journalistic-editorial offers, which suggests content provided by traditional media; it is unclear if it would also apply to citizen journalists or bloggers. It creates, therefore, a special treatment rule for a category of content providers. The provision, however, does not prohibit moderation (eg, removal) of singular pieces of content. Rather, it refers to situations of systematic hampering of the whole offer, while disregarding the established criteria.
UK Online Safety Act
The long-debated UK Online Safety Act establishes a new regulatory framework to enhance the safety of internet use in the United Kingdom.Footnote 79 It imposes duties of care on providers of certain regulated services regarding illegal content and content harmful to children. At the same time, it includes provisions to prevent excessive moderation of media or content important to democratic society.Footnote 80
The Act exempts ‘news publisher content’ from platforms’ obligations, meaning platforms are not required to act on such content. Footnote 81 It also provides limited exemptions for unedited, full reproductions of news publisher content on user-to-user services.Footnote 82
Beyond this, the Act introduces a series of duties: to protect content of democratic importance (Section 17), to protect news publisher content (Section 18) and to protect journalistic content (Section 19). It is worth examining them in turn, as they contain features of so-called must-carry obligations.
Section 17 requires targeted servicesFootnote 83 to consider freedom of expression when moderating content ‘of democratic importance’ or acting against users who share it.Footnote 84 Measures must be proportionate to the provider’s size and capacity and applied equally to diverse political opinions.Footnote 85 Policies must be set out in terms of service and enforced consistently for both news publishers and other users, provided the content is intended to contribute to political debate in the UK.Footnote 86
Section 18 was added in late 2022, after the Department for Digital, Culture, Media and Sport announced the introduction of a so-called ‘temporary must carry obligation’ for content coming from news publishers.Footnote 87 The intention was to additionally protect news publisher content that could be caught in a platform’s safety obligations.Footnote 88 The term ‘must-carry’ is not used explicitly in the Act. Instead, it is provided that if a platform was, in fact, going to take action against news publisher content or against a user that is a recognised news publisher, such platform would have to follow a prior-notification and a mandatory appeals procedure specified in Section 18(3).Footnote 89 In particular, that means the prior-notification must specify the action that the provider is planning on taking, give reasons for this action referencing relevant parts of their terms of service, explain how the importance of freedom of expression was taken into account, and indicate a time period for a response. However, a service provider may take down the content without following the described process, if they reasonably consider that they could face criminal or civil liability for not acting, or if the content in question is a of a type of listed offenses or illegal content. If the action is taken, the recognised news publisher has to be notified and can request reversal of the action. If a recognised news publisher is banned from a service, ie, de-platformed, the provider may act in relation to other content of the same news publisher that is still present on the service, without following the indicated steps.
Finally, Section 19 states that the same attention to freedom of expression must be given when making specific moderation decisions on ‘journalistic content’.Footnote 90 The provision also requires creation of a dedicated and expedited complaints procedure to remedy the moderation decisions (with a possibility to reinstate) regarding journalistic content.Footnote 91 The complaint procedure in Section 19 should apply to any action, not only those taken on the basis of a platform’s terms and conditions (as in Section 18).
Poland’s proposal to protect the freedom of expression of social media users
An interesting example of a novel must-carry obligation appeared in a Polish proposal for a new law on ‘the protection of freedom of expression of social media users’.Footnote 92 Although the work on the proposal had been initiated a few months prior, the plans for a new law were announced in January 2021, coinciding with the deplatforming of Donald Trump. During the press conference, the Minister of Justice argued that censoring decisions by Big Tech corporations ‘threaten[s] and violate[s] the values at the centre of democracy’.Footnote 93 He also stressed that platforms engage in ‘ideologically motivated censorship’ that often targets members of religious and right-wing groups.Footnote 94
Officially, the main goal of the proposal was to curb the arbitrary power of social media platforms by prohibiting removal of content that does not breach Polish law, thereby restricting their ability to remove content that is not illegal. The bill introduced several measures aimed to strengthen the protection of freedom of expression online. For example, the proposal contained a mandatory internal appeal procedure for take downs and account suspensions where complaints would have to be resolved within 48 hours, as well as a ‘put back’ procedure.Footnote 95 The proposal, however, also contained several worrisome elements, such as obligation for data retention or a possibility to file an internal complaint against content that was defined as unlawful. The term ‘unlawful content’ included illegal content but also disinformation (misleading information created for profit or in ‘breach of public interest’), content infringing personal rights as well as ‘public decency’. The proposal, therefore, required platforms to consider complaints against content that was neither illegal nor contrary to the platform’s terms and conditions, but was considered indecent by the complainant, eg, critical of the Catholic church.Footnote 96
The most contentious proposal was the creation of a Free Speech Council, an external appeals body of five members appointed by Parliament. It would review unresolved user complaints about platform content decisions and could order content reinstatement or account reactivation after closed deliberations.Footnote 97 Its decisions would be binding on platforms.
The proposal drew heavy criticism from NGOs, business associations, and the Polish Ombudsman.Footnote 98 He noted that victims of targeted hate speech, if ignored by the Council, would have no recourse, skewing protection toward one side of the political spectrum. Footnote 99 He called the Council’s powers ‘a far-reaching interference in the freedom of speech’, warning they would shift the limits of expression from law to the Council’s discretion.Footnote 100 The Ombudsman and others also criticised the Council’s political nature and dependence on government.Footnote 101
Progress stalled in 2022, partly due to a potential conflict with the newly adopted EU DSA. After Poland’s 2023 elections returned the former opposition to power, the proposal was effectively dropped.
Common features
There are several common features between these German, UK, and Polish laws and proposals that help understand the potential scope of online ‘must-carry’ obligations. As we will see, these common features can be also traced down in the DSA and EMFA.
First, each country’s law is justified by the need to regulate platform power in connection with the moderation of content that is not illegal, especially when originating from news publishers and media. In all three examples, the provided justification is focused on the protection of freedom of expression, access to information and plurality of content and opinions. All three frameworks emphasise safeguarding this fundamental freedom, whether through promoting media pluralism (Germany), protecting content of democratic importance (UK), or restricting platforms from removing lawful content (Poland).
Second, in the German and UK examples the core solution to address this concern is through specific protections for journalistic or editorial content. In this sense, the rationale is to provide extra protection to media in the online public sphere. It reflects therefore the need to ‘rebalance the relation between media and online platforms’ expressed, eg, by the European Commission already in 2018.Footnote 102 It also aligns with the belief that quality media is a ‘pillar of democracy’ and access to information in the public interest and free, independent journalism are essential for informed citizen participation in democratic processes.Footnote 103
Third, the protective rules in question aim to prevent that platforms carry out content moderation that is considered discriminatory or biased towards certain parties or viewpoints. Germany’s law prevents media intermediaries from discriminating against certain types of content, and the UK Online Safety Act requires platforms to follow specific rules when handling news or journalistic content. Poland’s proposal, however, was different. Nominally, it tried to limit platforms’ ability to remove legal content but, in this attempt, it created further possibilities for discrimination and censorship, just for other sets of viewpoints, unwelcomed by the then-State administration.
Each of the examined instruments provides for procedural safeguards in the form of an appeal or oversight mechanism to challenge platform moderation decisions. Germany allows claims to the state broadcasting authority, the UK enforces appeal procedures for news publishers, and Poland’s proposal involved an additional instance in the appeal procedure – the Free Speech Council (although politically appointed and opaque in its operation).
Variations of these common features, we would argue, are all found to some extent in the above-analysed ‘must-carry’ rules and proposals in EU law, especially in the DSA and EMFA.Footnote 104 As we also show, the same features surface in national case law in this area.
B. National ‘must-carry’ case law
Apart from various legislative attempts to introduce different forms of novel must-carry obligations across Europe, there is also a growing body of case-law on the topic. Initially, any claims to have platforms host content they have previously removed would be resolved outside of courts, either through reinstatement via public pressure, settlement or on procedural grounds.Footnote 105 Recently, however, there are more examples of cases where national courts in Europe take on the question of unjustified removal and order reinstatement of content. Not all the claims for reinstatement, however, lead to such an outcome. Below we provide representative examples from the Netherlands, Italy, Germany and Poland.
In 2020, in the Netherlands, the District Court of Amsterdam decided on two cases about removal of content critical of the government response to the Covid-19 pandemic.Footnote 106 Taking into account a particular task of preventing harmful disinformation, the Court came to the same conclusion that the two involved platforms, YouTube and Facebook, did not have to reinstate the previously deleted content.Footnote 107 This line of case law has been followed in 2021, when several courts issued judgments on similar cases.Footnote 108
In yet another case from the Netherlands (FVD/YouTube case), the District Court of Amsterdam concluded that despite YouTube being a platform of great importance to convey a message, it does not have to reinstate a video of a speech of a politician criticising the government’s measures against the pandemic. The Court first concluded that YouTube does not have a ‘must-carry’ obligation, and that such an obligation has not been included in the DSA. Then, the Court referred to ECtHR case law – in particular, Appleby v. the UK – and concluded that ‘any effective exercise of freedom of expression’ had not been made impossible, nor had the essence of right to freedom of expression been violated.Footnote 109 This is because the opinion in question could be expressed on several other channels or through a link thereto (eg, their website, their app, the website of the Parliament, and social media). Moreover, the Court added that the party could still use YouTube, and that most of their critical videos were still on the platform. Lastly, the Court argued that the outcome of the judgement could have been different in the case of an account ban (ie, deplatforming), rather than the deletion of just one video.Footnote 110
However, other national courts have reached opposite outcomes, recognising obligations for platforms to reinstate content or accounts in decisions that approximate the imposition of limited ‘must-carry’ obligations. This occurred for instance in an Italian case concerning the removal of content and an account based on incidents that happened outside the platform. In CasaPound, an Italian far-right movement claimed that removal had caused its exclusion from the political debate.Footnote 111 The court argued that Facebook’s decision to deactivate the page and account of CasaPound, based on a violation of its terms and conditions by a series of incidents, could not be upheld. Firstly, the court dismissed Facebook’s argument that the political movement is an organisation that incites hatred and violence. This is because the promotion of the aims of the far-right movement on the Facebook page did not – in and of itself – constitute an incitement to hatred and violence.Footnote 112 Secondly, CasaPound should not be held responsible by association for the incidents, because the content was not present at the organisation’s page and account, despite the involvement of CasaPound’s members and supporters in the incidents. The court strengthened these arguments by highlighting the importance of Facebook in modern-day political discourse and communication with one’s supporters. Moreover, the court pointed to obligations stemming from the Italian Constitution, such as the obligation to protect party pluralism, freedom of association and freedom of expression.Footnote 113 In the appeals procedure, the court emphasised that even though the political character of CasaPound does not create obligations to guarantee constitutional rights by the platform, a civil contract (eg, for delivery of a service) should be interpreted in accordance with the Italian Constitution.
German courts, for their part, rely on a combination of the Basic Law, which protects the right to freedom of expression in Article 5(1)(1), with a reference to the function of a platform as a ‘public marketplace’, and the doctrine of indirect third-party effect of fundamental rights (Drittwirkung). Footnote 114 It is noteworthy that multiple content reinstatement cases have been successful in Germany, especially where the providers failed to ensure adequate transparency and due process.Footnote 115
In 2018, the OLG München labelled large online platforms as ‘public marketplaces’ for information and opinions. In that role, such platforms generally would not be allowed to remove ‘admissible expressions of opinion’ that do not qualify as illegal content, even on the basis of their terms and conditions. Footnote 116 Rather, these providers would have a ‘substantial indirect meaningful duty’ to protect the right to freedom of expression of users in the context of content removal decisions. Footnote 117 The OLG Dresden added to this by observing that a private company that ‘takes over from the state to such a degree the framework of public communication’ must also have the ‘concomitant duties that the state has as a provider of essential services’.Footnote 118 Moreover, opinions that are protected under Article 5 of the Basic Law enjoy a higher level of protection, so that its removal cannot be based solely on a violation of the terms and conditions, it must not be performed arbitrarily, and users may not be blocked from the service without recourse. Complying with these requirements ensures that platforms can moderate the content they host, delete uploaded content in order to avoid liability, and take down (both criminally and not-criminally punishable) hate speech.
In a similar fashion, the Federal Constitutional Court (BVerfG) issued a preliminary injunction ordering Facebook to allow a right-wing party access to its previously suspended Facebook page to resume posting.Footnote 119 Preventing a political party from using its Facebook page ‘denied an essential opportunity to disseminate its political messages and actively engage in discourse with users of the social network’, which ‘significantly impedes’ the party’s visibility, especially during the elections. This argument bears some resemblance to the argument in the Italian CasaPound case on the exclusion from the political debate. The BVerfG also clarified that, indeed, fundamental rights can be effective in disputes between private parties. This is particularly so if a private party enjoys ‘significant market power’ in Germany, as Facebook was considered to do. The indirect third-party effect, however, is not reserved solely for freedom of expression: all relevant fundamental rights must be balanced to determine if terms and conditions alone can justify the deletion of a particular statement.Footnote 120
Finally, the German Federal Court of Justice (FCJ) further clarified in 2021 Facebook’s content removal rights and duties in the light of its dominant position in the market.Footnote 121 The FCJ argued that Facebook can develop its own internal rules (eg, Community Standards) and enforce them by removing posts and blocking accounts in case of a breach, provided that it takes its users’ fundamental rights into account. The developed internal rules must be clear and leave little room for interpretation.Footnote 122 Reasons for removal of content or blocking of an account should be objective and the platform may not ban specific (predefined) opinions. Moreover, due to its size, Facebook must comply with due process requirements, as a State would have to do when censoring expression. This procedural protection of fundamental rights, specifically, requires Facebook to: (i) inform a user about any removal of their content (after) and of any intention to block a user’s account (before); (ii) inform a user of the reason for the action; (iii) give a user an opportunity to respond; and (iv) issue a new decision after a review, with the chance of reinstating the removed content.
Lastly, a Polish case, in which Facebook and Instagram removed fan pages and groups run by a NGO – the ‘Civil Society Drug Policy Initiative’ (‘Społeczna Inicjatywa Narkopolityki’, or ‘SIN’) – due to unspecifiedFootnote 123 violation of the Community Standards. Footnote 124 Arguably, the decision could have been caused by SIN’s unusual approach to drugs, which focuses on the safe use rather than abstinence.Footnote 125 There was, however, no warning and no explanation of the reasons of the removal. There also was no possibility to appeal. In 2019, SIN filed a lawsuit arguing that blocking access to its content was arbitrary and unjustifiably restricted SIN’s possibility to disseminate information, express opinions and communicate with their audience. Moreover, SIN argued that it prevented continuation of their educational activities and undermined their reputation by suggesting that their activity was unlawful. In 2019, the District Court in Warsaw issued an interim measure that temporarily prohibited Meta from removing fanpages, profiles and groups run by SIN, as well as from blocking individual posts,Footnote 126 to allow SIN continuation of its educational activities until the case was resolved. The court has also obliged Meta to store profiles, fanpages and groups (including comments) that were previously deleted to make sure that they can be easily restored in the future.Footnote 127 In early 2024, the court of first instance has ruled that Meta cannot block users without any justification and without providing them the possibility to effectively challenge the decision.Footnote 128 Interestingly, Meta’s actions amounting to blocking content without justification and the possibility of appeal would also constitute a violation of the DSA, which imposes specific content moderation obligations.
As with the national laws examined above, there are important common features between German, Italian, Dutch, and Polish case law. First, the key legal concern of courts in imposing restrictions on platforms is to safeguard the right to freedom of expression in the context of impactful content moderation decisions, especially in political contexts. Second and related, courts highlight the need to balance freedom of expression with other affected rights, taking into account the impact of the measures on public debate. In this context, the political speech of specific speakers receives special protection, with courts often ruling against platforms when moderation limits political participation. Third, a key consideration for courts assessing restrictions on the platforms’ moderation activities is whether the platforms at issue are seen as a playing a role of public space or forum for discourse, or as having significant market power. Fourth, all courts focus on procedural safeguards of due process and transparency, emphasising the need for clear, transparent moderation processes and availability of effective remedies, requiring platforms to inform users, provide reasons for removal, and allow appeals.
5. From ‘must-carry’ to special treatment and the right to freedom of expression
The next question concerns the extent to which the newly examined must-carry rules align with European fundamental rights law, particularly the right to freedom of expression. First, we need to highlight that the ECtHR, when addressing questions on freedom of expression and online service providers, distinguishes between the responsibilities of professional media outlets and non-media platforms regarding third-party online content. This is reflected in its contrasting rulings in Delfi AS v. Estonia (2015) and MTE and Index.hu v. Hungary (2016), both concerning liability for user comments under the posted storied, in light of Article 10 ECHR. In these cases, the court’s reasoning varied according to the content at issue and the platform’s nature.Footnote 129 The nuanced approach of the court can be linked to the 2011 CoE Recommendation on the new notion of media.Footnote 130 The Recommendation argued for a differentiated and graduated policy approach to different actors participating in the rapidly evolving new media ecosystem.Footnote 131 This nuanced approach is very much present in policies addressing media and in the court’s jurisprudence, reflecting the idea that the rules applied to the different media actors (eg, traditional media vs online platforms) may not always be the same.
In this section, we analyse whether these rules can be justified under the doctrine of positive obligations in human rights law, drawing on examples such as the right to reply in media law, which mandates the publication of certain content (A). Next, we examine must-carry obligations in the traditional broadcasting context, which protect access to information and pluralism, and assess whether the analogy between these rules and the new obligations for online platforms is justified, concluding that it is not (B). Finally, we argue that these new rules are better understood as special treatment provisions in platform regulation. We evaluate their impact on media organisations, politicians, and freedom of expression, critiquing their potential to entrench power imbalances, weaken democratic values, and inadequately address disinformation and media trust issues within the DSA and EMFA frameworks (C).
A. Obligations to effectively protect freedom of expression
According to European human rights instruments, such as the ECHR and the EU Charter, States cannot interfere with the exercise of protected rights unless specific requirements are met. But States may also have an additional obligation to effectively protect fundamental human rights from interference by others, including by private parties.Footnote 132 Such an obligation requires States to take an active stance in private conflicts and may justify the introduction of must-carry-like obligations.
In the ECHR, the concept of positive obligations is based on Article 1, which requires that the States ‘shall secure to everyone the rights and freedoms defined in the Convention’.Footnote 133 In the context of the right to freedom of expression, this involves an obligation for governments to promote the right and to provide for an environment where it can be effectively enjoyed. This means protecting the freedom of expression against interference, even by private parties.Footnote 134 Moreover, States are required to create a favourable environment for participation in public debate for everyone and to enable the expression of ideas and opinions.Footnote 135 The obligation is also understood as an obligation to act, or to implement, for example by enacting domestic legislation to protect the right. Lack of such action may trigger the responsibility of a State, even if the resulting interference has been conducted by a private party.Footnote 136 When examining EU law, however, the main human rights instrument of reference is the EU Charter.Footnote 137
Under the Charter, the obligation to respect the rights contained therein (negative obligation) is clearly articulated. The existence of the obligation to protect (positive obligation) is less obvious. The CJEU, for a long time, did not refer explicitly to the doctrine of positive obligations in its jurisprudence.Footnote 138 It focused instead on proportionality, fair balance and the lack of effective protection of the Charter rights. Arguably, this approach allowed the CJEU to reach similar result as the ECtHR when applying the positive obligations doctrine.Footnote 139
This subtle approach has been recently changing, however, since the CJEU ruled in Commission v Hungary that the negative obligation of public authorities may be ‘supplemented by a positive obligation to adopt legal measures seeking to protect private and family life’.Footnote 140 Later, in La Quadrature du Net the CJEU ruled that ‘positive obligations of the public authorities may result from Article 7 Charter requiring them to adopt legal measures to protect private and family life’.Footnote 141 Both judgements also refer explicitly to the jurisprudence of the ECtHR, highlighting that the corresponding rights of the Charter and the Convention must be regarded as having the same meaning and scope.Footnote 142 Although both rulings referred to the right to respect for private and family life enshrined in Article 7 Charter (corresponding to Article 8 ECHR), they clearly indicate the growing willingness of the CJEU to explicitly recognise the doctrine of positive obligations and the role of States to actively protect fundamental rights.
This finding is useful when examining EU and national legislation that aims to ensure better protection of the expressive right. The question remains whether the doctrine of positive obligations and effective protection of freedom of expression can be construed as an underlying legal baseline for introducing novel must-carry obligations on online platforms.
An important step towards answering this question is to examine whether this doctrine has been employed before to justify rules that compel publication of specific content.
The ECHR and the Charter both recognise that protection of the enshrined fundamental rights requires active engagement of the legislator, which must ensure effective protection of fundamental rights. But does this mean that users must be always given a forum? Clearly, that is not the case. As ruled by the ECtHR, Article 10 does not bestow any ‘freedom of forum’.Footnote 143 That is to say, Article 10 does not guarantee any ‘freedom of reach’, meaning a right to have one’s content broadcasted on any particular private forum or, in our case, private platform, even if of significant influence. Footnote 144 States, however, might be required to step in where a legal act, including a private contract, appears unreasonable, arbitrary, discriminatory or inconsistent with the principles underlying the Convention.Footnote 145 That would mean setting limits for rules that private owners establish on their property.
Arguably, rules imposed by States on private owners to limit certain prerogatives that could have negative impact on others are not unusual also when it comes to speech, for example prohibiting showing certain content, considered harmful, to minors.Footnote 146 Such restrictions, effectively, limit the owners’ right to private property, their right to conduct business as well as their freedom of expression, as a result of a balancing exercise with other rights at stake. Examples can also be found in rules and judgements on press freedom, in particular on the ‘right to reply’, which mandates publication of certain information.
The right to reply is a particular form of access to forum, initially used in written media and later expanded to the online environment.Footnote 147 Essentially, it provides a possibility to publish a response to inaccurate information in the same medium where the original statements were made. If the subject of the information wishes to benefit from this possibility, the publisher must make the reply public, subject to specific conditions. The purpose is to provide a way to protect oneself against statements or opinions disseminated by the media that are likely to be injurious to the private life, honour, dignity, and reputation. The right to reply is based on the premise that it should be possible to contest untruthful information, but also to ensure a plurality of opinions, especially in matters of general interest such as literary and political debate.Footnote 148 According to the court, such situations may create a positive obligation ‘for the State to ensure an individual’s freedom of expression in such media’, for example by requiring publication of a retraction, an apology or a judgment in a defamation case.Footnote 149
The right of reply, therefore, demands balancing of the right to freedom of expression of the media against the right to freedom of expression (next to reputation and other rights) of the subject of the information. It is, however, an exception to the general rule that newspapers and other privately-owned media must be free to exercise editorial independence to decide what to publish, including articles, comments and letters submitted by readers.Footnote 150 The limitation of the editorial freedom, allowing for a compelled publication of content that the publisher would not necessarily want to publish, is another example of State restricting freedom of expression to protect the expressive rights of others.
In essence, it can be said that the doctrine of positive obligations and its manifestations, like the right to reply, clarify at least two important aspects vis-à-vis ‘must-carry’ obligations. First, it is possible for States to impose limitations on private parties in order to compel them to carry some type of content on the grounds of freedom of expression. This possibility, we argue, extends in principle to online platforms and to user-generated content they host and provide access to. Second, however, this possibility is limited. It is construed as an exception that should be carefully calibrated and balanced against competing rights and interests, and adjusted to the particular scenario it covers. In our case, and somewhat in line with the European and national case law and examples provided above, this means that new ‘must-carry’ obligations should take into account a number of factors that are both conditions for their admissibility and restrict their scope of application, including the type (eg, media vs non-media provider) and relative power and/or reach of platform affected (eg, if it is of significant influence, such as VLOPs in EU law), the privileged speaker, the type of expression to be carried, the context and severity of speech and the proportionality of the obligation considering alternative means of expression. Considering these constraints, the question emerges whether the ‘must-carry’ label is suitable for these types of obligations.
B. What’s in a name? Old vs new ‘must-carry’
The term ‘must-carry’ has surfaced frequently in contemporary discussions on restricting the freedom of online platforms to moderate lawful content on their services.Footnote 151 It is a shorthand, or a convenient analogy, to describe the new rules we have discussed thus far. But do these rules even resemble traditional must-carry regimes? Traditional must-carry obligations limit private ruling in broadcasting, by obliging transmission services to make certain channels that serve public interest objectives available to the public. They emerged in the context of electronic communications, in light of the growing power of cable providers tempted to suppress local broadcasters. Their aim was to guarantee access to public service broadcasting and ensure a diverse choice of programmes to effectively protect the right to freedom of expression and access to the information of the public. From the perspective of the private entities subject to the obligation, this amounts to a limitation on their right to freedom of expression. They are forced to provide content that they would otherwise not be interested in carrying. They are also restricted in their ability to use their own capacity freely.
In the EU, these obligations were originally provided for in Article 31 of the Universal Services Directive, now amended and regulated in Article 114 of the Electronic Communications Code.Footnote 152 That must-carry provision requires that the introduced obligations are reasonable and apply to specified radio and TV channels, but they are not meant to cover all channels transmitted by a private broadcaster. As confirmed by the CJEU, ‘must-carry’ should not automatically be awarded but should be strictly limited to those channels serving an overall content fulfilling general interest objectives.Footnote 153 These objectives, moreover, must be clearly defined. A mere general statement that the imposed obligation aims to ensure plurality and cultural diversity is not sufficient. Footnote 154 Additionally, ‘must-carry’ should be proportionate and transparent. This means that the way it is applied ‘must be subject to a transparent procedure based on objective non-discriminatory criteria known in advance’.Footnote 155 The obligations can be imposed on the providers of electronic communication networks and services that are used by a significant number of users as their principal source of radio and TV channels.Footnote 156
Looking at motivations for old vs new ‘must-carry’ rules, an important distinction emerges. Traditional must-carry obligations were introduced in the 1990’s to address the problem of scarcity of space in analogue and cable broadcasting.Footnote 157 With supply growing quickly, these rules were meant to ensure that public service broadcasting, which was financed by the general public, could actually reach the public.Footnote 158 New must-carry rules are motivated by concerns over the growing private power of large-scale influential platforms over online expression. The goal is to prevent them from arbitrarily restricting access to (lawful) expression. From that perspective of reigning in the significant power of private entities to determine what content is available on their channels or services, the analogy stands.
Crucially, however, there are significant differences. In the current digital environment, for instance, there is no shortage of space on the internet. The problem lies more with the shortage of time and attention of the viewers. Furthermore, the two sets of rules at issue are aimed at different actors: electronic communication networks and services on one hand, and online platforms (a type of hosting service provider), on the other.Footnote 159 The design of the obligations also differs between the two regimes. For traditional must-carry obligations, the focus lies on carrying specific designated channels without restrictions. For the new rules, the scope is varied, but usually relates to certain categories of speakers (eg, media providers or politicians) and specific types of content (eg, journalistic or political). New ‘must-carry’ obligations are also more nuanced in their design, in line with the underlying rationale of safeguarding freedom of expression against certain content moderation measures. They impose procedural safeguards to prevent or mitigate the impact of such measures, such as ex ante and ex post information requirements, as well as review and redress mechanisms.
When comparing traditional and new must-carry obligations, it becomes evident that the similarities lie more with a conceptual representation of what a ‘must-carry’ rule entails rather than with a direct application of these traditional rules to a new technological and business reality. At a high level, a conceptual similarity can be found in the rationale of the approach: both types of rules are grounded in the notion that States may impose limitations on private rights, including fundamental rights, for the protection of the rights of others. This is the case even when the limitation is imposed on the right to freedom of expression. Furthermore, such a limitation takes the shape of an obligation for a provider of significant scale, reach and influence to accommodate third-party content on its service on the grounds of general or public interest, even against their will. But in the end, it is still merely an analogy.
Given the differences between regimes, it is hard to argue that insights from the traditional must-carry obligations are directly applicable to adjudicate freedom of expression conflicts associated with modern rules for online platforms. This realisation allows us to step away from the ‘must-carry’ label and identify the new rules for what they are. In our view, the rules examined above are better understood as restrictions on the rights of platforms in order to recognise a special treatment privilege for some categories of speakers and their expression. As such, for the remainder of this paper we will refer to these rules – including legislative proposals and judgements to effectuate such rules – as special treatment rules.
C. Special treatment and its discontents
Special treatment rules aim to safeguard certain speakers and their lawful content from being subject to moderation restrictions by platforms based on their internal policies and practices. They aim to bring more balance in the relationship between platforms, speakers and the audience in the digital information ecosystem. Arguably, such rules restrict the rights of platforms but do not violate the right to freedom of expression protected under the ECHR and CFREU.
From the perspective of freedom of expression, nevertheless, relevant questions arise. First, what would be the effect of such rules on the ‘platformised public sphere’, and in particular on efforts to curb disinformation?Footnote 160 Second, whether and to what extent should the public interest and the unique role of media in shaping public debate be considered in this discussion?Footnote 161 And finally, whether providing more protection to speech by certain speakers does not undermine the principle of equality of speech in a rule of law-based society. The two primary examples of special treatment rules that we have examined – one in favour of politicians and the other for traditional media providers – highlight these particular aspects.
Special treatment for politicians
The first type of special treatment rule we discuss is meant to protect politicians from content moderation practices by platforms of significant influence.
From our analysis of legislative proposals above in this paper, the quintessential example of a special treatment rule for politicians is the DSA proposal for a ‘Trump amendment’, which meant to create an obstacle for termination of a user account (deplatforming), including by requiring the approval of a judicial authority.Footnote 162 The rejected rule would not have covered other content moderation measures, such as removal, blocking, or labelling specific posts by that user. The limitation was facially based on ‘public interest’,Footnote 163 but that protection would only work in one direction – to protect a specific category of users, those in power. Speakers from an unprivileged position would have no special protection against deplatforming, even if their content was relevant from the public interest perspective. As such, this particular understanding of ‘political speech’ protection would apply exclusively to speech by politicians but not to others speech about politicians.
On a practical level, this type of rule is hardly revolutionary. Many platforms, especially VLOPs, already have internal rules that favour specific categories of users and provide special protective treatment for politicians, world leaders,Footnote 164 or more generally for their ‘high profile accounts’.Footnote 165 In one prominent example, such practices were revealed in the Facebook files leak of 2021 and can be traced in several Meta Oversight Board cases.Footnote 166 Similar findings can be found in the recent audit of the VLOP X (formerly Twitter), which notes that ‘(a)ccounts with large followings or ‘verified’ status appear to be treated differently to regular users’.Footnote 167 The ‘Trump amendment’, however, would legitimise granting special protection to a category of users, who in many cases would have little problem finding alternative means to exercise their right to freedom of expression. Footnote 168 It could be, for instance, through a friendly broadcaster or a widely attended official press conference.Footnote 169 Arguably, the existing alternatives are not fully equivalent to the most popular social media platforms. But expressive opportunities available to politicians still give them advantage in comparison to average users facing deplatforming. Such bans continue to happen for multiple reasons, yet they rarely cause similar controversy or trigger policy discussions.Footnote 170
On this point, it is crucial to provide additional nuance regarding political speech on online platforms. As is well known, the ECtHR’s balancing of freedom of expression in the online context places different weight on different categories of expression. Arguably, political and public interest expression enjoy the highest level of protection under Article 10 ECHR, with the court repeatedly holding that there is ‘little scope’ for restrictions on political speech or debate on matters of public concern, as tolerating even offensive comments may be crucial to enable open democratic debate. When applying this reasoning to speech by politicians, the discussed special treatment rules seem to be a good fit. The ECtHR, however, provided an interesting addition to the discussion in its 2023 ruling in Sanchez v. France,Footnote 171 when it viewed a politician during election period as a category of a ‘high risk speaker’.Footnote 172 The court clarified as well that political speech, even though it ‘calls for elevated level of protection’, is not granted absolute protection.Footnote 173 Applying these insights into our analysis underscores, first, that politicians already enjoy a high-level of protection under the existing framework, there being little justification for a special treatment rule that furthers their protection from platforms. Second, and related, the potential high-risk nature of the politician-as-a-speaker cautions against special treatment rules that may prevent justified moderation practices.
The ultimately abandoned DSA amendment now seems to have been merely an overreaction to the de-platforming of the then-sitting US President. It illustrated perhaps the fear by political leaders of being cut off from their forum. While the amendment was presented as promoting freedom of expression and protecting it from the arbitrary decisions by platforms, it had the perverse effect on the essence of the right by shielding only privileged speakers. As designed, special treatment rules like the Trump Amendment would arguably afford politicians with prima facie protection against hasty moderation decisions. That should not mean, however, that their speech should be afforded near absolute protection, which would go beyond the current human rights law.
Special treatment for media providers
Examples of rules providing for special treatment of content from traditional media providers include the German law’s prohibition to discriminate against journalistic editorial offers by media providers, or the UK Online Safety Act’s protection of news publisher content and journalist content. At the EU level, we identify the DSA media exemption proposal (and to some extent the proposal on public interest information for recommender systems), and the EMFA rule on special treatment of media providers. Footnote 174
The exponent of this approach is Article 18 EMFA. As noted, this provision requires platforms to consult media providers before moderating content and introduces an expedited appeals process. Footnote 175 While it doesn’t prohibit moderation, it makes the process slower and more complex. Critics highlight its narrow definition of media providers, which excludes certain media and journalistic activities.Footnote 176 Concerns also arise over platforms’ role in the self-declaration mechanism, as it may entrench large platforms’ dominance. Although platforms can verify editorial independence and standards with regulatory bodies, it’s unclear if they will actively verify declarations, adopt a passive stance, or selectively scrutinise certain media providers.Footnote 177
Yet another criticism of Article 18 EMFA is its potential impact on efforts to combat disinformation. This concern emerged prominently during the DSA negotiations and resurfaced strongly in the EMFA discussions.Footnote 178 The central issue is that the special protections could be granted to media service providers that actively spread disinformation or serve as propaganda channels for authoritarian governments. This concern becomes especially clear when we consider that public service media in Member States like Hungary and (until recently) Poland – known for disseminating disinformation and state propaganda – could qualify for these protections.Footnote 179
To address the risk of this rule being exploited to transform media providers into ‘propaganda megaphones’, and to ensure that the rule only benefits entities adhering to professional journalism standards, the EMFA introduced additional requirements. These include a requirement for media providers to maintain independence from entities associated with third states and political parties.Footnote 180
Article 18 further contains a provision that limits its application to situations where (i) the content is incompatible with the platform’s terms and conditions, and (ii) does not contribute to a systemic risk as defined in Article 34 DSA.Footnote 181 To be sure, Article 34 DSA does not explicitly mention disinformation as an autonomous systemic risk category.Footnote 182 However, Recital 83 DSA clarifies that within the risk category related with the protection of public health (as well as protection of minors, serious negative consequences to a person’s physical and mental well-being, or on gender-based violence), a source of such risk can be coordinated disinformation campaigns.Footnote 183 More broadly, Recital 84 DSA states that when assessing all the categories of the DSA systemic risks – including negative effects on democratic processes, civic discourse, electoral processes, and public security – VLOPs should also focus on lawful information that contributes to those systemic risks. The recital adds that particular attention must be paid to how these providers’ services are used to disseminate or amplify misleading or deceptive content, including disinformation. In this regard, the DSA incentivises VLOPs to take measures against the spread of disinformation and fight disinformation campaigns, including in the context of codes of practice or codes of conduct.Footnote 184 To the extent the spread of disinformation by media organisations is considered a contributing factor to systemic risks under the DSA, then this risk mitigation regime may apply to such content, arguably pre-empting the application of the privilege in Article 18 EMFA.Footnote 185 If that is the case, platforms that comply with their DSA obligations will still be able to take action against disinformation in certain contexts, even if it comes from media organisations, since Article 18 EMFA will not apply in those scenarios.Footnote 186
Is this enough to prevent the negative effect of the provision on the fight against the spread of disinformation? The answer depends on how VLOPs respond to this obligation. The safeguards added to Article 18 EMFA were generally well received by scholars who consider the provision to be ‘well-equipped to avoid becoming a vehicle for disinformation’.Footnote 187 One might question, however, whether these limitations still preserve the original intent of protecting media organisations from potential censorship by platforms, especially in situation when platforms abuse their algorithmic power to limit the visibility and reach of certain content (eg, by downranking or shadowbanning).Footnote 188
The rules on special treatment of media organisation force us to consider yet another aspect. The provision grants privileged status to media organisations for a specific reason. In the words of the Committee of Ministers of the Council of Europe, media have always been ‘the most important tool for freedom of expression in the public sphere, enabling people to exercise their right to seek and receive information’.Footnote 189 Awarding this special treatment is justified by the particular and unique role of media, based on informing the public, shaping public debate and upholding democratic values. Quality media, for this reason, are described as a ‘pillar of democracy’.Footnote 190 Yet, in recent years, traditional media has lost much of its opinion-forming power.Footnote 191 This process is accompanied by decreasing levels of trust in traditional media and declining engagement in the form of news avoidance and news fatigue.Footnote 192 The media landscape has transformed significantly due to rapid technological advancements and in particular the development and dominance of online platforms, which redefined the information sharing practices in the public sphere.
Article 18 EMFA tries to address this changing ecosystem and to reformulate the relationship between media and platforms of significant influence by strengthening the position of legacy media. The rule on special treatment of media organisations, however, amounts to creating privilege for speakers who occupy the top rungs of the information ecosystem hierarchy and who already have established channels to disseminate their message, even if they lost some of their traditional prominence. This special treatment does not protect ordinary speakers or dissenters critical of the status quo. Consequently, this type of rule may undermine the fundamental principle of equality before the law. It promotes the notion that certain types of speech deserve greater protection than others,Footnote 193 effectively transforming the right to freedom of expression into a privilege reserved for a select group of already privileged speakers. From this perspective, then, it is difficult to see how this approach can contribute to reverse the trend and reinstate trust in media organisations.
Reflections
At a general level, the introduction of special treatment rules may be understood as an effort to recalibrate the relationship between platforms of significant influence and their users by limiting the former’s capacity to exercise disproportionate control over particular categories of speakers and their expression. At a more concrete level, however, the heterogeneity of speakers and forms of expression entails distinct rationales and consequences, as illustrated by our analysis of privileges accorded to media organisations and politicians. Accordingly, our evaluation of these two categories of special treatment provisions, within the framework of the right to freedom of expression, leads to different outcomes.
In the case of special treatment rules for politicians, the situation is problematic. The risk of democratic backsliding increases when politicians are granted special protections that shield them from moderation, over and above the already robust safeguards afforded by human rights law to political speech, thereby threatening the impartiality and fairness of public discourse.Footnote 194 For the time being, such forms of special treatment have been abandoned, deservedly. Our view is that they should not be pursued further.
In the case of media organisations, special treatment rules are often motivated on the grounds of ensuring the availability of trustworthy, quality information by strengthening the position of legacy media. This is a commendable aim. However, it is not without risks to the democratic process, information integrity, and informed public discourse.Footnote 195 From our perspective, a poorly calibrated media privilege can unduly restrict platforms of significant influence in the moderation of disinformation and propaganda by media service providers that are either controlled by or politically aligned with governments in their jurisdiction. In doing so, a special treatment rule would have the unintended consequence of removing an important check on such media providers. The scenario is particularly concerning in contexts where media outlets have been co-opted by authoritarian governments, as seen in certain EU Member States. In the EU, the final version of Article 18 EMFA seeks to address this concern through requirements such as the self-declaration of editorial independence, subjection to regulatory oversight or a recognised self-/co-regulatory mechanism, and the disclosure of ownership structures and points of contact, among others. At this stage, however, it remains unclear whether these requirements will be sufficient to effectively mitigate the concerns outlined above. Much will depend on how the provision is implemented and enforced in practice.
Both types of special treatment, however, pose a risk to the principles of democratic governance and the rule of law. By granting preferential protection to already influential speakers, these rules undermine the foundational democratic ideal of equal treatment before the law. They enshrine a tiered system of expression that favours established voices, potentially entrenching existing power structures and stifling dissent. They may also convey a problematic message: that restrictions to speech of those not covered by any privilege are (more) acceptable.Footnote 196 As such, while special treatment rules may be well-intentioned, they must be scrutinised carefully to ensure they do not inadvertently erode democratic values and the rule of law, transforming freedom of expression into a privilege for the already powerful and treating the speech of others as a potential risk.Footnote 197
In the end, our analysis highlights a crucial trade-off in media and platform regulation in light of freedom of expression. There is an increasingly pressing need to protect trustworthy journalistic content, particularly in a polarised and fragmented political and societal landscape. To address this need, the EU legislature is experimenting with rules within a complex legislative framework, including the DSA and EMFA, eventually landing on the special treatment provisions examined here.
While their successful implementation is desirable, we remain concerned that these measures will not remedy the structural power imbalance between legacy media and dominant platforms. Regulatory pressure to shield media organisations from moderation offers only a short-term fix that risks reinforcing their dependency. Requiring a presence on platforms that suppress the reach of quality media is unlikely to shift existing dynamics. More durable solutions are needed, beyond compelled inclusion or forced engagement. Although a full assessment lies outside the scope of this paper, promising alternatives include financial support for quality journalism and local media, and renewed emphasis on user empowerment online.
On the first point – of financial support – in a 2023 assessment of the EMFA proposal, Brogi and her coauthors advanced recommendations on issues they considered crucial for media freedom and pluralism but were absent from the draft regulation. Beyond strengthening political independence, enhancing transparency in relations between media providers and digital intermediaries, and establishing an independent mechanism to monitor implementation of Articles 34 and 35 of the DSA, they highlighted the need to finance journalism as a public good.
They argued that the economic sustainability of journalism is vital for editorial independence and media pluralism, yet the EMFA proposal overlooked direct financial support. Building on limited Creative Europe funding, they recommended expanding such programs to foster innovation, investigative reporting, and local media. Further proposals included rethinking the remit of public service media, channelling revenues from a potential digital tax, and creating a European Fund for Journalism to reduce risks of political capture while supporting cross-border initiatives. They also suggested allocating part of the revenues from international corporate tax reform to media pluralism, ensuring that large platforms contribute to restoring the sustainability of a sector disrupted by digital transformation. These remain valid recommendations to this day, and illustrate the limitations of a special treatment rule as a solution to such broader challenges.
On the second point, we refer to the recent draft CoE recommendations on online safety and empowerment of content creators and users. In this context, user empowerment refers to enabling individuals to expand their understanding, make informed choices, and exercise control over their online experience, so they can benefit from opportunities and address risks without undue burden. It encompasses online measures including effective tools for personalising platform use, mechanisms to exercise and protect rights, and avenues for collective action.Footnote 198 Crucially, the Recommendation states that for content that is lawful but prohibited by platforms’ terms and conditions, States should rely on ‘alternative, proportionate ways of mitigating risks, including user empowerment measures, in the framework of online safety, user empowerment and platform accountability framework’.Footnote 199 The draft recommendation further states that systemic duties on intermediaries regarding lawful content or behavior should not serve as a backdoor for content-specific restrictions lacking a clear legal basis.Footnote 200 A consequence of this approach, as noted by Daphne Keller, is that user empowerment measures offer Sates an alternative to must-carry or special treatment rules that they would otherwise impose to compel platforms to host or prioritise particular types of expression.Footnote 201
Ultimately, although such privileges may be justified in certain contexts and offer limited short-term benefits, we contend that they do not constitute viable long-term solutions. They are unlikely to provide the decisive means to recalibrate the power imbalance between media providers and platforms of significant influence, or to address the broader challenges of independence and trust identified above.
6. Conclusions
The analysis in this paper reveals the increasing reliance on so-called must-carry obligations as a regulatory tool for addressing the challenges posed by online platforms, particularly in safeguarding media content and protecting freedom of expression. Key findings include the identification of new ‘must-carry’ measures in the DSA (in the form of rejected proposals) and the EMFA (Article 18), which aim to enhance protections for media providers while raising concerns about potential overreach and risks to platform autonomy. At the national level, the German prohibition against content discrimination, the UK’s Online Safety Act provisions, and Poland’s abandoned legislative proposal illustrate varied approaches that emphasise media pluralism and user rights while exposing tensions with overarching EU frameworks.
Our analysis of influential case law from European Member State courts (The Netherlands, Germany, Italy, and Poland) dealing with similar obligations on online platforms reaches similar findings and highlights analogous challenges. These measures and judgements reflect a shared goal of curbing undue platform power, primarily on fundamental rights grounds, yet they diverge significantly in methods and scope. Crucially, we argue that these measures are better understood as special treatment rules that restrict platform discretion in content moderation and privilege certain categories of speakers and their expression, namely media organisations and politicians.
Our analysis of special treatment rules for politicians and media providers shows that, while these provisions are designed to rebalance the relationship between large platforms and influential speakers, they risk undermining equality and freedom of expression. Special protections for politicians, such as the proposed ‘Trump amendment’ under the DSA, would have granted political actors additional safeguards against deplatforming even though their speech already enjoys strong protection under human rights law and they possess ample alternative means to communicate. Such measures risk shielding those in power from justified moderation and eroding the integrity of democratic discourse.
Rules privileging media organisations, most notably Article 18 EMFA, aim to safeguard quality journalism and ensure the continued availability of trustworthy information. While these goals are commendable, the mechanism is susceptible to criticism. Granting preferential status to legacy media entrenches the dominance of already privileged actors, risks protecting outlets that disseminate disinformation or propaganda, and may hinder platforms’ ability to address systemic risks from lawful but harmful content under the DSA. Although the EMFA includes safeguards such as editorial independence requirements and oversight mechanisms, their effectiveness remains uncertain.
Both types of special treatment risk creating a two-tier system of expression, where established voices receive disproportionate protection while ordinary speakers remain vulnerable. This dynamic undermines the democratic principle of equality before the law and conveys a troubling message: that restrictions on non-privileged speech are more acceptable. Even when well-intentioned, such rules may reinforce structural imbalances, entrench existing hierarchies, and transform freedom of expression into a privilege for the already powerful.
A more sustainable response requires moving beyond compelled inclusion or forced engagement on dominant platforms. Durable solutions should focus on strengthening the independence and sustainability of journalism through direct financial support, fostering innovation and local media, and establishing safeguards against political capture. At the same time, user empowerment measures must be prioritised as an alternative to special treatment rules, equipping individuals with tools to shape their online environment and exercise their rights without undue dependence on platforms’ often opaque policies.
Ultimately, while narrowly framed privileges may offer temporary relief, they do not provide viable long-term solutions to the structural power imbalances between platforms of significance influence, politicians, and media organisations. If the objective is to restore trust, pluralism, and resilience in the digital public sphere, the path forward lies in reinforcing systemic safeguards, empowering users, and supporting quality journalism as a public good.Footnote 202
Acknowledgements
Aleksandra Kuczerawy’s research in this paper benefited from funding from FWO grant nr. 1214321N, as well as FWO funding in ALGEPI projectG098223N and research visits at the Cyber Policy Center at Stanford University and Católica Global School of Law in Lisbon. João Pedro Quintais’s research in this paper is part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the NWO - Dutch Research Council (grant number: VI.Veni.201R.036). The authors wish to thank Stefanie Boss for research assistance on this paper as well as Lidia Dutkiewicz, Daphne Keller, Hannah Ruschemeier, Graham Smith, Maria Luisa Stasi, Max van Drunen, Folkert Wilman and Julia Zöchling for their helpful comments.
Funding statement
Open access funding provided by University of Amsterdam.
Competing interests
The authors declare none.