Hostname: page-component-68c7f8b79f-m4fzj Total loading time: 0 Render date: 2026-01-16T18:14:44.440Z Has data issue: false hasContentIssue false

‘Must-carry’, special treatment and freedom of expression on online platforms: a European story

Published online by Cambridge University Press:  15 December 2025

Aleksandra Kuczerawy
Affiliation:
Centre for IT & IP Law (CiTiP), KU Leuven, Belgium
João Pedro Quintais*
Affiliation:
Institute for Information Law (IViR), University of Amsterdam, The Netherlands
*
Corresponding author: João Pedro Quintais; Email: j.p.quintais@uva.nl
Rights & Permissions [Opens in a new window]

Abstract

This paper examines the evolution and implications of ‘must-carry’ obligations in the regulation of online platforms, with a focus on Europe. These obligations, which restrict platforms’ discretion to remove or deprioritise certain content, represent a novel regulatory response to the growing power of platforms in shaping public discourse. The analysis traces developments at EU and national levels. At the EU level, it considers rejected must-carry proposals during the drafting of the Digital Services Act (DSA) and the adoption of Article 18 of the European Media Freedom Act (EMFA), which grants privileges to recognised media service providers. At the national level, it examines Germany’s prohibition on content discrimination, the UK’s Online Safety Act, and Poland’s abandoned legislative proposal on freedom of expression online. Case law from courts in the Netherlands, Germany, Italy, and Poland further illustrates the emergence of judicially crafted duties resembling must-carry obligations. The paper argues that these measures are best understood as special treatment rules that privilege particular speakers, notably media organisations and politicians, by limiting platform autonomy in content moderation. While intended to safeguard pluralism and access to trustworthy information, such rules risk creating a two-tier system of expression in which established voices receive disproportionate protection while ordinary users remain vulnerable. Protections for politicians raise concerns about shielding powerful actors from justified moderation, whereas media privileges, though more defensible, remain limited in scope and potentially counterproductive, especially when exploited by outlets disseminating disinformation. The conclusion is that compelled inclusion and preferential treatment are unlikely to offer sustainable solutions to the structural imbalances between platforms, media providers, and politicians. More durable approaches should focus on strengthening journalism through financial and structural support, fostering innovation and local media, and prioritising user empowerment measures. Only systemic safeguards of this kind can effectively promote pluralism, accountability, and resilience in the digital public sphere.

Information

Type
Core analysis
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press

1. Introduction

Online platforms of significant influenceFootnote 1 have become central forums for the dissemination of information and the exercise of freedom of expression. Their capacity to amplify or suppress content and structure visibility has positioned them as powerful gatekeepers of the digital public sphere. In response, European policymakers have sought to regulate platforms of significant influence in order to address illegal and harmful content while safeguarding fundamental rights. These efforts reflect a broader shift from liability exemptions and indirect obligations to direct delegation of regulatory functions to private platforms.

Within this regulatory evolution, so-called ‘must-carry’ obligations have emerged as a distinctive tool. Borrowed from broadcasting law, the term now refers to measures that restrict platforms’ discretion in moderating lawful content. Such obligations take diverse forms, from proposals to prioritise information of public interest in recommender systems to privileges shielding media organisations or politicians from suspension or deplatforming. Proponents frame these measures as necessary to protect pluralism, editorial independence, and democratic discourse. Critics warn, however, that they risk entrenching power imbalances, undermining platform autonomy, and creating a tiered system of expression.

This paper situates these developments within the broader European debate on platform regulation and freedom of expression. It argues that contemporary must-carry provisions are better understood as special treatment rules that privilege particular speakers. While designed to recalibrate the relationship between platforms, media providers, and political actors, such measures carry significant risks for equality of expression and democratic accountability.

The paper proceeds as follows. Section 2 traces the historical development of platform regulation in the EU, looking at the increased delegation of enforcement responsibilities. Section 3 analyses the EU framework, focusing on the rejected proposals in the Digital Services Act (DSA)Footnote 2 and the adoption of Article 18 of the European Media Freedom Act (EMFA).Footnote 3 Section 4 turns to national approaches in Germany, the UK, and Poland, supplemented by relevant case law. Section 5 compares the new measures with must-carry obligations in broadcasting and evaluates these measures as special treatment rules for media and political actors, assessing their implications for freedom of expression and democratic accountability. Section 6 concludes by arguing that sustainable solutions lie not in compelled inclusion but in systemic safeguards, including financial support for journalism, innovation in local media, and user empowerment.

2. The emergence of new ‘must-carry’ obligations

Over the past decade, online platforms have become crucial fora for dissemination and access to information and digital content in the online environment, as well as for individual expression.Footnote 4 At the same time, platforms also became points of control, due to their ability to affect user behaviour. Examples include eliminating or disabling access to their service or to the information they host, reducing or increasing the visibility of content, or even assisting third-party enforcement by identifying the wrongdoers on their services. It is therefore unsurprising that platforms have become the centre of policy discussions on how the law can and should regulate illegal and harmful activities online, fitting in the broad area of platform regulation. In particular, States are interested in using the technical and human resources of platforms to regulate illegal content online, as well as information that, although legal, is considered harmful – for example mis/disinformation.Footnote 5 States attempt to do so by assigning some of their own traditional policing functions to platforms and thus privatising enforcement of public policies on speech. This legislative trend towards co-opting platforms to regulate expression online initially took place through indirect means, namely conditional liability exemptions or ‘safe-harbours’ for certain intermediary services, such as mere conduit, caching and hosting, provided for initially in the e-Commerce Directive.Footnote 6 Currently, as we discuss below, there is a noticeable shift towards direct delegation mechanisms.

In Europe, this trend is visible at multiple levels. At the EU level, it is motivated by arguments on harmonising the digital single market, tackling illegal content online and enhancing the responsibility of platforms.Footnote 7 The EU legislature has been steadily churning out horizontal and sector-specific instruments that explicitly delegate monitoring, assessing, and sanctioning powers to the platforms. Examples include, in chronological order, the Code of Conduct on Hate Speech Online, the Copyright in the Digital Single Market Directive (CDSMD), the amended Audiovisual Media Services Directive (AVMSD), and the Terrorist Content Regulation.Footnote 8 At national level, sometimes predating and spurring EU intervention, relevant examples of proposed and enacted legislation include the German Network Enforcement Act (NetzDG), the Austrian Anti-Hate Speech Law (KoPlG), the French Avia Law, and the UK Online Safety Act.Footnote 9

Broadly speaking, these legal instruments and initiatives share common ground insofar as they use similar mechanisms to enhance the ‘responsibility’ of platforms.Footnote 10 First, they attempt to further detail and increase the exposure of platforms to (direct) liability and obligations to deploy enforcement measures for the third-party content they host. Second, they impose additional obligations on the platforms themselves regarding the effective moderation of the content they host and (the behaviour of) their users, as well as the design and functioning of their systems.Footnote 11 Overall, these efforts require platforms to take up additional enforcement duties due to their increasingly prominent role in online communications. Calls for enhanced responsibility of platforms point also to the political, societal, and even moral responsibility that accompanies such a role.Footnote 12

But delegation of enforcement measures to private entities raises several concerns.Footnote 13 First, private platforms are not competent to make decisions on fundamental rights, a role traditionally assigned to the judiciary. Second, platforms do not carry out a proper balancing of the competing rights at stake. As a result, this privatisation of enforcement duties risks adversely affecting the exercise of fundamental rights of Internet users. It may, in particular, lead to undue restrictions to freedom of expression through private censorship.Footnote 14

In response to mounting criticism, European policymakers began introducing measures and safeguards to allow for more effective protection of users’ right to freedom of expression, or at least to mitigate the negative effects of the obligations imposed on platforms. These countermeasures come in different shapes and forms. In some cases, they can be viewed as natural developments of existing duties and obligations in the area of intermediary liability, as updated to the specific legislation or subject matter at hand. This is particularly noticeable in the crown jewel of European platform regulation, the DSA, which amends and extends upon the 2000 e-Commerce Directive.

Illustrations of such countermeasures include the reappearing prohibition on the imposition of general monitoring obligations, specifications of notice-and-action procedures, in-platform redress mechanisms, and out-of-court redress mechanisms, to name a few.Footnote 15 These new instruments also include a proportionality-based approach, manifested in tailoring of obligations to the size, influence or reach of platforms, to mitigate overzealous enforcement.Footnote 16

Some of the countermeasures, however, are harder to conceptualise. In particular, we observe a new class of rules that are sometimes loosely referred to as ‘must-carry’ obligations.Footnote 17 This umbrella term covers a variety of rules, including: prohibitions to moderate (or outright prohibitions to remove) content originating from predefined sources (eg, elected officials); put-back orders by politically appointed councils; obligations to reinstate removed content that is subsequently shown to be lawful; or obligations to preserve or prioritise content for public interest reasons.Footnote 18

Some of the proposed measures have laudable aims, such as avoiding undue restrictions on content of public interest or strengthening the position of legacy media in the online environment. Others, however, present serious risks, such as facilitating State-controlled narrative (or even propaganda).Footnote 19 In any case, these measures or obligations give rise to important questions from the perspective of fundamental rights. For instance, does their introduction mean that certain types of legal content must always be ‘carried’ by online platforms, even against their will? Do some of these obligations create a right to a forum on private property? To what extent are these obligations analogous to the must-carry obligation known in traditional broadcasting regulation? More fundamentally, what are the freedom of expression implications of these obligations?

Historically and legally, the concept of ‘must-carry’ refers to the obligation imposed on transmission services to make certain channels, which serve general interest objectives, available to the public. The central aim of such an obligation is to guarantee access to public service broadcasting and ensure a diverse choice of programmes in order to effectively protect the right to freedom of expression and access to information for the public. However, for private entities subject to the obligation, this may amount to a limitation on their own right to freedom of expression, as they are forced to serve content that they would otherwise not be interested in carrying. Or, as US scholars would point out, a case of ‘compelled speech’.Footnote 20 While broadcasting regulation and the discussed measures for platforms are obviously distinct regimes, the underlying question about the impact of the obligations on the right to freedom of expression of the involved stakeholders looks and feels familiar. Given the abundance of this type of proposals in the area of platform regulation, as well as their potential risks for fundamental rights, the new wave of ‘must-carry’ obligations justifies closer scrutiny.

3. Mapping the wave of new ‘must-carry’ obligations in the EU

Crucial to the debate on platform regulation is the question of how to find balance between protecting society from illegal and harmful content, ensuring effective exercise of the right to freedom of expression and access to information, and respecting the right to conduct business. Novel must-carry obligations are increasingly viewed by policymakers, at least in part, as an answer to that question. They are advanced as a contemporary policy response to platform power not only in Europe but also across the world, for instance in the US or Brazil.Footnote 21 This section sets out to understand the object and scope of novel must-carry obligations from a European perspective. For this purpose, it discusses different manifestations of these obligations in EU law, European legislative proposals and judicial decisions. Given the broad and loose definition of the ‘must-carry’ term in this context, it is possible to identify several proposals and existing rules at European level in the DSA and in the EMFA. These instruments are examined here on the grounds that they are both relevant (from a policy standpoint) and sufficiently illustrative of the scope and characteristics of this type of rules. We follow the chronological order as it best reflects the development of the arguments and revisions that shaped the ultimate wording of what we view as the important legal provision in our analysis: Article 18 EMFA.

A. Digital Services Act: must-carry proposals in the legislative process

The DSA covers, among others, how online platforms can carry out moderation of content posted by their users.Footnote 22 The main aim of the DSA is to create a safer digital space and to enhance protection of fundamental rights online.Footnote 23 The DSA does not contain rules on content per se. The definition of what is illegal, for instance what constitutes defamation or hate speech, is left to the discretion of the particular EU Member States.Footnote 24 Instead, the DSA contains rules on the liability of providers of intermediary services and separate due diligence obligations that they must abide by. In doing so, it sets out detailed procedures to regulate content moderation practices of online platforms, including with the aim to address the problem of private censorship and over-removal of content.Footnote 25

This is in line with the DSA’s core aim of securing remedies for users whose expression is restricted by platforms. Footnote 26 For this purpose, The DSA provides three routes of redress:Footnote 27 an internal complaint and redress system in Article 20; a non-binding out-of-court dispute settlement mechanism for users and notice-submitters, functioning as an alternative or second instance to complaints; and (non-waivable) judicial remedies under national laws.Footnote 28 Although these mechanisms enhance the right to an effective remedy, they stop significantly short of imposing ‘must-carry’ duties. Platforms retain discretion to exclude lawful but unwanted content, subject to Article 14’s requirement of due regard for fundamental rights. The DSA thus constrains moderation practices without obliging platforms to host all legal speech, rejecting stronger must-carry proposals considered during the legislative process.

The DSA does not contain, strictly speaking, a must-carry obligation. However, the legislative process featured a number of proposals that explicitly or implicitly attempted to impose such obligations. For the most part, these proposals were advanced by different European Parliament committees.Footnote 29 Although none of the proposals made it to the final text of the DSA, they are nevertheless illustrative of the type of novel must carry obligations under consideration in platform regulation discussions and have the potential to (and in fact did, in the EMFA) resurface in future legislative initiatives. As such, they merit a closer look.

A. Must-carry of public interest information for recommender systems

One example is found in the context of regulating recommender systems,Footnote 30 where the DSA imposes a series of obligations on very large online platforms (VLOPs), such as regarding the design of these systems and the way in which users can interact with them. During the legislative process, the IMCO Committee considered that these rules should be amended to ‘further strengthen the empowerment of consumers’.Footnote 31 The centrepiece of the various amendments proposed to achieve this aim was a provision, which would introduce inter alia ‘a ‘must-carry’ obligation to ensure that information of public interest is high-ranked in the platforms [sic] algorithms’.Footnote 32 According to its proponents, such an obligation ‘should ensure that recommender systems display information from trustworthy sources, such as public authorities or scientific sources as first result following search queries in areas of public interest’. Footnote 33 The goal was to prevent ‘platforms from nudging users into the direction of unscientific propaganda, abusive content or conspiracy theories in order to keep them active on the platform (dark patterns)’. Footnote 34

Prohibition to moderate content from recognised media service providers

In another example, the CULT committee advanced a text for a must-carry obligation to the benefit of ‘recognised media service providers’, as defined in the AVMSD. Footnote 35 In particular, the proposal targeted VLOPs and stated that they must ensure that their content moderation practices, tools and terms and conditions are ‘applied and enforced in such a way as to prohibit any removal, suspension, disabling access to or otherwise interference with content and services from the account of a recognised media service provider’.Footnote 36 This ‘media exemption’ from content moderation was concisely justified on the need to protect editorial independence in the media sector. For this purpose, it was considered that ‘commercial online platforms should not be allowed to exercise a supervisory function over legally distributed online content originating from service providers who exercise editorial responsibility and consistently adhere to Union and national law as well as journalistic and editorial principles’.Footnote 37 Furthermore, media service providers should ‘remain solely responsible for the content and services they produce’, since ‘platforms cannot be held either responsible or liable for the content offered by media service providers on their platforms’. Footnote 38 As we shall see, a watered-down version of this proposal resurfaced in the EMFA, as a rule requiring special treatment for media service providers.Footnote 39

The ‘Trump amendment’

Perhaps the most prominent example of a must-carry proposal during the DSA drafting process was the so-called ‘Trump Amendment’. The amendment was advanced after several social media platforms suspended the account of the then-US president Donald Trump, based on violations of their terms and conditions. Footnote 40 The event, sometimes referred to as ‘the great deplatforming’,Footnote 41 has given rise to considerable discussion on the power of platforms to curtail the speech of elected politicians. For that reason, we examine it in greater detail here.

The proposal was advanced by the IMCO Committee, as an exception to the measures against misuse of the platform services by users who frequently provide manifestly illegal content, or submit notices or complaints that are manifestly unfounded. These measures include temporary suspensions of the account of a user or of the processing of user’s notices and complaints.

The amendment sought to limit the margin of discretion of platforms when applying the measures in relation to users that are ‘of public interest’. The only example provided in the justificatory note was that of politicians, there being no indication that this concept would also apply to other accounts that might otherwise be considered of public interest, such as journalists, human rights activists or whistleblowers.

For the covered ‘public interest’ accounts, the harsher measure of temporary account suspension ‘must receive the approval of the relevant judicial authority’.Footnote 42 The stated justification was the need to ‘ensure that accounts of public interest, eg, of politicians, are not suspended on the basis of the platform’s decision alone’.Footnote 43

Crucially, this amendment would apply solely to one moderation measure: the temporary termination of a user account, or ‘de-platforming’. It was not supposed to be a restriction on any other content-level measures, like the removal or blocking of specific posts by the user. In theory, taking such measures by platforms against specific posts, therefore, would have been still allowed without additional judicial approval. The amendment did not address the possibility of permanent (or at least not time limited) bans, which was the action taken by several platforms in relation to Trump at the time.

B. Special treatment of media in the European Media Freedom Act

The idea of introducing special treatment of media providers was not included in the final version of the DSA. It has reappeared, however, in the recently adopted EMFA. The goal of the EMFA is to establish a common framework for media services in the internal market, in particular, to protect media pluralism and independence in the EU. To this end, the Regulation provides safeguards against political interference in editorial decisions and against surveillance of journalists. It addresses the issues of the independence and stable funding of public service media, as well as the transparency of media ownership and of the allocation of state advertising. The EMFA, moreover, contains several rules addressing the provision of and access to media services in a digital environmentFootnote 44 and in particular, the presence of media content on VLOPs. The provision on the special treatment of media service providers, although heavily criticised during the negotiation process, has prevailed and is present, in a modified version, in the final version of the EMFA.Footnote 45

Article 18 EMFAFootnote 46 mandates that VLOPs provide a functionality for their users to declare their status as media service providers that are ‘editorially independent from Member States, political parties, third countries and entities controlled or financed by third countries’.Footnote 47 Next, those media service providers should declare that they are ‘subject to regulatory requirements for the exercise of editorial responsibility in one or more Member States and oversight by a competent national regulatory authority or body, or adhere[…] to a co-regulatory or self-regulatory mechanism governing editorial standards’. Media service providers should also declare that they ‘do not provide content generated by artificial intelligence systems without subjecting it to human review or editorial control’. In response to the filed declaration, VLOPs should indicate whether or not they accept the declaration.

Media services that fulfil the criteria and are accepted by VLOPs benefit from two main privileges. The first privilege applies when a VLOP intends to suspend the provision of its services to a media service provider or restrict the visibility of its contentFootnote 48 on the grounds that the content is incompatible with the VLOP’s terms and conditions. In that case, the VLOP must communicate a statement of reasons for the intended decision prior to the suspension or restriction taking effect.Footnote 49 With this prior notification, VLOPS should give the media service provider the possibility to reply within 24 hours, although the timeframe may be shorter in case of a crisis.Footnote 50 The content should stay available until media organisation has been given time to respond.Footnote 51 Interestingly, the described procedure does not apply if suspension or restriction of visibility is triggered by the VLOPs obligations in relation to specific types of content. These include various types of systemic risks defined in the DSA, protection of minors and general public based on the AVMSD, or obligations relating to illegal content.Footnote 52 In other words, the scope of the privileged treatment is limited, as it would not apply to certain types of controversial content, such as (illegal) hate speech, incitement to violence, or racist speech.Footnote 53 The privilege may also not apply in relation to content considered as disinformation, in cases where systemic risks under the DSA are identified; we return to this point below.Footnote 54

The second privilege applies where a media service provider considers that a VLOP repeatedly restricts or suspends the provision of its services without sufficient grounds. In that case, the VLOP shall ‘engage in a meaningful and effective dialogue with the media service provider, upon its request, in good faith with a view to finding an amicable solution’ for terminating unjustified restrictions or suspensions and avoiding them in the future.Footnote 55 The media service provider may notify the European Board for Media Services (the ‘Board’) and the Commission about the outcome and the details of such exchanges. It may also request an opinion by the Board on the outcome of the dialogue, including recommended actions for the VLOP.Footnote 56 At time of writing, the Commission is in the process of drafting guidelines to help protect media providers from unwarranted content removals from VLOPs, thereby further specifying the meaning and scope of Article 18 EMFA.Footnote 57

During the legislative process the text of the provision that would ultimately result in Article 18 was controversial from the start. As pointed out by Joan Barata, the definition of media services that was included in the Commission’s proposal was very narrow and focused on traditional understanding of media (‘providing programmes or press publications’). Footnote 58 Because of this narrow framing, the protections introduced in the EMFA would exclude some forms of media and journalistic activity from the scope of application.Footnote 59 The proposed privilege, as a result, would protect content originating from a commercial or a public broadcaster but not necessarily from personal accounts of journalists employed by them. It would also not apply to posts coming from a human rights organisation, or a citizen journalist. This limited approach is not exactly up to date with the recent trends promoting broad understanding of media and journalism.Footnote 60 It rather follows the ‘old school’ of thought that only professional media actors deserve special treatment.Footnote 61 The European Parliament attempted to extend the definition to apply also to ‘standard and non-standard forms of employment’, clarifying in the preamble that it would include bloggers.Footnote 62 In the final agreement, bloggers are not mentioned. The definition of media services refers to the TFEU and focuses on any form of economic activity. Recital 9, however, explains further that ‘the definition of media service provider should cover a wide spectrum of professional media actors falling within the scope of this definition, including freelancers’.

Another point of criticism of the privilege in Article 18 EMFA refers to the condition of editorial independence. It is a crucial concept of media freedom and pluralism. But it is not defined by law and is subject to many variables.Footnote 63 Both for public service media and for commercial media multiple complex indicators have to be taken into account to answer the question of independence.Footnote 64 The European Media Pluralism Monitor (MPM), conducted yearly, consistently shows that effective protection of editorial independence continues to be the source of major concerns.Footnote 65 The 2024 MPM indicated that ‘the European media sphere is still significantly affected by high levels of political capture’ with seven countries are considered high risk regarding state interference and political independence.Footnote 66 Editorial independence is also extremely difficult to assess, despite indicators and tools such as the MPM. In many cases, no simple ‘yes’ or ‘no’ is possible but rather, the answer is found somewhere on a spectrum. It would be up to the VLOPs to make that assessment, based on information provided in the self-declaration, ultimately legitimising their power over media providers. The same concern applies to the fact that VLOPs are to assess whether a media service adheres to the standards of editorial responsibility.

To address these concerns, the final text of EMFA includes an additional safeguard, stating that VLOPs will be able to confirm the provided information with the relevant national regulatory body or the relevant co- or self-regulatory mechanism – in case of reasonable doubts. Such an addition should help eliminating varying assessments by different VLOPs. Still, some scholars argue persuasively that the overall design of Article 18 EMFA ‘risks structurally embedding platform firms as adjudicators of news legitimacy’.Footnote 67

The final version of EMFA, moreover, clarifies the available recourses (eg, in case a VLOP rejects or invalidates a declaration or ignores a response) by directly listing redress mechanisms available in the so-called Platform-to-business Regulation (2019/1150) as well as the DSA.Footnote 68 The former instrument creates a procedural privilege for media services, by giving them priority in complaints handling, and thereby treating them as a special type of business user. The latter is meant for regular (non-business) users. Relying on the DSA might be still interesting for the media providers or regular journalists as it offers more remedies, such as the aforementioned out-of-court dispute settlement procedure. On its own, the DSA mechanism does not offer any preferential treatment nor bump the complaint to the front of the appeals line, but together with the wording of Article 18(5) EMFA it could achieve this effect.Footnote 69

A final and broader critique of Article 18 EMFA is warranted. The provision ‘aims to re-establish a space for traditional media in the online world by providing specific guarantees for media content on digital platforms’.Footnote 70 However, its design appears ill-suited to the specific needs of local journalism. First, the privilege applies only to recognised media service providers. Smaller, less established, or emerging local outlets may find it difficult to qualify. This reflects what Tambini describes as the ‘paradox of media privilege’.Footnote 71 If the criteria are overly rigid or burdensome, many local journalists may fall outside the protected category. Second, Article 18 may reinforce disparities in resources. Even with procedural safeguards, local outlets often lack the legal, financial, or technical capacity to pursue litigation or to ensure compliance by VLOPs. Finally, such privileges do not aim to address the financial fragility of local journalism or its broader structural challenges. We return to this final point below when highlighting alternative approaches to regulating the relationship between media providers and platforms of significant influence.Footnote 72

4. New ‘must-carry’ developments at national level

This section examines the development of so-called must-carry rules for online platforms at national level in Europe. It looks first at the legislative developments in different European countries, namely: the German law prohibiting to discriminate content; the UK’s Online Safety Act; and a since abandoned proposal of must-carry obligations in Poland (A). This is then followed by an exploration of national case law in this area, which has proved influential in shaping national debates, as well as the previously discussed EU rules and proposals (B).

A. National must-carry law

Germany’s prohibition to discriminate

The German Interstate Treaty on Media (Medienstaatsvertrag-MStV) updated the long-standing broadcasting regulation of the Interstate Treaty on Broadcasting and Telemedia.Footnote 73 Its goal is to boost media pluralism and safeguard diversity of information by focusing on visibility and discoverability of information.Footnote 74 In particular, the Treaty updates the rules targeting traditional broadcasting media, while also implementing the amended AVMSD, which is gradually expanding to other types of media services. Article 94 of the Treaty prohibits ‘media intermediaries’ from discriminating ‘against journalistic-editorial offers, the discernibility of which they have a particularly high influence on’. The term media intermediary refers to ‘any telemedia that also aggregates, selects, and generally presents third-party journalistic-editorial offers without combining them into an overall offer’.Footnote 75 It is a broad concept, which includes search engines, social media services, app portals, user-generated content portals, blogging portals and news aggregators.Footnote 76 According to Article 94, discrimination occurs where the criteria on visibilityFootnote 77 ‘are systematically deviated from, in favour or to the detriment of a specific offer, or if these criteria systematically and directly or indirectly impede offers unfairly’ for no objectively justified reason. If the provider of journalistic/editorial content believes that their content has been discriminated against, they can file a claim with their state broadcasting authority.Footnote 78

The provision refers to journalistic-editorial offers, which suggests content provided by traditional media; it is unclear if it would also apply to citizen journalists or bloggers. It creates, therefore, a special treatment rule for a category of content providers. The provision, however, does not prohibit moderation (eg, removal) of singular pieces of content. Rather, it refers to situations of systematic hampering of the whole offer, while disregarding the established criteria.

UK Online Safety Act

The long-debated UK Online Safety Act establishes a new regulatory framework to enhance the safety of internet use in the United Kingdom.Footnote 79 It imposes duties of care on providers of certain regulated services regarding illegal content and content harmful to children. At the same time, it includes provisions to prevent excessive moderation of media or content important to democratic society.Footnote 80

The Act exempts ‘news publisher content’ from platforms’ obligations, meaning platforms are not required to act on such content. Footnote 81 It also provides limited exemptions for unedited, full reproductions of news publisher content on user-to-user services.Footnote 82

Beyond this, the Act introduces a series of duties: to protect content of democratic importance (Section 17), to protect news publisher content (Section 18) and to protect journalistic content (Section 19). It is worth examining them in turn, as they contain features of so-called must-carry obligations.

Section 17 requires targeted servicesFootnote 83 to consider freedom of expression when moderating content ‘of democratic importance’ or acting against users who share it.Footnote 84 Measures must be proportionate to the provider’s size and capacity and applied equally to diverse political opinions.Footnote 85 Policies must be set out in terms of service and enforced consistently for both news publishers and other users, provided the content is intended to contribute to political debate in the UK.Footnote 86

Section 18 was added in late 2022, after the Department for Digital, Culture, Media and Sport announced the introduction of a so-called ‘temporary must carry obligation’ for content coming from news publishers.Footnote 87 The intention was to additionally protect news publisher content that could be caught in a platform’s safety obligations.Footnote 88 The term ‘must-carry’ is not used explicitly in the Act. Instead, it is provided that if a platform was, in fact, going to take action against news publisher content or against a user that is a recognised news publisher, such platform would have to follow a prior-notification and a mandatory appeals procedure specified in Section 18(3).Footnote 89 In particular, that means the prior-notification must specify the action that the provider is planning on taking, give reasons for this action referencing relevant parts of their terms of service, explain how the importance of freedom of expression was taken into account, and indicate a time period for a response. However, a service provider may take down the content without following the described process, if they reasonably consider that they could face criminal or civil liability for not acting, or if the content in question is a of a type of listed offenses or illegal content. If the action is taken, the recognised news publisher has to be notified and can request reversal of the action. If a recognised news publisher is banned from a service, ie, de-platformed, the provider may act in relation to other content of the same news publisher that is still present on the service, without following the indicated steps.

Finally, Section 19 states that the same attention to freedom of expression must be given when making specific moderation decisions on ‘journalistic content’.Footnote 90 The provision also requires creation of a dedicated and expedited complaints procedure to remedy the moderation decisions (with a possibility to reinstate) regarding journalistic content.Footnote 91 The complaint procedure in Section 19 should apply to any action, not only those taken on the basis of a platform’s terms and conditions (as in Section 18).

Poland’s proposal to protect the freedom of expression of social media users

An interesting example of a novel must-carry obligation appeared in a Polish proposal for a new law on ‘the protection of freedom of expression of social media users’.Footnote 92 Although the work on the proposal had been initiated a few months prior, the plans for a new law were announced in January 2021, coinciding with the deplatforming of Donald Trump. During the press conference, the Minister of Justice argued that censoring decisions by Big Tech corporations ‘threaten[s] and violate[s] the values at the centre of democracy’.Footnote 93 He also stressed that platforms engage in ‘ideologically motivated censorship’ that often targets members of religious and right-wing groups.Footnote 94

Officially, the main goal of the proposal was to curb the arbitrary power of social media platforms by prohibiting removal of content that does not breach Polish law, thereby restricting their ability to remove content that is not illegal. The bill introduced several measures aimed to strengthen the protection of freedom of expression online. For example, the proposal contained a mandatory internal appeal procedure for take downs and account suspensions where complaints would have to be resolved within 48 hours, as well as a ‘put back’ procedure.Footnote 95 The proposal, however, also contained several worrisome elements, such as obligation for data retention or a possibility to file an internal complaint against content that was defined as unlawful. The term ‘unlawful content’ included illegal content but also disinformation (misleading information created for profit or in ‘breach of public interest’), content infringing personal rights as well as ‘public decency’. The proposal, therefore, required platforms to consider complaints against content that was neither illegal nor contrary to the platform’s terms and conditions, but was considered indecent by the complainant, eg, critical of the Catholic church.Footnote 96

The most contentious proposal was the creation of a Free Speech Council, an external appeals body of five members appointed by Parliament. It would review unresolved user complaints about platform content decisions and could order content reinstatement or account reactivation after closed deliberations.Footnote 97 Its decisions would be binding on platforms.

The proposal drew heavy criticism from NGOs, business associations, and the Polish Ombudsman.Footnote 98 He noted that victims of targeted hate speech, if ignored by the Council, would have no recourse, skewing protection toward one side of the political spectrum. Footnote 99 He called the Council’s powers ‘a far-reaching interference in the freedom of speech’, warning they would shift the limits of expression from law to the Council’s discretion.Footnote 100 The Ombudsman and others also criticised the Council’s political nature and dependence on government.Footnote 101

Progress stalled in 2022, partly due to a potential conflict with the newly adopted EU DSA. After Poland’s 2023 elections returned the former opposition to power, the proposal was effectively dropped.

Common features

There are several common features between these German, UK, and Polish laws and proposals that help understand the potential scope of online ‘must-carry’ obligations. As we will see, these common features can be also traced down in the DSA and EMFA.

First, each country’s law is justified by the need to regulate platform power in connection with the moderation of content that is not illegal, especially when originating from news publishers and media. In all three examples, the provided justification is focused on the protection of freedom of expression, access to information and plurality of content and opinions. All three frameworks emphasise safeguarding this fundamental freedom, whether through promoting media pluralism (Germany), protecting content of democratic importance (UK), or restricting platforms from removing lawful content (Poland).

Second, in the German and UK examples the core solution to address this concern is through specific protections for journalistic or editorial content. In this sense, the rationale is to provide extra protection to media in the online public sphere. It reflects therefore the need to ‘rebalance the relation between media and online platforms’ expressed, eg, by the European Commission already in 2018.Footnote 102 It also aligns with the belief that quality media is a ‘pillar of democracy’ and access to information in the public interest and free, independent journalism are essential for informed citizen participation in democratic processes.Footnote 103

Third, the protective rules in question aim to prevent that platforms carry out content moderation that is considered discriminatory or biased towards certain parties or viewpoints. Germany’s law prevents media intermediaries from discriminating against certain types of content, and the UK Online Safety Act requires platforms to follow specific rules when handling news or journalistic content. Poland’s proposal, however, was different. Nominally, it tried to limit platforms’ ability to remove legal content but, in this attempt, it created further possibilities for discrimination and censorship, just for other sets of viewpoints, unwelcomed by the then-State administration.

Each of the examined instruments provides for procedural safeguards in the form of an appeal or oversight mechanism to challenge platform moderation decisions. Germany allows claims to the state broadcasting authority, the UK enforces appeal procedures for news publishers, and Poland’s proposal involved an additional instance in the appeal procedure – the Free Speech Council (although politically appointed and opaque in its operation).

Variations of these common features, we would argue, are all found to some extent in the above-analysed ‘must-carry’ rules and proposals in EU law, especially in the DSA and EMFA.Footnote 104 As we also show, the same features surface in national case law in this area.

B. National ‘must-carry’ case law

Apart from various legislative attempts to introduce different forms of novel must-carry obligations across Europe, there is also a growing body of case-law on the topic. Initially, any claims to have platforms host content they have previously removed would be resolved outside of courts, either through reinstatement via public pressure, settlement or on procedural grounds.Footnote 105 Recently, however, there are more examples of cases where national courts in Europe take on the question of unjustified removal and order reinstatement of content. Not all the claims for reinstatement, however, lead to such an outcome. Below we provide representative examples from the Netherlands, Italy, Germany and Poland.

In 2020, in the Netherlands, the District Court of Amsterdam decided on two cases about removal of content critical of the government response to the Covid-19 pandemic.Footnote 106 Taking into account a particular task of preventing harmful disinformation, the Court came to the same conclusion that the two involved platforms, YouTube and Facebook, did not have to reinstate the previously deleted content.Footnote 107 This line of case law has been followed in 2021, when several courts issued judgments on similar cases.Footnote 108

In yet another case from the Netherlands (FVD/YouTube case), the District Court of Amsterdam concluded that despite YouTube being a platform of great importance to convey a message, it does not have to reinstate a video of a speech of a politician criticising the government’s measures against the pandemic. The Court first concluded that YouTube does not have a ‘must-carry’ obligation, and that such an obligation has not been included in the DSA. Then, the Court referred to ECtHR case law – in particular, Appleby v. the UK – and concluded that ‘any effective exercise of freedom of expression’ had not been made impossible, nor had the essence of right to freedom of expression been violated.Footnote 109 This is because the opinion in question could be expressed on several other channels or through a link thereto (eg, their website, their app, the website of the Parliament, and social media). Moreover, the Court added that the party could still use YouTube, and that most of their critical videos were still on the platform. Lastly, the Court argued that the outcome of the judgement could have been different in the case of an account ban (ie, deplatforming), rather than the deletion of just one video.Footnote 110

However, other national courts have reached opposite outcomes, recognising obligations for platforms to reinstate content or accounts in decisions that approximate the imposition of limited ‘must-carry’ obligations. This occurred for instance in an Italian case concerning the removal of content and an account based on incidents that happened outside the platform. In CasaPound, an Italian far-right movement claimed that removal had caused its exclusion from the political debate.Footnote 111 The court argued that Facebook’s decision to deactivate the page and account of CasaPound, based on a violation of its terms and conditions by a series of incidents, could not be upheld. Firstly, the court dismissed Facebook’s argument that the political movement is an organisation that incites hatred and violence. This is because the promotion of the aims of the far-right movement on the Facebook page did not – in and of itself – constitute an incitement to hatred and violence.Footnote 112 Secondly, CasaPound should not be held responsible by association for the incidents, because the content was not present at the organisation’s page and account, despite the involvement of CasaPound’s members and supporters in the incidents. The court strengthened these arguments by highlighting the importance of Facebook in modern-day political discourse and communication with one’s supporters. Moreover, the court pointed to obligations stemming from the Italian Constitution, such as the obligation to protect party pluralism, freedom of association and freedom of expression.Footnote 113 In the appeals procedure, the court emphasised that even though the political character of CasaPound does not create obligations to guarantee constitutional rights by the platform, a civil contract (eg, for delivery of a service) should be interpreted in accordance with the Italian Constitution.

German courts, for their part, rely on a combination of the Basic Law, which protects the right to freedom of expression in Article 5(1)(1), with a reference to the function of a platform as a ‘public marketplace’, and the doctrine of indirect third-party effect of fundamental rights (Drittwirkung). Footnote 114 It is noteworthy that multiple content reinstatement cases have been successful in Germany, especially where the providers failed to ensure adequate transparency and due process.Footnote 115

In 2018, the OLG München labelled large online platforms as ‘public marketplaces’ for information and opinions. In that role, such platforms generally would not be allowed to remove ‘admissible expressions of opinion’ that do not qualify as illegal content, even on the basis of their terms and conditions. Footnote 116 Rather, these providers would have a ‘substantial indirect meaningful duty’ to protect the right to freedom of expression of users in the context of content removal decisions. Footnote 117 The OLG Dresden added to this by observing that a private company that ‘takes over from the state to such a degree the framework of public communication’ must also have the ‘concomitant duties that the state has as a provider of essential services’.Footnote 118 Moreover, opinions that are protected under Article 5 of the Basic Law enjoy a higher level of protection, so that its removal cannot be based solely on a violation of the terms and conditions, it must not be performed arbitrarily, and users may not be blocked from the service without recourse. Complying with these requirements ensures that platforms can moderate the content they host, delete uploaded content in order to avoid liability, and take down (both criminally and not-criminally punishable) hate speech.

In a similar fashion, the Federal Constitutional Court (BVerfG) issued a preliminary injunction ordering Facebook to allow a right-wing party access to its previously suspended Facebook page to resume posting.Footnote 119 Preventing a political party from using its Facebook page ‘denied an essential opportunity to disseminate its political messages and actively engage in discourse with users of the social network’, which ‘significantly impedes’ the party’s visibility, especially during the elections. This argument bears some resemblance to the argument in the Italian CasaPound case on the exclusion from the political debate. The BVerfG also clarified that, indeed, fundamental rights can be effective in disputes between private parties. This is particularly so if a private party enjoys ‘significant market power’ in Germany, as Facebook was considered to do. The indirect third-party effect, however, is not reserved solely for freedom of expression: all relevant fundamental rights must be balanced to determine if terms and conditions alone can justify the deletion of a particular statement.Footnote 120

Finally, the German Federal Court of Justice (FCJ) further clarified in 2021 Facebook’s content removal rights and duties in the light of its dominant position in the market.Footnote 121 The FCJ argued that Facebook can develop its own internal rules (eg, Community Standards) and enforce them by removing posts and blocking accounts in case of a breach, provided that it takes its users’ fundamental rights into account. The developed internal rules must be clear and leave little room for interpretation.Footnote 122 Reasons for removal of content or blocking of an account should be objective and the platform may not ban specific (predefined) opinions. Moreover, due to its size, Facebook must comply with due process requirements, as a State would have to do when censoring expression. This procedural protection of fundamental rights, specifically, requires Facebook to: (i) inform a user about any removal of their content (after) and of any intention to block a user’s account (before); (ii) inform a user of the reason for the action; (iii) give a user an opportunity to respond; and (iv) issue a new decision after a review, with the chance of reinstating the removed content.

Lastly, a Polish case, in which Facebook and Instagram removed fan pages and groups run by a NGO – the ‘Civil Society Drug Policy Initiative’ (‘Społeczna Inicjatywa Narkopolityki’, or ‘SIN’) – due to unspecifiedFootnote 123 violation of the Community Standards. Footnote 124 Arguably, the decision could have been caused by SIN’s unusual approach to drugs, which focuses on the safe use rather than abstinence.Footnote 125 There was, however, no warning and no explanation of the reasons of the removal. There also was no possibility to appeal. In 2019, SIN filed a lawsuit arguing that blocking access to its content was arbitrary and unjustifiably restricted SIN’s possibility to disseminate information, express opinions and communicate with their audience. Moreover, SIN argued that it prevented continuation of their educational activities and undermined their reputation by suggesting that their activity was unlawful. In 2019, the District Court in Warsaw issued an interim measure that temporarily prohibited Meta from removing fanpages, profiles and groups run by SIN, as well as from blocking individual posts,Footnote 126 to allow SIN continuation of its educational activities until the case was resolved. The court has also obliged Meta to store profiles, fanpages and groups (including comments) that were previously deleted to make sure that they can be easily restored in the future.Footnote 127 In early 2024, the court of first instance has ruled that Meta cannot block users without any justification and without providing them the possibility to effectively challenge the decision.Footnote 128 Interestingly, Meta’s actions amounting to blocking content without justification and the possibility of appeal would also constitute a violation of the DSA, which imposes specific content moderation obligations.

As with the national laws examined above, there are important common features between German, Italian, Dutch, and Polish case law. First, the key legal concern of courts in imposing restrictions on platforms is to safeguard the right to freedom of expression in the context of impactful content moderation decisions, especially in political contexts. Second and related, courts highlight the need to balance freedom of expression with other affected rights, taking into account the impact of the measures on public debate. In this context, the political speech of specific speakers receives special protection, with courts often ruling against platforms when moderation limits political participation. Third, a key consideration for courts assessing restrictions on the platforms’ moderation activities is whether the platforms at issue are seen as a playing a role of public space or forum for discourse, or as having significant market power. Fourth, all courts focus on procedural safeguards of due process and transparency, emphasising the need for clear, transparent moderation processes and availability of effective remedies, requiring platforms to inform users, provide reasons for removal, and allow appeals.

5. From ‘must-carry’ to special treatment and the right to freedom of expression

The next question concerns the extent to which the newly examined must-carry rules align with European fundamental rights law, particularly the right to freedom of expression. First, we need to highlight that the ECtHR, when addressing questions on freedom of expression and online service providers, distinguishes between the responsibilities of professional media outlets and non-media platforms regarding third-party online content. This is reflected in its contrasting rulings in Delfi AS v. Estonia (2015) and MTE and Index.hu v. Hungary (2016), both concerning liability for user comments under the posted storied, in light of Article 10 ECHR. In these cases, the court’s reasoning varied according to the content at issue and the platform’s nature.Footnote 129 The nuanced approach of the court can be linked to the 2011 CoE Recommendation on the new notion of media.Footnote 130 The Recommendation argued for a differentiated and graduated policy approach to different actors participating in the rapidly evolving new media ecosystem.Footnote 131 This nuanced approach is very much present in policies addressing media and in the court’s jurisprudence, reflecting the idea that the rules applied to the different media actors (eg, traditional media vs online platforms) may not always be the same.

In this section, we analyse whether these rules can be justified under the doctrine of positive obligations in human rights law, drawing on examples such as the right to reply in media law, which mandates the publication of certain content (A). Next, we examine must-carry obligations in the traditional broadcasting context, which protect access to information and pluralism, and assess whether the analogy between these rules and the new obligations for online platforms is justified, concluding that it is not (B). Finally, we argue that these new rules are better understood as special treatment provisions in platform regulation. We evaluate their impact on media organisations, politicians, and freedom of expression, critiquing their potential to entrench power imbalances, weaken democratic values, and inadequately address disinformation and media trust issues within the DSA and EMFA frameworks (C).

A. Obligations to effectively protect freedom of expression

According to European human rights instruments, such as the ECHR and the EU Charter, States cannot interfere with the exercise of protected rights unless specific requirements are met. But States may also have an additional obligation to effectively protect fundamental human rights from interference by others, including by private parties.Footnote 132 Such an obligation requires States to take an active stance in private conflicts and may justify the introduction of must-carry-like obligations.

In the ECHR, the concept of positive obligations is based on Article 1, which requires that the States ‘shall secure to everyone the rights and freedoms defined in the Convention’.Footnote 133 In the context of the right to freedom of expression, this involves an obligation for governments to promote the right and to provide for an environment where it can be effectively enjoyed. This means protecting the freedom of expression against interference, even by private parties.Footnote 134 Moreover, States are required to create a favourable environment for participation in public debate for everyone and to enable the expression of ideas and opinions.Footnote 135 The obligation is also understood as an obligation to act, or to implement, for example by enacting domestic legislation to protect the right. Lack of such action may trigger the responsibility of a State, even if the resulting interference has been conducted by a private party.Footnote 136 When examining EU law, however, the main human rights instrument of reference is the EU Charter.Footnote 137

Under the Charter, the obligation to respect the rights contained therein (negative obligation) is clearly articulated. The existence of the obligation to protect (positive obligation) is less obvious. The CJEU, for a long time, did not refer explicitly to the doctrine of positive obligations in its jurisprudence.Footnote 138 It focused instead on proportionality, fair balance and the lack of effective protection of the Charter rights. Arguably, this approach allowed the CJEU to reach similar result as the ECtHR when applying the positive obligations doctrine.Footnote 139

This subtle approach has been recently changing, however, since the CJEU ruled in Commission v Hungary that the negative obligation of public authorities may be ‘supplemented by a positive obligation to adopt legal measures seeking to protect private and family life’.Footnote 140 Later, in La Quadrature du Net the CJEU ruled that ‘positive obligations of the public authorities may result from Article 7 Charter requiring them to adopt legal measures to protect private and family life’.Footnote 141 Both judgements also refer explicitly to the jurisprudence of the ECtHR, highlighting that the corresponding rights of the Charter and the Convention must be regarded as having the same meaning and scope.Footnote 142 Although both rulings referred to the right to respect for private and family life enshrined in Article 7 Charter (corresponding to Article 8 ECHR), they clearly indicate the growing willingness of the CJEU to explicitly recognise the doctrine of positive obligations and the role of States to actively protect fundamental rights.

This finding is useful when examining EU and national legislation that aims to ensure better protection of the expressive right. The question remains whether the doctrine of positive obligations and effective protection of freedom of expression can be construed as an underlying legal baseline for introducing novel must-carry obligations on online platforms.

An important step towards answering this question is to examine whether this doctrine has been employed before to justify rules that compel publication of specific content.

The ECHR and the Charter both recognise that protection of the enshrined fundamental rights requires active engagement of the legislator, which must ensure effective protection of fundamental rights. But does this mean that users must be always given a forum? Clearly, that is not the case. As ruled by the ECtHR, Article 10 does not bestow any ‘freedom of forum’.Footnote 143 That is to say, Article 10 does not guarantee any ‘freedom of reach’, meaning a right to have one’s content broadcasted on any particular private forum or, in our case, private platform, even if of significant influence. Footnote 144 States, however, might be required to step in where a legal act, including a private contract, appears unreasonable, arbitrary, discriminatory or inconsistent with the principles underlying the Convention.Footnote 145 That would mean setting limits for rules that private owners establish on their property.

Arguably, rules imposed by States on private owners to limit certain prerogatives that could have negative impact on others are not unusual also when it comes to speech, for example prohibiting showing certain content, considered harmful, to minors.Footnote 146 Such restrictions, effectively, limit the owners’ right to private property, their right to conduct business as well as their freedom of expression, as a result of a balancing exercise with other rights at stake. Examples can also be found in rules and judgements on press freedom, in particular on the ‘right to reply’, which mandates publication of certain information.

The right to reply is a particular form of access to forum, initially used in written media and later expanded to the online environment.Footnote 147 Essentially, it provides a possibility to publish a response to inaccurate information in the same medium where the original statements were made. If the subject of the information wishes to benefit from this possibility, the publisher must make the reply public, subject to specific conditions. The purpose is to provide a way to protect oneself against statements or opinions disseminated by the media that are likely to be injurious to the private life, honour, dignity, and reputation. The right to reply is based on the premise that it should be possible to contest untruthful information, but also to ensure a plurality of opinions, especially in matters of general interest such as literary and political debate.Footnote 148 According to the court, such situations may create a positive obligation ‘for the State to ensure an individual’s freedom of expression in such media’, for example by requiring publication of a retraction, an apology or a judgment in a defamation case.Footnote 149

The right of reply, therefore, demands balancing of the right to freedom of expression of the media against the right to freedom of expression (next to reputation and other rights) of the subject of the information. It is, however, an exception to the general rule that newspapers and other privately-owned media must be free to exercise editorial independence to decide what to publish, including articles, comments and letters submitted by readers.Footnote 150 The limitation of the editorial freedom, allowing for a compelled publication of content that the publisher would not necessarily want to publish, is another example of State restricting freedom of expression to protect the expressive rights of others.

In essence, it can be said that the doctrine of positive obligations and its manifestations, like the right to reply, clarify at least two important aspects vis-à-vis ‘must-carry’ obligations. First, it is possible for States to impose limitations on private parties in order to compel them to carry some type of content on the grounds of freedom of expression. This possibility, we argue, extends in principle to online platforms and to user-generated content they host and provide access to. Second, however, this possibility is limited. It is construed as an exception that should be carefully calibrated and balanced against competing rights and interests, and adjusted to the particular scenario it covers. In our case, and somewhat in line with the European and national case law and examples provided above, this means that new ‘must-carry’ obligations should take into account a number of factors that are both conditions for their admissibility and restrict their scope of application, including the type (eg, media vs non-media provider) and relative power and/or reach of platform affected (eg, if it is of significant influence, such as VLOPs in EU law), the privileged speaker, the type of expression to be carried, the context and severity of speech and the proportionality of the obligation considering alternative means of expression. Considering these constraints, the question emerges whether the ‘must-carry’ label is suitable for these types of obligations.

B. What’s in a name? Old vs new ‘must-carry’

The term ‘must-carry’ has surfaced frequently in contemporary discussions on restricting the freedom of online platforms to moderate lawful content on their services.Footnote 151 It is a shorthand, or a convenient analogy, to describe the new rules we have discussed thus far. But do these rules even resemble traditional must-carry regimes? Traditional must-carry obligations limit private ruling in broadcasting, by obliging transmission services to make certain channels that serve public interest objectives available to the public. They emerged in the context of electronic communications, in light of the growing power of cable providers tempted to suppress local broadcasters. Their aim was to guarantee access to public service broadcasting and ensure a diverse choice of programmes to effectively protect the right to freedom of expression and access to the information of the public. From the perspective of the private entities subject to the obligation, this amounts to a limitation on their right to freedom of expression. They are forced to provide content that they would otherwise not be interested in carrying. They are also restricted in their ability to use their own capacity freely.

In the EU, these obligations were originally provided for in Article 31 of the Universal Services Directive, now amended and regulated in Article 114 of the Electronic Communications Code.Footnote 152 That must-carry provision requires that the introduced obligations are reasonable and apply to specified radio and TV channels, but they are not meant to cover all channels transmitted by a private broadcaster. As confirmed by the CJEU, ‘must-carry’ should not automatically be awarded but should be strictly limited to those channels serving an overall content fulfilling general interest objectives.Footnote 153 These objectives, moreover, must be clearly defined. A mere general statement that the imposed obligation aims to ensure plurality and cultural diversity is not sufficient. Footnote 154 Additionally, ‘must-carry’ should be proportionate and transparent. This means that the way it is applied ‘must be subject to a transparent procedure based on objective non-discriminatory criteria known in advance’.Footnote 155 The obligations can be imposed on the providers of electronic communication networks and services that are used by a significant number of users as their principal source of radio and TV channels.Footnote 156

Looking at motivations for old vs new ‘must-carry’ rules, an important distinction emerges. Traditional must-carry obligations were introduced in the 1990’s to address the problem of scarcity of space in analogue and cable broadcasting.Footnote 157 With supply growing quickly, these rules were meant to ensure that public service broadcasting, which was financed by the general public, could actually reach the public.Footnote 158 New must-carry rules are motivated by concerns over the growing private power of large-scale influential platforms over online expression. The goal is to prevent them from arbitrarily restricting access to (lawful) expression. From that perspective of reigning in the significant power of private entities to determine what content is available on their channels or services, the analogy stands.

Crucially, however, there are significant differences. In the current digital environment, for instance, there is no shortage of space on the internet. The problem lies more with the shortage of time and attention of the viewers. Furthermore, the two sets of rules at issue are aimed at different actors: electronic communication networks and services on one hand, and online platforms (a type of hosting service provider), on the other.Footnote 159 The design of the obligations also differs between the two regimes. For traditional must-carry obligations, the focus lies on carrying specific designated channels without restrictions. For the new rules, the scope is varied, but usually relates to certain categories of speakers (eg, media providers or politicians) and specific types of content (eg, journalistic or political). New ‘must-carry’ obligations are also more nuanced in their design, in line with the underlying rationale of safeguarding freedom of expression against certain content moderation measures. They impose procedural safeguards to prevent or mitigate the impact of such measures, such as ex ante and ex post information requirements, as well as review and redress mechanisms.

When comparing traditional and new must-carry obligations, it becomes evident that the similarities lie more with a conceptual representation of what a ‘must-carry’ rule entails rather than with a direct application of these traditional rules to a new technological and business reality. At a high level, a conceptual similarity can be found in the rationale of the approach: both types of rules are grounded in the notion that States may impose limitations on private rights, including fundamental rights, for the protection of the rights of others. This is the case even when the limitation is imposed on the right to freedom of expression. Furthermore, such a limitation takes the shape of an obligation for a provider of significant scale, reach and influence to accommodate third-party content on its service on the grounds of general or public interest, even against their will. But in the end, it is still merely an analogy.

Given the differences between regimes, it is hard to argue that insights from the traditional must-carry obligations are directly applicable to adjudicate freedom of expression conflicts associated with modern rules for online platforms. This realisation allows us to step away from the ‘must-carry’ label and identify the new rules for what they are. In our view, the rules examined above are better understood as restrictions on the rights of platforms in order to recognise a special treatment privilege for some categories of speakers and their expression. As such, for the remainder of this paper we will refer to these rules – including legislative proposals and judgements to effectuate such rules – as special treatment rules.

C. Special treatment and its discontents

Special treatment rules aim to safeguard certain speakers and their lawful content from being subject to moderation restrictions by platforms based on their internal policies and practices. They aim to bring more balance in the relationship between platforms, speakers and the audience in the digital information ecosystem. Arguably, such rules restrict the rights of platforms but do not violate the right to freedom of expression protected under the ECHR and CFREU.

From the perspective of freedom of expression, nevertheless, relevant questions arise. First, what would be the effect of such rules on the ‘platformised public sphere’, and in particular on efforts to curb disinformation?Footnote 160 Second, whether and to what extent should the public interest and the unique role of media in shaping public debate be considered in this discussion?Footnote 161 And finally, whether providing more protection to speech by certain speakers does not undermine the principle of equality of speech in a rule of law-based society. The two primary examples of special treatment rules that we have examined – one in favour of politicians and the other for traditional media providers – highlight these particular aspects.

Special treatment for politicians

The first type of special treatment rule we discuss is meant to protect politicians from content moderation practices by platforms of significant influence.

From our analysis of legislative proposals above in this paper, the quintessential example of a special treatment rule for politicians is the DSA proposal for a ‘Trump amendment’, which meant to create an obstacle for termination of a user account (deplatforming), including by requiring the approval of a judicial authority.Footnote 162 The rejected rule would not have covered other content moderation measures, such as removal, blocking, or labelling specific posts by that user. The limitation was facially based on ‘public interest’,Footnote 163 but that protection would only work in one direction – to protect a specific category of users, those in power. Speakers from an unprivileged position would have no special protection against deplatforming, even if their content was relevant from the public interest perspective. As such, this particular understanding of ‘political speech’ protection would apply exclusively to speech by politicians but not to others speech about politicians.

On a practical level, this type of rule is hardly revolutionary. Many platforms, especially VLOPs, already have internal rules that favour specific categories of users and provide special protective treatment for politicians, world leaders,Footnote 164 or more generally for their ‘high profile accounts’.Footnote 165 In one prominent example, such practices were revealed in the Facebook files leak of 2021 and can be traced in several Meta Oversight Board cases.Footnote 166 Similar findings can be found in the recent audit of the VLOP X (formerly Twitter), which notes that ‘(a)ccounts with large followings or ‘verified’ status appear to be treated differently to regular users’.Footnote 167 The ‘Trump amendment’, however, would legitimise granting special protection to a category of users, who in many cases would have little problem finding alternative means to exercise their right to freedom of expression. Footnote 168 It could be, for instance, through a friendly broadcaster or a widely attended official press conference.Footnote 169 Arguably, the existing alternatives are not fully equivalent to the most popular social media platforms. But expressive opportunities available to politicians still give them advantage in comparison to average users facing deplatforming. Such bans continue to happen for multiple reasons, yet they rarely cause similar controversy or trigger policy discussions.Footnote 170

On this point, it is crucial to provide additional nuance regarding political speech on online platforms. As is well known, the ECtHR’s balancing of freedom of expression in the online context places different weight on different categories of expression. Arguably, political and public interest expression enjoy the highest level of protection under Article 10 ECHR, with the court repeatedly holding that there is ‘little scope’ for restrictions on political speech or debate on matters of public concern, as tolerating even offensive comments may be crucial to enable open democratic debate. When applying this reasoning to speech by politicians, the discussed special treatment rules seem to be a good fit. The ECtHR, however, provided an interesting addition to the discussion in its 2023 ruling in Sanchez v. France,Footnote 171 when it viewed a politician during election period as a category of a ‘high risk speaker’.Footnote 172 The court clarified as well that political speech, even though it ‘calls for elevated level of protection’, is not granted absolute protection.Footnote 173 Applying these insights into our analysis underscores, first, that politicians already enjoy a high-level of protection under the existing framework, there being little justification for a special treatment rule that furthers their protection from platforms. Second, and related, the potential high-risk nature of the politician-as-a-speaker cautions against special treatment rules that may prevent justified moderation practices.

The ultimately abandoned DSA amendment now seems to have been merely an overreaction to the de-platforming of the then-sitting US President. It illustrated perhaps the fear by political leaders of being cut off from their forum. While the amendment was presented as promoting freedom of expression and protecting it from the arbitrary decisions by platforms, it had the perverse effect on the essence of the right by shielding only privileged speakers. As designed, special treatment rules like the Trump Amendment would arguably afford politicians with prima facie protection against hasty moderation decisions. That should not mean, however, that their speech should be afforded near absolute protection, which would go beyond the current human rights law.

Special treatment for media providers

Examples of rules providing for special treatment of content from traditional media providers include the German law’s prohibition to discriminate against journalistic editorial offers by media providers, or the UK Online Safety Act’s protection of news publisher content and journalist content. At the EU level, we identify the DSA media exemption proposal (and to some extent the proposal on public interest information for recommender systems), and the EMFA rule on special treatment of media providers. Footnote 174

The exponent of this approach is Article 18 EMFA. As noted, this provision requires platforms to consult media providers before moderating content and introduces an expedited appeals process. Footnote 175 While it doesn’t prohibit moderation, it makes the process slower and more complex. Critics highlight its narrow definition of media providers, which excludes certain media and journalistic activities.Footnote 176 Concerns also arise over platforms’ role in the self-declaration mechanism, as it may entrench large platforms’ dominance. Although platforms can verify editorial independence and standards with regulatory bodies, it’s unclear if they will actively verify declarations, adopt a passive stance, or selectively scrutinise certain media providers.Footnote 177

Yet another criticism of Article 18 EMFA is its potential impact on efforts to combat disinformation. This concern emerged prominently during the DSA negotiations and resurfaced strongly in the EMFA discussions.Footnote 178 The central issue is that the special protections could be granted to media service providers that actively spread disinformation or serve as propaganda channels for authoritarian governments. This concern becomes especially clear when we consider that public service media in Member States like Hungary and (until recently) Poland – known for disseminating disinformation and state propaganda – could qualify for these protections.Footnote 179

To address the risk of this rule being exploited to transform media providers into ‘propaganda megaphones’, and to ensure that the rule only benefits entities adhering to professional journalism standards, the EMFA introduced additional requirements. These include a requirement for media providers to maintain independence from entities associated with third states and political parties.Footnote 180

Article 18 further contains a provision that limits its application to situations where (i) the content is incompatible with the platform’s terms and conditions, and (ii) does not contribute to a systemic risk as defined in Article 34 DSA.Footnote 181 To be sure, Article 34 DSA does not explicitly mention disinformation as an autonomous systemic risk category.Footnote 182 However, Recital 83 DSA clarifies that within the risk category related with the protection of public health (as well as protection of minors, serious negative consequences to a person’s physical and mental well-being, or on gender-based violence), a source of such risk can be coordinated disinformation campaigns.Footnote 183 More broadly, Recital 84 DSA states that when assessing all the categories of the DSA systemic risks – including negative effects on democratic processes, civic discourse, electoral processes, and public security – VLOPs should also focus on lawful information that contributes to those systemic risks. The recital adds that particular attention must be paid to how these providers’ services are used to disseminate or amplify misleading or deceptive content, including disinformation. In this regard, the DSA incentivises VLOPs to take measures against the spread of disinformation and fight disinformation campaigns, including in the context of codes of practice or codes of conduct.Footnote 184 To the extent the spread of disinformation by media organisations is considered a contributing factor to systemic risks under the DSA, then this risk mitigation regime may apply to such content, arguably pre-empting the application of the privilege in Article 18 EMFA.Footnote 185 If that is the case, platforms that comply with their DSA obligations will still be able to take action against disinformation in certain contexts, even if it comes from media organisations, since Article 18 EMFA will not apply in those scenarios.Footnote 186

Is this enough to prevent the negative effect of the provision on the fight against the spread of disinformation? The answer depends on how VLOPs respond to this obligation. The safeguards added to Article 18 EMFA were generally well received by scholars who consider the provision to be ‘well-equipped to avoid becoming a vehicle for disinformation’.Footnote 187 One might question, however, whether these limitations still preserve the original intent of protecting media organisations from potential censorship by platforms, especially in situation when platforms abuse their algorithmic power to limit the visibility and reach of certain content (eg, by downranking or shadowbanning).Footnote 188

The rules on special treatment of media organisation force us to consider yet another aspect. The provision grants privileged status to media organisations for a specific reason. In the words of the Committee of Ministers of the Council of Europe, media have always been ‘the most important tool for freedom of expression in the public sphere, enabling people to exercise their right to seek and receive information’.Footnote 189 Awarding this special treatment is justified by the particular and unique role of media, based on informing the public, shaping public debate and upholding democratic values. Quality media, for this reason, are described as a ‘pillar of democracy’.Footnote 190 Yet, in recent years, traditional media has lost much of its opinion-forming power.Footnote 191 This process is accompanied by decreasing levels of trust in traditional media and declining engagement in the form of news avoidance and news fatigue.Footnote 192 The media landscape has transformed significantly due to rapid technological advancements and in particular the development and dominance of online platforms, which redefined the information sharing practices in the public sphere.

Article 18 EMFA tries to address this changing ecosystem and to reformulate the relationship between media and platforms of significant influence by strengthening the position of legacy media. The rule on special treatment of media organisations, however, amounts to creating privilege for speakers who occupy the top rungs of the information ecosystem hierarchy and who already have established channels to disseminate their message, even if they lost some of their traditional prominence. This special treatment does not protect ordinary speakers or dissenters critical of the status quo. Consequently, this type of rule may undermine the fundamental principle of equality before the law. It promotes the notion that certain types of speech deserve greater protection than others,Footnote 193 effectively transforming the right to freedom of expression into a privilege reserved for a select group of already privileged speakers. From this perspective, then, it is difficult to see how this approach can contribute to reverse the trend and reinstate trust in media organisations.

Reflections

At a general level, the introduction of special treatment rules may be understood as an effort to recalibrate the relationship between platforms of significant influence and their users by limiting the former’s capacity to exercise disproportionate control over particular categories of speakers and their expression. At a more concrete level, however, the heterogeneity of speakers and forms of expression entails distinct rationales and consequences, as illustrated by our analysis of privileges accorded to media organisations and politicians. Accordingly, our evaluation of these two categories of special treatment provisions, within the framework of the right to freedom of expression, leads to different outcomes.

In the case of special treatment rules for politicians, the situation is problematic. The risk of democratic backsliding increases when politicians are granted special protections that shield them from moderation, over and above the already robust safeguards afforded by human rights law to political speech, thereby threatening the impartiality and fairness of public discourse.Footnote 194 For the time being, such forms of special treatment have been abandoned, deservedly. Our view is that they should not be pursued further.

In the case of media organisations, special treatment rules are often motivated on the grounds of ensuring the availability of trustworthy, quality information by strengthening the position of legacy media. This is a commendable aim. However, it is not without risks to the democratic process, information integrity, and informed public discourse.Footnote 195 From our perspective, a poorly calibrated media privilege can unduly restrict platforms of significant influence in the moderation of disinformation and propaganda by media service providers that are either controlled by or politically aligned with governments in their jurisdiction. In doing so, a special treatment rule would have the unintended consequence of removing an important check on such media providers. The scenario is particularly concerning in contexts where media outlets have been co-opted by authoritarian governments, as seen in certain EU Member States. In the EU, the final version of Article 18 EMFA seeks to address this concern through requirements such as the self-declaration of editorial independence, subjection to regulatory oversight or a recognised self-/co-regulatory mechanism, and the disclosure of ownership structures and points of contact, among others. At this stage, however, it remains unclear whether these requirements will be sufficient to effectively mitigate the concerns outlined above. Much will depend on how the provision is implemented and enforced in practice.

Both types of special treatment, however, pose a risk to the principles of democratic governance and the rule of law. By granting preferential protection to already influential speakers, these rules undermine the foundational democratic ideal of equal treatment before the law. They enshrine a tiered system of expression that favours established voices, potentially entrenching existing power structures and stifling dissent. They may also convey a problematic message: that restrictions to speech of those not covered by any privilege are (more) acceptable.Footnote 196 As such, while special treatment rules may be well-intentioned, they must be scrutinised carefully to ensure they do not inadvertently erode democratic values and the rule of law, transforming freedom of expression into a privilege for the already powerful and treating the speech of others as a potential risk.Footnote 197

In the end, our analysis highlights a crucial trade-off in media and platform regulation in light of freedom of expression. There is an increasingly pressing need to protect trustworthy journalistic content, particularly in a polarised and fragmented political and societal landscape. To address this need, the EU legislature is experimenting with rules within a complex legislative framework, including the DSA and EMFA, eventually landing on the special treatment provisions examined here.

While their successful implementation is desirable, we remain concerned that these measures will not remedy the structural power imbalance between legacy media and dominant platforms. Regulatory pressure to shield media organisations from moderation offers only a short-term fix that risks reinforcing their dependency. Requiring a presence on platforms that suppress the reach of quality media is unlikely to shift existing dynamics. More durable solutions are needed, beyond compelled inclusion or forced engagement. Although a full assessment lies outside the scope of this paper, promising alternatives include financial support for quality journalism and local media, and renewed emphasis on user empowerment online.

On the first point – of financial support – in a 2023 assessment of the EMFA proposal, Brogi and her coauthors advanced recommendations on issues they considered crucial for media freedom and pluralism but were absent from the draft regulation. Beyond strengthening political independence, enhancing transparency in relations between media providers and digital intermediaries, and establishing an independent mechanism to monitor implementation of Articles 34 and 35 of the DSA, they highlighted the need to finance journalism as a public good.

They argued that the economic sustainability of journalism is vital for editorial independence and media pluralism, yet the EMFA proposal overlooked direct financial support. Building on limited Creative Europe funding, they recommended expanding such programs to foster innovation, investigative reporting, and local media. Further proposals included rethinking the remit of public service media, channelling revenues from a potential digital tax, and creating a European Fund for Journalism to reduce risks of political capture while supporting cross-border initiatives. They also suggested allocating part of the revenues from international corporate tax reform to media pluralism, ensuring that large platforms contribute to restoring the sustainability of a sector disrupted by digital transformation. These remain valid recommendations to this day, and illustrate the limitations of a special treatment rule as a solution to such broader challenges.

On the second point, we refer to the recent draft CoE recommendations on online safety and empowerment of content creators and users. In this context, user empowerment refers to enabling individuals to expand their understanding, make informed choices, and exercise control over their online experience, so they can benefit from opportunities and address risks without undue burden. It encompasses online measures including effective tools for personalising platform use, mechanisms to exercise and protect rights, and avenues for collective action.Footnote 198 Crucially, the Recommendation states that for content that is lawful but prohibited by platforms’ terms and conditions, States should rely on ‘alternative, proportionate ways of mitigating risks, including user empowerment measures, in the framework of online safety, user empowerment and platform accountability framework’.Footnote 199 The draft recommendation further states that systemic duties on intermediaries regarding lawful content or behavior should not serve as a backdoor for content-specific restrictions lacking a clear legal basis.Footnote 200 A consequence of this approach, as noted by Daphne Keller, is that user empowerment measures offer Sates an alternative to must-carry or special treatment rules that they would otherwise impose to compel platforms to host or prioritise particular types of expression.Footnote 201

Ultimately, although such privileges may be justified in certain contexts and offer limited short-term benefits, we contend that they do not constitute viable long-term solutions. They are unlikely to provide the decisive means to recalibrate the power imbalance between media providers and platforms of significant influence, or to address the broader challenges of independence and trust identified above.

6. Conclusions

The analysis in this paper reveals the increasing reliance on so-called must-carry obligations as a regulatory tool for addressing the challenges posed by online platforms, particularly in safeguarding media content and protecting freedom of expression. Key findings include the identification of new ‘must-carry’ measures in the DSA (in the form of rejected proposals) and the EMFA (Article 18), which aim to enhance protections for media providers while raising concerns about potential overreach and risks to platform autonomy. At the national level, the German prohibition against content discrimination, the UK’s Online Safety Act provisions, and Poland’s abandoned legislative proposal illustrate varied approaches that emphasise media pluralism and user rights while exposing tensions with overarching EU frameworks.

Our analysis of influential case law from European Member State courts (The Netherlands, Germany, Italy, and Poland) dealing with similar obligations on online platforms reaches similar findings and highlights analogous challenges. These measures and judgements reflect a shared goal of curbing undue platform power, primarily on fundamental rights grounds, yet they diverge significantly in methods and scope. Crucially, we argue that these measures are better understood as special treatment rules that restrict platform discretion in content moderation and privilege certain categories of speakers and their expression, namely media organisations and politicians.

Our analysis of special treatment rules for politicians and media providers shows that, while these provisions are designed to rebalance the relationship between large platforms and influential speakers, they risk undermining equality and freedom of expression. Special protections for politicians, such as the proposed ‘Trump amendment’ under the DSA, would have granted political actors additional safeguards against deplatforming even though their speech already enjoys strong protection under human rights law and they possess ample alternative means to communicate. Such measures risk shielding those in power from justified moderation and eroding the integrity of democratic discourse.

Rules privileging media organisations, most notably Article 18 EMFA, aim to safeguard quality journalism and ensure the continued availability of trustworthy information. While these goals are commendable, the mechanism is susceptible to criticism. Granting preferential status to legacy media entrenches the dominance of already privileged actors, risks protecting outlets that disseminate disinformation or propaganda, and may hinder platforms’ ability to address systemic risks from lawful but harmful content under the DSA. Although the EMFA includes safeguards such as editorial independence requirements and oversight mechanisms, their effectiveness remains uncertain.

Both types of special treatment risk creating a two-tier system of expression, where established voices receive disproportionate protection while ordinary speakers remain vulnerable. This dynamic undermines the democratic principle of equality before the law and conveys a troubling message: that restrictions on non-privileged speech are more acceptable. Even when well-intentioned, such rules may reinforce structural imbalances, entrench existing hierarchies, and transform freedom of expression into a privilege for the already powerful.

A more sustainable response requires moving beyond compelled inclusion or forced engagement on dominant platforms. Durable solutions should focus on strengthening the independence and sustainability of journalism through direct financial support, fostering innovation and local media, and establishing safeguards against political capture. At the same time, user empowerment measures must be prioritised as an alternative to special treatment rules, equipping individuals with tools to shape their online environment and exercise their rights without undue dependence on platforms’ often opaque policies.

Ultimately, while narrowly framed privileges may offer temporary relief, they do not provide viable long-term solutions to the structural power imbalances between platforms of significance influence, politicians, and media organisations. If the objective is to restore trust, pluralism, and resilience in the digital public sphere, the path forward lies in reinforcing systemic safeguards, empowering users, and supporting quality journalism as a public good.Footnote 202

Acknowledgements

Aleksandra Kuczerawy’s research in this paper benefited from funding from FWO grant nr. 1214321N, as well as FWO funding in ALGEPI projectG098223N and research visits at the Cyber Policy Center at Stanford University and Católica Global School of Law in Lisbon. João Pedro Quintais’s research in this paper is part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the NWO - Dutch Research Council (grant number: VI.Veni.201R.036). The authors wish to thank Stefanie Boss for research assistance on this paper as well as Lidia Dutkiewicz, Daphne Keller, Hannah Ruschemeier, Graham Smith, Maria Luisa Stasi, Max van Drunen, Folkert Wilman and Julia Zöchling for their helpful comments.

Funding statement

Open access funding provided by University of Amsterdam.

Competing interests

The authors declare none.

References

1 See Council of Europe Committee of Experts on Online Safety and Empowerment of Content Creators and Users, Draft Recommendation CM/Rec(20XX)XX of the Committee of Ministers to Member States on Online Safety and Empowerment of Content Creators and Users (MSI-eSEC(2025)03rev.2, 4 June 2025) <https://rm.coe.int/public-msi-esec-2025-03rev2-draft-cm-recommendation-on-online-safety-a/1680b67ada> accessed 13 October 2025, defining ‘Platforms of significant influence’ as those platforms that due to their size, market share, or impact, play a substantial role in shaping the information environment globally or in a particular territory and thereby materially affect the enjoyment and exercise of freedom of expression and information and other human rights, and the functioning of democracy.

2 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) (DSA), OJ L 277.

3 In light of our analytical focus, the paper does not engage with related debates, including the intricate issue of EU competencies in media regulation and their relationship to the regime of Art 18 EMFA. On this topic, see, eg, E Brogi et al, ‘The European Media Freedom Act: Media Freedom, Freedom of Expression and Pluralism’ (European Parliament (LIBE committee) 2023). <https://www.europarl.europa.eu/thinktank/en/document/IPOL_STU(2023)747930> accessed 8 September 2025.

4 European Commission, ‘Tackling Illegal Content Online – Towards an Enhanced Responsibility of Online Platforms’ COM(2017) 555 final, p 2.

5 It should be noted that some of this information or content may be prohibited by platforms’ terms and conditions.

6 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market (e-Commerce Directive), OJ L 178/1.

7 See, eg, European Commission, ‘Tackling Illegal Content Online – Towards an Enhanced Responsibility of Online Platforms’ COM(2017) 555 final; European Commission, ‘Commission Recommendation (EU) 2018/334 of 1 March 2018 on Measures to Effectively Tackle Illegal Content Online’ C(2018) 1177 final.

8 European Commission, ‘EU Code of Conduct on Countering Illegal Hate Speech Online’ <https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en#theeucodeofconduct> accessed 13 October 2025; Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, OJ 2019, L 130/92 (CDSM Directive); Directive 2010/13/EC, as amended by Directive (EU) 2018/1808, OJ 2018, L 303/69 (AVMSD Directive); Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online, OJ 2021, L 172/79 (Terrorist Content Regulation).

9 Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz, Network Enforcement Act, NetzDG), BGBl. I S. 3352 (2017) (Germany); Kommunikationsplattformen-Gesetz (KoPl-G) BGBl. I Nr. 151/2021 (Austria); Loi n° 2020-766 du 24 juin 2020 visant à lutter contre les contenus haineux sur internet (Law No. 2020-766 of June 24, 2020, to Combat Hateful Content on the Internet, Avia Law), JO, June 25, 2020 (France); Online Safety Act 2023, c 36 (UK).

10 The notion of enhanced responsibility for platforms as a policy focus in EU platform regulation is patent, eg, European Commission, ‘Tackling Illegal Content Online – Towards an Enhanced Responsibility of Online Platforms’ COM(2017) 555 final; European Commission, ‘Commission Recommendation (EU) 2018/334 of 1 March 2018 on Measures to Effectively Tackle Illegal Content Online’ C(2018) 1177 final. The move has also been noted in scholarship, eg, in G Frosio and M Husovec, ‘Accountability and Responsibility of Online Intermediaries’ in G Frosio (ed), The Oxford Handbook of Online Intermediary Liability (Oxford University Press 2020) 613.

11 See Art 3(t) DSA for a definition of ‘content moderation’.

12 Frosio and Husovec (n 10). Increased calls for responsibility of platforms based on non-legal considerations have been already well articulated in literature, eg, M Taddeo and L Floridi, ‘New Civic Responsibilities for Online Service Providers’ in M Taddeo and L Floridi (eds), The Responsibilities of Online Service Providers (Springer 2017); Laidlaw, Regulating Speech in Cyberspace | Law and Technology, Science, Communication (Cambridge University Press 2017).

13 See, for example, J McNamee, ‘The slide from “self-regulation” to corporate censorship’, European Digital Rights Initiative, 2011 <https://www.edri.org/files/EDRI_selfreg_final_20110124.pdf>accessed 12 November 2025; A Kuczerawy, The Code of Conduct on Online Hate Speech: an example of state interference by proxy, 20 July 2016 <https://www.law.kuleuven.be/citip/blog/the-code-of-conduct-on-online-hate-speech-an-example-of-state-interference-by-proxy/> accessed 12 November 2025.

14 See, generally, M Husovec, ‘(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement’ (2021) <https://papers.ssrn.com/abstract=3784149> accessed 12 December 2024. See also D Kaye, The Risks of Internet Regulation How Well-Intentioned Efforts Could Jeopardize Free Speech, Foreign Affairs, 21 March 2024 <https://www.foreignaffairs.com/united-states/risks-internet-regulation> accessed 12 November 2025.

15 See, eg, regarding the prohibition on general monitoring obligations, Art 8 DSA, Art 17(8) CDSMD, and Art 5(8) Terrorist Content Regulation.

16 See, eg, the tiered structure of enforcement obligations in Chapter III of the DSA (including the exclusions for micro and small enterprises in Art 19 and 29), and the special liability rules for small and new platforms in Art 17(6) CDSMD.

17 ML Stasi, ‘Keep It up: Social Media Platforms and the Duty to Carry Content’ (2022) Federalismi.it 184–207.

18 Section 3 below displays a categorisation of ‘must-carry’ proposals and measures.

19 On the concept of propaganda in the online environment, see, eg, C Bjola, ‘Propaganda in the Digital Age’ 3 (2017) Global Affairs 189; AM Guess and BA Lyons, ‘Misinformation, Disinformation, and Online Propaganda’ in J Tucker and N Persily (eds), Social Media and Democracy: The State of the Field, Prospects for Reform (Cambridge University Press 2020) 10.

20 See, eg, E Volokh, The First Amendment and Related Statutes: Problems, Cases and Policy Arguments/by Eugene Volokh, fourth edition (Foundation Press Thomson/West 2011) 496–542.

21 This paper focuses on the European perspective. For US analysis see D Keller, Who Do You Sue? State and Platform Hybrid Power Over Online Speech (Hoover Institution 2019), Aegis Series Paper No. 1902 <https://www.hoover.org/research/who-do-you-sue> accessed 10 December 2024. For the analysis of the recent US Supreme Court judgements on the Texas and Florida laws prohibiting content moderation, see D Keller, Texas, Florida, and the Magic Speech Sorting Hat in the NetChoice Cases, 21 February 2024, Lawfare <https://www.lawfaremedia.org/article/texas-florida-and-the-magic-speech-sorting-hat-in-the-netchoice-cases> accessed 10 December 2024. For Brazil, see Civil Appeal No. 0000412-86.2016.8.24.0175 (Meleiro); Civil Appeal No. 0000447-46.2016.8.24.0175 (Meleiro).

22 For the definition of ‘online platforms’ and ‘content moderation’ see, respectively, Art 3(i) and (t) DSA.

23 Art 1(1) DSA.

24 See Art 3(h) DSA broadly defining ‘illegal content’.

25 We will not discuss systemic risks and mitigation (Art 26 and 27 DSA) as forms of novel ‘must-carry’ proposals. For further reading on those provisions, see Stasi (n 17) as well as SB Micova and D Schnurr, Systemic Risk in Digital Services: Benchmarks for Evaluating the Management of Risks to Electoral Processes (Centre for Regulation in Europe (CERRE) 2024) <https://cerre.eu/publications/systemic-risk-in-digital-services-benchmarks-for-evaluating-the-management-of-risks-to-electoral-processes/> accessed 29 March 2025; MC de Carvalho, ‘It Will Be What We Want It to Be: Sociotechnical and Contested Systemic Risk at the Core of the EU’s Regulation of Platforms’ AI Systems’ 16 (2025) JIPITEC – Journal of Intellectual Property, Information Technology and E-Commerce Law <https://www.jipitec.eu/jipitec/article/view/420> accessed 29 March 2025.

26 On access to justice in Europe, see European Union Agency for Fundamental Rights, Access to Justice in Europe <https://fra.europa.eu/sites/default/files/fra_uploads/1506-FRA-Factsheet_AccesstoJusticeEN.pdf> accessed 12 November 2025.

27 See more in A Kuczerawy, Remedying Overremoval: The Three-Tiered Approach of the DSA, VerfBlog, 2022/11/03 <https://verfassungsblog.de/remedying-overremoval/> accessed 12 December 2024.

28 Recital 59 adds that the possibilities to contest decisions of platforms should not affect the possibility to seek judicial redress. See also Art 21 DSA.

29 For the in-depth analysis of the negotiations surrounding the treatment of journalistic content in relation to content regulation in the DSA, see C Papaevangelou, ‘The non-interference principle’: Debating online platforms’ treatment of editorial content in the European Union’s Digital Services Act’, in European Journal of Communication, vol. 38, 2023, n 5.

30 See Art 3(s) DSA for the definition of ‘recommender system’.

31 See the European Parliament, Committee on the Internal Market and Consumer Protection (IMCO), ‘Explanatory Statement in the Report on the Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and Amending Directive 2000/31/EC (COM(2020) 825 – C9-0418/2020 – 2020/0361(COD))’ (May 2021) 133 <https://www.europarl.europa.eu/doceo/document/IMCO-PR-693594_EN.pdf> accessed 12 November 2025.

32 European Parliament, IMCO Committee, ‘Explanatory Statement in the Report on the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (n 31) 135. This obligation would be included in Art 24a, paragraph (6), stating: ‘Online platforms shall ensure that information from trustworthy sources, such as information from public authorities or from scientific sources is displayed as first results following search queries that are related to areas of public interest’. See also the justification to new Art 24a: ‘The obligations introduced should not only target the VLOPs, but online platforms as such. Consumers should be equally protected irrespective of whether it is a VLOP or a smaller platform’.

33 European Parliament, IMCO Committee, ‘Explanatory Statement in the Report on the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (n 31) 135, Amendment 30, Proposal for a regulation, Recital 52 a (new), p 38.

34 European Parliament, IMCO Committee, ‘Explanatory Statement in the Report on the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (n 31) 135, Art 24 a (new), Justification, p 103.

35 For the opinion of the CULT Committee on Amendment 153 (Art 26(2) DSA), see 2020/0361(COD), ‘Opinion of the Committee on Culture and Education for the Committee on the Internal Market and Consumer Protection on the proposal for a regulation of the European Parliament and of the Council on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (COM(2020)0825 – C9-0418(2020) – 2020/0361(COD))’, at p 93 <https://www.europarl.europa.eu/doceo/document/CULT-AD-693943_EN.pdf> accessed 11 December 2024.

36 Ibid.

37 Ibid., 4.

38 Ibid., 4.

39 See Section 3.

40 A Kuczerawy, ‘Does Twitter Trump Trump? A European Perspective’, (2021) Verfassungsblog <https://verfassungsblog.de/twitter-trump-trump/> accessed 11 December 2024.

41 See E Douek et al, The Lawfare Podcast: Jonathan Zittrain on the Great Deplatforming, 14 January 2021 <https://www.lawfareblog.com/lawfare-podcast-jonathan-zittrain-great-deplatforming> accessed 12 December 2024.

42 European Parliament, IMCO Committee, ‘Explanatory Statement in the Report on the Proposal for a Regulation on a Single Market for Digital Services (Digital Services Act)’ (n 31) 135, Amendment 110, Proposal for a regulation, Art 20 – paragraph 4 a (new) p 91.

43 Ibid.

44 Arts 18 to 20 EMFA.

45 Compare: Council of the European Union, ‘Final Compromise Text: Regulation of the European Parliament and of the Council Establishing a Common Framework for Media Services in the Internal Market (European Media Freedom Act) and Amending Directive 2010/13/EU, 2022/0277 (COD)’ (Brussels, 19 January 2024) <https://data.consilium.europa.eu/doc/document/ST-5622-2024-INIT/en/pdf> accessed 12 November 2025; with European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council Establishing a Common Framework for Media Services in the Internal Market (European Media Freedom Act) and Amending Directive 2010/13/EU’ COM(2022) 457 final. For criticism see P Collings, C Schmon, Electronic Frontier Foundation, EU Media Freedom Act: A Media Privilege in Content Moderation is a Really Bad Idea, 12 July 2023 <https://www.eff.org/deeplinks/2023/07/eu-media-freedom-act-media-privilege-content-moderation-really-bad-idea> accessed 12 December 2024; and Art 19, European Media Freedom Act: Content of media service providers on very large online platforms (Art 17) <https://www.article19.org/wp-content/uploads/2023/03/EMFA-17.pdf> accessed 12 December 2024.

46 For a critical analysis of the provision, see C Papaevangelou and M van Drunen, Institutionalising Platform Dependence: The Paradox of the European Media Freedom Act (pre-print available <https://doi.org/10.31235/osf.io/rbwpx_v1> accessed 12 November 2025). This section is based on L Dutkiewicz, A Kuczerawy, Rights and duties of (news) media services providers to ensure independence and transparency, in: IRIS Report on the news media sector, European Audiovisual Observatory, Strasbourg, forthcoming.

47 The Commission will issue guidelines to facilitate the effective implementation of the functionality (Art 18(9) EMFA). See infra in this section.

48 In the initial EC proposal the media privilege provision only talked about suspending the provision of the services ‘in relation to’ content provided by a media service provide. The wording was inconsistent with the DSA, which distinguishes between restrictions on content accessibility and visibility (ie, removal, blocking of access, and downranking) and suspension of services or accounts. This omission has been corrected in the final version of the act, which now refers to decisions suspending the provision of online services or restricting the visibility of the content. Even though the terms ‘suspension’ and ‘restriction’ are not defined, they would likely cover measures such as removal and delisting (for suspension) and demotion (for restriction).

49 Art 18(4) EMFA.

50 In line with Art 36 DSA (Crisis response mechanism).

51 See more in MZ Van Drunen et al, ‘What can a media privilege look like? Unpacking three versions in the EMFA’ (2024) Journal of Media Law <https://doi.org/10.1080/17577632.2023.2299097> accessed 12 November 2025.

52 Art 18(4) EMFA lists specifically VLOPs obligations pursuant to Arts 28, 34, and 35 of Regulation (EU) 2022/2065 and Art 28b of Directive 2010/13/EU or their obligations relating to illegal content pursuant to Union law.

53 There are however, concerns that the special protection would apply in case of misleading or hateful-but-not-illegal information from the beneficiary media service provider to remain online, amplifying it further and, as a result, threaten marginalised or vulnerable groups. See SA Allioui, EU Media Freedom Act: the convolutions of the new legislation, EU Law Analysis, 6 June 2024 <https://eulawanalysis.blogspot.com/2024/06/eu-media-freedom-act-convolutions-of.html> accessed 12 December 2024.

54 See discussion below at 5.C. Broadly on the interaction between Art 18 EMFA and the DSA, see K Klafkowska – Waśniowska, ‘Taking Extra Care of the Media?’ [2024] Verfassungsblog <https://verfassungsblog.de/taking-extra-care-of-the-media/> accessed 24 September 2025; M Monti, ‘In Defence of Art 18 – EMFA Observatory’ (Centre for Media Pluralism and Media Freedom, 1 November 2024) <https://cmpf.eui.eu/in-defence-of-article-18-of-the-emfa/> accessed 24 September 2025.

55 Art 18(6) EMFA.

56 If no amicable solution is found, the media service provider may use the mediation mechanism of the Art 12 of Regulation (EU) 2019/1150 or the out-of-court dispute settlement of Art 21 DSA.

57 European Commission, ‘Commission Seeks Feedback on Protecting Media Service Providers on Online Platforms’ (Press Release, 23 June 2025) <https://digital-strategy.ec.europa.eu/en/news/commission-seeks-feedback-protecting-media-service-providers-online-platforms> accessed 12 November 2025.

58 J Barata, ‘Protecting Media Content on Social Media Platforms’, (2022) Verfassungsblog <https://verfassungsblog.de/emfa-dsa/> accessed 12 December 2024.

59 See also T Seipp, RÓ Fathaigh and M van Drunen, ‘Defining the “Media” in Europe: Pitfalls of the Proposed European Media Freedom Act’ 15 (2023) Journal of Media Law 39.

60 Eg Committee of Ministers, ‘Reply to Recommendation on the Protection of Journalists’ Sources’ Doc 12834 (23 January 2012) Recommendation 4 <https://assembly.coe.int/nw/xml/XRef/Xref-XML2HTML-en.asp?fileid=13207&lang=en> accessed 12 December 2024: ‘(…) the definition of media is changing. In certain circumstances, the protection of sources may need to be extended to new media actors’, or Committee of Ministers, ‘Recommendation CM/Rec(2011)7 of the Committee of Ministers to Member States on a New Notion of Media’ (21 September 2011, published 2013) <https://edoc.coe.int/en/media/8019-recommendation-cmrec20117-on-a-new-notion-of-media.html> accessed 13 October 2025.

61 See, eg, Parliamentary Assembly of the Council of Europe (PACE), ‘Recommendation 1950 (2011) on the Protection of Journalistic Sources’ Recommendation 15 <http://assembly.coe.int/nw/xml/XRef/Xref-XML2HTML-EN.asp?fileid=17943&lang=en> accessed 13 October 2025 (explaining that the protection of journalistic sources is a professional privilege based on commitment to confidentiality that ‘does not exist with regard to non-journalists, such as individuals with their own website or web blog’).

62 van Drunen et al (n 51) 7.

63 See Barata (n 58). Barata points out that guidance on the topic has been issued by several European institutions, eg, the Council of Europe, or the European Broadcasting Union. The EU has provided indicators to assess independence in the Protocol annexed to the Treaty of Amsterdam as well as the Communication from the Commission on the application of State aid rules to public service broadcasting. Nevertheless, assessing the independence is extremely difficult and controversial, especially in some countries. See more in the European Audiovisual Observatory publication: FJ Cabrera Blázquez, M Cappello, J Talavera Milla and S Valais, ‘Governance and independence of public service media’, (2022) IRIS Plus <https://rm.coe.int/iris-plus-2022en1-governance-and-independence-of-public-service-media/1680a59a76> accessed 12 December 2024.

64 Examples of these include legal and political factors, funding, business models, etc.

65 The Centre for Media Pluralism and Media Freedom (CMPF), the Media Pluralism Monitor (MPM) covers 32 European countries (EU 27 plus Albania, Montenegro, Republic of North Macedonia, Serbia and Turkey). The latest implementation of the MPM also features preliminary studies on Bosnia and Herzegovina, Moldova, and Ukraine. See more on the MPM and the methodology at Centre for Media Pluralism and Media Freedom (CMPF), Media Pluralism Monitor <https://cmpf.eui.eu/media-pluralism-monitor/; https://cmpf.eui.eu/media-pluralism-monitor/> accessed 12 December 2024.

66 Twelve countries scoring high-risk in the sub-indicator Political Independence. See Centre for Media Pluralism and Media Freedom, Monitoring Media Pluralism in the Digital Era (Publications Office of the European Union 2024) <https://cadmus.eui.eu/handle/1814/77028> accessed 12 December 2024, 7, 105.

67 Papaevangelou and van Drunen (n 46).

68 The mediation mechanism under Art 12 of Regulation (EU) 2019/1150 or the out-of-court dispute settlement under Art 21 of Regulation (EU) 2022/2065.

69 See more in MZ Van Drunen, N Helberger and R Fahy, ‘The platform-media relationship in the European Media Freedom Act’, (2023) Verfassungsblog <https://verfassungsblog.de/emfa-platforms/> accessed 12 December 2024. NB this leads to some questions about the relation between the EMFA, the DSA and the Platform-to-business Regulation, which are beyond the scope of this paper.

70 Monti (n 54).

71 D Tambini, ‘What Is Journalism? The Paradox of Media Privilege’ 5 (2021) European Human Rights Law Review 523–39.

72 See Section 5C.

73 The Interstate Treaty on Media (Medienstaatsvertrag-MStV) entered into force in November 2020 to amend the previous Interstate Treaty on Broadcasting and Telemedia (Rundfunkstaatsvertrag), which up until now was the main regulatory framework for public-service and commercial broadcasting in Germany <https://www.die-medienanstalten.de/service/rechtsgrundlagen/medienstaatsvertrag/> accessed 12 December 2024. For English translation see Interstate Media Treaty, Non-official translation <https://www.die-medienanstalten.de/fileadmin/user_upload/Rechtsgrundlagen/Gesetze_Staatsvertraege/Interstate_Media_Treaty_en.pdf> accessed 12 December 2024. At time of writing, a draft of the Sixth Amendment Treaty to the Medienstaatsvertrag (MStV) awaits approval by all 16 state parliaments in Germany, with entry into force envisaged for 1 December 2025. The amendment mainly concerns changes to the Interstate Treaty on the Protection of Minors in the Media, while § 94 MStV (non-discrimination), the provision we discuss in this section, remains unaffected. See Rundfunkkommission, ‘Synopse: 6. Änderung des Medienstaatsvertrags’ (15 May 2023) <https://rundfunkkommission.rlp.de/fileadmin/rundfunkkommission/Dokumente/Beschluesse/Anlage_2023-05-15_RFK_TOP_1_6._MAEStV_Synopse.pdf> accessed 12 November 2025.

74 Examples of positive approach (eg, amplification, prominence-boosting or prioritisation) can be found also in other parts of the world, eg, UK, Canada and China. See E Mazzoli, ‘A comparative lens on prominence regulation and its implications for media pluralism. A working paper’, The 49th Research Conference on Communication, Information and Internet Policy (2021) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3898474> accessed 12 December 2024. See also AlgorithmWatch, ‘Germany’s New Media Treaty Demands That Platforms Explain Algorithms and Stop Discriminating. Can It Deliver?’ (2020) <https://algorithmwatch.org/en/new-media-treaty-germany/> accessed 12 December 2024.

75 Art 2(16) of the Interstate Treaty on Media (Medienstaatsvertrag-MStV).

76 N Helberger, P Leerssen and M Van Drunen, ‘Germany proposes Europe’s first diversity rules for social media platforms’ (2019) <https://blogs.lse.ac.uk/medialse/2019/05/29/germany-proposes-europes-first-diversity-rules-for-social-media-platforms/> accessed 12 December 2024.

77 This criteria should be published in accordance with Art 93(1) Interstate Treaty on Media (Medienstaatsvertrag-MStV).

78 Art 94(3) of the Interstate Treaty on Media (Medienstaatsvertrag-MStV).

79 UK Online Safety Act 2023, 2023 c 50 <https://www.legislation.gov.uk/ukpga/2023/50/enacted> accessed 12 December 2024.

80 The UK Online Safety Act went into force in 2023. The Act provides a cross-cutting duty that the regulated user-to-user services have a duty to give particular regard to the importance of protecting users’ right to freedom of expression, when implementing safety measures and policies prescribed in the Act. The services must also carry out impact assessments of the measures and policies on the right to freedom of expression and the right to privacy. See Section 22 of the Act.

81 See Section 55(2)(g) UK Online Safety Act. There are two primary categories of news publisher content: (1) generated by UK-regulated broadcasters and (2) generated by other recognised news publishers. ‘Recognised news publishers’ are defined as publishers which (among other requirements) produce ‘news-related material’ with editorial control, with a complaints procedure, are registered business address in the UK, and are subject to a standards code (Section 56).

82 Eg, full articles, recordings or links to them but not a screenshot, not a photo or an excerpt. See Section 55(10) UK Online Safety Act.

83 Category 1 services, which are regulated user-to-user services included in the OFCOM register established under Section 82(2)(a). The UK Online Safety Act applies to ‘search services’ and ‘user-to-user services’, defined broadly, as ‘an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service’ (Section 2 (1)).

84 Eg, by giving a warning, suspending or banning a user from a service.

85 Section 17(3) UK Online Safety Act.

86 Section 17(7) UK Online Safety Act. NB the provision on protecting content of democratic importance does not foresee a specific appeal procedure.

87 Department for Digital, Culture, Media, and Sport, Fact sheet on enhanced protections for journalism within the Online Safety Bill, 23 August 2022 <https://www.gov.uk/government/publications/fact-sheet-on-enhanced-protections-for-journalism-within-the-online-safety-bill/fact-sheet-on-enhanced-protections-for-journalism-within-the-online-safety-bill> accessed 12 December 2024.

88 Even if not primarily targeted by them as this content is already exempted through Section 55(2)(g) UK Online Safety Act.

89 See J Norton, ‘Nadine Dorries: How I Will Preserve Freedom of the Press Online under Toughened-Up Internet Laws’ (The Daily Mail 2022) <https://www.dailymail.co.uk/news/article-10621259/Nadine-Dorries-preserve-freedom-Press-online.html> accessed 13 October 2025.

90 Section 19(10) and (11) UK Online Safety Act (referring to content originating from recognised news publishers or other users, if it is generated for the purposes of journalism and is UK-linked).

91 Section 19(4) and (5) UK Online Safety Act.

92 Ustawa o ochronie wolności słowa w internetowych serwisach społecznościowych, proposal from 15 January 2021 <https://www.gov.pl/web/sprawiedliwosc/zachecamy-do-zapoznania-sie-z-projektem-ustawy-o-ochronie-wolnosci-uzytkownikow-serwisow-spolecznosciowych> accessed 12 December 2024. For a summary in English, see Panoptykon Foundation, ‘Polish law on “protecting the freedoms of social media users” will do exactly the opposite’, (2021), available at <https://edri.org/our-work/polish-law-on-protecting-the-freedoms-of-social-media-users-will-do-exactly-the-opposite/> accessed 12 December 2024.

93 See the official portal of the Ministry of Justice, ‘Ochrona wolności słowa użytkowników serwisów społecznościowych’, (2022), available at <https://www.gov.pl/web/sprawiedliwosc/ochrona-wolnosci-slowa-uzytkownikow-serwisow-spolecznosciowych2> accessed 12 December 2024 (translation by the authors).

94 NB This usually refers to removal of aggressive anti-LGBTQ+ rhetoric that does not qualify as hate speech under Polish legislation but often qualifies as such under platforms’ T&C’s. See more the on broader legal, political and institutional context in L Dutkiewicz and J Czarnocki, ‘The DSA Proposal and Poland’ (DSA Observatory Blog 2021) <https://dsa-observatory.eu/2021/10/22/the-dsa-proposal-and-poland/> accessed 12 December 2024.

95 Art 19 and 20 Ustawa o ochronie wolności słowa w internetowych serwisach społecznościowych (Polish Law Proposal).

96 See, for example, Amnesty International, Poland: Verdict in prosecution of women who put up posters of Virgin Mary with rainbow halo expected, 11 January 2022 <https://www.amnesty.org/en/latest/news/2022/01/poland-verdict-in-prosecution-of-women-who-put-up-posters-of-virgin-mary-with-rainbow-halo-expected/> accessed 17 December 2024.

97 Art 22 Ustawa o ochronie wolności słowa w internetowych serwisach społecznościowych (Polish Law Proposal).

98 K Sobczak, ‘Organizacje nie chcą ustawy o ‘wolności słowa w internecie’, (2022) <https://www.prawo.pl/prawo/sprzeciw-wobec-ustawy-o-wolnosci-slowa-w-internecie,513032.html> accessed 12 December 2024.

99 Ombudsman of Poland, ‘Projekt ustawy o ochronie wolności słowa w internecie. Wątpliwości Marcina Wiącka wobec niektórych propozycji MS’, (2021) <https://bip.brpo.gov.pl/pl/content/rpo-ms-uwagi-projekt-wolnosci-slowa-internet> accessed 12 December 2024.

100 Ibid.

101 No requirement for the political independence of the Council members. See also Panoptykon Foundation (n 92), Sobczak (n 98), and Ombudsman of Poland (n 99).

102 European Commission, ‘Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions: Tackling Online Disinformation – A European Approach’ COM(2018) 236 final, 14.

103 Centre for Media Pluralism and Media Freedom, ‘Monitoring Media Pluralism in the Digital Era – Application of the Media Pluralism Monitor in the European Union Member States and in Candidate Countries in 2023 (Media Pluralism Monitor 2024)’ <https://cmpf.eui.eu/media-pluralism-monitor-2024/> accessed 12 December 2024.

104 See Section 3.

105 Eg, this happened in case of removal of the iconic picture of Phan Thị Kim Phúc (‘Vietnam napalm girl’) or the prehistoric ‘Venus of Willendorf’ figurine. See more in Levin, Carrie Wong, Harding, ‘Facebook backs down from “napalm girl” censorship and reinstates photo’, (2016) The Guardian <https://www.theguardian.com/technology/2016/sep/09/facebook-reinstates-napalm-girl-photo> accessed 20 June 2023; and ‘Facebook apologises for censoring prehistoric Venus statue’, (2018) Phys.org <https://phys.org/news/2018-03-facebook-apologises-censoring-prehistoric-venus.html> accessed 20 June 2023. In 2011, Facebook deactivated an account of a teacher, Frederic Durand after he posted a picture of a 19th-century painting of a woman’s genitalia (‘L’Origine du Monde’, an 1866 oil painting by Gustave Courbet). The case was resolved in 2018, by dismissal. See more in J Schmid, S Bouderbala, ‘Facebook denies ‘censoring’ 19th-century vagina painting’, (2018) Phys.org <https://phys.org/news/2018-02-facebookdenies-censoring-19th-century-vagina.html> accessed 20 June 2023; and ‘French court throws out Facebook nude Art ‘censorship’ case’, (2018) France 24 <https://www.france24.com/en/20180315-french-court-facebook-nude-art-censorship-courbet> accessed 20 June 2023.

106 District Court of Amsterdam, Café Weltschmerz v YouTube (9 September 2020) ECLI:NL:RBAMS:2020:4435; and District Court Amsterdam, Smart Exit v Facebook (13 October 2020) ECLI:NL:RBAMS:2020:4966.

107 See for a longer analysis and details: B van der Donk, ‘Should Critique on Governmental Policy Regarding Covid-19 Be Tolerated on Online Platforms? An Analysis of Recent Case-Law in the Netherlands’ 13 (2021) Journal of Human Rights Practice 426–32.

108 District Court of Amsterdam, van Haga v Youtube (18 August 2021) ECLI:NL:RBAMS:2021:4308 <http://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBAMS:2021:4308> accessed 12 November 2025. District Court of Amsterdam, Forum v YouTube (15 September 2021) ECLI:NL:RBAMS:2021:5117 <http://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBAMS:2021:5117> accessed 12 November 2025. District Court of Noord-Holland, van Haga v LinkedIn (6 October 2021) ECLI:NL:RBNHO:2021:8539 <http://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBNHO:2021:8539>accessed 12 November 2025; Rotterdam District Court, Engel v Facebook (29 October 2021) ENCLI:NL:RBROT:2021:10459 <https://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBROT:2021:10459> accessed 12 November 2025. See for a longer analysis and further details, see JP Quintais, N Appelman and RÓ Fathaigh, ‘Using Terms and Conditions to Apply Fundamental Rights to Content Moderation’ 24 (2023) German Law Journal 881.

109 District Court of Amsterdam, Forum voor Democratie v Google (15 September 2021) ECLI:NL:RBAMS:2021:5117, at 4.8.

110 District Court of Amsterdam, Forum voor Democratie v Google (15 September 2021) ECLI:NL:RBAMS:2021:5117, at 4.9.

111 See Court of Rome, CasaPound Italia and Davide Di Stefano v Facebook Ireland (11 December 2019) R.G. 59264/2019 <https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2020/01/sentenzacpifb.pdf> accessed 17 December 2024, and the decision dismissing the appeal: Court of Rome, Facebook Ireland v CasaPound Italia (29 April 2020) <https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2020/02/ordinanza-FaceBook-CasaPound-reclamo.pdf> accessed 17 December 2024; M Manna, ‘Facebook vs CasaPound: The Deactivation of CasaPound’s Page and Account Is Unlawful’ (Martini Manna Blog 2020) <https://www.martinimanna.com/blog/facebook-vs-casapound-the-deactivation-of-casapounds-page-and-account-is-unlawful> accessed 17 December 2024.

112 See more in Columbia University, Facebook v. CasaPound (Global Freedom of Expression) <https://globalfreedomofexpression.columbia.edu/cases/casapound-v-facebook/> accessed 13 October 2025.

113 See, respectively, Art 49, 18 and 21 of the Constitution of the Italian Republic.

114 Drittwirkung in a doctrinal debate in Germany means ‘third-party effect’. ‘It refers to the possible application of the German Basic Law in cases where both parties are private parties. The ‘third party’ refers to the party outside the classic individual/State relationship who is affected by the constitutional norms’. See A Clapham, ‘The ‘Drittwirkung’ of the Convention’, in RSJ Macdonald, F Matscher and H Petzold (eds), The European System for the Protection of Human Rights (Martinus Nijhoff Publishers 1993) 165.

115 M Kettemann and AS Tiedeke, ‘Back up: Can Users Sue Platforms to Reinstate Deleted Content?’ 9 (2020) Internet Policy Review <https://policyreview.info/articles/analysis/back-can-users-sue-platforms-reinstate-deleted-content> accessed 16 December 2024.

116 OLG München (Higher Regional Court Munich), Judgment of 28 December 2018, 18 W 1955/18, at 19 et seq.; and KG Berlin (Higher Regional Court Berlin), 22 March 2019, 10 W 172/18, at 17.

117 See OLG Stuttgart (Higher Regional Court Stuttgart), Judgment of 6 September 2018, 4 W 63/18, at 73.

118 Kettemann and Tiedeke (n 115) 13.

119 Federal Constitutional Court (Germany), Judgment (22 May 2019) 1 BvQ 42/19, ECLI:DE:BVerfG:2019:qk20190522.1bvq004219 <http://www.bverfg.de/e/qk20190522_1bvq004219.html> accessed 16 December 2024.

120 Kettemann and Tiedeke (n 114) 15.

121 Federal Court of Justice (Germany), Judgment (29 July 2021) III ZR 179/20 <https://juris.bundesgerichtshof.de/cgi-bin/rechtsprechung/document.py?Gericht=bgh&Art=en&nr=121741&pos=0&anz=1> accessed 16 December 2024; and Federal Court of Justice (Germany), Judgment (29 July 2021) III ZR 192/20 <https://juris.bundesgerichtshof.de/cgi-bin/rechtsprechung/document.py?Gericht=bgh&Art=en&sid=47ce4f13cd0917d90e2df8d776db544a&nr=121561&pos=0&anz=1> accessed 20 June 2023.

122 M Kettemann and T Klausa, ‘Regulating Online Speech: Ze German Way’ (Lawfare Blog 2021) <https://www.lawfareblog.com/regulating-online-speech-ze-german-way> accessed 16 December 2024.

123 According to SIN, they ‘don’t know what exactly set off the alarm for Facebook content moderators’ and that made it difficult to correct the situation. See Panoptykon Foundation, ‘SIN v Facebook: Tech Giant Sued over Censorship in Landmark Case’ (2019) <https://edri.org/our-work/sin-v-facebook/> accessed 12 December 2024.

124 SIN is a Polish NGO conducting educational activities on the harms of drug use and providing assistance to people who abuse such substances. See SIN, <https://sin.org.pl/> accessed 16 December 2024. The lawsuit is supported by the Panoptykon Foundation, a NGO focusing on the protection of ‘fundamental rights and freedoms in the context of fast-changing technologies and growing surveillance’. See Panoptykon Foundation, What is Panoptykon < https://en.panoptykon.org/about> accessed 16 December 2024; Panoptykon Foundation, SIN vs. Facebook <https://panoptykon.org/sinvsfacebook/en> accessed 16 December 2024.

125 See more in I Lunden, ‘Facebook Is Being Sued by a Polish Drug Prevention Group over Free Speech Violation’ (TechCrunch 2019) <https://techcrunch.com/2019/05/07/facebook-is-being-sued-by-a-polish-drug-prevention-group-over-free-speech-violation/> accessed 13 October 2025.

126 Postanowienie, ‘Sąd Okręgowy w Warszawie IV Wydział Cywilny’ (2019) <https://panoptykon.org/sites/default/files/various-artykuly-w-skrocie/postanowienie_o_zabezpieczeniu_powodztwa_so_warszawa.pdf> accessed 16 December 2024.

127 Postanowienie, ‘Sąd Okręgowy w Warszawie IV Wydział Cywilny, Sygn. IV Cz 97/20 p-I’, (2021) <https://panoptykon.org/sites/default/files/publikacje/postanowienie_z_uzasadnieniem_prawomocne.pdf> accessed 16 December 2024.

128 See Panoptykon Foundation, Win against Facebook. Giant not allowed to censor content at will (EDRi Blog, 3 April 2024) <https://edri.org/our-work/win-against-facebook-giant-not-allowed-to-censor-content-at-will/> accessed 13 October 2025.

129 In Delfi AS v. Estonia, a major news portal was held liable for hateful and violent user comments. The ECtHR found no Article 10 violation, stressing that Delfi, as a professional commercial outlet, bore heightened duties for user content. In MTE and Index.hu v. Hungary, Hungarian courts held an online news portal (Index.hu) and a self-regulatory platform (MTE) liable for vulgar user comments about a real-estate company. The ECtHR found a violation of Art 10, noting the comments, while offensive, were not hate speech or incitement. See Delfi AS v Estonia App no 64569/09 (ECtHR, 16 June 2015).

130 Committee of Ministers, ‘Recommendation CM/Rec(2011)7 of the Committee of Ministers to Member States on a New Notion of Media’ (21 September 2011, published 2013) <https://edoc.coe.int/en/media/8019-recommendation-cmrec20117-on-a-new-notion-of-media.html> accessed 13 October 2025.

131 The Recommendation observed that the new media policies should reflect the evolving ecosystem, and in particular the roles and specific functions played by the actors involved. Media policies should offer a response that would be of appropriate form (differentiated) and level (graduated), according to the part that media services play in content production and dissemination processes. This means that the role they play in the communication process, either as media or as intermediaries – according to the criteria provided in the Recommendation, should be reflected in the policy framework. See more in A Kuczerawy, Intermediary Liability and Freedom of Expression in the EU: From Concepts to Safeguards (Intersentia 2018).

132 X and Y v Netherlands App no 8978/80 (ECtHR, 26 March 1985) para 23 (in the context of Art 8 ECHR); Plattform ‘Arzte fur das Leben’ v Austria App no. 10126/82 (ECtHR, 21 June 1988) para 32 (in the context of Art 11 ECHR); Appleby and others v. the United Kingdom App. no. 44306/98 (ECtHR, 6 May 2003) para 41 (in the context of Art 10 ECHR). See more in: A Kuczerawy, ‘The Power of Positive Thinking: Intermediary Liability and the Effective Enjoyment of the Right to Freedom of Expression’, 8 Journal of Intellectual Property, Information Technology and E-Commerce (2017) 226–37.

133 JF Akandji-Kombe, Positive Obligations under the European Convention on Human Rights: A Guide to the Implementation of the European Convention on Human Rights (Human Rights Handbooks, Council of Europe 2007) 5.

134 Fuentes Bobo v. Spain App no 39293/98 (ECtHR, 29 February 2000); Özgür Gündem v. Turkey App no 23144/93 (ECtHR, 16 March 2000) para 43. See also Van Dijk, Van Hoof, Van Rijn, Zwaak (eds), Theory and practice of the European Convention on Human Rights, o.c. (Martinus Nijhoff Publishers 1998) 784–5.

135 Dink v. Turkey App no 2668/07, 6102/08, 30079/08, 7072/09 and 7124/09 (ECtHR, 14 September 2010). See more in Angelopoulos et al, Study of fundamental rights limitations for online enforcement through self-regulation (Institute for Information Law, 2015), p 38; T McGonagle, Positive obligations concerning freedom of expression: mere potential or real power? (Council of Europe 2015) 11.

136 For instance, in case of refusal to broadcast an advertisement by a commercial TV company. See Verein gegen Tierfabriken Schweiz (VgT) v. Switzerland App no. 32772/02 (ECtHR, 28 June 2001) paras 45, 48.

137 The ECHR is not an EU instrument. It is addressed to signatory States, members of the Council of Europe.

138 See for example Coty Germany GmbH v Stadtsparkasse Magdeburg (C-580/13, EU:C:2015:485), and UPC Telekabel Wien GmbH v. Constantin Film Verleih GmbH, Wega Filmproduktionsgesellschaft mbH (C-314/12, ECLI:EU:C:2014:192).

139 A similar approach was followed by the CJEU to uphold the validity of Art 17 CDSMD in Poland v Parliament and Council (C-401/19, ECLI:EU:C:2022:297). Still, in the same case, the AG makes explicit reference to positive obligations. See Poland v Parliament and Council (C-401/19, Opinion of AG Saugmandsgaard Øe, ECLI:EU:C:2021:613) footnote 90, referring to Appleby and others v. the United Kingdom App no 44306/98 (ECtHR, 6 May 2003) para 39, and Khurshid Mustafa v. Sweden App no 23883/06 (ECtHR, 16 December 2008) para 31.

140 Commission v Hungary (Transparency of associations) (C-78/18, EU:C:2020:476) para 123.

141 La Quadrature du Net (Joined Cases C-511/18, C-512/18 and C-520/18, ECLI:EU:C:2020:791) para 126.

142 Commission v Hungary (Transparency of associations) (C-78/18, EU:C:2020:476) para 122. See similarly Poland v Parliament and Council (C-401/19, ECLI:EU:C:2022:297) on Art 17 CDSMD.

143 Appleby and others v. the United Kingdom App no 44306/98 (ECtHR, 6 May 2003).

144 T McGonagle, ‘The State and beyond: activating (non-)media voices’, in Sousa et al (eds.), Media Policy and Regulation: Activating Voices, Illuminating Silences, Conference contribution (University of Minho – Communication and Society Research Centre 2013) 187–98, 190.

145 Khurshid Mustafa v. Sweden App no 23883/06 (ECtHR, 16 December 2008) paras 45 and 33.

146 See, eg, the AVMSD provisions on protection of minors from harmful content (Art 6a AVSMD) and on prohibition of commercial communication of cigarettes or other tobacco products, of prescription medication and of alcoholic beverages aimed specifically at minors (Art 9 AVMSD). Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation, or administrative action in Member States concerning the provision of audiovisual media services, O.J. 2010, L 95/1, as amended by Directive (EU) 2018/1808.

147 See Council of Europe, ‘Recommendation Rec(2004)16 of the Committee of Ministers to Member States on the Right of Reply in the New Media Environment’ <https://search.coe.int/cm?i=09000016805db3b6> accessed 17 December 2024.

148 See Melnychuk v. Ukraine App no 28743/03 (ECtHR, 5 July 2005). See also Kaperzyński v. Poland App no 43206/07 (ECtHR, 3 April 2012); and Marunic v. Croatia App no 51706/11, (ECtHR, 28 March 2017).

149 In 1989, in Ediciones Tiempo SA v Spain, the ECtHR clearly stated that the right to reply constitutes interference with a newspaper’s freedom of expression, but may be justified under Art 10(2) to protect reputation and rights of others. See Ediciones Tiempo SA v. Spain App no 13010/87 (ECtHR, 12 July 1989). See more in F Hempel, ‘The Right of Reply under the European Convention on Human Rights: An Analysis of Eker v Turkey App No 24016/05 (ECtHR, 24 October 2017)’, 10 (2018) Journal of Media Law 17.

150 See Melnychuk v. Ukraine App no 28743/03 (ECtHR, 5 July 2005).

151 See Stasi (n 17). See also M Fertmann and M Kettemann (eds), Can Platforms Cancel Politicians? How States and Platforms Deal with Private Power over Public and Political Actors: an Exploratory Study of 15 Countries (Verlag Hans-Bredow-Institut 2021), Question 3. For US perspective see E Goldman and J Miers, ‘Online Account Terminations/Content Removals and the Benefits of Internet Services Enforcing Their House Rules’ 1 (2021) Journal of Free Speech Law 191–226.

152 Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code (Recast)Text with EEA relevance, PE/52/2018/REV/1, O.J. 2018, L/321, [hereafter EEEC].

153 EC v. Belgium (C-134/10, ECLI:EU:C:2011:117) para 63.

154 EC v. Belgium (C-134/10, ECLI:EU:C:2011:117) para 54.

155 United Pan-Europe Communications Belgium and Others (C-250/06, ECLI:EU:C:2007:783) para 51.

156 Art 114(1) of the EECC.

157 See N van Eijk, B van der Sloot, ‘Must-Carry Regulation: a Must or a Burden?’ (2021) IRIS plus, 7.

158 See T Roukens, ‘What are we Carrying Across the EU these Days? Comments on the Interpretation and Practical Implementation of Art 31 of the Universal Service Directive’ in Nicoltchev, To Have or Not to Have Must-Carry Rules (IRIS Special 2005).

159 There is a possibility that the same entity could provide both services but basing it on functional analysis, it would be subject to both regimes.

160 See M Monti, The missing piece in the DSA puzzle? Art 18 of the EMFA and the media privilege, in E Brogi (ed), EMFA Under the Spotlight: Towards a Common Regulatory Framework to Foster Media Pluralism? 14 October 2024, Rivista italiana di informatica e diritto, n. 2/2024, DOI 10.32091/RIID0173.

161 For a related discussion from the perspective of media freedom, see L Dutkiewicz and A Kuczerawy, ‘Protecting media content on social media platforms in the EU’, in K Barker and O Jurasz (eds), Handbook of Social Media, Law and Society (Routledge 2025).

162 See above at Section 3.

163 NB also the role of ‘public interest’ as a requirement for other proposals, such as the DSA proposal for recommender systems discussed above at 3.

164 For Facebook/Meta, see N Clegg, ‘Facebook, Elections and Political Speech’ (Meta Newsroom, 24 September 2019) <https://about.fb.com/news/2019/09/elections-and-political-speech/> accessed 13 October 2025. For Twitter/X, see X, ‘World Leaders on Twitter: Principles & Approach’ (X Blog, 15 October 2019) <https://blog.x.com/en_us/topics/company/2019/worldleaders2019> accessed 13 October 2025. NB that at least Meta has explicit policies for ‘newsworthy content’. See Meta, Transparency Center, ‘Our approach to newsworthy content’ (Updated Nov 12, 2024) <https://transparency.meta.com/features/approach-to-newsworthy-content> accessed 17 December 2024.

165 See E Douek, ‘Facebook’s Responses in the Trump Case Are Better than a Kick in the Teeth, but Not Much’ (Lawfare 2021) <https://www.lawfareblog.com/facebooks-responses-trump-case-are-better-kick-teeth-not-much> accessed 17 December 2024.

166 See J Horowitz, ‘Facebook Says Its Rules Apply to All: Company Documents Reveal a Secret Elite That’s Exempt – The Facebook Files’ (The Wall Street Journal 2021) <https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353> accessed 17 December 2024. The leak revealed that: ‘A program known as XCheck has given millions of celebrities, politicians and other high-profile users special treatment, a privilege many abuse’. See also L Dutkiewicz, Media Freedom on VLOPs: DSA’s Regulatory Expectations vs Social Media Platforms’ (Mis)understanding of Public Interest Content (Auteurs & Media, 2025; Vol. 2024; iss. 3; pp. 360–79).

167 See X Independent Audit, Art 37 of the Regulation (EU) 2022/2065 (Digital Services Act), p. 13, para 10 <https://transparency.x.com/content/dam/transparency-twitter/dsa/dsa-audit/TIUC-DSA-Audit-Report-2024-08-27.pdf> accessed 12 December 2024.

168 See more in Kuczerawy (n 40).

169 In extreme cases, they could even start their own social media platform as Donald Trump did with Truth Social.

170 See more on the long history of banning different types of users accounts in J York, Silicon Values: The Future of Free Speech under Surveillance Capitalism (Verso Books 2021). See also Women Press Freedom, Spain: Women Press Freedom Denounces Instagram Suspension of Cristina Fallarás for Exposing Politician’s Gender-Based Violence (2 November 2024) <https://www.womeninjournalism.org/alerts/spain-women-press-freedom-denounces-instagram-suspension-of-journalist-cristina-fallars-for-exposing-politicians-gender-based-violence> accessed 17 December 2024.

171 In Sanchez v. France App no 45581/15 (ECtHR, 15 May 2023), Julien Sanchez, a politician, was convicted for failing to delete anti-Muslim hate speech posted by third parties on his public campaign Facebook page, where the comments remained for six weeks. The ECtHR (Fifth Section, upheld by the Grand Chamber) found no violation of Art 10. The cases did not address the question of moderation by a platform, but focused on increased responsibilities for politicians, in relation to speech they amplify.

172 In the words of the Court: ‘Owing to a politician’s particular status and position in society, he or she is more likely to influence voters, or even to incite them, directly or indirectly, to adopt positions and conduct that may prove unlawful, thus explaining why he or she can be expected to be all the more vigilant’. Sanchez v. France (n 171) para 187.

173 Sanchez v. France (n 171) para 148.

174 Arguably, at EU level, an earlier manifestation of this recent wave of provisions aimed at special treatment for news publishers, vis-a-vis online intermediaries can be seen in Art 15 CDSMD, which grants a new related right for press publishers applicable to online uses of press publications by information society service providers. On this topic, see, eg, U Furgał, ‘The EU Press Publishers’ Right: Where Do Member States Stand?’ 16 (2021) Journal of Intellectual Property Law & Practice 887–93.

175 A similar idea can be found in the UK Online Safety Act, as examined above at 4.A.

176 For an overview of the criticism of the then proposed Art 17 (now 18) EMFA, see Brogi et al (n 3) 61–3. For criticism of the final version, see, eg, Papaevangelou and van Drunen (n 46).

177 One example in the current scenario would be that of media providers that are critical of Donald Trump on the platform X, which is privately owned by Elon Musk, an outspoken supporter of Trump’s presidential bid in 2024.

178 See EU DisinfoLab, Fact-Checkers and Experts Call on MEPS to Reject a Media Exemption in the DSA (5 November 2021) <https://www.disinfo.eu/advocacy/fact-checkers-and-experts-call-on-meps-to-reject-a-media-exemption-in-the-dsa/> accessed 16 December 2024.

179 It could be argued that the public service media providers in those two countries do not meet the legal requirements specified in Art 5 (Safeguards for the independent functioning of public service media providers) and Art 6 (Duties of media service providers) EMFA.

180 Monti (n 160).

181 On top of the limitation regarding content that is illegal or content subject to Art 28b of Directive 2010/13/EU (AVMS Directive).

182 For a general analysis of the DSA regulates disinformation, see RÓ Fathaigh, D Buijs and J van Hoboken, ‘The Regulation of Disinformation Under the Digital Services Act’ 13 (2025) Media and Communication <https://www.cogitatiopress.com/mediaandcommunication/article/view/9615> accessed 9 September 2025.

183 Recital 83 DSA making reference to the systemic risk category in Art 34(1)(d).

184 See Recitals 2, 84, 88, 95, 104, 106, and 108 DSA; reference here is made to the systemic risk categories in Art 34(1)(a)–(c). Thus far the most obvious systemic risk category where disinformation plays a role is in connection with Art 34(1)(c) on ‘any actual or foreseeable negative effects on civic discourse and electoral processes, and public security’. See, eg, Directorate-General for Communications Networks, Content and Technology (European Commission), Digital Services Act: Application of the Risk Management Framework to Russian Disinformation Campaigns (Publications Office of the European Union 2023) <https://data.europa.eu/doi/10.2759/764631> accessed 9 September 2025. NB in July 2025 the Code of Practice of Disinformation was officially integrated into the DSA and converted into a Code of Conduct. See European Commission, ‘The Code of Conduct on Disinformation’ (13 February 2025) <https://digital-strategy.ec.europa.eu/en/library/code-conduct-disinformation> accessed 13 October 2025.

185 NB in principle, the EMFA rules are not considered to conflict with the DSA; in case of ambiguity, the DSA should take precedence. M Cole and C Etteldorf C., EMFA Background Analysis (2023) <https://www.europarl.europa.eu/RegData/etudes/STUD/2023/733129/IPOL_STU(2023)733129_EN.pdf> accessed 17 December 2024, pp 20 et seq.

186 Theoretically, pre-notification requirement could still be in place in case of an isolated disinformation incident on a platform were no systemic risks related to disinformation have previously been identified.

187 Monti (n 160).

188 See more in T Blagojev, K Bleyer-Simon, E Brogi, R Carlini, D da Costa Leite Borges, JE Kermer, I Nenadić, M Palmer, PL Parcu, U Reviglio, M Trevisan and S Verza, ‘Monitoring Media Pluralism in the European Union: Results of the MPM2025’ (European University Institute, Robert Schuman Centre for Advanced Studies, Centre for Media Pluralism and Media Freedom 2025) <https://hdl.handle.net/1814/92916> accessed 13 October 2025, p 29. See also P Leerssen, ‘An End to Shadow Banning? Transparency Rights in the Digital Services Act between Content Moderation and Curation’ 48 (2023) Computer Law & Security Review 105790.

189 Committee of Ministers, ‘Recommendation CM/Rec(2011)7 of the Committee of Ministers to Member States on a New Notion of Media’ (21 September 2011, published 2013) <https://edoc.coe.int/en/media/8019-recommendation-cmrec20117-on-a-new-notion-of-media.html> accessed 13 October 2025, 5.

190 Centre for Media Pluralism and Media Freedom (n 66) 14.

191 N Helberger, ‘The Political Power of Platforms: How Current Attempts to Regulate Misinformation Amplify Opinion Power’ 8 (6) (2020) Digital Journalism 842–54.

192 See N Newman, R Fletcher, K Eddy, CT Robertson and RK Nielsen. Reuters Institute Digital News Report 2023. Reuters Institute for the Study of Journalism <https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2023> accessed 16 December 2024. See also M Verstraete, DE Bambauer, Ecosystem of Distrust, 16 First Amendment Law Review 129 (2017F) <https://scholarship.law.unc.edu/falr/vol16/iss2/3/> accessed 16 December 2024.

193 See Allioui (n 53).

194 Potentially, we may see this scenario unfold in the aftermath of the 2024 US presidential elections where we witnessed a special relationship between the elected president and the owner of a major social media platform – even if the special treatment is a de facto, not de jure, situation.

195 See Council of Europe Committee of Experts, Draft Recommendation CM/Rec(20XX)XX (n 1), identifying this as an explicit risk to freedom of expression.

196 See for a discussion of increasing censorship under the DSA in R Griffin, EU Platform Regulation in the Age of Neo-Illiberalism, 2024 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4777875> accessed 10 December 2024

197 See also G Smith, ‘Carved Out or Carved Up? The Draft Online Safety Bill and the Press’ (Cyberleagle Blog, 16 June 2021) <https://www.cyberleagle.com/2021/06/carved-out-or-carved-up-online-safety.html> accessed 16 December 2024.

198 See Council of Europe Committee of Experts, Draft Recommendation CM/Rec(20XX)XX (n 1) 6.

199 See Council of Europe Committee of Experts, Draft Recommendation CM/Rec(20XX)XX (n 1) 9.

200 See Council of Europe Committee of Experts, Draft Recommendation CM/Rec(20XX)XX (n 1) 9.

201 D Keller, ‘Comment to Council of Europe Committee of Ministers on the Draft Recommendation on Online Safety and Empowerment of Content Creators and Users’ <https://papers.ssrn.com/abstract=5384268> accessed 23 September 2025.

202 See, eg, Brogi et al (n 3); TJ Seipp et al, ‘Between the Cracks: Blind Spots in Regulating Media Concentration and Platform Dependence in the EU’ 13 (2024) Internet Policy Review <https://policyreview.info/articles/analysis/regulating-media-concentration-and-platform-dependence> accessed 29 September 2025; J Donovan, ‘First Came the Bots, Then Came the Bosses – We’re Entering Musk and Zuck’s New Era of Disinformation’ The Guardian (11 November 2024) <https://www.theguardian.com/commentisfree/2024/nov/12/elon-musk-mark-zuckerberg-disinformation> accessed 29 September 2025.