Hostname: page-component-77c78cf97d-9lb97 Total loading time: 0 Render date: 2026-04-24T23:03:03.359Z Has data issue: false hasContentIssue false

Criminal Hate Speech Attributable to Online Platforms: A Call for a Thorough Corporate Remedial Responsibilities Framework in Europe

Published online by Cambridge University Press:  08 September 2025

Eva Nave*
Affiliation:
Leiden University , The Netherlands
Rights & Permissions [Opens in a new window]

Abstract

Online platforms have adopted business models enabling the proliferation of hate speech. In some extreme cases, platforms are being investigated for employing algorithms that amplify criminal hate speech such as incitement to genocide. Legislators have developed binding legal frameworks clarifying the human rights due diligence and liability regimes of these platforms to identify and prevent hate speech. Some of the key legal instruments at the European Union level include the Digital Services Act, the proposed Corporate Sustainability Due Diligence Directive and the Artificial Intelligence Act. However, these legal frameworks fail to clarify the remedial responsibilities of online platforms to redress people harmed by criminal hate speech caused or contributed to by the platforms. This article addresses this legal vacuum by proposing a comprehensive remedial responsibilities framework for online platforms which caused or contributed to criminal hate speech based on the general corporate human rights responsibilities framework.

Information

Type
Scholarly Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press
Figure 0

Figure 1. Corporate remedial responsibilities for adverse human rights impacts.