This is the culmination of a process that began at least a decade ago.
One of the most important (albeit least reported) developments of 2023 was the launch of the European Union’s Digital Services Act (DSA), which came into full effect in late August and which we covered in the article, The EU’s Mass Censorship Regime Is Almost Fully Operational. Will It Go Global? The goal of the DSA is to combat — i.e., suppress — mis- and disinformation online, not just in Europe but potentially across the world, and is part of a broader trend of Western governments actively pushing to censor information on the Internet as they gradually lose control over key narrative threads.
Here’s how it works: so-called Very Large Online Platforms (VLOPs) and Search Engines (VLSEs) — those with more than 45 million active monthly users in the EU — are required to censor content hosted on their platforms deemed to be illegal by removing it, blocking it, or providing certain information to the authorities concerned. Platforms are also required to tackle hate speech, dis- or misinformation if it is deemed to have “actual or foreseeable negative effects on civic discourse and electoral processes, and public security” and/or “actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.”
Besides take-downs and outright suspensions, other familiar tools at the disposal of tech platforms include de-monetisation, content demotion, shadow-banning and account visibility filtering. The European Commission has primary, but not exclusive, regulatory responsibility for VLOPs and VLOSEs. The same requirements now also apply to all other online service providers, though responsibility for execution and enforcement lies not with the Commission but national authorities.