Why is the DSA an extreme threat to fundamental freedoms?
The DSA requires platforms to remove “illegal content,” which it broadly defines as anything that is not in compliance with EU law or the law of any Member State (Article 3(h)). This could result in the lowest common denominator for censorship across the EU.
Freedom of speech is the cornerstone of a democratic society and includes the right to voice controversial or unpopular opinions. It ensures open debate, allows for the challenging of authority, and is fundamental to exchanging ideas. However, the DSA poses serious risks to this right.
The DSA is deeply flawed. It is built on the idea that “bad speech” is best countered by censorship rather than robust discussion. Furthermore, the DSA gives the European Commission broad power over how platforms handle speech, undermining the free expression essential to democratic societies.
If a censorship law such as the DSA is the “gold standard,” as the Commission has praised its own construct, authoritarian governments worldwide will readily adopt the model, claiming that Western liberal states endorse it.
Allowing “illegal content” to potentially be determined by one country’s vague and overreaching laws pits the DSA against international law standards that require any restrictions on speech to be precisely defined and necessary. This is extremely problematic given the increasing number of absurd so-called “hate speech” laws potentially criminalizing peaceful speech throughout Europe.
Example 1
Germany’s highly controversial NetzDG Law, enacted in 2017, forces digital service providers to enforce sweeping online restrictions on certain kinds of content, linking to provisions of the criminal code and including the broad offence of “insult”.
A person in Germany could see something “insulting” online that they claim is illegal under German law, file a complaint under the DSA, and trigger a take-down of the content for all countries in the EU, including countries where “insult” is not a criminal offence.
Example 2
The DSA forces digital service providers to block specific people or messages, even those from outside the EU, from being heard by Europe. A Latin American president says something that a German believes violates German law. Under the DSA, that speech could be blocked (“content moderated”) from all EU countries.
How does the DSA censor speech?
The DSA is at the heart of Europe’s censorship industrial complex, consisting of several interwoven regulations and codes that give an unaccountable bureaucracy broad power to censor speech. Censorship occurs through vast “content moderation” networks coupled with a powerful enforcement mechanism to force platforms to comply.
“Content
Moderation”
The unelected and largely unaccountable Commission has positioned itself under the DSA to enable sweeping censorship in the name of “public safety” and “democracy”. It does this through a complicated mega-structure that allows the Commission to pull the strings of censorship, making private enterprises complicit and forcing them to comply with the threat of draconian fines.
The DSA creates a censorship industrial complex consisting of an expansive web of outsourced content flaggers, national coordinators, monitoring reporters, and other authorities, with the European Commission at its head. This is a business model dependent on finding content to censor and inconsistent with the standards of the rule of law.
The structure is intentionally unnavigable for the regular citizen to determine what is allowable speech. As platforms
have the obligation to moderate content, the Commission can hide behind the DSA to claim that it itself is not censoring speech.
The DSA applies directly to all Member States without requiring national implementation. National regulators work with existing legal frameworks to create new structures to apply to the DSA alongside domestic laws. In the event of conflict, the DSA overrides national laws.
Content is policed by so-called “trusted flaggers,” including NGOs and private entities, and may even include law enforcement agencies like Europol. This deputizes organizations with their own agendas to enforce censorship at scale.
This system of “flaggers” reports content that they deem “illegal” to the platform. The platform must prioritize flagged content for removal. If the platform deems the content illegal, it must quickly remove it or disable access (by geo-blocking or hiding visibility).
VLOPs are also obligated to proactively prevent “illegal content” by conducting regular risk assessments to identify how their services may spread “illegal content”. Under Article 34 of the DSA, these include “negative effects on civic discourse and electoral processes, and public security”
and “effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental wellbeing”. The efforts include adapting their design, terms and conditions, algorithmic systems, advertising, content moderation, including for “hate speech,” and awarenessraising measures.
Enforcement
A powerful enforcement mechanism ensures compliance. Under the threat of enormous financial penalties and suspension, digital service providers are forced to censor and potentially suspend individuals, and individuals may even be criminally prosecuted.
Penalties for users like you:
If, after content is flagged, the platform deems it illegal on its own review, it must remove or disable access and notify the account.
If users persistently post “illegal content,” platforms can suspend their account (after issuing a warning and with an obligation to be proportionate and for a reasonable period).
Your opinion on life, marriage, family, and abortion can be censored if deemed “illegal content”. To avoid massive fines or penalties, platforms are incentivized to err on the side of caution and remove your post, even if it’s perfectly lawful. Every Member State has a designated Digital Services Coordinator to enforce compliance with the DSA. The Coordinator can seek court orders to rule on the “illegal” nature of content on platforms and then fine and potentially suspend online platforms. This could happen under one of the many over-broad “hate speech” criminal laws in Europe. If the “hate speech” was subjectively determined to threaten the life or safety of a person or persons, it is possible that even peaceful speech without a real threat could be prosecuted (e.g. if, in the case of Päivi Räsänen, someone argued that her Twitter bible post endangered those who identify as LGBT).
What is “hate speech”?
“Hate speech” is not found in any international convention. The DSA relies on the EU’s Framework Decision of 28 November 2008, which defines “hate speech” as incitement to violence or hatred against a protected group of persons or a member of such a group. This circular definition of “hate speech” as incitement to hatred is problematic because it fails to specify what “hate” is.
Due to their vague and subjective nature, “hate speech” laws are inconsistently interpreted and arbitrarily enforced, relying more on the subjective perception of hearers than objective harm. The definition of “hate speech” is not harmonized at the EU level, meaning that what is deemed illegal in one country may not be in another.
Penalties for Platforms:
Platforms evaluate content under the threat of crippling fines with every incentive to censor and none to uphold free speech. They face little to no punishment for unjustly banning content and enormous penalties if they refuse to censor.
If a platform refuses to remove or restrict access to “illegal content” after it has been flagged especially by a “trusted flagger” or regulatory authority the platform may face serious repercussions.
The Digital Service Coordinators have broad powers to investigate platforms, issue orders, impose fines, and escalate cases to the European Commission. When dealing with VLOPs, the Commission can override the Coordinators at any time, giving it direct control over censorship enforcement. For these platforms, the Commission has the same powers as the Coordinators but lacks the requirement of “independence” to which the Coordinators are subject (Article 50(2)).
The Commission or national regulators can impose fines of up to 6% of the platform’s global annual turnover for non-compliance, amounting to billions. If noncompliance persists, platforms may face periodic penalty payments. Finally, it can restrict access to the platform within the EU or suspend operations.
Enhanced Enforcement
The planned “European Democracy Shield” will strengthen the DSA and impose even stricter regulations on online speech. Its stated aim is to protect the EU from foreign information manipulation and interference, particularly in the digital realm, focusing on the integrity of elections and political processes. Together with the DSA, it can be weaponized to target peaceful expression, further empowering unelected bureaucrats to censor.
The DSA grants emergency powers that allow the European Commission to demand additional censorship measures from online platforms during times of crisis without sufficiently precise definitions or limitations.
What is a crisis in this case?
Crisis is defined as “where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it” (Article 36(2)); “Such crises could result from armed conflicts or acts of terrorism, including emerging conflicts or acts of terrorism, natural disasters such as earthquakes and hurricanes, as well as from pandemics and other serious cross-border threats to public health” (para 91).
The Commission may adopt a decision requiring VLOPs to take certain actions in response to the crisis: 1) assess how their services contribute to a serious threat, 2) apply measures to prevent, eliminate, or limit the threat, and 3) report back to the Commission on those measures.
The potential extraordinary measures it identifies are: “adapting content moderation processes and increasing the resources dedicated to content moderation, adapting terms and conditions, relevant algorithmic systems and advertising systems, further intensifying cooperation with trusted
flaggers, taking awareness-raising measures and promoting trusted information and adapting the design of their online interfaces” (para 91).
In a worst-case scenario, the European Commission could crack down on speech at will whenever it decrees a crisis and force platforms to “mitigate risks”. This would prevent citizens from accessing information and sharing views, handing extraordinary power to bureaucrats to control narratives in times of upheaval.
Is there recourse for a censored individual or platform forced to comply with the DSA?
The DSA severely limits the power of national courts to protect citizens’ free speech rights. National courts become the Commission's long arm for censorship. International appeal is possible but costly and onerous.
Appeal Options for Users
A censored individual can try to appeal directly to the platform, use a certified out-of-court dispute resolution mechanism, or appeal to the Digital Services Coordinator. While the out-of-court dispute settlement bodies offer a relatively easy appeal option (5 euros for the individual to submit), their decisions, are not binding, and the platforms are only required to engage in good faith.
If the platform does not, it leaves the user with only more expensive and lengthy judicial recourse. Faced with that reality, many are likely to submit to censorship or preemptively self-censor.
Judicial Recourse
Individuals or the platform can technically challenge censorship in national courts, but the courts are required to comply with Commission decisions. Article 82 states that a “national court shall not take any decision which runs counter to that Commission decision. National courts shall also avoid taking decisions which could conflict with a decision contemplated by the Commission in proceedings.”
Individuals or platforms can take their case to the Court of Justice of the European Union (CJEU), but this is a complex and costly process with strict requirements. The CJEU system takes 1-2 years for a ruling, sometimes longer, and rarely grants interim relief measures.
Is the DSA a problem only for Europe?
The DSA is a digital gag order with global consequences because it can censor you no matter where you live. Because the DSA applies to “Very Large Online Platforms” and search engines accessed within the EU but with a global presence, DSA censorship impacts the entire world.
Extraterritorial Applicability
The DSA explicitly states its extraterritorial applicability as it covers platforms used by people “that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services [the platforms] have their place of establishment” (Article 2(1)).
While the DSA states in Article 9(2)(b) that takedown orders should be “limited to what is strictly necessary to achieve its objective,” grave extraterritorial concerns remain.
De Facto Global Censorship Standards
Platforms may be inclined to adapt their international content moderation policies to EU censorship. If platforms deem something “illegal” under EU rules, that content may be banned everywhere, even in countries with strong free speech protections.
In its letter to European Commissioner Henna Virkkunen, the U.S. House Judiciary Committee wrote: “Though nominally applicable to only EU speech, the DSA, as written, may limit or restrict Americans’ constitutionally protected speech in the United States. Companies that censor an insufficient amount of ‘misleading or deceptive’ speech as defined by EU bureaucrats face fines up to six percent of global revenue, which would amount to billions of dollars for many American companies. Furthermore, because many social media platforms generally maintain one set of content moderation policies that they apply globally, restrictive censorship laws like the DSA may set de facto global censorship standards.”
Europe in the Dark
Individuals outside of Europe could find themselves censored within Europe. This could happen to even a head of state or individual with enormous international reach. In the worst case, blocking content from reaching the 500 million inhabitants of the EU has the potential to cut an entire continent out of the conversation a draconian move with world-changing impact.