Pathways to Justice Toolkit

Page 1


PATH WAYS TO JUSTICE

AUTHORS:

Alexandra Giannopoulou (Digital Freedom Fund)

Nikita Kekana (Digital Freedom Fund)

Cesar Manso Sayao (Digital Freedom Fund)

EDITOR:

Alexandra Giannopoulou (Digital Freedom Fund)

COPYEDITOR:

Emma Irving

CREATIVE DIRECTOR:

Marea Zanlungo

GRAPHIC DESIGNER:

studio kitschen www.studiokitschen.com

ILLUSTRATOR:

Tessa www.lotsofbroth.com

Developing Information, Guidance, and Interconnectedness for (Charter) Rights Integration in Strategies for Enforcement

Funded by the European Union. Views and opinions expressed are however those of the authors only and do not necessarily reflect those of the European Union or the CERV Programme. Neither the European Union nor the granting authority can be held responsible for them.

PUBLISHED NOVEMBER 2024

This work is published under a CC BY-SA 4.0 license (Creative Commons Attribution and ShareAlike). The full license is available here: https://creativecommons.org/licenses/by-sa /4.0/legalcode Human-readable summary: https://creativecommons.org/licenses/bysa/4.0/

INTRODUCTION

PART 1

1

Enforcing the EU Charter of Fundamental Rights in the Court of Justice of the European Union

PART 2

Enforcing EU Charter Law in Member State’s National Courts

PART 3

Using the National Human Rights Institutes, Ombuds Institutions & Equality Bodies

PART 4

Using Collective Redress to Improve Access to Justice

7

15

19

26

PART 5

Alternative Quasi-Judicial Pathways: DPA & DSA Mechanisms

CONCLUSION

35

49

INTR O DUCTION O

Systemic and transformative change happens when impacted people act and collaborate with others to build power to challenge the root of their oppression. With this toolkit, we collate and present information, strategies, and best practices designed to protect the EU Charter of Fundamental Rights1 (EU Charter/ the Charter) in the digital sphere through strategic litigation.

1 EU

of

With an ever-increasing proportion of interactions between citizens, the public sector, governments, and commercial entities taking place online and via digital technologies, it is crucial that fundamental rights are upheld in the digital space in order to combat digital harms. This digital transformation brings, according to the European Commission, ‘new opportunities to make fundamental rights more effective but also brings challenges.’2 Through this toolkit, we invite the reader to witness the instrumental role of the EU Charter in countering the systemic oppression, harms, and injustices that are encoded in technology. The immense potential of the EU Charter as a means of resistance against technological bias and oppression sits at the core of this toolkit.

The EU Charter, drafted at the turn of the millennium, represents an important contribution to the canon of binding legal instruments that make up the European human rights framework. It exists in addition to and alongside other international and European human rights instruments, including the Universal Declaration of Human Rights (UDHR), the European Convention on Human Rights (ECHR), a range of subject-matter specific instruments, and national constitutions and ‘bills of rights’.

2 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, ‘Strategy to strengthen the application of the Charter of Fundamental Rights in the EU’ (2020), p. 2.

The Charter is binding on both EU institutions and Member States when they act within the scope of EU law. It can therefore play a role in filling existing gaps in national and international human rights frameworks and provide an additional layer of protection. With the entry in force of the Treaty of Lisbon,

Charter
Fundamental Rights of the European Union [2010] OJ C 326.

the Charter became a legally binding instrument with the same legal value as the Treaties. As a result, the Charter is becoming the primary avenue for rightsbased claims, since it can offer tangible opportunities for individuals to directly enforce fundamental rights before the courts, including in relationships between private parties.

From the practical perspective of procedural rules, the Charter can be used in strategic litigation when a given right is not guaranteed by any other binding and enforceable document (such as the right to good administration). The Charter can also be used to strengthen claims that the rights contained in other human rights documents have been violated (most often those under the ECHR).3

However, it has been shown in practice that ‘references to the Charter are formal, declaratory, even decorative and combined with references to the ECHR, without distinction.’4 As a result, it is clear that the potential of many Charter provisions is in need of further exploration.

The host of rights and freedoms the EU Charter articulates creates the image of a modern (digitally aware) human rights instrument. It is the only binding international legal instrument that distinctly mentions a right to the protection of personal data as clearly distinguished from the right to privacy. As evidenced in case law and in our case studies, Articles

7 (privacy) and 8 (protection of personal data) of the EU Charter are the foundational rights around which most digital rights cases are built. However, there are other Charter rights and freedoms that are or can be a solid foundation for strategic digital rights litigation.

the establishment and expansion of dataextractive business and revenue models, and the growing reliance on technologymediated decision-making processes by both public and private entities.

in particular. The outcome of strategic litigation cases can lead to changes in legislation and government policy, raise public awareness, and foster support for a particular issue.7

3 See also Łukasz Bojarski, Jane A. Hofbauer, and Natalia Mileszyk, ‘The Charter of Fundamental Rights as a Living Instrument: Guidelines for Civil Society’ (2014) <https:// gmr.lbg.ac.at/wp-content/uploads/sites/12/2021/09/cfreu_ guidelines.pdf> accessed 12 October 2024.

4 Jeremias Adams-Prassl and Michal Bobek, ‘Introduction’ in Michal Bobek and Jeremias Adams-Prassl (eds), The EU Charter of Fundamental Rights in the Member States (Hart publishing 2022), p. 7.

The Charter’s potential for protecting digital rights goes even further and includes a host of other rights and freedoms that–in the face of the ongoing digitisation and datafication of everyday lives–are likely to grow in importance and utility in the digital sphere. These rights and freedoms, like all human rights, are designed to uphold EU values and the rule of law and ultimately to protect individuals and groups from being subjected to injustice, discriminatory treatment, and exclusion from opportunity. Importantly, such abuses can be increasingly observed in the digital domain in connection with the use of artificial intelligence (AI), access to and (selective) provision of digital goods and services,

Many of the case studies presented in this toolkit address direct harms and highlight the impact of fundamental rights violations resulting from invasive technosocial systems. Creating a corpus of texts highlighting litigation pathways built on the link between fundamental rights and digital rights is ultimately an attempt to address the legacy of power in the context of digital technologies.

STRATEGIC LITIGATION

To be considered strategic, litigation must have the potential to have an impact beyond the parties directly involved in the case and to bring about legislative, policy, or social change.5 Though a ‘terminological forest’6 surrounds the concept, we understand successful strategic litigation as entailing widespread benefits for both society in general and those involved in a case

5 See Nani Jansen Raventlow, ‘Litigation as an instrument for social change – laying the foundations for DFF’s litigation support’ (Digital Freedom Fund Blog, 29 November 2017) <https://digitalfreedomfund.org/litigation-as-an-instrument-for-social-change-laying-the-foundations-for-dffs-litigation-support/> accessed 12 October 2024.

6 Michael Ramsden and Kris Gledhill, ‘Defining Strategic Litigation’ (2019) 38(4) Civil Justice Quarterly 407.

In a blog post for the European Roma Rights Centre, strategic litigation is compared to Nassim Nicholas Taleb’s Black Swan: [f]irst it is an outlier, as it lies outside the realm of regular expectations, because nothing in the past can point to its possibility. Second, it carries an extreme impact (…) Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable.

7 Equinet, ‘Strategic Litigation Handbook’ (2017) <https://equineteurope.org/wp-content/uploads/2018/05/ equinet-handbook_strategic-litigation_def_web-1. pdf> accessed 30 September 2024. See also, Public Law Project, ‘Guide to Strategic Litigation’ (2014) <https:// publiclawproject.org.uk/content/uploads/data/resources/153/40108-Guide-to-Strategic-Litigation-linked-final_1_8_2016.pdf> accessed 30 September 2024.

STRUCTURE OF THE TOOLKIT

The Pathways to Justice Toolkit aims to provide civil society organisations (CSOs), litigators, and activists with concrete advice and guidance on the potential of strategic litigation for protecting digital Charter rights.

When developing a litigation strategy, it is important to assess not only the rights that may be enforced in each context, but also the different procedural opportunities, causes of action, and remedies that are available to raise digital rights issues before the courts.

With this digiRISE Pathways to Justice toolkit, the DFF seeks to build capacity and raise awareness amongst litigators of different legal frameworks that could be used to protect and promote digital rights.

Across digiRISE activities, we have developed a corpus of materials and research and a network of experts in order to systematise the different avenues available for CSOs, activists, and litigators to strategically exploit the Charter’s potential. There are opportunities to explore and leverage different litigation models in order to strengthen strategic digital rights litigation. These include mass, group, collective, and representative litigation, as

well as quasi-judicial pathways and out-ofcourt processes. The objective of digiRISE is to reveal a spectrum of diverse pathways that can be explored to make the EU Charter more effective in the digital realm.

The protection of digital rights as fundamental rights has predominantly taken place on a CJEU level, with the effective use of Article 267 of the Treaty on the Functioning of the European Union8 (TFEU). However, we are aware that there are other avenues for using the EU Charter in national jurisdictions. In this toolkit, we present five different judicial and quasijudicial pathways at the EU or Member State level. Each case study attempts to reveal the opportunities and challenges of a given pathway. The focus of the toolkit is primarily on digital rights, but many of the principles apply to the use of the EU Charter more generally.

The objectives of the toolkit are:

1. to provide concrete procedural information on the judicial pathways that are available to CSOs, activists, and litigators to invoke the EU Charter;

2. to articulate how strategic litigation before different courts, independent authorities, or other actors can be impactful when combined with other organising strategies; and

8 Article 267 of the Consolidated version of the Treaty on the Functioning of the European Union [2016] OJ C202/47.

3. to provide case studies, takeaways, and reference resources (including a list of strategic litigation guidelines, expert tips, and examples) to assist CSOs, activists, and litigators as they embark upon strategic litigation.

This toolkit presents the following pathways to justice: the CJEU, national courts, National Human Rights Institutes or Ombuds institutions, representative actions, and quasi-judicial pathways such as complaints to Data Protection Authorities and Digital Service Coordinators, as well as other options provided by legal instruments such as the Digital Services Act.

ENFORCING THE EU CHARTER OF FUNDAMENTAL RIGHTS IN THE

The case law of the CJEU has immense potential which can be leveraged through strategic litigation. In the last decade, this potential has become more salient as the CJEU has adjudicated more regularly on important human rights issues.

This pathway explains how the EU Charter can be used in the CJEU for digital rights strategic litigation cases. We examine the process enshrined in Article 267 of the TFEU in order to provide insights into how organisations and individuals from all Member States can raise strategic fundamental rights questions before the CJEU via a reference from a Member State national court.

USING THE EU CHARTER IN THE CJEU

We recognise that many CSOs frequently formulate their actions before national courts in such a way as to invoke the preliminary ruling mechanism, and in this way, benefit from the impact of a potential CJEU decision. This is the judicial route most CSOs strategize towards when attempting to use the EU Charter. The advantage of this judicial route is the direct, vertical, and expanded impact that a potential CJEU decision might have for the topic at hand. Considering the procedural rules laid out in EU law, combined with the recent adjudication in human rights causes, the consitutionalisation of the CJEU could become one of the most impactful tools available to CSOs and human rights coalitions.

The constitutionalisation of the CJEU is understood as the process that enhances step-bystep substantive and procedural safeguards that allow to keep (executive) power in check. Concretely, this means that constitutionalization strengthens the ‘triad of constitutionalization’—that is democratic scrutiny by parliaments, judicial review, and fundamental rights protection by courts.9

9 Carolyn Moser and Berthold Rittberger, ‘The CJEU and EU (de-)constitutionalization: Unpacking jurisprudential responses’ (2022) 20(3) International Journal of Constitutional Law 1038.

The transnational effects and impact of CJEU case law is the defining factor in the strategic decision of individuals, CSOs, and coalitions thereof to attempt to reach the European judiciary through the preliminary ruling mechanism provided by Article 267 of the TFEU. This procedure enables national courts to apply to the CJEU for a ruling on the interpretation or validity of an EU legal act.

According to Article 267 of the TFEU, national courts have the power to address a question to the CJEU on the validity (legality) of acts of institutions, bodies, offices, or agencies. The question must relate to the conformity of these acts with the Treaties (including the Charter), general principles of EU law, directly applicable international treaties binding on the EU, or superior acts of secondary EU law (e.g. conformity of a decision with the regulation on which it was based). The procedure is a non-contentious one and serves as an instrument of cooperation between the CJEU and national courts, enabling the CJEU to provide a national court with guidance on the interpretation of EU law. In practice, strategic litigation cases have systematically put forward both specific and innovative legal arguments to address to the CJEU.

The contention that the Charter can act as a useful tool to protect individuals’ digital rights is supported by the CJEU’s recent attitude towards the interpretation and application of even the most longestablished human rights to the digital sphere.10 While the CJEU is encouraged, and indeed required, to ensure that its jurisprudence takes into account and aligns with that of the European Court of Human Rights (ECtHR), it has nevertheless developed its own distinctive approach and priorities.11

In practice, this should make the Charter an attractive proposition for all stakeholders seeking to use strategic litigation as a means to protect digital rights. However, CSOs often must go through national

10 According to a report by the EU Agency for Fundamental Rights in 2020, the Charter is currently invoked in about 10% of all preliminary ruling procedures and the number of cases in which the CJEU refers to the Charter has increased from 27 in 2021 to 379 in 2019 (European Fundamental Rights Agency, ‘Report on fundamental rights 2020’, focus section, p.4; available at https://fra.europa.eu/sites/ default/files/fra_uploads/fra-2020-fundamental-rights-report-2020_en.pdf, last accessed on 14 October 2024.

11 This can be seen, for example, in the CJEU’s decisions in cases related to electronic surveillance, where it has arguably taken a more restrictive stance in cases like Digital Rights Ireland, Tele2/Watson and Schrems 1 and 2 than its Strasbourg counterpart in, for example, its decisions in Big Brother Watch and Others v United Kingdom and Centrum för Rättvisa v Sweden. Case citations: Joined Cases C-293/12 and C-594/12, Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others and Kärntner Landesregierung and Others [2014]

ECLI:EU:C:2014:238; Joined Cases C-203/15 and C-698/15, Tele2 Sverige [2016] ECLI:EU:C:2016:970; Case C-362/14, Maximillian Schrems v Data Protection Commissioner [2015]

ECLI:EU:C:2015:650; Case C-311/18, Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems [2020] ECLI:EU:C:2020:559; Big Brother Watch and Others v United Kingdom, App nos. 58170/13, 62322/14 and 24960/15 (ECtHR 25 May 2021); Centrum för Rättvisa v Sweden, App no. 35252/08 (ECtHR 25 May 2021).

courts for an indirect review. There is no guarantee of success for this pathway, as national courts have discretion in whether to refer the case for a preliminary CJEU ruling. In any case, collective actors such as CSOs can only participate in the European case if they were part of the original national proceedings.

There are numerous potential benefits to pursuing this pathway. The European supranational judicial oversight provided by a CJEU decision can shift policies and set precedents across Member States. In practice, and as will be shown in our case study, Member States can exercise levels of control over the process and thereby affect CSOs’ ability to gain access to the CJEU for fundamental rights cases. Knowledge of these country-specific procedural hurdles in various Member States can lead to better chances of CSOs obtaining a CJEU ruling.

A coalition of CSOs in France, including La Quadrature du Net, the French Data Network, the Fédération des fournisseurs d’accès à Internet associatifs, and Igwan.

net brought a case before the French courts for the annulment of a national regulation that required electronic communication providers and operators to massively and automatically store and process data in order to detect terrorist threats.

The coalition considered that these regulations did not comply with (among others) the ePrivacy Directive.12 The French Council of State stayed the proceedings and referred three questions to the CJEU for a preliminary ruling. The questions concerned the compatibility of the general and indiscriminate data retention obligation under French law with fundamental rights guaranteed by the EU Charter. This action was also joined by Privacy International and the Centre for Democracy and Technology, who intervened in February 2016 before the Council of State in support of the French organisations’ request for the annulment of the regulatory provisions. Annulment was requested of the provisions of, in particular, Decree  No. 2006-358 of 24

March 2006, which allowed the indiscriminate retention of personal data, in contravention of applicable EU law. Coordinated legal actions by CSOs were also launched in Belgium and the United Kingdom.

The CJEU ruled the following: first, that EU law applies every time a national government obliges telecommunications providers to process data, including for the purposes of national security; second, it concluded that EU law sets out privacy and data protection safeguards regarding the collection and retention of data by national governments.

Specifically, the CJEU proclaimed that read in the light of Article 4(2) of the Treaty on European Union (TEU) and Articles 7, 8, 11, and 52(1) of the Charter, the Directive must be interpreted as precluding national legislation that requires electronic communications services providers to carry out the general and indiscriminate transmission of traffic and location data to the security and intelligence agencies of a State for the purpose of safeguarding national security.

ANALYSIS

The CJEU issued a decision which spoke to the interaction between secondary law, national transpositions, and the EU fundamental rights framework. This decision–well situated within a thematic focus on the compliance of the Directive in question and the EU Charter–is not alone in articulating how Articles 7 and 8 of the Charter are to be balanced and interpreted in national law transpositions of an EU Directive. Rather, the case in question sits within a long tradition of CJEU case law ensuring respect for and the balancing of fundamental rights in EU lawmaking.

Surveillance is a field where CSOs have been active and largely successful in their strategic litigation before European courts.13 As with every other Charter right, Article 8 is binding on both EU bodies and on Member States when they implement EU law (Article 51(1) EU Charter). In Tele2, the CJEU examined the validity of a national law obliging telecommunications service providers to retain certain telecommunications data for law enforcement purposes. The Court held that EU law covered the retention of the

data by service providers as well as access to them by authorities, because, inter alia, the ePrivacy Directive imposed obligations to guarantee the confidentiality of communications and provided for possible restrictions, including law enforcement.14 The same conclusion was reached in subsequent CJEU case law on the same topic, as our case study evidences.

In the field of data protection more broadly, impactful cases where CSOs have collaborated with individuals or intervened include Schrems II,15 Google,16 Planet49,17 and IAB Europe.18

However, we acknowledge that there is a risk that a CJEU decision could ultimately be unfavourable. For example, on 30 April 2024, the CJEU published its decision in the EncroChat case.19 The case concerned

14 For more on this discussion see Ioannis Kouvakas, ‘Article 8 of the Charter of Fundamental Rights of the EU’ in Digital Freedom Fund, Digital Rights are Charter Rights, (2023) <https://digitalfreedomfund.org/digital-rights-arecharter-rights-essay-series/> accessed 30 September 2024.

15 Case C-311/18, Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems [2020] ECLI:EU:C:2020:559.

16 Case C-507/17, Google LLC v CNIL [2019] ECLI:EU:C:2019:772.

17 Case C-673/17, Planet49 [2019] ECLI:EU:C:2019:801.

18 Case C-604/22, IAB Europe [2024] ECLI:EU:C:2024:214.

12

13 The first landmark CJEU ruling was delivered in 2014 in the joined cases against the Data Retention Directive, one of which was litigated by the Irish CSO Digital Rights Ireland.

19 For an analysis of the case, see Hugo Partouche and Chloé Barthélémy, ‘Mass hacking and fundamental rights: a missed opportunity for the CJEU?’ (EU Law Analysis, 11 July 2024) <https://eulawanalysis.blogspot.com/2024/07/ mass-hacking-and-fundamental-rights.html> accessed 14 October 2024.

European police cooperation operations against organised crime, involving the mass interception of encrypted communications by means of spyware (‘hacking’). This resulted in the collection of millions of messages leading to thousands of arrests across Member States. The Berlin Regional Court referred questions to the CJEU, asking whether a German European Investigation Order concerning the transmission of data collected by French investigators using hacking techniques was compatible with fundamental rights. The CJEU decision avoided linking this case to existing European case law applying Articles 7 and 8 of the EU Charter to criminal matters. Instead, it prioritised the principle of mutual trust, which guarantees the effectiveness of European judicial cooperation.

Statistically, we see that in practice some Member States make more use of the preliminary ruling mechanism than others. This is due to national processes, including the general tendency of certain courts to refer questions to the CJEU in order to support EU legal integration (since the integration process expands their judicial powers) and the established strategic litigation strategies within national case law. If collective actors want to influence the CJEU, their best bet is to get involved in national legal proceedings from the start, as this will increase their chances of eventually

accessing justice through the CJEU. It is possible to challenge a national court for failing to make a reference to the CJEU, but this process is time consuming, expensive, and unusual. It can also prove counterproductive for CSOs aiming to use these same national courts in the future.

There are many factors to consider when deciding whether to seek a preliminary ruling to enforce the EU Charter. The CJEU’s procedural hurdles prevent CSOs from using this pathway to its fullest potential. Not all organisations have the expertise and working hours to dedicate to the preparation of a case and to the strategic formulation of the questions to be referenced using the Article 267 mechanism. While the referencing itself does not come with directly associated fees, the national court litigation process and the legal expenses of litigating at the EU level constitute often insurmountable barriers for individuals, CSOs, and coalitions of CSOs to achieve their objectives at the CJEU. This becomes even more evident especially when one considers other barriers such as the expertise necessary to create, select, and research the strategic litigation case thatwill be more likely to be adjudicated by the European judiciary.

strategic litigation space. While there is a long tradition of CJEU cases dealing with privacy (Article 7) and data protection (Article 8), there are fewer opportunities for CSOs to bring non-discrimination cases, especially in the digital space. This is partly due to the fact that any fundamental rights that have not been systematically associated with digital rights, such as the ones mentioned above, are less likely to be part of CJEU strategic litigation case law.

portions of the population living in EU territories.

TAKEAWAY AND RECOMMENDATIONS

While not guarantying access to justice for all, the CJEU constitutes a very efficient pathway to ensure compliance with EU law and the fundamental rights framework and can deliver strategic decisions in digital rights cases.

This all points to flagrant access to justice inequalities even within the human rights

Finally, it is important to note that all these challenges particularly affect access to justice for racialised groups, migrants, LGBTQI+ communities, people with disabilities, working class people, women, sex workers, and many other minoritised and historically excluded communities and individuals. These communities and individuals are the first to suffer from the ways in which technology can amplify biases, surveil, classify, discriminate against, exclude, and more. The complicated procedures laid out by EU law, the need to define a novel legal argument to attract the attention of the CJEU judiciary, the necessity to present a case via national courts (with all the variability in trust in the judiciary among different Member States), and the need (at times) to define specific individuals (or not) within a strategic litigation case, all pose challenges that directly affect access to justice for large

• The decision of which national court you will bring the preliminary question to is strategic.

• Make sure to shape your legal argument in a way that highlights its legal novelty.

• Ensure that you include affected individuals or groups in your strategy. Is going to the CJEU the preferred route for the people affected?

ENFORCING EU CHARTER LAW IN MEMBER STATE’S NATIONAL COURTS

One of the most important pathways for enforcing fundamental rights using the EU Charter is through the national courts of EU Member States (National Courts). This section explains when and how the EU Charter should be utilised by National Courts. It also unpacks the main reasons in favour of utilising this pathway and briefly identifies some of this pathway’s limitations. Like the rest of the toolkit, this section primarily focuses on digital rights, but many of the principles apply to the enforcement of fundamental rights through National Courts more generally.

APPLYING THE EU CHARTER IN NATIONAL COURTS

All Member States, including their national courts,20 are required to take appropriate measures to ensure the fulfilment of obligations that arise from the EU Treaties, including the EU Charter, or the acts of EU institutions.21 National judges must always observe EU law and cannot rely on national laws to override EU Law.22

This means that National Courts cannot declare a law or practice as lawful if it overrides fundamental rights contained in the EU Charter.

20 Case 14/83, Von Colson and Kaufmann v Land Nordrhein-Westfalen [1984] ECLI:EU:C:1984:153, para. 26.

21 Arts. 4(3) and 5, Consolidated version of the Treaty on European Union [2016] OJ 202/1.

22 Case 6/64, Flaminio Costa v ENEL [1964] ECLI:EU:C:1964.

The Dutch government introduced system risk indication (SyRI) software to identify ‘fraud’ in its social welfare system. 23

The core issue of the case was whether this system violated the right to privacy under European laws, including the right to the protection of personal data under the EU Charter and the General Data Protection Regulation (GDPR) and Article 8 of the ECHR.

The Hague Court decided that it was necessary to strike of a fair balance between the interests of the community as a whole and the right of the individuals affected by the legislation to respect for their private life in particular. In this case, the Court held that the possible benefits of using SyRI in fraud detection did not outweigh the right to privacy of the individuals whose data was being processed by the software.

Referring to the EU Charter (and the GDPR), the Court specifically held that it was important to consider the principles of transparency, purpose limitation, and data minimisation. Ultimately, the Court held that the legislation relating to the application of SyRI was not adequately transparent and verifiable.

This case study highlights a number of important things, namely:

- National Courts do use the EU Charter to protect fundamental rights, especially when relying on national laws might not offer adequate protection;

- This pathway is indeed accessible, as the Dutch legal team were able to bring arguments referring to EU and European Law in their own national court (The Hague), in their local language, and in the jurisdiction where the harm occurred (i.e. where the privacy violating software was being used)24

ANALYSIS

There are many reasons why this pathway remains an important one for enforcing the EU Charter, and why it should not be overlooked, particularly as we witness the increasing use of digital systems to conduct many aspects of daily life, such as welfare systems, banking systems, and migration systems.

Violations of fundamental rights that exist within the digital space typically affect a group of persons or a community rather than one single individual. This is because they often relate to the use of a particular digital tool or system that has been rolled out more generally, e.g. border passport scanners.

Although persons can make submissions before the CJEU in their local EU language, national courts will nevertheless still be more accessible as the proceedings will be completely in the local language, take place closer to the applicants’ place of residence, and the applicants will be more familiar with the national court’s proceedings.

23 C-09-550982-HA ZA 18-388, 5 February 2020, The Hague District Court <https://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBDHA:2020:1878> accessed 14 October 2024.

24 Douwe Linders, ‘Landmark ruling in SyRI case: Dutch court bans risk profiling’ (SOLV, 5 February 2020) <https://solv.nl/blog/landslide-victory-in-syricase-dutch-court-bans-risk-profiling/> accessed 14 October 2024.

Another advantage of using this pathway is the fact that, typically, the legal team working on a case will come from that particular Member State. As such, they will be more aware of the court’s procedural rules on admissibility and prescription and of the court etiquette.

Effective strategic litigation includes strategic communications, advocacy, and/or campaigning around the case.

This is a lot easier to do on a national level, particularly in the specific Member State where digital harm occurred, as people are a lot closer to the issue and feel more personally affected by it, even if they are not themselves the applicants before the court. Therefore, if a case takes place before National Courts, it is easier to get affected individuals and communities to attend the court hearings, organise protests outside the courts, and come up with related communications that are likely to gain the attention of local media. This will, in turn, make it more likely, even if the judgment is unfavourable, for the particular case to be impactful and to raise awareness of the fundamental rights violation.

National Courts also typically have better binding enforcement mechanisms in place to ensure that their judgments are adhered to–including the seizing of assets within the jurisdiction– compared to other pathways, especially Ombuds institutions and human rights bodies.

However, although the EU Charter technically applies in National Courts, in practice courts are more familiar with national laws and may feel that they lack the expertise to apply EU law. As a result, National Courts are more likely to invoke national laws in their judgments.25 This can be a missed opportunity, as not referring to fundamental rights can result in lower protection in situations where national provisions are not as expansive as the EU Charter.

Even when fundamental rights are referred to in National Courts, there is a risk that the rights will not be uniformly interpreted across different Member States. Where National Courts have slightly different interpretations of the scope of a fundamental right, this can result in unequal protection (although it is in these situations that National Courts should refer to the CJEU for clarification26). Further training of local judges on EU law could overcome this particular disadvantage.

case ultimately ending up at a Supreme or Constitutional Court. This can make it hard to maintain momentum and enthusiasm about a case.

TAKEAWAY AND RECOMMENDATIONS

Despite some shortcomings, particularly in the universal application of the EU Charter, National Courts remain an important pathway for enforcing fundamental rights in the digital space.

• Local applicants are more familiar with national courts, which can be beneficial for the case.

• National Court cases are easier to align with a broader advocacy and communication strategy related to the cause at hand.

Finally, these national procedures tend to be long and expensive, particularly as there are usually several instances with the

25 Tobias Nowak and Monika Glavina, ‘National courts as regulatory agencies and the application of EU law’ (2021) 43(6) Journal of European Integration 740.

26 Court of Justice of the European Union (European Union, 2024) <https://european-union.europa.eu/institutionslaw-budget/institutions-and-bodies/search-all-euinstitutions-and-bodies/court-justice-european-union-cjeu_ en> accessed 14 October 2024.

• Enforcement of a National Court judgment is often easier than in other pathways.

• More work needs to be done to ensure that the national judges are familiar with applying EU law.

USING THE NATIONAL HUMAN RIGHTS INSTITUTES, OMBUDS INSTITUTIONS, & EQUALITY BODIES O

PART 3 PATHWAYS TO JUSTICE

National Human Rights Institutes (NHRIs) are independent bodies established by the State through either constitutional or legislative authority in order to promote and protect human rights. They commonly have multiple mandates. NHRIs are essential in bridging the ‘protection gap’ between the rights of individuals and the responsibilities of the state. The Vienna Declaration of 199327 recommended the establishment of NHRIs, following the 1991 UN Paris Principles which outlined their functioning and status.

The national Ombuds institutions and the EU Ombudsman are responsible for investigating maladministration at the national and EU levels respectively. The national Ombuds institutions are independent authorities with investigating powers charged with receiving and examining complaints against national, regional, and local public authorities in the Member States. The European Ombudsman has this same responsibility with respect to EU institutions. Together, the national Ombuds institutions and EU Ombudsman form the European Network of Ombudsmen.

27 World Conference on Human Rights, ‘Vienna Declaration and Programme of Action (1993) <https://www.ohchr.org/sites/default/files/Documents/ProfessionalInterest/vienna.pdf> accessed 3 October 2024.

USING THE EU CHARTER IN NATIONAL HUMAN RIGHTS INSTITUTES

The role of NHRIs extends beyond the promotion and protection of human rights at the national level. It also involves the implementation of international standards, which make NHRIs suitable pathways to address and seek enforcement of fundamental rights, including those under the Charter. The

European Network of NHRIs28 supports and connects these institutions, promoting the exchange of best practices and engagement with the EU and other international mechanisms.

28 European Network of National Human Rights Institutions, available at <https://ennhri.org/> accessed 3 October 2024.

In 2022, a Dutch student named Robin Pocornie, supported by the Racism and Technology Centre,29 filed a discrimination claim with the Dutch equality body and national human rights institution, the College voor de Rechten van de Mens (the Institute for Human Rights).

Pocornie argued that the Vrije Universiteit Amsterdam’s (VU) use of proctoring software to invigilate online exams taken at home during the Covid-19 pandemic in 2020 discriminated against her on grounds of race.30

The e-proctoring software, Proctorio, repeatedly failed to recognise Pocornie’s face, which made her participation in exams difficult and imposed undue stress. Pocornie was able to discern that this issue was not experienced by her non-racialised fellow students. Her claim echoed academic research showing that commercial facial recognition systems perform significantly worse when attempting to identify the faces of individuals (and particularly women) with darker skin tones. When these systems are used to control access to resources, services, institutions, or benefits, their subpar performance for certain demographic groups generates unjust disadvantages.

In a preliminary decision, the Institute found that there was sufficient evidence to suggest the software was discriminatory. However, in its final judgment, it focused only on the student’s individual experience, ruling that the VU was not required to demonstrate that no racial discrimination had occurred in the use of the Proctorio software across the board.

29 Racism and Technology Center <https://racismandtechnology.center/> accessed 14 October 2024.

30 For an overview of Robin’s story, see Robin Aisha Pocornie, ‘Error 404: Human Face Not Found’ (28 February 2024) <https://youtu.be/ pVfvYYUkIcY?si=jbAEXbSC8rTc2vAT> accessed 14 October 2024.

The Institute determined that there was no conclusive legal proof of discrimination in the student’s specific case. Nevertheless, the ruling acknowledged the possibility that such software could lead to discrimination in other scenarios.

ANALYSIS

NHRIs and Ombuds institutions provide an accessible and cost-effective pathway, especially for individuals facing discrimination. These institutions often offer free legal assistance and serve as a critical first point of contact, especially for discrimination cases. Their procedures are typically more streamlined, informal, and approachable, particularly for individuals with limited means or those whose circumstances do not permit going to National Courts. The ability of these actors to investigate cases, seize the courts directly, provide legal support, and even engage in strategic litigation, contributes not only to individual cases but also to broader systemic changes.

Traditionally, these independent authorities are tasked with discrimination cases, among other human and fundamental rights violations. Due to the scope of the Charter, as expressed in Article 51(1), the

prohibition of discrimination applies to ‘the institutions [and] bodies […] of the Union’ as well as ‘the Member States only when they are implementing Union law’. Hence, the EU has a comprehensive obligation to refrain from any form of discrimination based on all grounds listed in Article 21.31 For instance, this implies that Frontex, an EU agency, cannot utilise border control software that discriminates against individuals based on factors such as skin colour, ethnic origin, or language. The obligation for Member States, however, is more limited: the nondiscrimination obligation only applies when there is a ‘direct link’ with EU law.

NHRIs and Ombuds institutions often lack the desired level of effectiveness in securing rights and achieving enforceable outcomes. This is because the remedies they propose are frequently non-binding and dependent on voluntary compliance, which can limit their ability to address complex or systemic legal issues. Furthermore, unlike court rulings, the recommendations or findings from these bodies do not establish legal precedents, thereby diminishing their broader legal impact.

NHRIs and national Ombuds institutions are only competent to deal with complaints

relating to the national level. Complaints relating to EU maladministration should be addressed to the EU Ombudsman.

USING THE EU CHARTER IM OMBUDS INSTITUTIONS

The EU Ombudsman is an independent and impartial body that holds the EU administration to account. The Ombudsman investigates complaints about maladministration in EU institutions, bodies, offices, and agencies. The Ombudsman may find maladministration if an institution fails to respect fundamental rights, legal rules or principles, or the principles of good administration. This covers, for example, administrative irregularities, unfairness, discrimination, abuse of power, failure to reply, refusal of information, and unnecessary delay. Any citizen or resident of the EU, business, association, or other body with a registered office in the EU can lodge a complaint before the Ombudsman whether they are directly affected by the maladministration or not.

A coalition of CSOs comprised of Privacy International, Access Now, the Border Violence Monitoring Network, Homo Digitalis, the International Federation for Human Rights (FIDH), and Sea-Watch,32

sent a complaint to the EU Ombudsman asking the independent authority to investigate the transfer to third countries of surveillance capabilities (including capacity building or training of third country authorities in surveillance techniques), surveillance equipment, and other forms of support. The EU Ombudsman is the entity responsible for handling the complaint because these activities are carried out by the European Commission, the European Border and Coast Guard Agency (Frontex), the European Union Agency for Law Enforcement Training (CEPOL), and the European External Action Service (EEAS). The complaint suggested that no human rights risk and impact assessments had been carried out prior to the engagement of these bodies with the authorities of third countries. These risk assessments are needed to ensure that any surveillance transfer will not result in serious violations of the right to privacy or facilitate other serious violations of human rights. The coalition asked the EU Ombudsman to proclaim maladministration and to address it, recognising the implications of the failure for millions of people.

31 See Raphaële Xenidis, ‘Article 21: an exploration, or the right to algorithmic non-discrimination’ in Digital Freedom Fund (n 14).

32 ‘Complaint on EU surveillance transfers to

third countries’ (Privacy International) <https:// privacyinternational.org/legal-action/complaint-eusurveillance-transfers-third-countries> accessed 14 October 2024.

The EU Ombudsman issued a decision in November 2023 finding that no further inquiries were justified, but it did suggest a series of improvements the EEAS should adopt in order for its human rights due diligence process to constitute an acceptable alternative to a human rights impact assessment. These recommendations focused on making changes to the EEAS human rights due diligence documentation, in particular, the inclusion of ‘data protection provisions’ on information sharing sessions, the assessment of the human rights impact of an activity at ‘every stage’ of said activity, and the inclusion in the due diligence documentation of a description of the risk identification process and the elements involved in a potential risk assessment. The recommendations aimed to enhance transparency, accountability, and human rights considerations. Similarly, the Ombudsman’s decision with respect to Frontex followed the same logic, highlighting that human rights impacts assessments ‘should be specifically designed, taking into account the nature of surveillance capabilities being transferred, so as to allow for the potential negative effects on human rights to be identified and to provide for mitigation measures’.

TAKEAWAY AND RECOMMENDATIONS

NHRIs, national Ombuds institutions, and the EU Ombudsmen are all independent institutions designed to be access to justice pathways. While it is difficult to assess their efficiency as a whole–due to divergences in powers, capacity, funding, and expertise

among the different national authorities in the Member States–all these authorities are part of a wider system set up to obtain redress and to create processes for rights violations. The essence of NHRIs and Ombuds institutions is their neutrality and ability to work towards systems changes.

• NHRIs and Ombuds institutions provide an accessible and costeffective alternative pathway, especially for individuals facing discrimination.

• The recommendations or findings of these bodies do not establish legal precedents. While the broader legal impact is low, these bodies contribute to opening a pathway for legal precedent setting by courts.

• You can combine the NHRIs and Ombuds institutions pathways with appropriate advocacy and communications strategies to maximise the impact of your strategic litigation case.

USING COLLECTIVE REDRESS TO IMPROVE ACCESS

PATHWAYS TO JUSTICE TOOLKIT

The expansion of large digitalisation and datafication processes exposes thousands or even millions of individuals to harmful practices that are leading to the violation of their digital rights. As the Commission highlighted, ‘[i]n light of increasing cross-border trade and EU-wide commercial strategies, these infringements increasingly also affect consumers in more than one Member State.’33 Collective redress mechanisms can help to ensure the enforcement of digital rights at scale by enabling CSOs to develop efficient litigation strategies that allow them to represent groups or classes of claimants, both with and without a representative mandate. Such collective actions also create opportunities for cross-disciplinary and cross-jurisdictional collaborations between different CSOs and other institutional stakeholders.

Challenging technologically created, imposed, and amplified systemic injustices requires knowledge and understanding of data systems, social capital, and political agency to claim rights, and resilience and courage to stand up. We need to move away from the notion of individual empowerment (…) to collective agency - which should be understood as a process that brings together different competencies needed to identify and uncover the problem and jointly work towards a solution, through the practice of resistance.34

33 Proposal for a Directive on representative actions for the protection of the collective interests of consumers, and repealing Directive 2009/22/EC [2018] COM/2018/0184 final - 2018/089 (COD), p. 1.

34

Roderic Crooks, Catherine D’Ignazio, Arne Hintz, Fieke Jansen, Juliane Jarke, Anne Kaun, Stine Lomborg, Dan McQuillan, Jonathan A. Obar, Lucy Pei, and Ana Pop Stefanija, ‘People’s Practices in the Face of Data Power’, in Juliane Jarke and Jo Bates (eds), Dialogues in Data Power (Bristol University Press 2024).

Flo Health, Inc. developed and owns the smartphone application called ‘Flo Ovulation and Period Tracker’.

In April 2023, a representative action was brought against Flo Health, Inc. by Ius Omnibus, a Portuguese not-forprofit consumer association, before the Judicial Court of the District of Lisbon. The complaint alleges that Flo Health, Inc. unlawfully shared users’ sensitive personal data.

Between 30 June 2016 and 23 February 2019, Flo Health, Inc. shared sensitive personal data of (Portuguese) users of the application relating to their menstruation, health, and sex life. This data was recorded by users in the application. The company shared this information with companies such as Meta and Google, i.e. companies that regularly engage in data analytics and targeted advertising activities. Personal data was shared without obtaining users’ consent, for commercial purposes, and in violation of the company’s assurance that this data would not be shared with third parties. This data-sharing practice was detected by authorities from the United States of America, which forced Flo Health, Inc. to end the personal data sharing and notify all users.

The subsequent lawsuit, still pending before Portuguese courts, highlights the importance of the fundamental rights of Portuguese consumers, including the right to privacy and data protection, as well as the right to consumer protection. If this claim is successful, Flo Health, Inc. will be ordered to compensate the consumers represented in the action for the damage caused by these illegal practices.

ANALYSIS

While many Member States provide for collective redress mechanisms in their national laws, this is not the case in all States. Consecutive EU initiatives have sought to make it possible ‘for qualified entities designated by the Member States, such as consumer organisations or independent public bodies, to bring representative actions for the protection of the collective interests of consumers with the primary aim of stopping both domestic and cross-border infringements of EU consumer law.’35

This effort culminated in the adoption of Directive (EU) 2020/1828 of the European Parliament and of the Council of 25 November 2020 on representative actions for the protection of the collective interests of consumers and repealing Directive 2009/22/EC36 (Representative Actions Directive), which ‘sets out rules to ensure that a representative action mechanism for the protection of the

collective interests of consumers is available in all Member States.’37

The Directive applies to ‘representative actions brought against infringements by traders’38 of provisions included in a long list of EU laws, including in the areas of product liability, consumer contracts, consumer protection and unfair commercial practices, e-commerce, universal service and users’ rights relating to electronic communications networks and services, data protection, distance marketing, consumer credit agreements, and several others.39 Among other things, the Directive includes provisions on the designation of qualified entities,40 the types of representative actions covered,41 the available injunctive42 and redress43 measures, and funding.44 Member States were obliged to transpose the Directive into domestic law by 25 December 2022 and to apply these measures from 25 June 2023.45

Commission Report of 25 January 2018 on the implementation of Commission Recommendation 2013/396/ EU6 on common principles for injunctive and compensatory collective redress mechanisms in the Member States concerning violations of rights granted under Union Law [2018] COM(2018) 40 final, p.2.

Directive (EU) 2020/1828 of the European Parliament and of the Council of 25 November 2020 on representative actions for the protection of the collective interests of consumers and repealing Directive 2009/22/EC [2020] OJ L 409.

37 Art. 1(1), Representative Actions Directive (n 38).

38 Art. 2(1), Representative Actions Directive (n 38).

39 Annex I, Representative Actions Directive (n 38).

40 Art. 4, Representative Actions Directive (n 38).

41 Art. 7, Representative Actions Directive (n 38).

42 Art. 8, Representative Actions Directive (n 38).

43 Art. 9, Representative Actions Directive (n 38).

44 Art. 10, Representative Actions Directive (n 38).

45 Art. 24(1), Representative Actions Directive (n 38).

The Directive promises to broaden the possibility of bringing representative actions in all Member States, including actions that challenge violations of digital EU Charter rights. In addition to the mechanisms introduced following the Directive’s implementation throughout the EU, some Member States have preexisting collective redress mechanisms. The availability of information about the substantive and procedural requirements of different collective redress mechanisms can be key in strategic litigation, for example in developing cross-jurisdictional strategies.

The case study described above represents one of the many objectives that can be pursued with a representative action. The aim of the complaint in the case study is to have a national court recognise the damage to consumers, to have the court order compensation, and, finally, to have information about the violation become public. The case in question was filed in Portugal, a Member State with an opt-in system.46 This means that the consumer organisation acting as a qualified entity is considered to represent all consumers affected by the unlawful practice, without being obliged to prove that consumers have specifically signed up to be represented in

the complaint. This system is particularly advantageous for cases concerning BigTech because of the large number of people that could be affected.

In addition to the case study presented above, there are multiple other representative action cases with the potential to affect a large number of people.

We believe it to be important to highlight in this context the representative action case47 brought by The Daphne Caruana Galizia Foundation against the company C-planet for a data breach related to electoral data in Malta. C-planet created a database of illegally collected personal data containing the personally identifiable information of anyone who had the right to vote in the 2013 elections, including sensitive information such as voting intentions or party leanings. While the case is still pending before the local courts, the process adopted for bringing the case is notable.

The Daphne Caruana Galizia Foundation’s strategy considers both the large number of potential claimants and the procedural and redress opportunities afforded by data protection regulations and the GDPR. For example, the Foundation created a tool that allowed anyone to check whether their data had been leaked by C-Planet by entering

their ID card number. This prompted 620 claimants to join the action. Furthermore, the Foundation partnered with noyb and filed another complaint with the national Data Protection Authority.48

Alternatively, there are representative action cases pursued in other EU jurisdictions which do not necessarily rely on the consumer harm or compensation narrative as described in the case study above. Take, for instance, the collective action case that was brought in France against discriminatory identity checks by law enforcement. As part of a representative action,49 several associations and NGOs50 brought a case before the Council of State to put an end to the practice of discriminatory identity checks. According to the Council of State decision,51 while the Court recognised that these practices do not constitute isolated instances, it also did not go as far as to call them ‘systemic’ or

48 For this process, please refer to the next pathway and its corresponding analysis.

49 See here the representative action website: https:// maruemesdroits.org/ accessed 14 October 2024.

50 These CSOs are: Maison Communautaire pour un Développement Solidaire (MCDS), Pazapas, Réseau Egalité, Antidiscrimination, Justice Interdisciplinaire (Reaji), Amnesty International France and Human Rights Watch. See also: ‘Contrôles au Faciès en France: le combat continue’ (Amnesty International, 11 October 2024) <https://www. amnesty.fr/discriminations/actualites/controles-au-faciesle-conseil-detat-reconnait-lexistence-du-probleme-mais-refuse-de-contraindre-letat-a-y-mettre-un-terme> accessed 14 October 2024.

‘widespread’. The Court decided that this practice constitutes discrimination every time people have been subject to a check on the basis of physical characteristics associated with a real or supposed origin. The Council of State notes, however, that the measures requested by the associations have, in fact, the aim of redefining public policy choices regarding the use of identity checks for the purposes of supressing crime and preventing public order disturbances. Such measures do not fall within the powers of the administrative judge. After the Council of State dismissed the appeal, the coalition re-oriented the complaint towards the United Nations Committee on the Elimination of Racial Discrimination (CERD).52

Finally, the complaint brought by members of the Rohingya community before the Irish courts against Meta seeks to establish the responsibility of Facebook for the contribution that its content moderation policies–and the ensuing amplified disinformation–made to the commission of genocide against the Rohingya population in Myanmar.

In 2018, UN human rights investigators said the use of Meta’s social media platform Facebook had played a key role

46 For more information on the differences in collective redress systems, please refer to our collective redress database on the DFF digiRISE webpage: https://digitalfreedomfund.org/digirise-2/ accessed 14 October 2024.

47 ‘Collective action against C-Planet data breach’ (The Daphne Caruana Galizia Foundation, 03 April 2020) <https:// www.daphne.foundation/en/2020/04/03/collective-action-data-breach> accessed 14 October 2024.

51 N° 454836, 11 October 2023, Council of State <https://www.conseil-etat.fr/actualites/controles-d-identite-discriminatoires-la-determination-d-une-politique-publique-ne-releve-pas-du-juge-administratif> accessed 14 October 2024.

52 See the complaint here: ‘CERD submission April 2024’ <https://www.hrw.org/sites/default/files/media_2024/04/CERD%20submission%20April%202024.pdf> accessed 14 October 2024.

in spreading the hate speech that fuelled the violence. ‘Facebook’s algorithms and Meta’s ruthless pursuit of profit created an echo chamber that helped foment hatred of the Rohingya people and contributed to the conditions which forced the ethnic group to flee Myanmar en masse’, read a statement by Amnesty International.53 The case, brought collectively by 17 Rohingya refugee individuals, alleges that harm was suffered as a result of Facebook’s failure to appropriately moderate content, which contributed to the genocide against the Rohingya group.

In all the cases presented above, there was a high degree of direct involvement in case strategy by the affected communities. Movement lawyering54 is one form of social change lawyering, and its aim is to invert the relationship dynamics between litigators and represented communities and to escape more traditional forms of lawyering. One of the biggest aims of movement lawyering is to use the law to build the power of communities and to ensure that movements do not become reliant on lawyers only. While there

53 ‘Myanmar: Time for Meta to pay reparations to Rohingya for role in ethnic cleansing’ (Amnesty International, 25 August 2023) <https://www.amnesty.org/en/latest/ news/2023/08/myanmar-time-for-meta-to-pay-reparationsto-rohingya-for-role-in-ethnic-cleansing/> accessed 14 October 2024.

54 Christine Cimini and Doug Smith ‘An Innovative Approach to Movement Lawyering: An Immigration Rights Case Study’ (2021) 35(2) Georgetown Immigration Law Journal 431.

are many challenges in engaging with communities and resistance movements, and in aligning the broader structural objectives, representative actions are ideal avenues to apply this form of lawyering and to build the collective power needed to set strategies with transformative objectives for all.

TAKEAWAY AND RECOMMENDATIONS

Representative actions are another type of procedure that can be instituted before national courts. They come in various procedural forms but are subject to a degree of harmonisation thanks to the Representative Actions Directive. There is no one-size-fits-all representative action strategy–it must be adjusted to harm suffered, the objective pursued by the group of individuals or communities affected, and the chosen forum and jurisdiction in the EU.

• Collective actions create opportunities for cross-disciplinary and cross-jurisdictional collaborations between different CSOs and other institutional stakeholders.

• When making strategic choices about jurisdictions and partners, you should take into consideration the type of collective claim pursued (not all collective claims are necessarily consumer law claims).

• Consider implementing movement lawyering as a driving principle. Representative actions are ideal tools to use in broader collective action efforts, especially for movement strategizing.

ALTERNATIVE QUASI-JUDICIAL PATHWAYS: DPA & DSA MECHANISMS

Besides private enforcement through national courts, both the GDPR and the Digital Services Act (DSA) provide quasi-judicial pathways to enforce their provisions and the fundamental rights that underpin them. Data Protection Authorities (DPAs) are the specialised national authorities primarily responsible for managing complaints and ensuring the enforcement of the GDPR. In contrast, the DSA designates the European Commission as its primary public enforcer. However, the DSA also establishes a multilayered private enforcement framework of quasi-judicial mechanisms. This includes the option to file complaints with Digital Service Coordinators (DSCs), the national authorities tasked with privately enforcing the DSA. Additionally, the DSA provides for non-binding out-of-court dispute resolution through specialised bodies certified by DSCs, as well as direct engagement with digital platforms via their internal complaint-handling systems.

ENFORCING THE EU CHARTER RIGHTS THROUGH DATA PROTECTION AUTHORITIES (DPA s )

DPAs are independent public authorities responsible for overseeing the enforcement of the GDPR and national data protection laws in each Member State. As such, they are a primary pathway to seek the enforcement not only of the right to protection of personal data under Article 8 of the Charter, but all other interconnected and interdependent

fundamental rights, as recognised in Article 1(2) of the GDPR.55

Many provisions throughout the GDPR further reinforce the fundamental rights enshrined in the Charter. For example, the additional protections for special categories of personal data under Article 9 of the GDPR can be instrumental in protecting Charter rights such as freedom of thought, conscience, and religion under Article 10,

55 This is mirrored in Recital 4, which goes into further detail explicitly recognising that the GDPR observes the freedoms and principles recognised in the Charter, mentioning the respect for private and family life, home and communications (Article 7), freedom of thought, conscience and religion (Article 10), freedom of expression and information (Article 11), freedom to conduct a business (Article 16), the right to an effective remedy and to a fair trial (Article 47), and cultural, religious and linguistic diversity (Article 22).

the right to non-discrimination under Article 21, and the right of collective bargaining and action under Article 28, among others.56

Scholars have also recognised this interplay between the GDPR and other fundamental rights. The right to data protection is seen as an enabler of other fundamental rights or as a precondition to exercising them,57 and the GDPR itself is described as a ‘multifunctional’ framework for the protection of other fundamental rights.58 The right to protection of personal data under Article 8 of the Charter, in conjunction with the GDPR and national laws regulating personal data, remains the main avenue for seeking enforcement of Charter rights through a DPA.

Article 8(3) of the Charter explicitly establishes that the compliance control carried out by DPAs is an integral part of the right to data protection. This has been further developed by Articles 51 to 59 of the GDPR, which ensure the independence of DPAs and set out their competence, tasks, and powers.

and enforcement actions.

56 Other examples could be: the provisions on data processing for journalistic, academic, artistic or literary purposes in Article 85, which reconcile data protection with the right to freedom of expression and information under Article 11 of the Charter; or provisions such as Article 88, which establish that Member States can provide, either by law or by collective agreements, more specific rules to ensure fundamental rights protections for the processing of employees’ personal data in the context of employment, which aligns with the right of collective bargaining and action under Article 28 of the Charter.

57 ‘Article 1 GDPR’ (GDPR Hub, 14 March 2024) <https:// gdprhub.eu/Article_1_GDPR> accessed 14 October 2024. 58 This can be illustrated, for example, by examining how the right to data protection can serve as a tool to counter the effects of online harassment and its impact on human dignity, integrity and freedom of expression under Articles 1, 3, and 11 of the Charter respectively, or automated decision-making and its impact on non-discrimination under Article 21 of the Charter. See Florence D’Ath, ‘The General Data Protection Regulation: A Multi-Functional Framework for the Defence of the Rights and Freedoms of Data Subjects in the Digital Sphere’ (Dissertation to obtain the degree of Doctor in Law, University of Luxembourg/Maastricht University 2023) <https://orbilu.uni.lu/bitstream/10993/60043/1/Thesis%20 DATH%20Florence.pdf> accessed 3 October 2024.

In general, each member state is responsible for overseeing compliance within its own territory, as established in Article 55 of the GDPR. However, the GDPR contains an exception for crossborder data processing in the form of a ‘one-stop-shop’ mechanism under Article 56. In such cases, the supervisory authority of the Member State where the company’s sole or main establishment59 is located becomes the ‘lead supervisory authority’ (LSA). The LSA is required under Article 60 to cooperate and share information with the other relevant or ‘concerned supervisory authorities’ in order to reach a consensus on decisions

59 The European Data Protection Board has recently issued an opinion on the notion of main establishment of a controller under Article 4(16)(a) GDPR, and on the criteria for the application of the one-stop-shop mechanism. It concluded that a controller’s ‘place of central administration’ can be considered as a main establishment only if it takes the decisions on the purposes and means of the processing of personal data and it has power to have these decisions implemented. It also reiterated that the burden of proof falls on controllers in relation to the place where the relevant processing decisions are taken and where there is the power to implement them. See European Data Protection Board, ‘Opinion 04/2024 on the notion of main establishment of a controller in the Union under Article 4(16)(a) GDPR’ (2024) <https://www.edpb.europa.eu/system/files/2024-02/edpb_opinion_202404_mainestablishment_en.pdf> accessed 14 October 2024.

Under Article 77, data subjects have the right to lodge a complaint with a DPA for infringements of the GDPR in the Member State of their habitual residence, their place of work, or the place of the alleged infringement. The DPA must in turn inform the complainant of both the progress and the outcome of the complaint, as well as the possibility to appeal its decision before a court under Article 78. The availability of recourse to the courts reaffirms the right to an effective remedy and a fair trial under Article 47 of the Charter.

Lastly, Article 79 of the GDPR gives data subjects the right to bring legal action before national courts against data controllers or processors if their rights under the GDPR are infringed, further reaffirming the aforementioned fundamental right to an effective remedy. However, while the option to seek judicial remedies exists, DPAs offer a series of advantages as a pathway to seek the enforcement of data protection and adjacent Charter rights.

After receiving complaints from various data subjects, the Dutch DPA (Autoriteit Persoonsgegevens, AP) launched an investigation into Clearview AI. Clearview AI offers facial recognition services that allow law enforcement to search a database of over 30 billion images of individuals (including EU/EEA citizens) scraped from the internet and social media platforms. The company uses algorithms to convert these images into biometric data by creating vectors for facial recognition.

This case followed a series of coordinated enforcement actions against Clearview AI across several jurisdictions in Europe, as well as campaigning efforts against facial recognition amongst the European digital rights community.

The AP found that Clearview AI had collected the sensitive personal data of Dutch citizens without a legal basis. It issued a fine of EUR 30,5 million and ordered the deletion of the data. The AP’s decision centred fundamental rights considerations to assess Clearview AI’s reliance on legitimate interest as a legal basis for its data processing activities. In doing so, it applied the three-part test established by CJEU case law (purpose, necessity, and balancing).

The AP determined that Clearview AI’s only interest lay in the necessity of processing personal data in order to conduct a business. The AP stated that ‘such freedom does not extend so far as to cover activities that almost fully coincide with infringing the fundamental rights of others’ and that such an interest did not constitute a legitimate interest and valid legal basis. The AP also determined that Clearview AI’s data processing violated the principle of data minimisation as it was not limited to what was strictly necessary. The AP considered the vast amount of personal data Clearview AI collects from the internet and the lack of a defined retention period or geographical restrictions. Furthermore, the AP concluded that the indiscriminate mass processing of biometric data constituted a very grave infringement of data subjects’ fundamental rights, significantly outweighing any potential interest Clearview AI might have in processing the data for its commercial activities. This finding confirmed the AP’s initial assessment and the rejection of the notion of legitimate interest as a valid legal basis.

ANALYSIS

DPAs are experts in data protection laws and are well-equipped to adequately interpret, apply, and enforce data protection and privacy rights. They also have dedicated staff trained in data protection laws, which can speed up investigations and decisions compared to general courts, where data protection is just one of a myriad of legal issues handled.

DPAs also tend to have simple processes in place for filing a complaint, typically involving filling out a form or an online submission. Filing a complaint with a DPA is usually free or involves minimal administrative fees, and individuals do not need to be represented by a lawyer. By contrast, court proceedings often involve significant legal fees for filing, legal representation, and court costs. DPAs can provide guidance and mediation between the data subject and the processor, with the aim of resolving disputes that might not require escalation. DPAs also have significant investigatory and sanctioning powers, are able to impose hefty fines, and can order measures to force processors into compliance.

However, there are also some downsides to seeking enforcement of Charter rights by lodging complains with DPAs. Due precisely to their specialisation in data protection, they might have limited legal purview or expertise to address other fundamental rights under the Charter. Furthermore, many DPAs are underfunded and understaffed, which can lead to delayed investigations and enforcement actions.60

DPAs have expressed concerns that new EU legislation will significantly increase their workload and responsibilities, without providing additional resources. The new legislation will require them to oversee new large-scale IT systems61 and carry out new supervisory duties related to AI technologies following the successive implementation of the AI Act.

60 For example, a recent report by the EU Agency for Fundamental Rights found that inadequate resources risk undermining the implementation of DPAs’ mandate and independence because DPAs find themselves overburdened and dealing with increasing numbers of complaints, which often times does not allow them to provide comprehensive oversight and start their own ex officio investigations. See European Union Agency for Fundamental Rights, ‘GDPR in practice – Experiences of data protection authorities’ (2024) <https://fra.europa.eu/en/publication/2024/gdpr-experiences-data-protection-authorities> accessed 14 October 2024.

61 For example, the Entry/Exit System for border control: ‘Entry/Exit System’ (European Commission, 9 October 2024) <https://home-affairs.ec.europa.eu/policies/schengen-borders-and-visa/smart-borders/entry-exit-system_en> accessed 14 October 2024.

Furthermore, DPAs across Europe vary significantly in their approaches to enforcement. For example, the French Commission nationale de l’informatique et des libertés (CNIL) is known for taking a proactive and strict stance towards Big Tech companies. This contrasts with Ireland’s Data Protection Commission, which oversees many major Big Tech cases due to the companies’ EU headquarters being in Ireland, and which has been criticised for taking a more cautious approach, particularly in high-profile cases. Another example is Germany’s decentralised system of state-level DPAs, which leads to significant regional differences in enforcement.

In general, the lack of harmonised procedures and the legal uncertainty62 have resulted in inefficient and patchwork enforcement of the GDPR so far.63 However, it’s possible that the upcoming GDPR procedural regulation might be a step in the right direction towards overcoming these obstacles and may enhance DPAs’ role as

62 Data Protection Law Scholars Network and Access Now, ‘The Right to Lodge a Data Protection Complaint: OK, but then what? An empirical study of current practices under the GDPR’ (2022) <https://cadmus.eui.eu/bitstream/ handle/1814/74899/The_right_to_lodge_a_data_protection_ complaint_2022.pdf?sequence=1&isAllowed=y> accessed 14 October 2024.

63 ‘Data Protection Day: Are Europeans really protected?’ (noyb, 27 January 2023) <https://noyb.eu/en/data-protection-day-are-europeans-really-protected> accessed 14 October 2024. See also Irish Council for Civil Liberties, ‘Europe’s enforcement paralysis: ICCL’s 2021 report on the  enforcement capacity of data protection authorities’ (2021) <https://www.iccl.ie/digital-data/2021-gdpr-report/> accessed 14 October 2024.

a pathway to enforce data protection laws and Charter rights more broadly.64

TAKEAWAY AND RECOMMENDATIONS

Despite the challenges and obstacles to consistently enforcing data protection laws across EU Member States, DPAs frequently integrate fundamental rights considerations into their decisions, serving as a vital and accessible means to uphold these rights in relation to data protection. However, due to the varying approaches and capacities of DPAs in different jurisdictions, careful legal research and strategic planning are necessary to determine their suitability for a given case.

64 ‘Analysis: GDPR Procedural Regulation enters critical phase’ (noyb, 16 July 2024) <https://noyb. eu/en/analysis-gdpr-procedural-regulation-enters-critical-phase> accessed 14 October 2024.

• You can file a complaint with a national DPA free of charge or with minimal administrative fees; there is no need for individuals to be represented by a lawyer.

• DPAs are specialised in data protection, so they might have limited legal purview or expertise

to address other fundamental rights under the Charter.

• Consider the potential delays to investigations and enforcement actions in a given jurisdiction, especially since many DPAs are underfunded and understaffed.

ENFORCING THE EU CHARTER OF FUNDAMENTAL RIGHTS THROUGH THE DIGITAL SERVICES ACT (DSA)

Big Tech platforms exert substantial control over online public discourse through their content moderation systems. Inconsistent enforcement driven by arbitrary terms and conditions, vague policies and guidelines, and an overreliance on automation, are among the many flaws which have been noted in these systems.65 These flaws often result in the suppression and censorship of important public interest speech,66 while at the same time allowing disinformation, hate speech, and harmful content to spread,67 all of which has a disproportionate effect on marginalised groups.

In this context, the DSA offers a novel multi-layered mix of public and private enforcement mechanisms, amongst which are a series of private enforcement pathways to address freedom of expression and other Charter rights issues.

65 João Pedro Quintais, Naomi Appelman, and Ronan Ó Fathaigh, ‘Using Terms and Conditions to apply Fundamental Rights to Content Moderation’ (2023) 24(5) German Law Journal 881.

66 Among many examples, one that stands out in particular is the suppression of pro-Palestinian voices in the context of the ongoing mass atrocities currently taking place in the region. See ‘Meta’s Broken Promises - Systemic Censorship of Palestine Content on Instagram and Facebook’ (Human Rights Watch, 21 December 2023) <https://www.hrw.org/ report/2023/12/21/metas-broken-promises/systemic-censorship-palestine-content-instagram-and> accessed 14 October 2024.

67 Amongst the most notable cases is Meta’s admittedly substantial role in fuelling the ethnic cleansing suffered by the Rohingya people in Myanmar. See Amnesty International, ‘The Social Atrocity, Meta and the Right to Remedy for the Rohingya’ (2022) <https://www.amnesty.org/en/documents/ASA16/5933/2022/en/> accessed 14 October 2024.

One of the DSA’s overarching aims is precisely to set out harmonised rules for a safe, predictable, and trusted online environment in which Charter rights are effectively observed. Moreover, Article 14 imposes a specific obligation on platforms to apply and enforce their terms and conditions with due regard to users’ freedom of expression, freedom and pluralism of the media, and other fundamental rights as enshrined in the Charter. The ‘notice and action mechanism’ for illegal content under Article 16 identifies the conditions for platforms’ liability for the content hosted on their sites. Once the platform has decided on the notice, users then have a series of options to seek redress for these decisions.

One of the key mechanisms introduced by Article 20 of the DSA is the internal complaint-handling systems for dealing with complaints against content moderation decisions. Article 21 of the DSA entitles users to challenge these types of decisions by platforms (including the outcomes of the aforementioned internal complaint-handling procedures) through any of the out-of-court dispute settlement bodies certified by Member States’ DSCs. Additionally, this provision states that users are entitled to initiate proceedings to challenge the platform’s content moderation decisions before a national court at any stage, regardless of their engagement with these out-of-court dispute settlement mechanisms.

Lastly, Article 53 of the DSA entitles users to lodge complaints with DSCs, which are the national regulatory bodies tasked with overseeing the application of the DSA and ensuring compliance by platforms.

Although the DSA is now fully in effect, many questions remain regarding its enforcement.68 So far, most actions have been led by the Commission, and it remains to be seen how private enforcement and the case law around it will unfold.69 For this reason, we have chosen to propose a hypothetical case study in this section.

68 Gaby Miller, ‘The Digital Services Act Is Fully In Effect, But Many Questions Remain’ (Tech Policy Press, 20 February 2024) <https://www.techpolicy.press/the-digital-services-actin-full-effect-questions-remain/> accessed 14 October 2024.

69 Although some private enforcement litigation decisions are beginning to emerge. See, for example, Paddy Leerssen, ‘The DSA’s first shadow banning case’ (DSA Observatory, 6 August 2024) <https://dsa-observatory.eu/2024/08/06/thedsas-first-shadow-banning-case/> accessed 14 October 2024.

CASE STUDY

This case study envisions a situation where a video uploaded on Facebook by a user from an EU Member State depicting a military intervention and apparent human rights violations on another continent is removed by the platform.

The removed content contained graphic scenes of military violence deliberately targeting what appear to be civilians. The content further contained solidarity and resistance slogans in the local language of where the atrocities took place, and details regarding an upcoming protest in the EU Member State organised to apply political pressure in support of economic sanctions against the oppressive military regime.

Under the DSA, the user could engage with Meta’s internal complaint-handling system, requesting that the platform reinstate the content on the basis that it did not violate Meta’s violent and graphic content policy or its violence and incitement policy.

Furthermore, the user could leverage Article 14(4) of the DSA and argue that the platform must apply its terms and conditions with due regard to their fundamental rights.

ARMED CONFLICT CONTENT REMOVAL

In particular, the user could argue that their freedom of expression and information under Article 11 of the EU Charter has been infringed, and that the restriction to their speech would not pass the three-part test established in human rights law. Furthermore, the user could claim an infringement of their freedom of assembly and of association under Article 12 of the Charter, on the basis that they are an activist and the purpose of the content was to organise and promote a political protest.

Additionally, the user could claim that their right to non-discrimination under Article 21 of the Charter has been infringed, since the content did not include any incitement to violence or hateful content. The video did include slogans in a minority language which have been used by many resistance movements in the country in question, and therefore the removal might have been discriminatory based on either algorithmic bias or a content moderation decision that lacked contextual and cultural awareness.

Based on the outcome of the internal complaint-handling system, the user could consider beginning an out-of-court dispute settlement procedure with a specialised body or lodging a complaint with a DSC. Seeking redress in the courts would not be a priority in this type of case, given the time-sensitive nature of the issue and the main focus being the reinstatement of the content.

ANALYSIS

The set of multi-layered pathways provided by the DSA offers many substantial benefits for the protection of fundamental rights in the online context. An evident advantage is how accessible, cost effective, and expedient some of these pathways can be for users seeking to address potential infringements of their rights by social media platforms.

For example, under Article 20 of the DSA, platforms’ internal complaint-handling procedures must be easily accessible online, user-friendly, and free of charge. Depending on the issue, its urgency, and the type of remedy sought, this direct pathway to engage with the platforms themselves could in many instances be the most efficient way to address the fundamental rights infringement, in particular because it does not preclude leveraging other pathways as well.

It should be noted that research shows that the majority of people who suffer online harms do not take action. This is partly because they may have little confidence in a fair judgement, and partly because they may be pursuing outcomes beyond the content moderation decision itself, such as uncovering or punishing a perpetrator.70

Moreover, the effect on the protection of marginalised groups will largely depend on which organisations receive certification to conduct these out-of-court procedures, and the scope of their expertise.71 Experts have pointed out that the selection of specific authorities and the scope of their expertise may significantly impact access to justice through this particular pathway 72

70 Anna van Duin, Naomi Appelman, Brahim Zarouali, and Max Kosian, ‘Harmful Content and Access to Justice on Online Platforms: An Empirical Study on the Experiences and Needs of Victims’ (2023) Amsterdam Law School Research Paper No. 2023-19 <https://ssrn.com/abstract=4456769> accessed 3 October 2024.

71 Pietro Ortolani, ‘If You Build It, They Will Come. The DSA’s “Procedure Before Substance” Approach’ (Verfassungblog, 7 November 2022) <https://verfassungsblog.de/dsabuild-it/> accessed 3 October 2024

Similarly, under Article 21 of the DSA, outof-court procedures must also be clear, user-friendly, and easily accessible on the platforms’ interfaces. This provision also establishes that the bodies carrying out these procedures must be impartial and independent, have the necessary expertise related to the contested issue, and be able to settle the disputes swiftly, efficiently, and cost-effectively with clear and fair procedural rules that are easily and publicly accessible. While these out-ofcourt settlements are not legally binding, platforms must engage with the procedures in good faith, and they therefore remain a viable option for users to seek redress for Charter rights violations.

72 Ortolani, ibid.

Another aspect of the DSA worth noting is that it does not explicitly create a private right of action or enforceable substantive rights. Therefore, complaints before DSCs under Article 53 will most likely relate to either procedural rules or other due diligence obligations imposed by the DSA, which could in turn be linked to violations of individuals’ fundamental rights and therefore offer a viable pathway to enforce them.73 However, Article 53 of the DSA, in conjunction with the obligations platforms have under Article 14 to apply and enforce their terms and conditions with due regard to users’ Charter rights, could actually be constructed as a separate private right of action before DSCs. This would give fundamental rights an indirect horizontal effect in the relationship between online platforms and their users.74 However, given the absence of detailed guidance on Article 53 and its interaction with other DSA provisions, and the current lack of precedent on how DSCs will interpret and act upon their competence and powers under the DSA, this remains to be seen.75

Another benefit provided by the DSA in terms of redress for fundamental rights infringements is the right that Article 54 confers on users to seek compensation from platforms and digital service providers for any harm or loss incurred as a result of a failure to comply with the DSA’s provisions. This compensation must be in accordance with EU and national laws.76

73 For instance, when the reasons behind a content removal decision are not provided, or information regarding the appeal mechanisms for this decision is not given, it might be possible to make the case that this failure constitutes an unjustified infringement with an individual’s freedom of expression, and potentially serve as the basis for a separate cause of action. See Bengi Zeybek, Joris van Hoboken, and Ilaria Buri, ‘Redressing Infringements of Individuals’ Rights Under the Digital Services Act’ (DSA Observatory Analysis, 4 May 2022) <https://dsa-observatory.eu/2022/05/04/ redressing-infringements-of-individuals-rights-under-thedigital-services-act/> accessed 3 October 2024.

74 Quintais and others (n 72).

75 Zeybek and others (n 84).

The DSA envisions the option for collective redress under the Representative Actions Directive, which significantly enhances individuals’ right to seek redress by reducing the burden of pursuing claims individually when harms have a collective dimension. Furthermore, whether individually or collectively, users are entitled to authorise a non-profit organisation or association to exercise their rights under the DSA on their behalf. This will gain additional importance if the organisations representing users are ‘trusted flaggers’ under Article 22 of the DSA. In recognition of their expertise, platforms have the obligation to handle complaints by trusted flaggers more expeditiously, and the privileges trusted flaggers are awarded under the DSA in general should be conducive to increased fundamental rights protections.

76 As set out by Recital 121, compensation should not only align with the applicable national law’s rules and procedures, but it should also apply without precluding other avenues for redress available under consumer protection regulations.

However, the fact that the DSA allows law enforcement agencies, including Europol, as well as profit-seeking industry organisations to apply for this status raises wider concerns around the ‘trusted flagger’ figure itself and how it might facilitate enforcement overreach. Moreover, in the context of Member States with more repressive regimes, the weaponisation of ‘trusted flaggers’ could also pose a serious threat to fundamental rights, especially when it comes to marginalised groups.

TAKEAWAY AND RECOMMENDATIONS

The DSA offers a novel and promising multi-level approach to enforcing EU Charter rights, encompassing both platform-based and external mechanisms to safeguard users’ rights. However, while the DSA does also leave the window open for other judicial remedies, research reveals a significant mismatch between existing pathways and effective access to justice for those affected by harmful online content. This raises considerable concerns around digital due process in this context.

• This is a new legal framework and the processes it lays out are mostly untested. Make sure that the people affected are onboard with the lack of certainty in the pathways you suggest.

• Depending on the urgency of the issue at hand and the type of remedy sought, the direct pathway of engaging with the platforms themselves could be an efficient way to address a fundamental rights violation because it does not preclude leveraging other pathways as well.

• To maximise the impact of a collective claim, you can combine this pathway with

CONCLUSION PATHWAYS TO JUSTICE TOOLKIT

This Pathways to Justice Toolkit was designed to support and guide you as you embark on your litigation strategy building. It shone a light on the complex matrix of decision-making processes in the different pathways that are available for enforcing the EU Charter.

While our choice was to guide you in the judicial and quasi-judicial pathways to justice for better fundamental rights enforcement and protection in the digital realm, we acknowledge that no strategic litigation can operate in a vacuum. The broader strategy is a constant negotiation between the applicants’ objectives, legal realities at hand, timing, costs, risks, and ultimately, luck.

We are convinced that maintaining a dialogue with the applicants affected by the fundamental rights violations–whether they are individuals, communities, organisations, or collectives–is necessary for ensuring that you will follow a pathway to justice that will lead to impactful changes for all.

Justice is not achieved by judicial and quasi-judicial pathways alone. The value of these pathways is only properly revealed when seen in conjunction with broader movement-building strategies, advocacy, communication, and storytelling efforts. As revealed by the DFF’s Strategic Litigation

Toolkit, we believe that strategic litigation is a process that lasts beyond the duration of the litigation procedures. The impact of a strategic litigation case is decided in a context much larger than that of a courtroom. The preparation of a good legal argument that carefully considers the procedural norms of the jurisdiction and forum at hand are naturally key to a successful decision. However, the impact of a successful decision is determined both before the court application is submitted and after the court decision is published.

With this toolkit, we have presented five different pathways, some of which can be efficiently combined for impact. We hope that these guidelines and recommendations become a helpful tool to contribute to your case-building and strategizing.

We hope that this toolkit will lead to a better understanding of EU Charter pathways, ensuring that fundamental rights are respected in the digital realm, and ultimately contribute to justice for all.

Systemic and transformative change happens when impacted people act and collaborate with others to build power to challenge the root of their oppression. With this toolkit, we collate and present information, strategies, and best practices designed to protect the EU Charter of Fundamental Rights (EU Charter / the Charter) in the digital sphere through strategic litigation.

Many of the case studies presented in this toolkit address direct harms and highlight the impact of fundamental rights violations resulting from invasive techno-social systems. Creating a corpus of texts highlighting litigation pathways built on the link between fundamental rights and digital rights is ultimately an attempt to address the legacy of power in the context of digital technologies.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.