70th Edition of the Synergy Magazine

Page 1

SYNERGY

No. 70· II - 2021

M A G A Z I N E Magazine of the European Law Students' Association

ARE WE READY FOR THE DIGITAL ERA?

Is Personal Data the New Currency?

Ethical Perspectives on Mandatory Digital Currencies

E-Voting: Opportunity or Threat?

Eirini Vyzirgiannaki

Răzvan-Ștefan Bunciu

Eugenio Ciliberti


Partners' and Externals' Perspective

2 | SYNERGY Magazine


ABOUT ELSA

ELSA International Phone: +32 2 646 26 26 Web: www.elsa.org E-mail: elsa@elsa.org

The Association The European Law Students' Association, ELSA, is an international, independent, non-political and not-for-profit organisation comprised of and run by and for law students and young lawyers. Founded in 1981 by law students from Austria, Hungary, Poland and West Germany, ELSA is today the world’s largest independent law students’ association.

ELSA Members x 60,000

ELSA Local Groups x 400

ELSA National Groups x 44

Synergy Magazine Synergy Magazine is ELSA's members' magazine, which is published digitally twice a year and read across law students and young lawyers. The articles are contributions from students, young and experienced lawyers as well as academics.

VISION

ELSA International Human Rights Partner

General Legal Partners

"A JUST WORLD IN WHICH THERE IS RESPECT FOR HUMAN DIGNITY AND CULTURAL DIVERSITY"

ELSA’s Members ELSA’s members are internationally minded individuals who have an interest in foreign legal systems and practices. Through our activities, such as seminars, conferences, law schools, moot court competitions, legal writing, legal research and the Student Trainee Exchange Programme, our members acquire a broader cultural understanding and legal expertise.

General Education Partners

Our Special Status ELSA has gained a special status with several international institutions. In 2000, ELSA was granted Participatory Status with the Council of Europe. ELSA has Consultative Status with several United Nations bodies: UN ECOSOC, UNCITRAL, UNESCO & WIPO. General Partners

ELSA is present in 44 countries Albania, Armenia, Austria, Azerbaijan, Belarus, Belgium, Bosnia and Herzegovina, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, France, Georgia, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania, Luxembourg, Malta, Montenegro, the Netherlands, North Macedonia, Norway, Poland, Portugal, Republic of Moldova, Romania, Russia, Serbia, Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, Ukraine and the United Kingdom.

SYNERGY Magazine

Contributions

Editor-in-chief: Tony Marinescu

Would you like to contribute

Would you like to advertise your

Assistant Editor: Fairouz Abu Dahab

with

courses,

Linguistic Editor: Selin Aklan

the Magazine? Please, contact

products, please do not hesitate to

Design : Tony Marinescu

ELSA

contact ELSA .

Contact: marketing@elsa.org

information and guidelines.

articles

Advertising or

International

pictures for

for

further

services,

company

or

SYNERGY Magazine | 3


EDITORIAL

There is an old saying that describes the basic meaning of "synergy", "The whole is greater than the sum of its parts". With this in mind, ELSA has been tirelessly working for the past 40 years towards its vision of a just world. In 1987, the first concept of "ELSA Synergy" has sparked, in the shape of a printed newsletter. Two Tony Marinescu years later, it was upgraded to the Vice President in charge of international magazine we have all Marketing of the International been accustomed to. I still remember Board 2021/22 reading my first Synergy Magazine, fours years ago, when I was just a freshman in law school. Now I had the chance and pleasure to be the Editorin-Chief of this 70th edition, which highlights whether we are ready for the digital era or not. The past two years have urgently advanced the need and importance of the digital environment. Hence, we are basically in need of the Internet for some of the most mundane tasks. On top of all of these, new technologies emerge, which the general public can barely comprehend at the first glance. From blockchain to artificial intelligence, these topics just scratch the surface of this upcoming technology bubble which will majorly impact our lives. To learn more about these topics, members of our community prepared articles, which are a must-read. During my past months in the International Board of ELSA, I realised the importance of a strong community and its history. With this in mind, our Network dedicated time to rethink the future of Synergy. I am proud to announce that this publication is moving truly digital, taking the form of a website. Through this platform, we are able to still receive and promote your articles, but also we go back to the initial motive of Synergy, which was to promote of Network's initiative. During this upcoming Spring, I invite all of you to the launch of the platform and to contribute so we can take the publication to the next level. Lastly, I would like to thank my team who helped me create this last edition of the Synergy Magazine: Fairouz, Selin and Tomas. Thank you for all your work so far and I am looking to working with you for the transformation of Synergy! Have a good read and a great 2022!

4 | SYNERGY Magazine


TABLE OF CONTENTS HIGHLIGHTS

The ‘Digitally Capable’ Lawyer – Preparing the future generation

14

PARTNERS' AND EXTERNALS' PERSPECTIVE

06 10 14

Are we ready for the digital era? Europrivacy, a digital by design certification scheme for GDPR compliance The ‘Digitally Capable’ Lawyer – Preparing the future generation

INTERNATIONAL FOCUS

The Reality of Crypto-Assets: Tax Issues and Prospects Within the EU

26

16 Is Personal Data the New Currency? 18 Law Students in the Digital Marketplace 20 Ethical Perspectives on Mandatory Digital Currencies 24 Blockchain and the Right to be Forgotten 26 The Reality of Crypto-Assets: Tax Issues and Prospects Within the EU 28 Civil and Criminal Liability and Autonomous Robotic Surgery 30 The Cosmetic Industry and Augmented Reality: Biometric Data Collection

33 36

E-Voting: Opportunity or Threat?

and the Privacy Paradox "Meet your Robot Judge" E-Voting: Opportunity or Threat?

36 SYNERGY Magazine | 5


PARTNERS' & EXTERNALS' PERSPECTIVE

ARE WE READY FOR THE DIGITAL ERA? Yannick Meneceur

Head of the Digital Development Unit Council of Europe

The speed at which our society and our lifestyles are being transformed by digital technologies is unprecedented. Artificial intelligence ("AI") is certainly one of the main drivers of this transformation, at the heart of a growing number of services that already populate our daily lives. The term "AI", whose content has evolved substantially since its creation in 1955, has been reenchanted since the early 2010s and now refers to the various machine learning algorithms (such as deep learning), whose processing results have appeared to be particularly spectacular not only for image or sound recognition, but also for natural language processing. For several decades, public decision-makers have been promoting the development of computer science and digital technologies, convinced by the promise of improving our lives, as well as by the prospects of ever more dizzying economic profits, which are now counted in billions of billions of euros. In the competition for the development of "AI" in particular, international and regional regulators, such as the Council of Europe, the European Commission, the OECD or UNESCO are intervening in their respective fields of competence to try to frame the effervescence of initiatives. The idea for all the regulators is to use the law both as a technique for supervising the use of these new tools in order to prevent the significant 6 | SYNERGY Magazine

risks they pose to the respect of a certain number of the most fundamental rights and values, but also as an instrument for stimulating and developing the market. On reading the texts already drafted or in the process of being drafted, the future of the governance of this technology looks rather balanced, between respect for human rights, economic objectives, and ethical requirements, with intergovernmental organisations that seem to agree on the need to set up mechanisms for verifying "AI" before it is put into service or introduced onto the market. As it stands, one could be satisfied with the progress made in such a short time, remembering that it took decades in other industrial fields, such as pharmaceuticals, to reach this kind of maturity. However, the capacity of some of these new legal instruments to effectively prevent violations of fundamental rights and create a real "trustworthy AI" remains questionable for some. And if we ask ourselves the more general question of our preparation to face the challenges of the digital era, we must admit that the current discourses respond to each other with a relatively identical critical argumentation, creating a form of consensus consisting in only questioning the risks of infringement of the rights and freedoms of individuals, but without ever questioning the relevance of the real project of society that underlies


Partners' and Externals' Perspective

these new technologies. Thus, a certain number of major challenges seem to have been significantly underestimated in recent years and at least three series of actions should be taken to really prepare us for this new era: deconstructing the consensus on the neutrality of digital technologies and artificial intelligence (1); objectifying the capacities of artificial intelligence systems (2); evaluating the environmental sustainability of the digital society model (3). 1. Deconstructing the consensus on the neutrality of digital technologies and artificial intelligence Today, it seems quite intuitive to believe that technology is neither good nor bad in itself, and that we should focus only on its uses. However, this discourse ignores the extremely close link between human societies and the technical system made up of all their artefacts: each major discovery has contributed to substantially reshaping our environment, while at the same time sometimes extending its effects over several centuries. Thus the scope of the invention of printing went beyond the mere mechanization of the reproduction of works: the Reformation of the Church, the Enlightenment, and access to knowledge in general were all events linked to this invention. The advent of industrial processes in the 19th century also profoundly recomposed the relationships between individuals as well as our living spaces and modes of governance. Pursuing this continuous dynamic of progress, we would be today at our 4th industrial revolution with the meeting between "the world of the physical, the digital, the biological and the

innovation" whose tools already exceed the simple sophistication of existing means and feed the hopes of the transhumanists. This is why the originality of the system composed by the interactions between humans and digital technologies requires us to make an effort to decipher the environment being composed in order to grasp its composition and its governmentality. The transformation we are currently undergoing with the translation of the smallest corners of our lives into data for algorithmic processing is not simply an optimization of our modes of operation. This transition is actually leading us towards a whole new model of society, which obviously brings with it technical progress, but also its share of disenchantment, control and even totalitarianism. And this is not just because of the way we use these tools, but because of the structure woven by the tangle of computer and statistical mechanisms that are supposed to have the capacity to better appreciate, in all circumstances, an ever-increasing number of situations. In a very concrete way, the functioning of a company like Amazon already allows us to see how the presence of algorithmic foremen transforms, by their very nature, the work relationships. Fortunately, as we have seen with contact tracing applications during the health crisis, fundamental rights still protect us from many abuses. But the exercise of power over individuals, this biopolitics theorized by Michel Foucault, is today complemented by discrete mechanisms of algorithmic decision-making, increasingly autonomous, whose SYNERGY Magazine | 7


Partners' and Externals' Perspective

creation has no democratic basis and which would even dismiss the political thing. 2. Objectivizing the capabilities of algorithmic systems But the rhetoric of our digital era, especially about "AI", also fails to address other equally important issues: the structural difficulties in the operation of algorithmic systems, which are constantly being fixed, patched and adapted. Many of us conceive of these machines in an abstract way, ordered and stored in clean rooms in datacenters, whereas we should rather retain the image of a very artisanal steam engine, cobbled together by craftsmen with many pipes and patches. The accumulation of these challenges is such that we should, again, look at the big picture rather than considering these issues in isolation. This would allow us to assess whether these devices are mature enough to move out of the laboratory and operate in the real world, especially for decision-making functions in areas as sensitive as public services or health. To take just one example of these long-identified difficulties, we might consider the approximations of many "AI" systems due to random correlations or 8 | SYNERGY Magazine

misinterpretations of causality. A model revealing, for example, that asthmatics have a lower risk than the rest of the population of developing serious lung diseases should not lead us to consider that they have developed some form of protection against pulmonary complications, but rather that they will consult specialists more quickly because of their fragility. If this reasoning seems simple and common sense, how can we be sure that, in the vast intricacy of the links discovered by a machine, thousands (millions) of parameters do not lead to such confusions? When added together, we realize that we are in fact facing a very serious technical problem that should lead us to question the very project of generalizing machine learning as a viable solution to ever more diverse categories of applications, and even make us reconsider the claim of using this approach to progress towards general artificial intelligence by making learning methods more sophisticated. As a critical reflex, a very artificial balance between benefits, hoped for, and risks, resulting from the mere misuse of the technology, is most often invoked. In other words, due to a lack of technological culture and wrapped up in the certainty of the neutrality of algorithms, we


Partners' and Externals' Perspective may be entertaining illusions forged by the marketing of "AI" and we are wrongly leaving any effort to measure technical problems to technicians alone. 3. Assessing the environmental sustainability of the digital society model Deconstructing the consensus on the neutrality of technologies and objectifying the capacities of algorithmic systems are therefore two prerequisites that are all too often ignored in the construction of new forms of governance in the digital era, all too often ignored out of concern for not losing rank in the technology race. However, it is to be feared that another reality will impose itself with great brutality on all the competing blocks: this reality is the limit of our planet's resources, which have been largely overexploited for decades. Once we have reached the end of the rare earths available, how will we continue to produce the physical materials that support digital technologies? The shortage of semiconductors resulting from the pandemic disorder is a clear warning signal of our current dependence and the resulting fragility. Kate Crawford, a professor at New York University and a researcher at Microsoft, has illustrated in her Atlas of AI the profound impact of the development of this technology on our planet and the related power issues. Crawford first physically visited the place where lithium is extracted, which is essential for creating batteries for mobile terminals or electric cars. The findings are overwhelming and remind us of the consequences of the gold rush of the 19th century, when vast areas were rendered barren to enrich cities and individuals who are still prosperous today (already in the western United States). The parallel with the current logic of the

digital industry (massive extraction of minerals to build materials, massive extraction of data to run algorithms, concentration of wealth produced with very little "trickle down", indifference to the damage caused) lets us perceive a model of pure and simple plundering that is absolutely not sustainable over time. The author thus invites us to an obvious awareness: we are in the process of exhausting considerable quantities of the earth's materials to serve the space of a few seconds of geological time. Crawford thus demonstrates with great acuity the very short-term vision of current public policies, all the more flagrant if one links her conclusions to those of the IPCC on the evolution of our climate. Our ever-increasing dependence on digital technology therefore has an exorbitant cost that is difficult to sustain. In response to these questions or to those concerning the monstrous energy consumption of data centres or blockchains, new technical solutions are often put forward, which are supposed to balance the carbon footprint, but whose long-term effectiveness remains to be proven. Embracing the technical approach of engineer-entrepreneurs alone will not be enough to constitute a viable project for society and it seems urgent to add a political dimension that asks what kind of world we want to live in. Is "all-digital" absolutely the right model? Shouldn't other projects be put into competition and debated? Can't the European continent, which is so rich in values, contribute to raising awareness through ambitious public policies for the environment instead of giving in to harmful competition with other continents?

SYNERGY Magazine | 9


Partners' and Externals' Perspective

EUROPRIVACY, A DIGITAL BY DESIGN CERTIFICATION SCHEME FOR GDPR COMPLIANCE

Dr Sébastien Ziegler

Chairman of the Europrivacy International Board of Experts Europrivacy

Europrivacy is a certification scheme developed through the European Research Programme Horizon 2020 to assess data processing activities and certify their compliance with the European General Data Protection Regulation (GDPR) obligations and complementary data protection regulations. It is managed by the European Centre for Certification and Privacy (ECCP) in Luxembourg under the supervision of an International Board of Experts. Europrivacy has been brought by the Luxembourgish National Commission of Data Protection (CNPD) to the European Data Protection Board (EDPB) for endorsement under art. 42 GDPR. It is the first certification scheme under review for official recognition as European Seal. GDPR Certification – a powerful mechanism not exploited yet There are over 70 references to certification in the GDPR, including for assessing the compliance of data processors (Art. 28.5 GDPR), for cross-border data transfers (Art. 42.2, 46.2.f GDPR) or for assessing the adequacy of technical and organizational measures set in place (Art. 32.3 GDPR). As stated in the Regulation, 10 | SYNERGY Magazine

the purpose of certification is for “demonstrating compliance with this regulation of processing operations by controllers and processors” (Art. 42 GDPR) and “allowing data subjects to quickly assess the level of data protection of relevant products and services.” (Recital 100 GDPR). As a consequence, certification under the GDPR is subject to very specific requirements. For instance, it needs to be aligned with the evolution of the regulation, its related jurisprudence and soft law, including EDPB publications. That is why Europrivacy is supported by an International Board of Experts in charge of continuous monitoring of the evolution of the data protection related obligations for updating the scheme accordingly. In other words, Europrivacy is a living scheme in osmosis with the regulatory environment. Another requirement is to specifically focus on certification of data processing activities. Consequently, certification of management systems, such as ISO/IEC 27001 and 27701, is not eligible under art. 42 GDPR. The benefit of this approach is


Partners' and Externals' Perspective twofold: Firstly, it delivers a more granular and reliable indication of compliance. Secondly, it enables data controllers and processors to progressively certify data processing, step by step, in decreasing priority order. To ensure that such an approach does not become too costly, in particular for small and medium-sized enterprises (SMEs), an important part of the research has been dedicated to maximizing the efficiency of the certification process to increase the reliability of the assessment, while optimizing the process in terms of time and cost-efficiency. The benefits of a GDPR certification Europrivacy has been designed to precisely address the abovementioned requirements. It enables to assess compliance of data processing activities with all the GDPR obligations whose non-compliance could entail a risk for the data subjects’ rights and freedom or for the applicants. The certification process starts by a systematic assessment of compliance with the data protection obligations in order to identify residual non-compliances and to reduce the related

legal, financial, and reputational risks for the applicant. Once the processing activity is validated, it enables the applicant to demonstrate the compliance in order to build trust and confidence, to develop competitive advantages, and to improve their reputation and market access. A GDPR certification allows to recognize and transform compliance efforts into an asset that can become a potential source of revenues for the applicant. Another benefit of a Europrivacy certification is that all certified applicants are kept informed about any changes in data protection compliance requirements that have been identified by its International Board of Experts. Applicability to emerging technologies As Europrivacy has been developed in the context of the European Research Programme Horizon 2020, since its inception it has been designed to encompass data processing involving innovative technologies such as artificial intelligence, distributed ledger technologies, and the Internet of Things. It works

SYNERGY Magazine | 11


Partners' and Externals' Perspective

closely with the research community and is currently involved in several European research projects in the domains of e-health and medical data, artificial intelligence, smart grid, and connected vehicles. This has led to a unique certification scheme model that combines core criteria with complementary domain and technology-specific criteria. It enables using the same scheme for certifying all sorts of data processing, while taking into account technology and domain-specific obligations. Addressing national obligations and extensibility to non-EU jurisdictions GDPR certification requires taking into account national obligations. Europrivacy has researched and developed an innovative mechanism to address these national obligations in the certification process. It also provides support with profiles on complementary national data protection obligations for each EU jurisdiction, as well as for a series of nonEU jurisdictions. Indeed, another characteristic of Europrivacy is its ability to be easily extensible to nonEU jurisdictions. ISO compliance While Europrivacy’s prime focus is on data protection obligations, the scheme itself has been designed to be fully compliant with both ISO/IECT 17065 and 170211. It is easily combinable with ISO certifications such

12 | SYNERGY Magazine

as ISO/IEC 27001 or 27701, complementing each other to simplify the certification process. Building a global community of qualified partners The European Centre for Certification and Privacy focuses on its role as scheme owner to ensure that the certification scheme is aligned with the evolution of the norms. The centre has developed an ecosystem of qualified certification bodies, law firms, and consulting firms able to deliver support and certifications to data controllers and processors. It encourages the global adoption of Europrivacy to support compliance and reduce risks related to data processing in a growing data economy and digital single market. Online academy, community website, resources and tools Europrivacy has the ambition to develop and propose a new model of certification and user experience. In order to support the use and adoption of Europrivacy, the European Centre for Certification and Privacy has developed an online Academy: https://academy. europrivacy.com. It delivers a sequence of three training programs: Introductory Course, Course for Implementers, and Course for Auditors. Each programme is provided through online videos and is completed by an online exam to validate the acquired knowledge and understanding of the scheme. The implementer and auditor courses provide formal qualifications that demonstrate the ability of the


Partners' and Externals' Perspective qualified experts. In parallel, ECCP has developed an online Community website: https://community.europrivacy.com. It provides many online resources, including over 750 reference documents, templates, and guidelines to support GDPR certification. It provides three customized sets of resources for Data Protection Officers, qualified implementers, and qualified auditors. Finally, it facilitates access to online tools and technologies from the research to support data protection compliance. Promoting dialogue, cooperation, and knowledge sharing: Privacy Symposium 2022 Europrivacy has been designed to be a living scheme supported by a living community of experts to address a fast-evolving regulatory and normative environment. The European Centre for Certification and Privacy supports international dialogue and cooperation in data protection and compliance. That is the reason why it is collaborating with the Council of Europe, ELSA, the European Centre for Cyber Security, and other organizations to organize the Privacy Symposium conference (www.privacysymposium. org) in Venice from April 5th to 7th 2022. The conference aims at promoting international dialogue, cooperation, and knowledge sharing. It will discuss the evolution of data protection regulations, at the national and international level, and their interaction

with innovative technologies, such as artificial intelligence, distributed ledger technologies (i.e., Blockchain), Internet of Things and edge computing. A specific programme will be dedicated to e-health and medical data compliance with the GDPR. The conference also includes a call for papers open to researches and practioners. The best papers will be presented in Venice and published by Springer. Conclusion Europrivacy leveraged the ISO principles’ foundations to research and deliver a highly innovative and efficient certification scheme with a new model of born-digital certification. It provides an agile and living certification scheme, able to address emerging technologies and a fast-evolving regulatory environment. It enables to efficiently assess the compliance of all sorts of data processing activities, from regular ones to highly innovative ones. Technology is an important enabler, but it is only a means, not an end. While our ambition is to leverage technology for delivering a new user experience of compliance as a service for all stakeholders, our priority is to build a community of experts and partners with a true passion for data protection and compliance. As ELSA members, you are more than welcome to join us. If you are interested, feel free to contact us at: contact@europrivacy.org

SYNERGY Magazine | 13


Partners' and Externals' Perspective

THE ‘DIGITALLY CAPABLE’ LAWYER – PREPARING THE FUTURE GENERATION

Matthew Carl

Senior Library Manager The University of Law

The Covid-19 pandemic has ushered in significant digital transformation across the legal and legal education sectors. Dispersed working, deeper online collaboration, further integration of AI and virtual meetings are just a few examples of how law firms and legal education providers have had to pivot to meet the demands of an increasingly digitised world. The wider economy in the UK has also seen a shift to the deployment of automation and AI, with e-commerce rising in 2020 by 4.5x the average for the years 20152019. Additionally, the way solicitors in the UK qualify is radically changing, with the introduction of the Solicitor’s Qualifying Exam (SQE) for the first time this year. This development brings with it a range of considerations for legal training providers and law firms, including how the SQE’s ‘standardised approach’ imparts the necessary skills for graduates to succeed within the modern law firm or workplace. All the above begs the question, how do we prepare the future generation of lawyer for the digital workplace and digital economy? What does that look like and what skills will they need to succeed in the digital era? The University of Law’s Digital Academy is our

14 | SYNERGY Magazine

response to those questions, an integral part of the university’s future academic model that provides intensive 1 to 1 digital support to students. The Digital Academy enables law graduates to understand their current digital capabilities by using the JISC Discovery tool, a self-reflective benchmarking platform that tracks and analyses their digital capabilities over the course of their studies. The JISC Digital Capability framework provides the basis for the ‘well rounded, digitally capable legal professional’ to develop a wide range of digital skills to succeed and thrive in the digital economy. These range from information and data literacy to digital wellbeing practices. By using the JISC Discovery Tool, students can reflect on and engage with learning that contributes to the development of skills requiring improvement. This not only promotes the development of digital skills but ensures a commitment to continuous CPD from an early stage. Digital capabilities such as coding and creation of digital media may not be directly relevant to law firms. To think like this is a missed opportunity however, as developing these skills can increase a graduate’s


Partners' and Externals' Perspective ability to interface and adapt to new forms of legal technology and a greater commercial understanding of the tech sector, a rapid growth area for most law firms. With Canva now being valued at $15bn, an understanding of online media creation can boost a student’s knowledge and insight into the processes that companies like Canva are engaging in. The other key aim of the Digital Academy is to create a professional training environment, one that mirrors best practice within learning and development departments in law firms. The transition from academic to work-based CPD learning can often be challenging for law graduates and can lead to disruption, as graduates may need more time to adjust to the internal training regimes of law firms. Our content and focus on adult learning pedagogy ensure that graduates start to become familiar with the type of training and content delivered by traditional learning and development departments, allowing graduates to ‘step into’ the formal training structures with ease.

In addition to and complementing the Digital Academy, is the ULaw Tech and Research Academy (ULTRA), an online platform that provides updates and learning on the latest legal tech developments such as legal AI and blockchain. Developing an awareness of these types of technologies ensures that graduates are law firm ready from day one, ensuring they can use this technology to develop efficient and effective workflows. By focusing on building digital capabilities, law graduates can build for themselves a solid foundation of digital competencies, one that enables them to excel within a rapidly changing environment. This should enable them to enter law firms with a broad range of digital skills, allowing them to adapt and evolve their practice to any technological developments or new ways of working. Having an awareness of legal tech and ai at an early stage can really help new solicitors leverage the most from this technology, allowing them to achieve greater productivity and ROI from day one.

SYNERGY Magazine | 15


INTERNATIONAL FOCUS

IS PERSONAL DATA THE NEW CURRENCY? Personal Data as a Tradeable Commodity in the Digitial Era: A European Perspective

Eirini Vyzirgiannaki

Member of ELSA Athens

As we transition into the digital era, the traditional economic model shifts towards a data-driven one. Not only is there an enormous volume of digital data available, but also an array of advanced methods to process it and extract monetary value from it. The personal data of European citizens has the potential to be worth nearly €1 trillion annually by some estimates.1 In this light, could one argue that data is the currency of today's digital economy? Is it possible to conceptualise personal data as a tradeable commodity? Given that data protection amounts to a fundamental right in Europe, can personal data be treated as a mere economic asset? Three data-centred business strategies are prevalent in contemporary digital markets.2 In the ‘zero-price’ or ‘data as payment’ model, consumers provide their personal data in exchange for digital services or products that are otherwise advertised as free of charge. In the ‘personal data economy’ users may supply their data to businesses and obtain value from this exchange. Finally, the ‘pay for privacy’ model 1  European Commission, ‘Questions and Answers - Data protection reform package’ (24 May 2017) <https://ec.europa.eu/commission/ presscorner/detail/en/MEMO_17_1441> accessed 1 October 2021. 2  Stacy-Ann Elvy, ‘Paying for Privacy and the Personal Data Economy’ (2017) 117 Columbia Law Review 1369. 16 | SYNERGY Magazine

requires consumers to pay a higher price if they do not consent to the collection and processing of their data, while offering discounts to those who do. Data subjects are, thus, perceived as owners of wealth that can be shared on their own terms. Consumers may supply their personal data as a means of payment, a non-pecuniary counter-performance, in agreements for digital services and content. In turn, businesses can harness the collected data to generate revenue either directly or indirectly.3 Namely, they can extract monetary value by selling or licensing it. They can also increase profit by using consumers’ data for product improvement, personalised services or offers, and targeted advertising. However, treating personal data as a non-pecuniary currency cannot accurately reflect the value exchange that occurs in a transaction. As personal data is inherently dynamic and fluid and cannot obtain a standardised value, it is challenging to quantify its commercial worth.4 Furthermore, as a non-depletable asset, it can be exploited indefinitely 3  Beate Roessler, ‘Should Personal Data Be a Tradable Good? On the Moral Limits of Markets in Privacy’ in Beate Roessler and Dorota Mokrosinska (eds), Social Dimensions of Privacy (CUP 2015). 4  Rebecca Kelly and Gerald Swaby, ‘Consumer Protection Rights and “Free” Digital Content’ (2017) 23(7) Computer and Telecommunications Law Review 165.


to generate more profit. Therefore, such data-driven transactions are often asymmetrical and consumers are likely to provide personal data whose ultimate value exceeds that of the product or service they receive in return.5 The uneven bargaining power compromises the level of protection afforded to personal data and may have a discriminatory effect.6 Privacy may become a luxury available to few, while those unable to afford it would relinquish access to their data for a discount. Similarly, individuals may be denied goods and services unless they disclose their data. In the same vein, the high level of protection granted to personal data within the European legal order limits the leeway for its commercialisation –lucrative as it may be.7 Being perceived as a projection individuality, intrinsically tied to personhood, personal data cannot be reduced to a mere commodity. The right to data protection constitutes an autonomous fundamental right; the pertinent legal framework in Europe includes Articles 7 and 8 EUCFR, Article 16 TFEU, the GDPR, Article 8 ECHR and the modernised Convention 108+ of the Council of Europe. Nevertheless, as per Recital 4 GDPR, data protection is not an absolute right but must be considered in relation to its function in society and be balanced against other fundamental rights. In this regard, the tradability of personal data is supported by the freedom to conduct business and form contracts. Notably, it is underpinned by the right to informational self-determination, which encompasses relinquishing control over one’s data. Eliminating such data-centred transactional practices would also undermine the free flow of data and the proper function markets. As the legal system allows the commercial exploitation of other incorporeal personality attributes with an economic value, such as one’s name, image and reputation,8 the commodification of personal data appears conceivable within in the European legal order.

With Directives 2019/770 and 2019/2161, the EU legislature expanded consumer protection to online contracts for digital services and content ‘paid’ with personal data rather than with money and, thus, acknowledged that personal data functions as a de facto price or contractual counter-performance. Yet, it stipulated that personal data cannot be reduced to a mere economic asset and shall fall in a special protective regime even when treated as a tradable commodity.9 Contract law shall apply on these transactions only in tandem with the data protection legislation. The principles shaping the European perspective on data protection, namely privacy, transparency, autonomy and non-discrimination, offer legitimate ground to restrict freedom of contract. On that premise, contractual arrangements concerning personal data shall be valid insofar as individuals retain their rights as data subjects, currently enshrined in the GDPR. Hence, even while bound by a contractual link, counterparties shall have their data processed only if a lawful basis exists and shall be able to exercise their rights to access data, withdraw consent or even ‘be forgotten’. All in all, delving into the dual nature of personal data and reframing it as a tradable commodity showcases the opportunities and challenges that accompany our transition into the digital era. Although some conceptual barriers in legal thinking may need to be broken down as we navigate this novel landscape in Europe, the way forward passes through the core principles, values and norms that underpin our legal order. 9  Václav Janeček and Gianclaudio Malgieri, ‘Data Extra Commercium’ in Sebastian Lohsse, Reiner Schulze and Dirk Staudenmayer (eds), Data as counter-performance - contract law 2.0? (Nomos 2020).

The dual nature of personal data as the subject of a protected fundamental right and as an economic good has increasingly gained normative recognition. 5  Gianclaudio Malgieri and Bart Custers, ‘Pricing Privacy – the Right to Know the Value of Your Personal Data’ (2018) 34 Computer Law and Security Review 289. 6  Giuseppe Versaci, ‘Personal Data and Contract Law: Challenges and Concerns about the Economic Exploitation of the Right to Data Protection’ (2018) 14 European Review of Contract Law 374. 7  European Data Protection Supervisor, ‘Opinion 8/2018 on the legislative package “A New Deal for Consumers”’ (5 October 2018). 8  Giorgio Resta, ‘The New Frontiers of Personality Rights and the Problem of Commodification: European and Comparative Perspectives’ (2011) 26 Tulane European and Civil Law Forum 33. SYNERGY Magazine | 17


International Focus

LAW STUDENTS IN THE DIGITAL MARKETPLACE How the employment structure of law students will change in the age of algorithmisation?

Szymon Skalski

Former Treasurer for ELSA Cracow

The post-pandemic world is a highly uncertain reality, especially in a matter so sensitive to change as the labour market. While the legal industry is often considered to be rather conservative when it comes to changing well-known solutions, it seems impossible that the long-term forced digitalisation of work will not leave a significant mark on the work of lawyers. This forces us to take seriously the discussion on the future fate of certain pillars of the whole industry. From the perspective of the employment of law students, one of the key aspects seems to be the substitutability of their work by more or less sophisticated algorithms or computer programs (AI or non-AI). However, to address these concerns as a whole, it is necessary to pre-determine the nature of the work that students most often perform in law firms. It seems that there are 3 aspects that best show the nature of work in a law firm before becoming an advocate or legal adviser: administrative work, research, and drafting simple procedural documents. Administrative work, which usually involves helping to organise work in a law firm and put in order documents, seems to be one of the first tasks that may be completely delegated for execution by algorithms. Basing the firm's work on a simple algorithm for 18 | SYNERGY Magazine

coordinating the work of employed lawyers, organising documentation, and keeping calendars seem today not so much a future as a very real possibility. What is more, such solutions do not have to be implemented using algorithms; probably sufficiently advanced software not based on machine learning could be sufficient to replace the work of students, secretaries, or office managers in this aspect though it might still require some level of censorship over it while AI would conduct those operations fully on its own and present workers with final results daily. Taking into account actions such as the e-Justice strategy1 introduced in the European Union and the awareness of certain backwardness of the judiciary in relation to today's world, it seems only a matter of time before the fully automated circulation of documents between parties, courts, and attorneys is introduced. The work of organising documents will then become completely redundant, or at least simplified to such an extent that it eliminates the need to employ additional staff. Perhaps the most essential part of the work of young lawyers in law firms is the substantive task of finding the information necessary to draft pleadings and 1 https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELE X:52019XG0313(01)&rid=7


International Focus conduct cases. Unfortunately, this work at a certain level of complexity is also quite easily substitutable. Simple expert systems used for data analysis began to emerge as early as 1965, and since then, with few interruptions, the capabilities of the algorithms have only grown. Of course, complex work, especially that which involves ethical issues such as criminal law, still needs to be not only supervised but also, in most situations, managed by a human. It seems, however, that the legal market will soon have to resolve the dilemma of whether to stick to a conservative stance or move with the times and delegate many hours of reading of legal acts and regulations to algorithms. As of today, algorithms are capable of acquiring and processing vast amounts of data at a rate unattainable by an ensemble of humans. Of course, elements such as algorithmic bias and EU regulations limit the possibility of using and developing the most invasive algorithms, but this does not change the fact that at the simplest level of research, which is just collecting information without processing it in detail, AI capabilities far exceed human ones. For many law firms, this may be a sufficient reason to stop offering internships and training to students. The final aspect of the work identified is the drafting of simple procedural documents. While this is a long shot, it seems that simple documents especially those based on accounting documents such as invoices could easily be drafted on a mass scale without any human input. It should be noted, however, that, especially when dealing with a huge number of very similar commercial cases, computer programs that compose the content of the claim themselves are already widely used today. While some human supervision is still necessary, systems that completely bypass lawyers, especially in very simple and repetitive cases, do not seem to be far in the future. Of course, there are again legal and customary barriers, but the market and lobbying groups have a great influence on created laws, which may result in faster adaptation of the law to technological realities.

industry will therefore be faced with the dilemma of how to engage law students in such an important aspect of their education as learning about the market environment in which they will work, in a situation where the work they can do is becoming less and less necessary. The answer is to be found at university. In the situation depicted in this article, it seems necessary for universities to take on the role of law firms in offering work experience opportunities to students. Initiatives to support this approach include moot courts and all sorts of local and international student exchange programmes to improve legal skills. We cannot leave students in a situation where, on the one hand, the market does not need their work while they are still at university and, on the other hand, when they finish their studies they are required to enter the highest level of legal work instantly, as simpler tasks are delegated to algorithms and computer programmes. It is for this reason that legal education in Europe and worldwide should adapt rapidly to the realities of a changing world. This presupposes actions such as: including the law of new technologies as a compulsory subject, making it possible to learn programming languages, increasing digital awareness of future lawyers. Without these and many other skills, future law students will be vulnerable to all the radical changes in the labour market caused by technology. Whether we like it or not, the legal profession is facing far-reaching changes and we need to prepare for them already at the stage of our university studies.

When diagnosing the current situation, therefore, one must bear in mind the sensitivity of the matter in question. Many people have a negative attitude towards changes introducing algorithms into widespread use, especially in industries dealing with such important aspects of life in the 21st-century society as the legal industry. However, assuming an economic paradigm of profitability while maintaining focus on maximizing the results of lawyers' work, progressive algorithmisation seems inevitable. The 19 | SYNERGY Magazine


PARTNERS' & EXTERNALS' PERSPECTIVE

ETHICAL PERSPECTIVES ON MANDATORY DIGITAL CURRENCIES

Răzvan-Ștefan Bunciu

Member of ELSA Bucharest

After the financial crisis of 2008-2010, the digital currency market gained momentum to such an extent that I am wondering whether ‘cash’ will survive the following decades. There are several types of digital currencies that share some peculiarities: they can be accessed by electronical means, exclusively, and they are sometimes referred to as ‘cybercash’.1 I will use this term for digital currencies as it clearly portrays the contrast between this type of currencies and ‘traditional’ ones. The suggestion for mandatory exclusive use of cybercash is based on considerations such as the speed and simplicity of virtual transfers. In this respect, China gives an example of a state-led digital currency (e-yuan) that is aimed to create a society without cash.2 Nevertheless, the US and the EU also want to create digital versions of their currencies.3 These decisions seemingly have a purely economic scope, and they aim at high and sustainable growth, especially in the context of the struggle for global hegemony. The topic I discuss is whether it is ethical 1  Jake Frankenfield, ‘Digital Currency’ (Investopedia, 10.8.2021) <https://www.investopedia.com/terms/d/digital-currency.asp> accessed 10.10.2021. 2  Andrew Browne, ‘Bloomberg New Economy: China Cashless Economy and Surveillance’ (Bloomberg, 20.2.2021) <https://www. bloomberg.com/news/newsletters/2021-02-20/bloomberg-new-economy-china-cashless-economy-and-surveillance> accessed 10.10.2021. 3  Luca D’Urbino, ‘The digital currencies that matter’ (The Economist, 8.5.2021) <https://www.economist.com/leaders/2021/05/08/thedigital-currencies-that-matter> accessed 10.10.2021. 20 | SYNERGY Magazine

to impose technology and force people to give up traditional means of payment (such as cash) in order to achieve a competitive economy. J.S. Mill, the father of utilitarianism, argues that a doctrine is moral if ‘actions are right in proportion as they tend to promote happiness, wrong as they tend to produce the reverse of happiness’.4 Therefore, utilitarianism has at its core the concept of happiness, which Mill sees as the sum of the happiness of all people, so ‘that each person’s happiness is a good to that person, and the general happiness, therefore, a good to the great aggregate of all persons’.5 Consequently, it is ethical to force the mandatory exclusive use of cybercash, as long as it brings more benefits. The idea that Rawls6 considers weak in utilitarianism is that a person must act according to the interests of the group, not according to his/her interest, so the case of how happiness is distributed within the group is not considered. Subsequently, a person acts morally if he or she perceives himself or herself as spectators and acknowledge the interests of the group which may therefore be to his or her disadvantage. Rawls argues that utilitarianism makes no distinction between people, and that happiness is different for 4  John Stuart Mill, Utilitarianism (first published 1861, The University of Chicago Press 1906) 9. 5  ibid 53. 6  John Rawls, A Theory of Justice (first published 1971, Harvard University Press 1999) 19-22.


Partners' and Externals' International Perspective Focus

each person.7 I believe Rawls would consider the forced use of cybercash unethical, as it would create unhappiness among some people that have to submit to the general will. Kant’s deontological ethics operate within the framework of categorical imperative, contained in the following quote: ‘So act that you use humanity, whether in your own person or in the person of any other, always at the same time as an end, never merely as a means’.8 Thus, the person, as a rational being, is the one whose interest takes precedence and must be treated as a goal. In this sense, the human being possesses the right to guide his/her decisions according to his/her own autonomous will, which must come from within, because he/she is a free being.9 7  ibid 25. 8  Immanuel Kant, Groundwork of the Metaphysics of Morals (1st published 1785, Cambridge University Press 1997) 38. 9  ibid 38-41.

Consequently, proponents of deontological ethics believe that the spread of digitalisation transforms the deliberative society into one where state’s own decisions prevail, i.e., they transit to a regime that restricts their freedom. The critique is that people will accept transitions without trying to understand or criticise them.10 Given the many strengths of cybercash, is it moral to force its use upon individuals? From a particular utilitarian perspective, the answer would be positive because virtual transfers are faster, the money is kept safe, cybercash reduces pollution according to some studies11, the production of banknotes generates a high economic and environmental cost etc. Thus, states’ 10  Andreas Spahn, ‘Digital Objects, Digital Subjects and Digital Societies: Deontology in the Age of Digitalisation’ (2020) 11 MDPI 1, 10-11. 11  Hass McCook, ‘Under the Microscope: The Real Costs of a Dollar’ (CoinDesk, 5.7.2014) <https://www.coindesk.com/microscope-realcosts-dollar> accessed 13.10.2021. SYNERGY Magazine | 21


Partners' and Externals' Perspective

intervention to force the exclusive use of cybercash would be appropriate, since this ethic assumes that it is moral to act in a certain way that will create, in sum, greater benefits for the whole group. Rawls criticises this ethic by stressing that the distribution of happiness within the group is not uniform, so that one’s unhappiness may be as great as the total happiness of the others.12 There will be individuals who do not agree with the mandatory exclusive use of cybercash for several reasons. Therefore, the negative aspects of cybercash will generate unhappiness. E.g., it may lead to loss of jobs in the financial sector.13 There is the issue of data security. One of the fears is that states will collect data and transactions that the consumer will make, raising the issue of violation of the right to privacy. Whether in democratic states there is at least hope that data will be protected, in totalitarian states, forcing the exclusive use of cybercash could lead to greater control. Last but not least, those who fail to keep up with such changes will suffer, given that there are no payment alternatives. For example, in China most payments are already made only through cards or e-applications. This has led to dire consequences over the fact that millions of people do not have access to an internet 12  Rawls (n 7) 19-24. 13  Waseem Sadiq, ‘Digital currency: The good, the bad and the ugly’ (ITProPortal, 6.6.2018) <https://www.itproportal.com/features/digital-currency-the-good-the-bad-and-the-ugly/> accessed 14.10.2021.

22 | SYNERGY Magazine

network that could allow them to operate in a digital economy.14 We might thus state that forcing the use of cybercash does not bring widespread happiness. From a Kantian perspective, persons, as rational beings, should not be forced to behave in a particular manner, because for an action to be considered moral, it must spring from within: ‘Morality is thus the relation of actions to the autonomy of the will, that is, to a possible giving of universal law through its maxims [...]. The dependence upon the principle of autonomy of a will that is not absolutely good (moral necessitation) is obligation. This, accordingly, cannot be attributed to a holy being’. 15

Additionally, Rawls created a connection between his principles and Kant’s, leading to the idea that persons (subject to moral laws) are rational and free to act as they see fit for their own good.16 As a conclusion, the ethics of Kant and Rawls do not justify the mandatory exclusive use of digital currencies, since a coercion is exercised, and it will generate unhappiness. The individual must not be sacrificed for the common good, as each and everyone’s pleasures must be acknowledged. 14  Simon Kemp, ‘Digital 2021: China’ (DataReportal, 9.2.2021) <https://datareportal.com/reports/digital-2021-china> accessed 14.10.2021. 15  Kant (n 9) 46. 16  Rawls (n 7) 221.


Partners' and Externals' Perspective

SYNERGY Magazine | 23


International Focus

BLOCKCHAIN AND THE RIGHT TO BE FORGOTTEN

Carlos Eduardo Pereira

Treasurer of the International Board of ELSA 2020/2021

In this article, I ponder about the future of data management and how we are going to balance technological systems with law regulations. However, this is an open and extensive topic, so I’m going to focus on one specific personal right, the right to erasure, related and adjusted with one technological step, the blockchain system. Regarding the European Union’s General Data Protection Regulation (“GDPR”), this was a significant advance in fixing a new legal framework for data protection and its related rights. This is a European regulation with the goal of protecting personal rights and fixing limits in the use of the data processing of those essential rights, accomplishing the status of an international human right in respect to access to information. This was implemented specifically in article 17A, the so-called ‘right to be forgotten was replaced by a more limited right to erasure in the version of the GDPR adopted by the European Parliament in March 2014. Article 17 mentioned the right to erasure establishes that: “the data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay…”. The article scope fixes the chance of European citizens to request the deletion of the processed data when it is not necessary for relation to its purposes, or when the consent was withdrawn by the data subject. On the other hand, constitutes an obligation for the data receiver to erasure this data.

24 | SYNERGY Magazine

Meanwhile, together with this new European legal context to data protection, the automation systems are increasing their capacity of storage data in every device and IT system. More and more, technology will continue to have a decisive role to save and processing data. The blockchain is an example of a technological concept of recording information, operating mainly in the financial system as a fintech instrument. Consisting in operating a digital ledger, in which transactions are organised and added to this participant’s ledger. This system has an origin in the financial markets, specifically following the bitcoin structure, having a certain autonomy from the cryptocurrencies system, securing and processing blocks of data with the support of cryptography. The first element to take into consideration in this relation between law and technical enforceability is transparency. Blockchain technology is defined to understand its structure. Every blockchain user is assigned a public address that in no way identifies them. This information is completely open, data subjects can view these holdings and transactions at will. Also defining the roles of each participant, establishing an entity mentioned as the controller and processor, or if it will be implemented a joint controllership. Related to this, is the consent management related to Blockchain users, in which we are intended to understand if it accomplishes mainly informative requirements requested by the article 7º /2 in the GDPR “shall be presented in a manner which is clearly distinguishable


from the other matters, in an intelligible and easily accessible form, using clear and plain language”. Also, giving the chance of being withdrawn by the data subject anytime. An important element to highlight is the immutability of the information processed and stored in the system achieved towards cryptographic hashes, which gives the chance of the data not being changed. Plus, assuming that these blocks of information cannot be updated or deleted, this creates an uncommon system for the management of data, by the reason of a scheme called CRAB, which stands for creating, read, append, and burning. Append means the replacement of the update operation, a new block to the blockchain, the ‘world state’1 is changed. So, the terms and conditions of using this system demand us to never put any data that requires the ability to be subsequently modified or deleted. Some positions defend that this data if encrypted without storing the encryption key, can be considered as erasure, personal data can be stored on a blockchain, following the structure, even though it is not proper deletion of data. But, in another way, some other opinions are defending that a good solution would be to only store personal encrypted, hashed personal data on the blockchain and if a data erasure request is accepted, reliably throwing away the encryption keys to make the data anonymous and not recoverable. This is the nearest situation to achieve the full deletion. According to this interpretation, these mechanisms ensure the security of the stored data accomplishing a requirement of the GDPR. By way of comparison, it is like having a safe box with a valuable object inside, but we don't open it and take out what's inside, we simply keep the safe box closed forever without a clue about the code. It doesn’t destroy the information; it just destroys the access. Concluding, my final operational advice is to not insert personal data in these specific technological instruments.

1  The ‘world state’ is the sum of all operations until now.

SYNERGY Magazine | 25


International Focus

THE REALITY OF CRYPTO-ASSETS: TAX ISSUES AND PROSPECTS WITHIN THE EU

Giacomo Benaglia

Member of ELSA Bologna

As is typically the case with changes and novelties dictated by technology, in recent years the lawmakers and institutions have found themselves more and more often in the position of ‘chasing’ when pursuing an effective definition and potential regulation of such innovations and their very impact on society. A peculiar phenomenon that has gathered great relevance in today’s scenario is the Crypto-assets’ one, with steeply increasing numbers of actors involved, a market capitalisation of more than $2 trillion and more than 200 million users.1 Given the reached magnitude of the industry, this has indeed led numerous world countries to pursue regulations of crypto-related activities from a fiscal standpoint.2 While in constant evolution and characterised by differences and peculiarities we can broadly define crypto-assets as being based, as their name suggests, on cryptography, possessing the intangible nature of essentially computer codes. The peculiarity of the system they are based on, detached from the issuance of a central bank, allows at the same time anonymity and decentralisation, while simultaneously presenting the capacity to conduct 1  Bank of America, ‘BofA Global Research Launches Coverage of Digital Assets’ (4 October 2021) <https://newsroom.bankofamerica. com/content/newsroom/press-releases/2021/10/bofa-global-researchlaunches-coverage-of-digital-assets.html> accessed 17 October 2021 2  Kateryna Solodan, ‘Legal Regulation of Cryptocurrency Taxation in European Countries’, European Journal of Law and Public Administration 2019, Volume 6, Issue 1, pp. 65 < https://doi.org/10.18662/ eljpa/64> accessed 17 October 2021 26 | SYNERGY Magazine

highly secure transactions outside of the participation of banking institutions.3 Notwithstanding a great success and growing diffusion, crypto-assets have raised concerns under several profiles due to their very nature and characteristics, especially with regards to their pseudo-anonymity and global scope, leading to a substantial risk of no reporting of taxable income, tax evasion and revenue loss.4 Focusing on the EU scenario at the present state, the tax treatment of such assets appears to be very differentiated and fragmented from one state to another.5 For example in Germany cryptocurrencies’ gains are subject to income tax limited to the first year of holding rather than capital gain, which is considered a private asset; in Spain on the other hand cryptocurrencies are subject to personal income tax

3  Ibid (n 2) 4  Luisa Scarcella, ‘Exchange of Information on Crypto-Assets at the Dawn of DAC8’, Kluwer International Tax Blog (29 March 2021) <http://kluwertaxblog.com/2021/03/29/exchange-of-information-oncrypto-assets-at-the-dawn-of-dac8/> accessed 17 October 2021 5  Nana Ama Sarfo, ‘The EU’s Cryptoasset Tax Strategy Needs Coordination’ (2 August 2021) <https://www.forbes.com/sites/ taxnotes/2021/08/02/the-eus-cryptoasset-tax-strategy-needscoordination/?sh=636367cb2105> accessed 17 October 2021; Oliver R. Hoor, Marie Bentley, ‘Crypto Assets are Focus of Upcoming Exchange of Information—DAC8’ (15 March 2021) <https://news. bloombergtax.com/daily-tax-report-international/crypto-assets-arefocus-of-upcoming-exchange-of-information-dac8> accessed 17 October 2021


International Focus while capital gain tax applies to the realised profits.6 From what emerges from the PwC Annual Global Crypto Tax Report 2020 a wide variety of solutions can be also found globally in the categorisation of the assets or the kind of taxed activities throughout various jurisdictions, ranging from capital gain to VAT issues related to the use of payment tokens, from direct taxation on mining income to Initial Coin Offerings (ICOs), etc.7 Nevertheless, endeavours to provide a common regulatory framework at a European level towards different aspects of the growing industry have emerged during the last years. Indeed the Member States, as expressed in the Council Directive 2011/16/EU on administrative cooperation in the field of taxation ‘DAC’, have agreed to develop cooperation directed to granting the correct application of taxes to their taxpayers while at the same time contrasting tax fraud and evasion. Such objective, as enshrined in the Directive, is pointed towards the competent national authorities establishing all the necessary procedures and providing a proper structure and framework for the cooperation in the application of direct taxation.8 At the current state, however, despite the emendations that occurred through the years, the Directive, while requiring financial intermediaries to report to tax administrations and encouraging the exchange and circulation of information between states, still does not contain any specific obligation for the intermediaries to report crypto-assets nor e-money.9 Within this scenario, in November 2020 the Commission issued an Inception Impact Assessment with regards to an eventual future proposal for an EU Council Directive, aimed at the further emendation of the ‘DAC’, with the intent of expanding its application to crypto-assets and re-enforcing the whole framework of cooperation on the matter. At the current stage comments on the subject have been expressed by the Member States through questionnaires last June closing the 6  Capital.com Research Team, ‘Crypto taxes 2021: A guide to UK, US and European rules’ (14 October 2021) <https://capital. com/crypto-taxes-2021-a-guide-to-uk-us-and-european-rules> accessed 17 October 2021 7  PwC, ‘PwC Annual Global Crypto Tax Report 2020’ <https:// www.pwchk.com/en/research-and-insights/fintech/pwc-annualglobal-crypto-tax-report-2020.pdf> accessed 17 October 2021 8  Oliver R. Hoor, Marie Bentley, (n 7) 9 Ibid

public consultation and the adoption by the Commission is currently planned for the third quarter of 2021.10 In the intention of the Commission the, at that stage amended, ‘DAC8’ would thus provide a crucial leap towards the provision to tax authorities of the information regarding taxpayers and their use of these assets, moving beyond the already applied anti-money laundering measures to the providers of crypto-related services.11 This, in as much as the automatic exchange of such data, would allow national competent authorities to effectively tax income or revenue sourcing for example from investments or payments operated through cryptocurrencies.12 The objective pursued by updating the Directive seems, in addition, particularly in line with the plan proposed by OECD to include crypto-assets among the financial assets subject to the automatic exchange of information and the Common Reporting Standard (CSR).13 Such exigencies and concerns have emerged beyond the European landscape, as clearly expressed by the OECD itself in a recent report from 2020 dealing with the taxation of virtual currencies, pointing out the necessity of improving transparency on the matter.14 In the described scenario, therefore, such measures pursuing an increase of protection against tax evasion, besides the clarity and coherence of the regulation of a phenomenon with global ramifications, appear to be, even in the European landscape, of growing importance, ensuring a potentially positive impact on economies and societies towards a more and more ‘digital era’.

10  Ibid; European Commission, ‘Tax fraud & evasion – strengthening rules on administrative cooperation and expanding the exchange of information’ <https://ec.europa.eu/info/law/better-regulation/have-your-say/ initiatives/12632-Tax-fraud-&-evasion-strengthening-rules-on-administrative-cooperation-and-expanding-the-exchange-of-information_en> accessed 17 October 2021 11  Oliver R. Hoor, Marie Bentley, (n 7) 12  Nana Ama Sarfo, (n 7). In addition to such measures needs to be also mentioned the Proposal for a Regulation by the EU Commission on MiCA (Markets in Crypto-Assets Regulation) which appears as stated in the document itself “in line with the Commission priorities to make Europe fit for the digital age and to build a future-ready economy that works for the people.” (Proposal for A Regulation of the European Parliament and of the Council 2020/0265(COD) of 24 September 2020 on Markets in Cryptoassets, and amending Directive (EU) 2019/1937), dealing for example with consumer protection and new licensing requirements and uniform rules for the providers of crypto-related services. See Capital.com Research Team, (n 5). Another potential key goal could also be represented by the formulation of an effective and comprehensive definition of crypto-assets, a matter that has still to this day appeared challenging. See Nana Ama Sarfo (n 7) 13  Luisa Scarcella, (n 4) 14 Ibid SYNERGY Magazine | 27


International Focus

Is the legal world prepared against AI medical malpractice?

CIVIL AND CRIMINAL LIABILITY AND AUTONOMOUS ROBOTIC SURGERY Konstantinos Apostolos

National Researcher of ELSA Greece ILRG: Human Rights and Technology

The rapid development of modern medicine entails, inter alia, the ever-increasing use of AI and robots. Medical robots constitute a decisive factor for high accuracy surgery, possible better outcomes in rehabilitation, whereas their use contributes to the reduction of healthcare costs by enabling medical professionals to shift their focus from treatment to prevention and by making more budgetary resources available for improved adjustment to the plethora of patients’ needs, life-long training of the healthcare experts and research.1 However, contrary to the traditional definition of medical malpractice, the framework on civil and criminal liability for medical errors using AI and robots seems vague.

framework for the regulation of artificial intelligence.2 The framework follows a risk-based approach and categorizes the possible uses of AI depending on the risk posed for public interest or interference with fundamental rights, forming a four-pillar system— minimal risk, limited risk, high risk, de facto ban. Although the proposal addresses a variety of issues, such as the development, distribution and use of AI, a wide range of dilemmas remains unsolved, spearheaded by the civil and criminal liability due to the use of AI, especially in the field of robotic surgery.

On 21.4.2021, the European Commission took the initiative globally by proposing a holistic legal

A study carried out for the European Parliament can shed some light on this labyrinth. In cases, where the doctor acts lege artis, but the system is wrong – two different sets of liability rules apply: product liability rules, concerning the liability of the manufacturer,

1  Resolution 2018/C 252/25 of the European Parliament with recommendations to the Commission on Civil Law Rules on Robotics (2015/2103(INL)), 16 February 2017.

2  Proposal COM/2021/206 for a Regulation of the European Parliament and of the Council <https://eur-lex.europa.eu/legal-content/EN/ ALL/?uri=CELEX:52021PC0206> accessed 5 October 2021.

28 | SYNERGY Magazine


International Focus and liability based on medical law, proceedings are brought against the practitioner and/or the medical structure where the operation took place. Normally, the patient will sue the doctor and/or the hospital, for having followed instructions that were incorrect. If held liable, the latter may then sue the producer in recourse. In the second situation, only medical liability rules apply, mainly regulated through the general principles of negligence.3 Still, numerous issues remain to be solved. Firstly, the causal link between malfunctioning or error in the system and the damage, given that normally the system is not responsible for the final decision, but merely provides an analysis which the doctor may rely upon, is not always obvious. On the one side, should the doctor be considered to have relied on the system, the final choice arguably falls to the doctor? On the other side, the doctor or hospital could sue the manufacturer under contract law, on the basis that the system does not offer promising performance and thus there is a lack of conformity or breach.4

liable, unless the death of the patient is the result of a malfunction of the machine. In this case, criminal liability is logically attached to the manufacturer. But, in the scenario of signal loss, during telesurgery, for example, who will be held liable?7 In conclusion, in light of only a few dilemmas in medical malpractice with the use of AI and robotic surgery, anyone can hardly argue that humanity is indeed ready for the digital era. 7  Ibid e1975.

But what happens in the futuristic possibility of a robot deciding? The actions of an autonomous robot, self-learning and adaptive to the conditions around it, could be damaging. Hence, that would not be the result of its programming and thus, its acts would not be controlled and directed by any human.5 Ιf robotic surgery eventually is proved to be more efficient than regular surgery, which could have consequences on medical malpractice standards. In this regard, the question is; should surgeons be held accountable for non‐robotic surgery in case it performs worse or should they be considered to be responsible for any kind of surgery they perform in general? Similarly, whether the same professional standards as human surgeons are applicable, or maybe some higher standards.6 Additionally, since the liability of a robot or AI program for its acts cannot be established, the liability for malpractice of autonomous robots is necessarily passed on to the people who manufacture, distribute, own and operate them. Consequently, only a human being can be regarded as criminally guilty for an autonomous robot's fault and not a surgical robot. However, the circle of the potentially liable people remains uncertain. If the remote operation of the robot takes place, then the surgeon should be held 3  Andrea Bertolini, ‘Artificial Intelligence and Civil Liability Legal Affairs’ (2020) Policy Department for Citizens' Rights and Constitutional Affairs, European Parliament. 4  Ibid 115. 5  Andreas Matthias, ‘The responsibility gap: ascribing responsibility for the actions of learning automata. Ethics and Information Technology’ (2004) Vol. 6, Iss. 3, Ethics and Information Technology 181‐183. 6  Shane O'Sullivan et al., ‘Legal, regulatory, and ethical frameworks for development of standards in artificial intelligence (AI) and autonomous robotic surgery’ (2019) Vol. 15, Iss. 1, International Journal of Medical Robotics and Computer Assisted Surgery e1977. SYNERGY Magazine | 29


THE COSMETIC INDUSTRY AND AUGMENTED REALITY: BIOMETRIC DATA COLLECTION AND THE PRIVACY PARADOX

Amalia Kurniaputri

Secretary General of ELSA Tilburg

When it comes to cosmetics, buyers must see and test the product to decide whether it matches their skin tone or is eye-catching to wear—it is unlikely to buy cosmetics blindly and impulsively. As a result of the COVID-19 predicament, the cosmetic industry has been driven to be technologically innovative to provide such experiences with consumer-facing technology.1 The technology now has acquired a critical role in customer and shopping experiences, allowing costumer to feel being as if they are at a physical store while staying at home.2 Ulta Beauty's GlamLab virtual try-on tool, for instance, has increased consumer engagement fivefold with over 19 million shade try-on.3 Ulta Beauty also recently unveiled Skin Analysis,4 which uses the biometric collection to analyse skin and provide recommendations and 1  Martha Anne Coussement and Thomas J. Teague, 'The New Customer-Facing Technology: Mobile And The Constantly-Connected Consumer' (2013) 4 Journal of Hospitality and Tourism Technology. 2  Rosy Broadman, Claudia E. Henninger, Ailing Zhu, ‘Augmented Reality and Virtual Reality – New Drivers for Fashion Retail?’ in Gianpaolo Vignali, Louise F. Reid, Daniella Ryding, and Claudia E. Henninger (eds), Technology-Driven Sustainability Innovation in the Fashion Supply Chain (Palgrave Macmillian, 2019) 3  Kristin Larson, ‘Beauty’s New Frontier” How Technology Is Transforming The Industry, From Virtual Reality To Livestreaming’ (Forbes, 9 January 2021) <https://www.forbes.com/sites/kristinlarson/2021/01/09/the-new-beauty-frontier-where-digital-amplifiesbeauty/?sh=4c22100124f3> accessed 16 September 2021 4 Ibid 30 | SYNERGY Magazine

product recommendations for concerns such as hyperpigmentation and fine wrinkles. The feature of choosing and testing cosmetic products online expanded to other brands; Maybelline New York with ‘Maybelline Virtual Try-On Makeup Tool’ and Chanel with ‘Chanel Lipscanner.’ On the other side, as these technologies were designed to meet the needs and interests of specific customers, they may also expose individuals to identity-based attacks. This essay looks at how a cosmetic company, in this example Ulta Beauty, uses biometric data collecting and processing through consumer-facing technologies while elegantly wrapping it in self-fulfilment activities. The privacy paradox in personalised shopping will also be addressed. Despite the fact that Ulta Beauty is a US-based corporation that adheres to the California Consumer Privacy Act and the Schrems II ruling, the case may underline the need for privacy awareness in the shopping experience. Biometric Data Collection in Virtual Try-On Makeup Biometric data is personal information derived through technological processing of a natural person's physical, physiological, or behavioural traits that allows or confirms that natural person's unique identity, such as


a face image or dactyloscopy data.5 In the present case, the personalisation facilities with the try-on make-up feature are gathered when customers provide the biometric data or interact with the technology.6 The data collected and processed was for the marketing and sales strategies, and other reasons disclosed to the customer at the time of collection. Knowing that this data may be shared with third parties, which exposed the extent to which the data was given, poses an identity-based vulnerability considering the sensitive nature of the information. To put it another way, if an unauthorised user has access to certain facial data, they may use that information to identify that individual and take whatever action they wish, whether lawful or unlawful. Since the GDPR has no precise rules for processing biometric data, the rights to privacy and data protection could be violated. According to Kindt,7 using facial images and biometric features for identification purposes, in general, may violate other fundamental rights such as freedom of expression and the right to assemble and associate. The biometric data gathered was also seen as an aspect of “private life.” In S. and Marper v. the United Kingdom,8 the Court expanded "private life" protection to include not just a person's name but also their physical and psychological integrity, as well as numerous aspects of their physical, social, and ethnic identities. On that account, the necessity for appropriate and comprehensive protection will be regarded as a preventative measure against 5  Article 4(14) GDPR 6  Para. 9, ULTA.com Privacy Policy, < https://www.ulta.com/company/privacy/> accessed 16 September 2021 7  E. Kindt, ‘Privacy and Data Protection Issues of Biometric Applications. A Comparative Legal Analysis.’ (2013) (1st, Dordrecht, 2013) Para. 35, Chapter 4.1.1.3.1. 8  Para. 66 S. and Marper v the United Kingdom, European Court of Human Rights, Applications nos. 30562/04 nd 30566/04 31 | SYNERGY Magazine

the emergence of the greatest risks like covert identification and function creep. Recognising the risks associated with biometric technologies should become a primary focus for every country. The Privacy Paradox relating to Cosmetic Customer Experiences with Virtual Try-On Makeup The primary purpose of adopting virtual try-on features for customer experiences was to deliver a personalised experience. For that reason, many customers are unaware of their self-interest prioritised benefits and overlook the risks, a phenomenon is known as the privacy paradox.9 As per the Privacy Calculus Theory, since the apparent benefits outweigh the perceived risks, privacy issues are frequently neglected, resulting in information exposure in exchange for economic benefits, personalisation, convenience, and social benefit.10 Additionally, a previous study has indicated that, despite online customers' worries about privacy, they occasionally willingly divulge personal information and accept being tracked and profiled in exchange for retail value and personalised services.11 Therefore, the privacy paradox posed the question of 9  Susanne Barth and Menno D. T. de Jong, ‘The privacy paradox – Investigating discrepancies between expressed privacy concerns and actual online behavior – A systematic literature review’ (2017) 34(7) Telematics and Informatics <https://www.sciencedirect.com/science/article/pii/S0736585317302022#bb0420> accessed 16 September 2021 10  Dave Wilson and Joseph S. Valacich, ‘Unpacking the privacy paradox: Irrational decision-making within the privacy calculus’ in Thirty Third International Conference on Information Systems, ICIS 2012 (2012) < https://aisel.aisnet.org/icis2012/proceedings/ResearchInProgress/101/> accessed 16 September 2021 11  Lemi Baruh, Ekin Secinti, Zeynep Cemalcilar, ‘Online Privacy Concerns and Privacy Management: A Meta-Analytical Review’ (2017) 67(1) Journal of Communication <https://doi.org/10.1111/ jcom.12276> accessed 17 September 2021; S. Shyam Sundar, Hyunjin Kang, Mu Wu, Eun Go, Bo Zhang, ‘Unlocking the Privacy Paradox: Do Cognitive Heuristics Hold the Key?’ In M. Beaudouin-Lafon, P. Baudisch, & W. E. Mackay (eds), CHI EA 2013 – Extended Abstracts on Human Factors in Computing Systems: Changing Perspectives (Association for Computing Machinery, 2013)


International Focus whether customers—and society in general—are fully ready for the digital era and conscious enough to leave their biases to preserve their personal information. On the contrary, the privacy paradox is important for a business like Ulta Beauty. That's because the privacy paradox narrative defines the scope of corporate responsibility as relatively narrow: if customers are presented as relinquishing privacy when online, businesses have little to no responsibility to acknowledge or satisfy privacy protection.12 Such practices may be viewed as creating inadequate privacy safeguards, which may contribute to biometric data-related risks. Thereby, Stefanie Pötzsch13 noted that in order to prepare societies for the digital era and tackle the privacy paradox, businesses should develop tools and features that are designed to influence people’s behaviour in increase privacy awareness. Lastly, Article 12(1) GDPR mandated taking the appropriate steps to distribute information in a comprehensible and easily accessible way. While Ulta Beauty's privacy policy is comprehensive regarding what data was gathered and processed and for what purpose, it may be impractical for certain people to read and analyse the policy. Customers' virtual try-on make-up experiences should be informed through specific application tools (for instance, the pop-up informed consent) about what data is gathered apart from cookies—because the two are not the same—rather than using automated individual decision making. Conclusion In closing, the makeup try-on technology analyses and identifies individual personalisation for the cosmetic we have picked with the biometric collection. Customers must carefully balance their desire and needs when utilising technological features that take advantage of personalisation to preserve privacy. When a person is digitally ready, they can leverage and are aware of the technologies they are using, including the consequences. The act of choosing to be unaware, ipso facto, has a long-term influence on the privacy data issue for both the customers and businesses. The price of privacy does not outweigh the benefits. Consequently, both customers and businesses must share the burden of privacy awareness; businesses must provide tools to enhance privacy awareness, whereas customers must strengthen their sense of protecting their privacy. 12  Kirsten Martin, ‘Breaking the Privacy Paradox: The Value of Privacy and Associated Duty of Firms’ (2019) 30(1) Business Ethics Quarterly <https:// www.cambridge.org/core/journals/business-ethics-quarterly/article/breakingthe-privacy-paradox-the-value-of-privacy-and-associated-duty-of-firms/ F7A893CB4C537B0DB4CC3914DF9B9DFA> accessed 16 September 2021 13  Stefanie Pötzsch, ‘Privacy Awareness: A Means to Solve the Privacy Paradox?’ in V. Matyáš et al (eds), The Future of Identity (IFIP International Federation for Information Processing, 2009) 32 | SYNERGY Magazine


International Focus

Artificial intelligence and Legal Judgement Prediction: a blessing or a threat to the effectiveness of the judicial systems?

MEET YOUR ROBOT JUDGE

Angeliki Konstantara Member of ELSA Athens

Fair administration of justice constitutes a cornerstone to state practice and policies, ensuring the protection of fundamental human rights and freedoms against any public or private infringement. Such administration requires the functioning of effective and reliable legal and judicial systems to fulfil the purpose of social stability and public order. Therefore, there have been successful attempts to integrate artificial intelligence into the judicial process concerning not only civil law cases, such as divorces, but also criminal convictions. AI programs, namely Legal Judgment Prediction (LJP)1 and Self-Attentive Capsule Network (dubbed as SAttCaps)2, are currently very popular in the US and in Europe, leading to the adoption of a legal regulatory framework for the use of AI in judicial decision making. The adoption of such safeguards is of vital importance so as to avoid the risks of technological growth to the detriment of human rights and fundamental freedoms while enhancing the effectiveness of the legal systems. Thanks to the access to legal judgement data, all the three stages of a trial (pre-trial claims, in court debate and judge sentence stage) are simulated online and legal based factors are evaluated with case life-cycle and multi-task learning mechanisms.3 1  Luyao Ma, Yating Zhang, Tianyi Wang, Xiaozhong Liu, Wei Ye1, Changlong Sun, Shikun Zhang, Legal Judgment Prediction with Multi-Stage Case Representation Learning in the Real Court Setting, arXiv:2107.05192v1 2  Yuquan Le , Congqing He, Meng Chen, Youzheng Wu, Xiaodong He and Bowen Zhou, Learning to Predict Charges for Legal Judgment via Self-Attentive Capsule Network, 24th European Conference on Artificial Intelligence - ECAI 2020 Santiago de Compostela, Spain 3  Supra note 1, 2

This multi-role representation model seems promising, yet unreliable. However time-saving, cost-effective and unbiased it is considered, it does not guarantee fairness in judgments due to lack of empathy, intuition and perceptiveness.4 Supporters of AI systems seem ignorant of the complexity of the judge’s role in the systematic and teleological interpretation of the law in the light of abstract principles, such as good faith and its application to the evidence submitted. It is certainly no exaggeration to state that the replacement of the human judge by algorithms undermine the substantial role of judicial decision making in shaping morally upright individuals, hence reinforcing social order and solidarity. But what renders AI most problematic for the fair administration of justice is the possible violation of the right to a fair trial due to the automatic adjudication of criminal cases. To be more specific, in accordance with the 6th Amendment of the Constitution of the United States5 as well as with article 6 of the European

4  Sourdin Tania, Judge vs Robot: Artificial Intelligence and Judicial Decision Making, UNSW Law Journal Volume 41(4) 5  Sixth Amendment "In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial, by an impartial jury of the State and district wherein the crime shall have been committed, which district shall have been previously ascertained by law, and to be informed of the nature and cause of the accusation; to be confronted with the witnesses against him; to have compulsory process for obtaining witnesses in his favor, and to have the Assistance of Counsel for his defence.” SYNERGY Magazine | 33


International Focus Convention of Human Rights6, the criminal procedure shall encompass the right to access to a court, emanating from the rule of law, as well as the equality of arms. The European Court of Human Rights has designated the principle of equality of arms as only one feature of the wider concept of a fair trial, which also includes the fundamental right that criminal proceedings should be adversarial7 in the sense that both prosecution and defence must be given the opportunity to have knowledge of and comment on the observations filed and the evidence adduced by the other party.8 Even though AI methods seem to fulfil the requirement for an impartial and speedy trial, as both the procession of the data as well as the decision making are carried out without delay, the access to a court is compromised by the abolishment of the oral procedure. The deprivation of the defendant’s right to be heard in court might also trigger the presumption of innocence (in dubio pro reo), as the algorithms used by LJP systems might distort the meaning of a particular article or even the defendant’s written statement and thus lead to the unfair conviction of an innocent man.9 Last but not least, LJP mechanisms put the right to liberty in question, given the fact they are not sufficiently developed to carry out pre-trial assessments for the determination of a defendant’s detention pending the trial. Alarmed by these concerns, the European Commission for the Efficiency of Justice (CEPEJ) adopted in 2018, under the auspices of the Council of Europe in Strasbourg, a comprehensive European Ethical Charter on the Use of Artificial Intelligence in Judicial Systems and their environment.10 This Charter proclaims the principles that prevent the infringement of the right to a fair trial by automatic AI systems. Among these five principles, the Principle of respect for fundamental rights ensures that the design and implementation of artificial intelligence tools and services are compatible with fundamental rights and the systematic 6  European Convention of Human Rights (ECHR), article 6: “In the determination of his civil rights and obligations or of any criminal charge against him, everyone is entitled to a fair and public hearing within a reasonable time by an independent and impartial tribunal established by law.” https://www.echr.coe.int/documents/convention_eng.pdf 7  Case of Brandstetter vs Austria, (Application no. 11170/84; 12876/87; 13468/87), ECHR Judgment of 28 August 1991,par.66 8  Ibid par.67 9  Franciska Zsófia Gyuranecz, Bernadett Krausz, Dorottya Papp, The AI is now in session – The impact of digitalization on courts, a THEMIS SEMI-FINAL D – Judicial Ethics and Professional Conduct,2018 10 https://rm.coe.int/ethical-charter-en-for-publication-4-december2018/16808f699c 34 | SYNERGY Magazine

interpretation of the law. The Principle of nondiscrimination, which is also a prominent ECHR11 principle, is major to the equality of arms, as it prohibits any discrimination that might have an unjustifiable influence on the conviction of the defendant on the basis of sex, race, colour, language, religion, political or another opinion, national or social origin, association with a national, minority, property, birth or another status. When it comes to the processing of intangible data and certified sources, the Principle of quality and security guarantees the legitimate preservation of personal and sensitive data with models elaborated in a multi-disciplinary manner, in a secure technological environment, whereas the Principle of transparency, impartiality and fairness as well as Principle “under user control” preclude a prescriptive approach and ensure the role of users as informed actors, capable of designating the scope of their data procession. It remains to be seen how all these principles will be applied in the AI judicial mechanism. In conclusion, the digital era calls for the transformation of traditional methods towards flexibility and speed. Even though AI is well adapted in automobility and engineering, its application in humanitarian and social services remains dubious, especially as far as legal administration is concerned. The replacement of the oral pleadings with multi-role mechanisms does not correspond to the defendant’s need to have a hearing in front of the judges, whereas the judgment is solely based on the sterile evaluation of legal and case facts, without any interpretive ability. Considering the educative and humanitarian role of a judge in a society, the development of AI mechanisms to the detriment of the support of the existing courts must be viewed with vigilance especially in criminal procedures, where the burden of proof is beyond any reasonable doubt, rendering the judgment decisive for a person’s life. That does not entail, however, the indiscriminate rejection of AI technology in our legal mechanisms. AI technology has a lot to offer in terms of efficiency and therefore lawyers and judges could take advantage of its benefits when dealing with identical, typical civil law cases, such as contracts or consensual divorces. In any case, it should be humans who have the upper hand in justice administration, not mere algorithms.

11  ECHR article 14


International Focus

SYNERGY Magazine | 35


HOW THE USE OF ELECTRONIC VOTING SYSTEMS COULD POSITIVELY IMPACT ON DEMOCRACIES AND WHAT COULD BE THEIR POTENTIAL DRAWBACKS

E-VOTING: OPPORTUNITY OR THREAT? Eugenio Ciliberti ELSA Alumnus

The Covid-19 emergency has raised new issues with regards to the exercise of democracy: in particular, concerns were expressed on new opportunities for citizens to cast their electoral votes in a safe and efficient manner and led to new perspectives in the long-standing debate on e-voting. However quick and accurate this method will result if implemented, there still is some criticism as to its credibility. Initially used in the US, it then spread to other countries such as Brazil, India and the Philippines, as well as Estonia, where internet voting has been used since 2007. In other European countries, such as France and Belgium, electronic voting systems are used in a few districts, whilst in others, for example, Germany and the Netherlands, they were adopted and soon afterwards ruled out after they proved themselves to be insecure and not sufficiently transparent. Despite their application may end up being controversial and causing high risks of fraud, the benefits of these systems should not be underestimated. The most elementary one is that digital technology can be a worthwhile investment: several expenditures would occur to purchase the equipment, train staff, update the software to prevent cyberattacks and secure equipment storage between elections, but the outcome would eliminate cases of multiple votes 36 | SYNERGY Magazine

and guarantee the fairness of voting procedures through the use of biometric identification. Further advantages concern the various phases of the voting process: voter registration, voter identity verification, vote casting, vote counting, results from transmission and tabulation. Digital voting would highly simplify these moments and facilitate the operations in an easy and time-saving way. Also, the use of technology and internet voting would help solve the crucial problem of low turnouts: in fact, voters physically unable to reach the polling station will be able to cast their ballot online. Furthermore, such systems would enable voters to verify that their vote is cast as intended, correctly recorded and counted: they would not have a marginal role in this process but would be at the very core of it. Finally, a relevant position is occupied by the importance of testing and verification of the technologies used: tests would be carried out by independent and competent bodies, as well as by the machines’ manufacturers, with the aim to prevent the possible frauds that will take place. Although the advantages of these methods are many, drawbacks must be carefully considered. First, the vulnerability to malware and other external attacks of the devices adopted should not be minimized: this has been the case in many elections and in the


Think Global, Act Local Netherlands brought e-voting activists to drop this system. Connected to this situation is the malfunction of these technologies, which could delay the process and spoil its outcome; and even when it works, e-voting is not infallible and could fail at any stage of the procedures, often resulting in the elections being annulled. Auditability is a fundamental matter too because when the only record of a vote is digital, there is a chance of it being lost irretrievably, usually due to hacking. Overall, the value of electronic voting would be none of these technologies do not comply with the principles set out in Article 25 of the International Covenant on Civil and Political Rights, consisting in every citizens’ right and opportunity “to vote and to be elected at genuine periodic elections which shall be by universal and equal suffrage and shall be held by secret ballot, guaranteeing the free expression of the will of electors”. In the absence of any other international standards addressing the specific characteristics of digital voting, the Council of Europe has adopted a recommendation on standards for e-voting, including the following indications: voters should be reliably identified; voter interfaces should be easy to understand and use for all voters; voters should have the chance to confirm their vote before casting it; after casting their vote, voters should be able to check that it has been correctly cast;

voting should be anonymous; all aspects of the vote must be fully transparent; electronic voting systems must be tested and certified by an independent body. These basic criteria, alongside the best practices from States that have succeeded in implementing these systems, show the path for an evolution of the way we conceive democracy. New technologies, in fact, could also lead to a change in citizen participation, affecting the possibility of strengthening citizens’ voice in politics and governance, creating political spaces for activism, promoting increased government accountability and providing avenues of communication to public officials. Strengthening community voice in public debate and decisionmaking is a key factor in strengthening the demand for public participation, accountability and transparency and e-voting could be an essential part of this process: in this sense, civil society actors have a major role in raising awareness and building understanding of the potential of ICTs in the implementation of democratic principles and practice. In addition, on the institutional level, a regulatory policy positively impacting both the ICT infrastructure and the social and political climate – including freedom of information and expression – would contribute to creating the ideal environment for the proper development of electronic voting systems.

SYNERGY Magazine | 37


Think Global, Act Local

38 | SYNERGY Magazine


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.