Page 1


2020 Vision: Man VS. Machine


Democracy in a Digital World: Reality or Illusion?


Content Moderators — Is This the Worst Job in Technology?


The Tech Revolution / VOLUME ONE /

The Full Bench Volume One 2020 The Tech Revolution Editor-in-Chief Isabella Marriott Editors Campbell Rice Milica Lazarevic Andrew Huynh Anthea Dinh-Tram Designer Judith Tan Special thanks to Samuel Guzman (President) Nicholas Plessas (Vice President of Education) The Full Bench is published bi-annually in Sydney by: UTS Law Students’ Society 61 Broadway, Ultimo NSW 2007 UTS Central, Level 14, Room 104 www.utslss.com UTS Law Students’ Society © This publication is copyright. Except where permitted under the Copyright Act, no part of this publication may form or by any means (electronic or otherwise) be reproduced, stored in a retrieval system or transmitted by any process without specific written consent of the UTS Law Students’ Society. Enquiries are to be addressed to the publishers. Acknowledgement of People and Country The Full Bench and UTS acknowledge the Gadigal and Guring-gai people of the Eora Nation upon whose ancestral lands our university now stands. We would also like to pay respect to the Elders both past and present, acknowledging them as the traditional custodians of knowledge for these places. Images and illustrations Unless provided by the designers and editors or commissioned specifically for the purpose of this publication, uncredited photographs have been sourced from Unsplash.com, licensed under the Unsplash License. Disclaimer All expressions of opinion published in The Full Bench are not the official opinion of the UTS Law Students’ Society unless expressly stated. The UTS Law Students’ Society accepts no responsibility for the accuracy of any opinions or information contained herein and readers should rely on their own enquiries to make decisions in their own interest. Want to contribute to the next edition? Email publications@utsslss.com


Contents 4

Editor-in-Chief’s Welcome


President’s Welcome


Vice President of Education’s Welcome


Editorial Team Welcome


The 4th Revolution — Technology:


2020 Vision: Man VS. Machine


Is a ‘Bot’ Really Going to Take Over?


The Responsibility Gap


Legal Tech — Paving the Way


Under Surveillance


Democracy in a Digital World: Reality or Illusion?


Content Moderators — 


A Tricky Tightrope — Can Legal Technology

Law in the Digital Age

for a “Greener” Legal Industry?

Is This the Worst Job in Technology?

Facilitate Access to Justice for Domestic Violence in Regional Areas?


Editor-in-Chief’s Welcome BY ISABELLA MARRIOTT

On behalf of the UTS Law Students Society (LSS) and Publications Subcommittee, we would like to wish you a warm welcome to the inaugural 2020 volume of the LSS academic journal, The Full Bench. If the first half of 2020 could be encompassed by one word, it would undoubtedly be ‘coronavirus’. The spread of this pandemic has seen businesses shut, jobs lost, and our everyday activities moved online. Technology has become more a part of our lives than ever before, with offices and universities moved to the comforts of our homes. The question we must ask, therefore, is how will this increase in technology impact the law? Prior to the pandemic, the legal industry was already beginning to utilise new technologies to make their processes more efficient. E-Discovery, practice management, and the decrease use of paper are beginning to become the norm in a lot of firms around the country. Further, as artificial intelligence (AI) becomes a staple of many technology companies, our judicial system has had to question whether humans can assume liability for actions completed by computers. This increase in technology has also impacted our right to privacy. Data breaches, increased surveillance and the misuse of personal information has created scandal after scandal, with our legal system attempting to protect individuals from becoming over-surveilled. But, is it too late? These crucial ideas and many more are explored in this edition of The Full Bench — The Tech Revolution. I would personally like to thank all contributors to The Full Bench, my editorial subcommittee, and our designer Judith Tan for their hard work. I would also like to thank the LSS Vice President of Education, Nick Plessas, and our President, Sam Guzman, for their ongoing support. Enjoy the read!


President’s Welcome BY SAMUEL GUZMAN

A warm welcome to the first edition of The Full Bench for 2020, the official academic journal proudly brought to you by the University of Technology Sydney Law Students’ Society (UTS LSS). In this edition, we explore the impact of technology on the law. Technological developments have continued to disrupt various sectors, with the legal industry being no exception. As law students in this day and age, it is critical for us to remain aware of these developments, and to understand and evaluate its effect on the law and legal profession in the past, the present, and importantly, the future. How has artificial intelligence increased efficiency within the legal workplace? What effect does legal automation have on the environment? What are the different challenges brought forth by legal technology on democracy, privacy and regulation? What does all of this mean for the future of the legal profession? These questions will all be considered by the thought-provoking articles contained in the following pages. This publication would not be possible without the tireless efforts of an incredibly talented team. I would like to extend my thanks and congratulations to Isabella Marriott, Publications Director; Nicholas Plessas, Vice President (Education); the Education Publication Subcommittee; and our designer, Judith Tan. Finally, I would also like to express my gratitude to our sponsors for their continued support, and of course, our brilliant contributing members whose articles serve as the heart of this publication. I hope you enjoy reading this edition of The Full Bench.

Vice President of Education’s Welcome BY NICK PLESSAS

Welcome to our first installment of The Full Bench for 2020 — The Tech Revolution. Modern advancements in technology are moving at a blinding pace and many believe we are approaching a fourth industrial revolution. Industrial revolutions are categorised not only by their pace but the impact they have on industries, economies and our way of life. These advancements in technology are evident through the innovative ways individuals and businesses are adapting to work around the global lockdown which this journal finds itself published in. The adaptation of many businesses to operate remotely is creating significant discussion about the changes we may see to the traditional work week. As flexible work weeks may have just had the proof of concept they needed, businesses will need to adapt to increasing demand for a greater work life balance facilitated through technology. The many ways in which the Legal industry will be affected by this technological shift will be explored by the articles contained in this edition. Such as the many different ways in which NewLaw is adapting to improve the provision of legal services. Additionally, this shift will bring about cultural changes which will demand action from the legal industry, such as the threats this shift will impose on our democracy. I would like to recognise the magnificent work of our Editor -in-Chief, Isabella Marriott, her team of editors, and all our contributors who provided this journal with their fantastic insights. On behalf of the UTS Law Students Society, I would like to thank you for supporting our student publications and I hope you enjoy this edition of The Full Bench.


Editorial Team Welcome CAMPBELL RICE Bachelor of Laws In my article, I discuss the rise in ‘self-learning’ technology, and the implications that has on deciding who is responsible for the actions of algorithms. In particular, the article focuses on whether algorithms can breach section 18 of the Australian Consumer Law (ACL) by engaging in misleading or deceptive conduct, and if so, looks at who is responsible if consumers are misled or deceived. As ‘self-learning’ algorithms can automatically improve their knowledge and processes through experience, it is conceivable that a computer could teach itself to mislead or deceive consumers, with no human input. As an algorithm is unlikely to come within the definition of ‘person’ that is required for a breach of s 18 of the ACL, the responsibility would likely fall to the programmer or company that uses the software. This article explores the difficulty of navigating new technology that can enhance business efficiency, but can also pose a great risk for businesses that choose to use it.

MILICA LAZAREVIC Bachelor of Economics and Bachelor of Laws The hopes of the digital revolution were that it would facilitate the democratisation of society. Inventions such as the internet and social media have the capacity to fulfil the reformist ideas underlying democracy, for example, by offering an abundance of information which equals knowledge which can empower the unempowered. However, are these benefits making us blind and prone to being exploited in both a political and commercial sense, leading to digital totalitarianism? Does the monopolisation of big tech companies make the powerful more powerful? My article will delve into the enigmatic and dark side of the digital revolution through exploring the notions of surveillance capitalism, digital panopticism, media populism, and the monolithic power of giant tech companies from Silicon Valley. All issues of which seem to normalise the invasion of privacy, misinformation, surveillance and the manipulation of individuals. My article contends that if these issues are not adequately addressed and major tech companies are left unregulated, it will pose severe challenges to upholding the core aspects of democracy.

ANDREW HUYNH Bachelor of Law and Bachelor of Business I am currently completing an internship at a legal tech firm called Lawpath. I encourage all law students to gain some experience outside of law school. This may include the legal industry, legal centres, or overseas practice. It has deeply enriched my experience and understanding of the law, as it allows me to use the law in a practical sense. I have also performed work experience at a small suburban law firm during high school and the experience began my passion for the law. I have a great interest in both the areas of law and technology. I like the idea that we are able to make connections with technology and law in this ever developing society. My article delves deeper into the legal implications of the disruption of technology, especially in the advent of artificial intelligence. The big question that clouds our pursuit of a legal career is the impact of technology. Is technology in the legal industry a friend or foe? Here we answer some of the big questions facing the legal industry today. Grab a coffee and enjoy the read.

ANTHEA DINH-TRAM Bachelor of Communications (Public Communications) and Bachelor of Laws With experience working in social media management and legal publishing, I am currently interested in the areas of media and intellectual property law. With STEM said to be the direction of the future and law grounded in the past, what happens when the unlikely pair meet? Of course, a revolution! However, will that allow lawyers to access libraries from their home, contact clients across the country, and dictate their work to the screen? Or, will it force them to compete with burn-out proof machines? This is something we can only wait to see. At least, for now, it can be accepted that technology is aiding the legal profession during the COVID-19 pandemic. The situation has seen Australian lawyers use virtual courtroom facilities and meet clients online. So, we must wonder, have the gates to technological development in law been opened? Or, will they shut again once the pandemic is over? These thoughts are only a fraction of what the contributors to this issue of The Full Bench contemplate. So, please read on and enjoy this issue of The Full Bench!


Staying true to your direction is what defines Clayton Utz. We’ve built a culture that’s unlike any other law firm, but don’t just take our word for it. A good lawyer needs compelling evidence so meet our people and judge for yourself. claytonutz.com/graduates

Academic brilliance certainly counts, but graduates who thrive here have something extra – a natural passion for connecting with people and a strong sense of self. That’s what staying true is all about. If you have these qualities, Clayton Utz is for you.


The 4th Revolution  — Technology Law in the Digital Age KURT CHENG

It is no surprise that in an industry steeped in history and tradition that new ways of working are often difficult, albeit open to adopting. With a trending focus on workplace environments, the wellbeing of the employee and inclusive culture and diversity, many industries are increasingly shifting towards cultures of open collaboration, rather than that of sole independent workers. The era of technological revolution has witnessed changes within legal practices gearing towards law in the digital age, and how technology is driving this phenomenon. A central tenet in the law in this contemporaneous age of working is the adopted by Australian courts too. Thereby, increasingly turning to new technologies as a means to support administrative flow and access to justice. Together, legal education in Australia is no exception to the changes technology presents in preparing students of tomorrow for the industry of the future. The Future of Legal Services Increasingly, the legal services industry is reliant on technology. However, the extent of the industry’s reliance is widely varied. Whether a suburban law firm utilising trust accounting software or commercial firms adopting automation technology to form custom trust deeds, legal services have come a long way. As we explore ‘The 4th Revolution — Technology’, we must question, what other possibilities are we yet to unlock? The answer has never been more simple, there are plenty. Technology has unlocked possibilities one would never have imagined some decades ago, albeit, as we sit here speculating what else we will achieve. If we look at the legal services industry objectively, we’re able to see that much of the monotonous and repetitive processes one usually would complete manually are increasingly

replaced by technology. Whether that is filing court documentation, preparing contracts for clients to blockchain technology and artificial intelligence (AI), the law in the digital age has certainly streamlined a large proportion of what employees were once empowered to do, while introducing new areas of law too. But how can we possibly reach further? Tomo Hachigo, Co-Founder at Sprintlaw, ideates that technology for project management, for example, can be crucial in “predicting workloads and managing resources accordingly”1 enabling progress monitoring to ensure time spent on work produced is valuable to the client while ensuring the effective deployment of resources. There are obvious impacts of workplace culture translating into mental health and employee wellbeing. Hachigo suggests a prevalent attitude of “let’s put on more bodies and we’ll get there, without real planning or resource management” presents a number of issues that can be easily remedied or avoided completely. Digital technology can be looked towards for the effective use of resources from progress tracking to maintaining rapport through client relationship management (CRM) systems, as well as services made widely available such as Lawbots2 that are able to identify key ‘legal and tax issues for any small business with employment law, contract law, tax and topic related legal questions’. These examples are indeed in existence at present and nowhere near hypothetical. Emerging legal technology-based law firms, such as Sprintlaw, LegalVision and Lawpath adopt new ways of working with agility in the digital age of law. These firms are increasingly creating a cornerstone of the new age of law in the digital world, and leading the way for the future.


Digital Literacy Viewing the legal services industry as a whole, we must account for the learning, training and education required in order for future practitioners and advocates to play within society. Increasingly, legal education in Australia has seen major changes in response to technology and societal values. In order to prepare graduates for law in the digital age, with great uncertainty on the developments that will occur, law schools are evidently teaching in ways that foster collaboration, critical thinking and unconventional approaches to learning. Coinciding the changes in the law through technology and society’s shifting values, Australian courts are also increasingly changing the way justice is administered. By providing both an expeditious process and greater administrative flow, modern courtrooms have adopted modern technologies. This includes video conferencing tools for remote witness testimony and electronic court filings to reduce administrative burden. Reflecting these modern changes, the environment in which students undertake their studies are evidently coinciding with the ‘open plan’ style of modern workplaces to foster a culture of collaboration in accommodation of technology and tools of the courts. Evident at the University of Technology Sydney (UTS), new Moot Trial Courts3 unveiled in 2020 mirror the style of contemporary Australian courtrooms, where Professor Lesley Hitchens, Dean of the Faculty of Law states, “[i]n this sense, it reflects who we are … modern, transparent, future-focused and practical”.

1 2 3

The Challenges With technology both providing a platform of innovation, curiosity and practicality, it also presents challenges when attempting to prepare students for an industry still steeped in tradition, with an uncertainty of what that future might look like. Whatever the digital age of tomorrow presents for graduates of the future, law schools across Australia must equip students with the traditional strengths of legal education, while ensuring agility to adapt in a digital world of delivering legal services. With Australian courts embracing technological advances and institutions of legal education adopting the need for new ways of working in preparation for the future workplace, the digital age of law will present both new discoveries and surprises, where we’ll be prepared to face. Conclusion As the world evolves, industries form, develop or change; the future outlook of a world of technological innovation is unclear. What we know is that we’ve come a long way through technology and the law in the digital age has seen unimaginable changes in the way its industry operates, in the administration of justice and the preparation of future practitioners. It is paramount that we continue the work technology has presented to form greater ways of working in an industry steeped in history, that we are able to write and tradition in what we are empowered to create.

Tomoyuki Hachigo, Building a law firm that doesn’t rely on a culture of overwork, LinkedIn Article (30 January 2020) Pinpoint.<https://www.linkedin.com/pulse/building-law-firm-doesnt-rely-culture-overwork-tomoyuki-hachigo/>. Sprintlaw, Lawbots (March 2020) < https://sprintlaw.com.au/freebies/ >.

UTS Faculty of Law, A view to the future: UTS Law in UTS Central (6 March 2020) < https://www.uts.edu.au/about/about-our-campus/news/view-future-uts-law-uts-central >.


2020 Vision: Man “AI empowers lawyers to spend more time on analysis,

counselling, negotiations, and court visits.” (Andrew Aruda, creator of the first artificially intelligent lawyer)

The year was 1997. The world chess champion, Garry Kasparov,

was just defeated by an IBM supercomputer named Deep Blue in a game of chess. This momentous occasion revealed that Artificial Intelligence (AI) may outperform humans at cognitive tasks. In the past, the legal profession had underestimated the potential of AI, much like Garry Kasparov did. It appeared that AI had reached a stage where it could outsmart humanity — at least in a game that had long been considered ‘too complex’ for machines. This was one of the first signs of the proliferation of technology that surrounds us today. The project led to the legal tech revolution. And this only happened 23 years ago. In the year 2020, the age-old debate remains: “will machines replace humans?”, and the inevitable answer is not ‘if’, but ‘when’. This results in a fear that tremors through the legal profession. The CIO/Head of Technology at Lander and Rodgers, an Australian corporate law firm, noted, “there will be a radical transformation of traditional workplaces and business models…some roles made redundant and replaced by sophisticated algorithms…it’s predicted there will be 30%–40% less legal work in the future”.1 So, will lawyers ever be replaced?

What is AI?

AI simulates the cognitive processes of the human mind, which enables computers to complete basic job functions. When applied to algorithms, AI allows computers to interpret data, recognise patterns, and form conclusions.

Removing the Drudgery from Due Diligence: Legal Research and Due Diligence

Machine learning technology is a subdiscipline of AI. AI can be taught to recognise sophisticated concepts. This AI advancement makes it viable for firms to filter through piles of information quickly. The ability for AI to rapidly confirm facts speeds up background research, which can accelerate arbitration and litigation proceedings. The due diligence research process broadly splits into two main areas: 1. Information Discovery The firms determine how the information fits into the due diligence case. Then, they make connections between findings and distil information to fit into the framework of a broader story. Effectively, firms will condense information into a more digestible form for the research report’s reader. 2. Information synthesis Although human researchers remain the primary drivers of information synthesis, AI has become effective in laying the foundations for what is established as e-Discovery. The more effectively machines can utilise e-Discovery, the more time due diligence researchers will have to drive profound synthesis.

‘AI replacing lawyers?’ The Shift (online, 31 December 2019) <https://theshift.media/ai-automation/ai-disrupting-law-7zh7c?fbclid=IwAR27w-NPEQubByC5HepLTRLD06u9j1BuVl5xA20EVZXQfZ50WEQ-MdFbRg> 1


Machine Should I Settle?: Predicting Outcomes

By analysing past legal data, AI can provide insights into future outcomes through predictive analysts. AI can store years’ worth of legal documents and can filter through it all to predict the chance of winning relevant cases. AI can provide lawyers with insight on related cases and assist them in accurately answering client questions. This could involve predicting a judge’s position in litigation, or an examiner’s allowance of a patent application based on previous rulings. AI can reveal when judges reuse similar language or follow certain patterns, which can increase a lawyer’s odds of winning. An example of this is Lex Machina, a company that uses predictive analysis. They have high-profile clients such as Akin Gump, Ford and Samsung. By filtering and analysing data from previous cases, it reveals connections and makes predictions about judges, lawyers, parties, and even the subjects of the case. It helps legal departments select and manage outside counsel.

Automation: Practice Management

Practice Management is the use of AIintegrated management systems that aim to reduce the time required for staff to work. It is the automation of repeatable tasks, which include billing, legal research or e-Discovery. AI is about processing data, but it is weaker in areas requiring emotional intelligence and human judgement. AI’s superior processing skills is beneficial in complex areas such as taxation law. In contrast, humans will always be better at negotiation, mediation and making ethical judgements.


Changing Cost Structures

Te c h n o l o g y re d u ce s t h e co s t s associated with representing a client and running a business. Activities such as intake, research, discovery, brief writing, and managing client relations are becoming digitised. More Cases Market pressure drives down legal costs. More cases will be performed, and more deals completed. AI reduces those barriers, creating more work for lawyers than is currently possible. Newer Ways to Practise Law Lawpath is an example of an Australian legal-tech business that informs clients of new technologies and information relating to technology law. Founded in 2013, Lawpath has grown to provide legal services to over 60,000 clients based on the sharing economy model. The business has evolved to become an online marketplace where legal firms pitch for work submitted by prospective clients.


The legal field is a great example of how AI and humans can collaborate to improve the quality, cost and timeliness of legal delivery. The last 20 years were great for innovation in legal research. The virtuous circle of innovation drove competition and inspired new technology. The legal profession will continue to benefit from this disruption in the knowledge of the workplace. Lawyers should not fear AI invading the legal industry, but instead recognise the advantages of embracing it. It is exciting to anticipate the power new tools in the next decade could bring to the legal workplace.


Is a ‘Bot’ Really Going to Take Over? MILAN SHARMA

Robot: “I feel comfortable using legal jargon in everyday life.” Elle Woods: “I object!”

COVID-19, the global pandemic that has effectively eradicated all sense of normality in the foreseeable future. In a matter of months, the world has adapted to working from home conditions by practising selfisolation to prevent the spread of the virus. The pandemic has forced the global population as a whole to depend heavily on technology. For the legal sector, particularly in NSW, this has meant transitioning hearings to telephone or audio-visual links in order to limit face-to-face exposure. In times of such pressure, it has exposed technological gaps, highlighting the importance of technology, especially in the legal sector to maintain law and order within society.

Never before have we seen technology move at such a rapid pace as it is now. We are currently entering the Fourth Industrial Revolution — the technological revolution. This revolution is set to change the way we think and operate, with Artificial Intelligence (AI) being at the forefront of the show. AI is a branch of computer science and holds the aim to affirm Alan Turing’s infamous question, ‘Can machines think?’. The technology is as commonly used as Siri and Alexa have been used for smart searches. The hysteria surrounding AI has been met with both excitement and criticism. However, in the legal sector, it hasn’t taken off as rapidly

in comparison to other industries such as medicine and finance. Richard Susskind OBE predicted in his 1996 book The Future of Law, that email would be the primary form of communication between lawyers and clients.1 This somewhat controversial opinion at the time, resembling an archaic legal sector, is now indeed one of the principal means of communication between legal counsel and clients. COVID-19 may serve as a catalyst to transform the traditional legal sector, with the potential launch of more legal technology start-ups both in Australia and globally. However, there’s the universal primary concern that the ‘Bot’ will take our jobs.


In recent years, legal technology has increasingly become a necessity to drive efficiencies across any legal team both internationally and domestically. Legal technology encompasses a wide range of software that is used to help process proceedings, analyse cases and judgments, and create databases. Legal technology’s ability ranges and includes common databases such as Westlaw and LexisNexis Advanced to AI-based legal research platforms, such as ROSS Intelligence and CARA. While technology such as AI has been perceived as a threat to the law, it should be viewed more so as a meticulous aid that can summarise legal documents or provide answers to legal questions at a faster rate. Therefore, producing work at a superior quality. AI is still in its infancy, but IBM Watson Machine Learning is at the forefront of AI, helping businesses across multiple industries harness AI. IBM has recently developed LegalMation, which has technology enthusiasts excited, but has lawyers just entering the workforce concerned. LegalMation is an AI product that helps legal teams draft high-quality litigation tasks within minutes, thereby cutting down costs, ‘letting attorneys spend more time on work that adds value to client relationships.’2 Similarly, Luminance, a United Kingdombased company, also uses machine learning to analyse documents similar to how we humans would. The idea for these two forms of software is not to replace a lawyer, but to provide ‘additional intelligence and analysis on what may be hundreds or thousands of pages and saving time and money.’ 3 More interestingly, in the United States and the United Kingdom, several law firms bought ‘robotic attorney’s to perform legal research and answer questions.’4 These robots have yet to replace lawyers, and it is unlikely that they will.

Bernard Marr, ‘The Future of Lawyers: Legal Tech, AI, Big Data and Online Courts’, Forbes (Online, 17 January 2020) <https://www.forbes.com/sites/ bernardmarr/2020/01/17/the-future-of-lawyers-legal-tech-ai-big-dataand-online-courts/#3513f9b3f8c4>.

A 2018 study conducted by Deloitte focused on technology and litigation in order to better understand how technological solutions could aid their in-house counsel. The study found that approximately ‘114,000 legal jobs are likely to be automated in the next 20 years and that technology has contributed to the loss of about 31,000 legal sector jobs.’5 However, the study was adamant on reinforcing the notion that a ‘Bot’ cannot take over the role of a lawyer. Emphasis was placed on the need for emotion and understanding the circumstances of the other parties in play. Supporting the overall outlook that AI and legal technology are to be used as an aid. The quantitative loss of jobs is a daunting prospect, but as a society we have to understand that similar to the industrial revolution, different types of jobs will be generated. The United States is taking a step further. Legal technology start-ups are being used as a way to provide access-to-justice for those who do not possess the means to go through the expensive process of hiring a lawyer. Similarly, in both the United States and Australia, access-tojustice is considered a central feature of the modern democratic system  — it represents the fundamental protection of the people. It can be regarded as a focal interest in the drive to create legal start-ups that enforce justice from the ground up. In 2018, there was a 713% growth in legal technology investments within the United States alone. LegalZoom is an example of what easy access-tojustice resembles. The online legal technology company offers services in the creation of legal documents such as leases, contracts and wills, at an affordable cost. The company currently has an estimated net worth of USD$2 billion.6 Whilst LegalZoom represents open access-to-justice, it is clear that they are helping individuals and small businesses protect themselves with the use of drafting up simple documents that may be beneficial in protecting their rights. David Coldewey, ‘Luminance and Omnius are bringing AI to legacy industries’ TechCrunch (Online, 22 December 2019) <https://techcrunch. com/2019/12/21/how-to-bring-ai-to-a-legacy-industry-according-tothe-founders-of-luminance-and-omnius/>.



Thomas Suh and James M. Lee, ‘Save the Lawyer: AI technology accelerates and augments legal work’ IBM (Online, 7 August 2018) <https://www.ibm.com/blogs/client-voices/save-the-lawyer-aitechnology-accelerates-and-augments-legal-work/>.



Jodie Baker, ‘Australia is leading the legal-tech revolution, but what does this mean for lawyers, firms and clients?’ SmartCompany (Online, 17 January 2019) <https://www.smartcompany.com.au/technology/ australia-legaltech-revolution/>.

Comparatively, in Australia, legal start-ups are not seeing the same investment as the United States, but some hold the same idea of legal revolution to enable access-tojustice. LegalVision was established in 2012, initially as an online legal documents business and has grown to an established incorporated legal practice to provide accessible legal support. Its founders recently raised over AUD$4 million from Gilbert + Tobin. Furthermore, Lawpath is another online platform that seeks to remove the complexity of the traditional law system to provide their clients with a more streamlined approach to legal services. These companies demonstrate the ability of legal technology companies and how they can provide access-tojustice and not remove the need of a lawyer. Le g a l te c h n o l o g y a n d t h e combination of AI should be viewed in the legal sector as aids to help progress and enrich Australia’s legal system. The future isn’t bleak and by embracing change, young lawyers can be at the forefront of innovation to ensure that the positions that could potentially be made redundant are replaced with innovative positions that will further the next generation of lawyers. We live in everchanging times, and to ensure law and order in Australia, it is paramount that the legal technological revolution is used to enhance the work of paralegals, lawyers, and judges alike and not to undermine them.

Deloitte Legal, What’s your problem? Legal Technology, (Online Report) <https://www2.deloitte.com/content/dam/Deloitte/global/Documents/ Legal/dttl-legal-technology-operating-model.pdf>. 5

Valentin Pivovarov, ‘713% Growth: Legal Tech Set and Investment Record in 2018’ Forbes (Online, 15 January 2019) <https://www.forbes.com/sites/valentinpivovarov/2019/01/15/ legaltechinvestment2018/#5be15c937c2b>. 6



As Rod Sims, ACCC Chairman, states, “you cannot

avoid liability by saying my robot did it”.1 Nevertheless, the increase in ‘machine learning’ or ‘self-learning’

distributing the same information as the hospital, but would arguably be classified as an information provider, as that is the organisation’s main role.

technology has raised the issue of determining who

Who is responsible? So, who is responsible if an algorithm causes a consumer to be misled or deceived?

What is a self-learning algorithm? Artificial intelligence (AI) refers to intelligence demonstrated by machines that can ‘mimic’ human functions such as learning or problem-solving. Although multiple technologies are described as AI, ‘machine learning’ or ‘self-learning’ technology makes it particularly hard to determine who is responsible for an algorithm’s behaviour.

For a contravention of s 18, it must be established that there was a person, who engaged in conduct in trade or commerce, that was misleading or deceptive, or likely to mislead or deceive. Although the Acts Interpretation Act 1901 (Cth) defines a ‘person’ as an individual as well as a body politic or corporation,8 an algorithm would not be classified as a person. Therefore, under s 18, the responsibility would likely fall to the programmer or company using the software.

should be held responsible for the actions of algorithms.

‘Machine learning’ and ‘self-learning’ technology enables software to automatically improve its knowledge and processes through experience, without being explicitly programmed with new information or instructions.2 As this software can learn by itself, it is plausible that a computer could teach itself to mislead or deceive consumers, with no human input. So, what happens when algorithms engage in misleading or deceptive conduct, through no active input of the person who designed the algorithm? A responsibility gap arises. How can an algorithm engage in misleading or deceptive conduct? Whether particular conduct is misleading or deceptive is a question of fact, to be determined, having regard to the context in which the conduct takes place and the surrounding facts and circumstances.3 However, conduct will be misleading or deceptive if it induces or is capable of inducing error.4

A robot or algorithm could engage in misleading or deceptive conduct in a variety of different circumstances. For example, if a call centre replaces its callers with computers, and one of those computers misleads or deceives a caller, the algorithm has caused the company to breach section 18 of the Australian Consumer Law (‘s 18’)5. However, some inaccuracy is allowable, and just like in human transactions, robots that mislead consumers may be engaged in trade puffery6, which is permissible under the Australian Consumer Law (‘ACL’). AI has also been used to produce ‘automated journalism’, where media businesses use machine learning, and natural language processing, to generate the automatic writing and publishing of content. If any of the content causes a reader to be misled or deceived, s 19 of the ACL protects the publisher, as an information provider against any potential breach of s 18. Although news providers are protected under this provision (except for any advertisements published), other sites that use AI to generate and disseminate information may not be protected. A hospital that uses AI to generate and distribute information online about Coronavirus may not be protected against a breach of s 18, as it would arguably not be classified as an information provider. This is in contrast to an organisation such as Healthdirect7, which may be

Some international regulators have argued that just as employers are held liable for misleading or deceptive conduct by their employees, organisations should be held responsible for representations that their algorithms make. 9 As European Commissioner Margrethe Vestager argues, “what businesses need to know is that when they decide to use an automated system, they will be held responsible for what it does, so they had better know how that system works.”10 However, the issue is: if an algorithm can learn on its own, how are companies meant to know how it works? As Australian law imposes strict liability for misleading and deceptive conduct, a programmer or organisation using the algorithm would likely be held responsible for any breach, even if the algorithm has misled or deceived consumers based on the knowledge that it has self-learned.


— GAP Consequences Although artificial intelligence has the potential to produce significant improvements to business efficiency, it is clear that care needs to be taken when creating algorithms to limit the risks associated with selflearning machines. The nature of strict liability regarding misleading and deceptive conduct in Australia makes it a real possibility that programmers or companies that use these algorithms will be held responsible for any representations that their algorithms make. This applies even if the humans behind it did not initiate the behaviours that caused a breach. The risk of being held liable for an algorithm’s misleading or deceptive conduct may put a dampener on investment in consumer interfacing artificial intelligence in Australia. Lawmakers and regulators have a responsibility to ensure consumer online safety, but must also be mindful not to take an “overaggressive approach to regulation and enforcement in this space”.11

Rod Sims ‘The ACCC’s approach to colluding robots’ (Conference Paper, Can robots collude?, 16 November 2017). 1

Sarah Keene and Troy Pilkington, ‘Consumer Analytica: New Zealand Consumer Law Application to International Developments in Privacy and Use of Data’ (Workshop paper, 11 August 2018) (‘Consumer Analytica’) [42-53]. 2


Taco Co of Australia Inc v Taco Bell Pty Ltd (1982) 42 ALR 177, 202.

Parkdale Custom Built Furniture Pty Ltd v Puxu Pty Ltd (1982) 149 CLR 191, 198-9. 4


Competition and Consumer Act 2010 (Cth) sch 2 s 18 (‘ACL’).

The ACCC website defines puffery as “‘a term used to describe wildly exaggerated or vague claims about a product or service that no one could possibly treat seriously. For example, a restaurant claims they have the ‘best steaks on earth’. These types of statements are not considered misleading.” 6

An online and over the phone government-funded service, providing approved health information and advice in Australia. 7


Acts Interpretation Act 1901 (Cth) s 2C (‘Acts Interpretation Act’).

Nicholas Hirst, ‘When Margrethe Vestager takes antitrust battle to robots’, Politico (online, 28 February 2018) <https://www.politico.eu/ article/trust-busting-in-the-age-of-ai/> 9

10 11


Sarah Keene and Troy Pilkington, ‘Consumer Analytica’ (n 2) [42-53].


Legal Tech — Paving the Way for a “Greener” Legal Industry? GEORGIA CHINCHILLA

technologies in the past.1

Our Legal Footprint Over the last few decades, Australia's legal sector has made great strides in terms of reducing its environmental footprint, particularly in regards to excessive paper consumption. When we think of the legal industry in the traditional sense, we might picture a barrister surrounded by stacks of documents and manila folders tied up with a pink legal ribbon, or a paralegal pushing a trolley full of folders towards the courtroom. Yet, this stereotype is becoming less and less accurate as the widespread use of digital technologies has rapidly transformed the delivery of legal services in the 21st century.

However, the world of legal tech has garnered a growing community of supporters in recent years. The proponents of disruptive technologies, such as artificial intelligence and machine learning, cite many benefits including increased efficiency, accuracy, and the opportunity to reduce the hefty costs of obtaining legal services.2 However, the shift from conventional legal practice and towards disruptive technologies also offers the opportunity to significantly reduce the legal industry's environmental footprint. This article considers what weight , if any, should be given to broader societal considerations within the legal tech debate, with a particular focus on environmental sustainability.

An annual report released last year by the Australian Legal Sector Alliance ('AusLSA'), revealed that paper consumption in Australian law firms has decreased by 66.4% since 2010. AusLSA attributes this drastic reduction in paper use to various technological innovations within the industr y, namely electronic filing, online lodgement, litigation processes, and electronic communication.3 The Federal Court of Australia, one of the first courts in Australia to implement e-lodgement of court documents, has pioneered the use of digital technologies in the litigation space. Currently, the Federal Court is continuing its transition away from paper-based litigation through it’s “Electronic Court File” initiative.

Law is an inherently risk-

averse industry, which has been somewhat reluctant to adopt innovative


Taking the Screen This shift towards digitisation and sustainable paper consumption is not the only way that technology has contributed to an increasingly green legal industry. 4 The accessibility of video technology has increased co n s i d e ra b l y, a l l ow i n g l e g a l personnel to hold e-conferences or witnesses to give evidence in proceedings via an audiovisual link in appropriate circumstances. This technology has the run-off advantage of reducing travel-related emissions which, interestingly, is the legal industry’s “most significant environmental material issue”, making up 55% of total emissions.5 Another example of how technology has reduced the environmental harm caused by the legal sector is the use of electronic databases and iCloud software to store correspondence and documents. This kind of electronic storage has not only reduced paper consumption, but it has also reduced the need to store millions of documents in powered storage facilities and the need to travel to and from off-site storage facilities to retrieve documents. 6 Technology has further opened up more flexible working arrangements within the legal industry, making it more feasible for staff to work remotely from home, thereby reducing emissions by commuters travelling to and from work.

Realigning Priorities It is clear that technological advancements have allowed the legal industry to develop in a more environmentally friendly trajectory in recent years. However, this is rarely recognised as a primary benefit of legal tech. Arguably, this is because the ultimate priority of the legal profession is to defend or enforce the rights of clients. There is a need to balance the risks and uncertainties associated with innovative technologies against the obligation to provide accurate and correct legal advice. These risks have contributed to a well-founded reluctance in embracing legal technology in the past, regardless of the economic, environmental or social benefits that technology may offer.7 Moreover, law firms are businesses engaged in the provision of legal services. Commercial interests such as efficiency, productivity, accuracy and profitability are central to the day-to-day operation and long-term viability of law firms in an increasingly competitive market. Accordingly, economic factors will take priority over broader considerations such as environmental sustainability. There is evidence to suggest, however, that this is changing due to the emphasis that is now placed on Corporate Social Responsibility. There is a growing expectation that industries, businesses and individuals should take part in adopting sustainable practices that minimise environmental impacts. This could extend to the expectation that law firms will adopt legal technologies that are reasonably available and proved to be reliable on the basis that it will reduce their environmental footprint.

Conclusion Ultimately, environmental factors are rarely given centre stage in the legal tech debate. Economic considerations and the ability to provide quality legal advice will always be primary factors, which guide the decision of whether or not to embrace new and innovative technologies. However, environmental considerations and the overall sustainability of the legal profession are increasingly relevant when weighing up the advantages and limitations of legal tech. This is especially the case given the growing expectation that service providers, including law firms, implement practices that are socially and environmentally sustainable. Janis, Blair ‘How Technology is Changing the Practice of the Law’ (2014) 31 GP Solo 10, 12. 1

Michael Legg and Thomas Davey, ‘Predictive Coding: Machine Learning Disrupts Discovery’ (2017) 32 The Law Society of NSW Journal 82. 2

AusLSA, 2019 Legal Sector Sustainability Update (2019) Australian Legal Sector Alliance, 50 <http://www.legalsectoralliance.com.au/ resources/Documents/Reports/2019%20Report/2019%20Flipbook/ index.html?page=8> 3

Hogan, A. M. & Holtan-Basaldua, “The Greening of the Legal Industry” (2015) 19 (1) The Climate Change, Sustainable Development, and Ecosystems Committee Newsletter <https://www.lw.com/ thoughtLeadership/the-greening-of-the-legal-industry> 4


Above (n 3) 47.

Dombosch, Gina, ‘Green is the New Black’ Issue 11.09 Australiasian Legal Business Magazine 60, 65. http://www.legalsectoralliance.com.au/ Resources/Documents/AusLSA/Media/ALB_Oct2013.pdf 6


Ibid (n 1).



“It was terribly dangerous to let your thoughts wander when you were in any public place within range of a telescreen. The smallest thing could give you away.”

Imagine a society where, when you step outside your door, your every action is recorded, and this data is collected and used by the government to give you a citizen rating score. Better still, a society where your thoughts and beliefs are empirically examinable and capable of being a neat archetype to someone else’s end. We may be inclined to discount this as the alarmist dystopia of the tin-foil hat wearers. However, the Chinese Government is harnessing advances in artificial intelligence and data mining storage to create detailed citizen profiles, with primary information obtained through a network of surveillance

cameras1. Admittedly, it has been stated that China today is a harbinger of what society looks like when surveillance proliferates unchecked. But, as this article examines, it is not so radically different from the rest of the world, and yes, Australia too. Orwell’s dystopian tale is less alarmist and futuristic than we may have thought. Clearview AI Clearview Al is a technology company that provides facial recognition software. They boast that its powerful and unprecedented facial recognition tool can be used to identify a person in almost any situation. They have amassed a database of billions of photos, from Facebook,


Instagram, LinkedIn and other websites. Sound familiar? We can safely assume that you have at least heard of the Cambridge Analytica and Facebook scandals, even if only through ‘The Great Hack’ Netflix series. Just like the imaginative society that this article opened with, Cambridge Analytica empirically examined the thoughts and beliefs of Facebook users, and created categories of these individuals for political means. This certainly created a widespread cautionary sentiment, especially given the nature of social media use today. Unlike surveillance in China, the possibility that it would affect us was more tangible. Nonetheless, another sentiment was that — the data was reportedly misused by Russian and US Governments2. This behaviour may have seemed to some as distant, or at least less probable, in Australia. However, Clearview Al is of interest here because a recent report exposed four Australian police organisations as clients of Clearview Al, along with 2,200 law enforcement agencies, companies and individuals around the world. Between the Australian Federal Police and state forces in Queensland, Victoria and South Australia, more than 1,000 searches have been conducted. Could we have our own Cambridge Analytica scandal unfolding in Australia? It is difficult to know precisely how this information is being used, but the reality is that — with or without Clearview Al, these breaches occur, and the law is both inherently slow, and inherently unable to respond effectively. Biometric Breaches Under current privacy laws in Australia, biometric information (such as your face, fingerprints, eyes, palms and voice) is considered sensitive. The Privacy Act 1988 (Cth) makes clear that any organisation collecting this information must first have consent to do so. However, there are exceptions to this where the information is deemed “necessary” to prevent a serious threat to the life, health or safety of any individual3. Many believe this exception has been exploited by law enforcement agencies, with legal commentators suggesting it is not broad enough to encompass all of the conduct that we have seen.4 The first red flags were both in 2019, when data breaches of the My Health Record were exposed as having risen from 35 to 42 and when the Federal Government announced its plan to create a national facial recognition database. However, legislation for the latter is currently stalled because of privacy concerns. Indeed, more anecdotally, the use of surveillance has also resulted in insurance claimants having their payments suspended because surveillance footage merely recorded them looking happy. This was reported last year by a Border Force Officer diagnosed with PTSD and who had his Comcare payments suspended for that reason.5 The point being illustrated is that the scope of how surveillance can impact any one of us is difficult to ascertain with precision, but it is safe to say that it is significant. Consent Let’s pull the scope back a little further and consider another area of law that impacts this issue. Thus far, we have discussed large-scale breaches of privacy

legislation by governments and law enforcement agencies. But do we ever consent to this information being obtained? The answer is yes, all the time. The use of technology and social media is so intertwined with our lives today — as are standard-form and click-wrap agreements. People don’t read terms of service, privacy policies or other electronic boilerplate. In all of these agreements that we mindlessly click through, we are signing away our consent to have our data collected. In the United States, this issue was explored in Berkson v Gogo LLC 97 (2015), with Senior District Judge Weinstein outlining a test to answer the question of enforceability in such an agreement. However, currently there is no domestic litigation concerning the transparency of these online agreements. This is important because ordinarily, there may be some protection by way of unfair terms under Australian Consumer Law, but it is unclear how this would apply in an online context. Conclusion There is an obvious corollary that this article has not yet examined. It can be useful. It goes without saying that technology has had an unprecedented and positive impact on legal evidence, proceedings and examination. But the point is that the impact that this technology will have on the law (and has already had) is a point of concern. It is not only further regulation that we need. We also need a greater understanding of how we use technology and the consent that we give for our information to be collected and interpreted. When we view these recent developments in Australia in the fore of Cambridge Analytica, Facebook and China, we have a reasonable concern.

1984, written in 1948, is a cautionary

prediction of the future. 2020, in 2020, needs no cautionary allusion and rearrangement of dates.

Anna Mitchell and Larry Diamond, ‘China’s Surveillance State Should Scare Everyone’, The Atlantic (online, 2 February 2018) <https://www.theatlantic.com/international/archive/2018/02/china-surveillance/552203/>. 1

Sean Illing, ‘Cambridge Analytica, the shady data firm that might be a key Trump-Russia link, explained’, Vox (online, 4 April 2018) <https://www.vox.com/policy-and-politics/2017/10/16/15657512/cambridge-analyticafacebook-alexander-nix-christopher-wylie>. 2

Australian Government, Office of the Australian Information Commissioner, Australian Privacy Principles guidelines, 22 July 2019, 6.1-6.77. 3

Sonia Hickey, ‘Police Accused of Lying About Use of ‘ineffective’ Facial Recognition Software’, Sydney Criminal Lawyers (Blog post, 4 March 2020) <https://www.sydneycriminallawyers.com.au/blog/police-accused-of-lyingabout-use-of-ineffective-facial-recognition-software/>. 4

‘Does video surveillance of psychiatric compo claimants tell us anything?’, The Law Report (ABC, 15 October 2019) <https://www.abc.net.au/radionational/programs/lawreport/video-surveillance-psych-compoclaims/11589824>. 5


PERSON 733376

PERSON 749532

PERSON 370443

Democracy in a Digital World: Reality or Illusion?

PERSON 829872


The hopes and expectations of the digital revolution were that it would democratise society. The rule of law, separation of powers, independent courts, responsible government and human rights are some of the foundations of our legal system, which strives to protect democracy. Although inventions such as the internet, the computer and social media can fulfil such reformist ideas, what democracy was unprepared for was digital totalitarianism. Digital totalitarianism is the absolute political or commercial power over society attained via information and communication technology.1 This article will delve into the enigmatic and dark side of the digital revolution through exploring the notions of surveillance capitalism, digital panopticism, media populism, and the monolithic power of giant tech companies from Silicon Valley. All issues of which seem to normalise the invasion of privacy, misinformation, surveillance and the manipulation of individuals. This article contends that if these issues are not adequately addressed and major tech companies are left unregulated, then it will pose severe challenges to upholding the core aspects of democracy.

Surveillance Capitalism and Digital Panopticism Digital surveillance indicates that governments and corporations will acquire unparalleled power to monitor and manipulate their citizens. In Silicon Valley, surveillance occurs for the capitalist profit-making motives, augmented by the commercialisation of a user’s data, also known as surveillance capitalism. Economist Shoshana Zuboff contends that it “claims human experience as free raw material for translation into behavioural data”.2 The process by which this is achieved is by the collection of extensive data on a user’s behaviour, which can then be utilised to alter your behaviour in ways that are practical for users and profitable for corporations. Humans are not completely aware that they are trading their privacy and digital freedom for convenience. Through surveillance, one can predict what the consumer wants, which yields profits and the ability to control and modify the predictable future.3 Larry Page, the co-founder of Google, has declared that: “[Google] should know what you want and tell it to you before you ask the question.”4 Why is predictability vital, and what does it have to do with digital


PERSON 496289

PERSON 248437

PERSON 846882

PERSON 665895 DEPENDANT 547751 PERSON 120570

PERSON 435677

PERSON 139811

PERSON 101097

totalitarianism? Well, the barrier to having absolute power is the unpredictability of humans.5 This is because we are inherently spontaneous.6 Thus, predictability is essential for these giant companies to attain their profits which have priority over human rights. Moreover, this form of surveillance is associated with Jeremy Bentham’s theory of the Panopticon which involves a system that controls behaviour.7 It metaphorically depicts a central tower from which prisoners are monitored. Prisoners do not know when they are monitored, thus ensuring that they are obedient at all times. However, this can be used as a mechanism to repress society by corporations and governments. As asserted by philosopher Foucault: “He is seen, but he does not see; he is an object of information, never a subject in communication.”8 In today’s context, it ensures constant scrutiny, wherein people are conditioned to behave a certain way without even noticing because of the clandestine nature of the digital world combined with their lack of digital literacy. For example, China’s

PERSON 308370


‘Social Credit System’ ensures conformity and to ‘raise the level of trustworthiness’.9 It digitally examines the behaviour of citizens to determine what rewards or punishments they deserve. An algorithm is to achieve this, which is impervious to manipulation and insidiously collects information about you to know who to target for commercial or political purposes.10 Overall, individual autonomy and self-determination are substituted for external behavioural control by artificial intelligence.

digital media taking over.15 There has been social media abuse and the spread of disinformation, palpable during the election of Bolsonaro in Brazil, the Five Star Movement in Italy and Victor Orban’s victory in Hungary (to name a few).16 Concepts such as popular sovereignty, nationalism and xenophobia are the focal point of the populist agenda in Western democracies.17 Neo-populists use social media as a platform to induce unfounded fears, and of which intensely polarise society.

Monopolisation of Major Tech-Companies We live in a data-driven economy that is shifting towards monopoly since a few technological corporations such as Google, Facebook, Microsoft, Apple, and Amazon are very influential in their fields. This means that technological power is possessed by several corporations that capitalise on surveillance and manipulation, never pay their fair share of taxes, thus normalising the disruption of the law. If left unregulated by the law, it will offer these tech giants absolute power. Regulators are already failing to catch up to the rapid speed of technological progress, which leads people to question their importance and whether one can depend on them for justice.11 There are no laws directly related to this matter, so what these companies are doing is not strictly illegal. Nonetheless, it is undemocratic and does not best serve society’s interests. For example, Google prospers in a borderless digital society and is bound by US law, which means justice is difficult to attain for anyone outside the United States. The Google Spain case scorches light on this idea, wherein Google challenged the validity of European data protection laws by contending that they are bound by Californian law (given that Google services originated in California).12 By doing this, these monopolies elude responsibility, by targeting the law’s numerous loopholes, thereby in some way justifying their breach of the law.

Specifically, the Cambridge Analytica case highlights the breach of privacy on 87 million profiles for the intents of political advertising, favouring candidates such as Ted Cruz and Donald Trump.18 The defences they used in court were that governments could not fathom the nature of artificial intelligence and, therefore, could not establish appropriate decisions on the case. 19 Unfortunately, this is true, signifying that major tech companies can exploit this situation to destabilise the foundations of a democratic legal system. By doing this, these companies elude responsibility and provide the impression that they are above the law. Facebook was fined USD 5 billion, but it does not alter the fact that US citizens were subject to psychometric profiling and micro-targeting for the election.20 Here we see how the power among governments and corporations equalise, as legislators are not keeping up with the progress of technology, allowing companies like Facebook to abuse that and deprive its users of the right to privacy and autonomy.

Media Populism Artificial intelligence has boosted polarisation, which is primarily attributed to the rise in media populism. Media populism refers to a digital approach in politically appealing to and representing the needs of ordinary people, typically by challenging the elite.13 Although the definition of populism does not specify a political party, currently in Western democracies it is right-wing parties (neo-populists) that feel that they are under-represented in modern democracies. They strive to induce fear in regards to immigration and social security, because it is an effective manner of captivating people’s attention and playing on their emotions.14 When Trump announced that the media is the ‘enemy of the people’ it generated feelings of dubiety towards traditional journalism with 1

Vincent F. Hendricks et al, ‘Epilogue: Digital Roads to Totalitarianism’ (2019) (1) Springer Link 119, 134.

John Laidler, ‘Harvard Professor says Surveillance Capitalism is Undermining Democracy’, The Harvard Gazette (online, 4 March 2019) <https://news.harvard.edu/gazette/story/2019/03/harvard-professor-says-surveillancecapitalism-is-undermining-democracy/>. 2

3 4 5 6 7


Hendricks (n 1) 127. Hendricks (n 1). Ibid. Ibid.

Thomas McMullan, ‘What does the Panopticon Mean in the Age of Digital Surveillance’, The Guardian (online, 23 July 2015) <https://www.theguardian.com/technology/2015/jul/23/panopticon-digital-surveillance-jeremybentham>, quoting Michel Foucault, Discipline and Punish The Birth of the Prison (New York: Pantheon Books, 1977). 8


Hendricks (n 1).


Kevin Korner, ‘Digital Politics: AI, Big Data and the Future of Democracy’ (2019) Deutsche Bank Research 1, 4.

Conclusion Conclusively, the ramifications of the digital revolution include surveillance capitalism, digital panopticism, and misinformation, which facilitates the rise of media populism that is all normalised by the actions of a few monopolistic technological corporations. This has the capacity to deprive individuals of their autonomy, dignity, self-determination, freedom, the rights to privacy, etc. Hence, widespread digitalisation subverts these democratic ideals. So, if nothing is done to prevent this and legislators keep lagging, then digital totalitarianism will become a reality that one cannot escape from.

Maelle Gavet, ‘The Digital Revolution is Destroying our Democracies.’, World Economic Forum (online, 07 February 2017) <https://www.weforum.org/agenda/2017/02/the-digital-revolution-is-destroying-ourdemocracies-it-doesn-t-have-to-be-that-way/> 11

Paul Nemitz, ‘Constitutional Democracy and Technology in the Age of Artificial Intelligence’ (2018) The Royal Society Publishing 376. 12

Emiliana De Blasio and Michele Sorice ‘Populism between Direct Democracy and the Technological Myth’ (2018) Palgrave Communications 4, 15. 13



Karl M. Manheim et al, ‘Artificial Intelligence: Risks to Privacy and Democracy’ (Research Paper No 2018-37, Loyola Law School, Los Angeles Legal Studies, December 2019). 15

16 17 18 19 20

(n 11) 5.

Blasio (n 14) 15.

Nemitz (n 13) 376. Ibid.

Korner (n 11) 5.



Content Moderators —  Is This the Worst Job in Technology? GEORGIA DIXON

Content moderators are the unsung

heroes of the internet and the front-line

protectors of our community that often go unnoticed. Every second, millions of photos, videos and comments are

uploaded onto the internet and cover

a variety of topics from memes to your favourite photos and COVID-19 internet

challenges, to your grandmother’s chain

mail posts. However, this is the side of

social media that Facebook lets you see.

The other is a dark, scary and potentially traumatising




filters out. This filtering is performed

by human beings, but there is rarely

any acknowledgement or concern for

the individuals who have to sort through the content.


Role of a Facebook Moderator There are 15,000 moderators around the globe employed by Facebook who typically spend at least six hours a day viewing reported content with a high performing moderator, looking at 400 or more posts per day to assess what content can remain on the platform.1 Moderators typically see graphic and violent imagery, hate speech, terrorist attacks, shootings, sexual exploitation, child pornography, animal abuse and bestiality on a daily basis.2 One employee reported having to watch live terrorist attacks unfold, people stoned to death and a dog cooked alive.3 Others have had to watch child rapes, refugees being tortured with molten metal, executions and mass killings.4 The content is horrifying and has no place on Facebook. However, consistently being exposed to graphic imagery on mass isnâ&#x20AC;&#x2122;t all that is affecting these workers. Reading posts or videos spreading hate speech and racism has even caused some moderators to adopt radical ideals, with one employee reporting that their colleagues have become flat-earthers and Holocaust deniers.5 Facebook

moderators are forced to consume a considerable amount of abhorrent content on a daily basis, that if left unchecked and without proper emotional support, could lead to severe trauma and mental health implications. As a result, many employees have been subject to night terrors, inflicting violent behaviour on others, nausea, cognitive dissociation, depression, anxiety, aggression, overeating and PTSD.6 Some employees have developed secondary traumatic stress disorder (STSD) from observing first-hand trauma experienced by others, which has led to anxiety, sleep loss, loneliness and dissociation.7 To cope with the trauma from their work, employees have also been known to abuse drugs, have sex and utilise offensive jokes at work, further exacerbating the negative work environment for their colleagues.8 Employees are provided with little training on how to identify when their mental health is being impacted by their work and are not given enough support or information on how to treat their mental health. Many only last a few months and most leave after about a year on the job.9


Facebook and its contracted companies do not provide an adequate work environment for its employees. There is extreme pressure on employees to perform well at their jobs, as there is limited job security. Moderators are judged using an “accuracy score” which measures how often their decisions regarding content are correct and aligned with Facebook’s content policies.10 When viewing reported content, the moderator must pick one of over 250 categories to sort the post into. These categories are updated weekly as Facebook policies change, and even more niche categories are designed.11 For example, a video showing a murder is not just categorised into a category called murder, since moderators must scrutinise every detail to determine how the murder occurred. Was there an ISIS flag to link it to a terror attack? Was it one person or many? Was it a child? Once they have determined the one in over 250 categories it fits into, they must do this over and over hundreds of times before the end of the workday.12 If their accuracy score drops below 96%, they can be fired. There is extreme pressure to be perfect. A moderator’s day is also managed minute by minute with oppressive oversight. The ‘breaks’ system further exacerbates the dangerous work environment. A moderator gets two 15-minute breaks, a 30-minute lunch break and nine minutes of wellness time a day. 13 At one contractor, Cognizant, 800 employees share only one urinal and two stalls in the men’s room and three stalls in the women’s room. Employees also claim there was minimal emotional support for employees in this “toxic environment”. What Are the Laws? Employers have a legal responsibility to ensure they provide a safe workplace for employees by managing hazards and risks.14 This includes managing both physical and psychological hazards. In NSW, an employer’s responsibilities are outlined in the Work Health and Safety Act 2011 (NSW). Per section 17, employers must eliminate risks to health and safety so far as is reasonably practicable, and if not reasonably practicable, they must minimise that risk as reasonably practicable. This is where the test for reasonable practicability is a balancing act of the likelihood of the hazard occurring, the degree of potential harm, a notice of the hazard, availability of avenues to eliminate the risk and whether the cost is proportionate to the risk (s 18). Employers also have a primary duty of care over their employees to provide a safe work environment and adequate facilities for the welfare of workers carrying out the work (s 19(3)). Across the world, similar legislation exists to protect employees and ensure a safe work environment, such as the California Labor Code in the USA and the Health and Safety at Work etc Act 1976 in the UK. Response of Internet Service Providers (Isps) Trauma from the internet is not a new phenomenon. Studies of police officers and employees of the US Department of Justice’s Internet Crimes Against Children task force have found that up to 76% of individuals displayed symptoms of psychological trauma (including Secondary Traumatic Stress Disorder) and other emotional distress

from exposure to child abuse material on the internet.15 Facebook also helped create the Technology Coalition in 2006 with other ISPs in an attempt to stop individuals from using the internet to exploit children or disseminate child pornography. Facebook was still a member in 2015 when the coalition published a guidebook for ISPs stating that companies in the technology industry must support employees’ wellbeing “who are on the front line of [the] battle” against abhorrent content.16 They made multiple recommendations, including ISPs, to provide mandatory group and individual counselling sessions administered by a professional with specialised training in trauma intervention, teach moderators to assess their own reaction to images, limit the time an employee is exposed to child pornography and prevent them from viewing child pornography one hour before the individuals leave work.17 They have also suggested some other solutions (which are used by some companies, just not Facebook) such as blurring photos, making them black and white, showing them in smaller sizes, removing audio from videos and performing a psychological assessment on potential employees before hiring them. However, despite the academic studies, and Facebook’s actual knowledge of the impacts of content moderating on a person’s mental health from their own coalition’s recommendations, Facebook and its contractors continue to fail to provide a safe workplace for employees.18 Legal Suits Individuals in Ireland and the USA have launched multiple lawsuits since 2018 against various ISPs, such as Microsoft, Google and Facebook, as well as their contractors, for the damage caused to them in their line of work as content moderators, and the failure of these companies to mitigate the hazards associated with their employment.19 The former employees have filed class-action lawsuits alleging that they were exposed to “thousands of acts of extreme and graphic violence” as well as “child sexual abuse, rape, torture, bestiality, beheadings, suicide, racist violence and murder” on a daily basis.20 Viewing such graphic imagery constantly has left members of the class action with “PTSD and other psychological disorders, physical injuries including stroke and epilepsy, lost pay, lost future earning capacity, emotional distress and loss of enjoyment of life”.21 While they acknowledge that there are some measures in place including counselling and mental health support, these services are very limited and only available to employees.22 This means that once their contracts expire, such individuals are no longer eligible to receive the necessary support needed.23 It is essential that these people are provided with assistance as the emotional effects of PTSD can be far-reaching, may not come on until a period of time after they have completed their work contract and can have long-term consequences. Additionally, they argue that the measures in place to reduce the mental health impacts of the work on moderators need to be advanced. The “wellness coaches” at one of Facebook’s contractors, Accenture, are not medical doctors, nor can they diagnose or treat


Hazardous Content This photo contains hazardous content which some people are employed to stare at indefinitely.

mental disorders. They are unqualified and unsuitable to be providing treatment and assistance to victims of second-hand trauma.24

to protect the community and fails to consider the content moderators who still have to watch the content, scrutinise it for classification and remove it.26

In response, content moderating contracting businesses, such as Accenture, are requiring employees to sign declarations acknowledging that in their role they may view disturbing content that could impact their mental health and cause PTSD.25 Australia has also adopted the Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill 2019, that was passed in April 2019 in response to the live streaming on Facebook of the Christchurch terrorist attack. This bill introduces new offences that apply to ISPs who allow abhorrent violent material to be uploaded or live-streamed to their platforms that is not taken down within a reasonable time. It is a knee-jerk reaction by the government in an attempt

Conclusion Social media could not exist safely without moderators. They are essentially the first line of defence against abhorrent content being shared and an important part of the social media ecosystem. Content moderators are employed to protect the community and stop users from seeing depraved and trauma-inducing images, and it is no wonder that these employees have suffered trauma themselves. Facebook and other ISPs must take further action to protect its employees and reduce hazards in the workplace for the safety of their workers or face further lawsuits, high employee turnover, dissatisfaction and public scrutiny.

1 Alex Hern, ‘Facebook moderators tell of strict scrutiny and PTSD symptoms’, The Guardian (online, 27 February 2019) <https://www.theguardian.com/ technology/2019/feb/26/facebook-moderators-tellof-strict-scrutiny-and-ptsd-symptoms>.

Pro Unlimited, Inc., Civil Action No 18CIV05135, Complaint for Declaratory and Injunctive Relief, Class Action (filed with San Mateo County Superior Court Clerk, 21 September 2018) <https://regmedia. co.uk/2018/09/24/scola_v_facebook.pdf>.

2 Casey Newton, ‘The trauma floor – the secret lives of Facebook moderators in America’, The Verge (online, 25 February 2019) <https://www.theverge. com/2019/2/25/18229714/cognizant-facebookcontent-moderator-interviews-trauma-workingconditions-arizona>. 3 Alex Hern, ‘Ex-Facebook worker claims disturbing content led to PTSD’, The Guardian (online, 5 December 2019) <https://www.theguardian.com/ technology/2019/dec/04/ex-facebook-workerclaims-disturbing-content-led-to-ptsd>.

4 Lauren Weber and Deepa Seetharaman, ‘The Worst Job in Technology: Staring at Human Depravity to Keep it off Facebook’, The Wall Street Journal (online, 27 December 2017) <https://www.wsj. com/articles/the-worst-job-in-technology-staringat-human-depravity-to-keep-it-off-facebook1514398398?ns=prod/accounts-wsj>. 5 Alex Hern (n 1).

6 Alex Hern (n 3).

7 Casey Newton (n 2). 8 Ibid. 9 Ibid.

10 Ibid. 11 Ibid.

12 Nick Wiggins and Damien Carrick, ‘Chris sorted through the ‘blood and gore’ on social media. Now he’s suing Facebook over PTSD’, The ABC (online, 26 February 2020) <https://www.abc.net.au/news/202002-26/ex-facebook-moderator-suing-company-overptsd/11972364>. 13 Alex Hern (n 1).

14 SafeWork NSW, Mental Health, <https://www. safework.nsw.gov.au/hazards-a-z/mental-health>; Safe Work Australia, Work-related psychological health and safety, <https://www.safeworkaustralia. gov.au/system/files/documents/1911/work-related_ psychological_health_and_safety_a_systematic_ approach_to_meeting_your_duties.pdf>. 15 Selena Scola, individually and on behalf of all others similarly situated v. Facebook, Inc. and

16 Andrew Arsht and Daniel Etcovitch, ‘The Human Cost of Online Content Moderation’, Harvard Jolt Digest (online, 2 March 2018) <https://jolt.law. harvard.edu/digest/the-human-cost-of-onlinecontent-moderation>; Selena Scola (n 15). 17 Ibid.

18 Elizabeth Dwoskin, Jeanne Whalen and Regine Cabato, ‘Content moderators at YouTube, Facebook and Twitter see the worst of the web – and suffer silently’, The Washington Post (online, 25 July 2019) <https://www.washingtonpost.com/ technology/2019/07/25/social-media-companiesare-outsourcing-their-dirty-work-philippinesgeneration-workers-is-paying-price/>; Selena Scola (n 15). 19 Nick Wiggins (n 12).

20 Selena Scola (n 15); Sandra E. Garcia, ‘Ex-Content moderator sues Facebook, saying violent images

caused her PTSD’, The New York Times (online, 25 September 2018) <https://www.nytimes. com/2018/09/25/technology/facebook-moderatorjob-ptsd-lawsuit.html>. 21 Selena Scola (n 15).

22 Nick Wiggins (n 12).

23 Andrew Arsht (n 16). 24 Alex Hern (n 1).

25 ‘Facebook and YouTube moderators sign PTSD disclosure’, BBC (online, 25 January 2020) <https:// www.bbc.com/news/technology-51245616>; Madhumita Murgia, ‘Facebook content moderators required to sign PTSD forms’, Financial Times (online, 26 January 2020) <https://www. ft.com/content/98aad2f0-3ec9-11ea-a01abae547046735>. 26 Ariel Bogle, ‘Laws targeting terror videos on Facebook and YouTube ‘rushed’ and ‘kneejerk’, lawyers and tech industry say’, The ABC (online, 4 April 2019) <https://www.abc.net.au/ news/science/2019-04-04/facebook-youtubesocial-media-laws-rushed-and-flawed-criticssay/10965812>.


A Tricky Tightrope 

— Can Legal Technology Facilitate Access to Justice for Domestic Violence in Regional Areas? QUYEN NGUYEN


Domestic violence in Australia

is our national shame, not crisis.1 It is a ‘major national health and welfare issue’ which affects everyone.2 The range of persons affected are diverse, and include: people with disabilities, the elderly, LGBTQI+ people, people from culturally or linguistically diverse backgrounds, and Indigenous Australians. 3 However, women and children remain the most vulnerable category.4 This risk has increased even more vis-à-vis COVID-19 protocols.5 More forms of abuse have emerged, where victims are being subjected to threats regarding their health and safety. Some women may have no choice but to stay in violent relationships because they will risk becoming homeless, or are told by their partner ‘that they have the virus therefore they can’t leave the house’.6 This article will explore how legal technology may assist individuals in rural, regional and remote (‘RRR’) communities in accessing legal services, as they typically experience higher rates of domestic violence than people in urban areas. Firstly, it will provide an overview of what constitutes domestic violence. Secondly, it will demonstrate why RRR communities require our attention. Finally, it will highlight the value of legal technology, and the potential for it to assist victims of domestic violence.


What is domestic violence?

Domestic violence is not explicitly defined under the Crimes (Domestic and Personal Violence) Act 2007 (NSW). Instead, the statute lists non-exhaustive factors that capture a range of behaviours.7 Monica Campo and Sarah Tayton, in Domestic and family violence in regional, rural and remote communities, define it as ‘acts of violence that occur between people who have, or have had, an intimate relationship. …there is… an ongoing pattern of behavior aimed at controlling a partner through fear…’.8 Some examples include: physical violence, emotional or psychological abuse, financial abuse and damage to property.9 Women also experience digital forms of violence and abuse. This may include ‘abusive text messages or emails, making continuous threatening phone calls, spying on and monitoring victims through the use of tracking systems, abusing victims on social media sites, and sharing intimate photos of the victim to others without the victim's consent (‘revenge porn’).10 GPS tracking is also being used via smartphone apps by abusers to send intimidating messages to their victims to threaten that ‘they know where they are, and to look out’.11

Domestic violence in RRR communities

Community attitudes and geographical isolation are core factors which often prevent women from accessing support, or disclosing violence to the authorities. This is concerning as 21% of women living outside of urban areas have already experienced violence from an intimate partner by the age of 15, compared to 15% of women living in a capital city. 12 Community attitudes Victims are often deterred from disclosing violence and abuse to relevant services due to the common social understanding that these ‘family problems’ should not be discussed openly.13 For many women, “they felt their community was complicit in the continuation of domestic and family violence as perpetrator behaviour was rarely challenged and there was an overall indifference to domestic and family violence”.14 This is significant as it suggests that there are even more cases of violence that remain unreported. This attitude also seems to pervade police responses and court magistrates’ attitudes. For the former, consistent breaches of intervention orders are ‘not a big deal’, while the latter is indifferent ‘to the safety concerns of mothers’.15 Geographical isolation Geographical isolation doesn’t just refer to the accessibility of services. It also includes the response time of services to reports of domestic violence. Police and emergency response times tend to be delayed, or may come too late.16 This lack of availability is troubling as it may leave women unprotected from immediate violence, especially as there are higher rates of gun ownership in RRR communities.17 Additionally, as these issues are specific to each geographical area, it makes it more difficult to assess


the effectiveness of services aimed at addressing and preventing domestic and family violence in nonurban communities.18

Why we need legal technology in RRR areas

RRR areas are in desperate need of lawyers. In 2007, only 13.3% of legal professionals practiced in this geographical area, in stark contrast to 96% who worked in private practice in NSW.19 Although there is no ‘one’ reason why there is a shortage, it appears that lawyers often feel overwhelmed in daily tasks.20 Other factors include ‘challenging clients, a high volume of work, stressful and adverse work environments, being remote from supervision and support, and being distant from social and family networks’.21 As such, legal technology may be able to assist. It may provide the benefit of resources and networks to help lawyers provide legal advice remotely and empower victims. Current legal technologies in the context of domestic violence do assist victims, but do not include practitioners providing legal services.

Current legal technology

1. VictimsVoice VictimsVoice ‘records incidences of abuse in a way that’s safe, secure and legally admissible’.22 All data which is uploaded remains securely stored, even if an individual is not using or paying for the app.23 The app was created to assist victims in documenting their injuries. This is critical, as some individuals face difficulty in corroborating their experiences as they come from disadvantaged backgrounds.24 On other occasions there may be no evidence at all, because the victim previously dismissed the assault as ‘accidental’.25 This is concerning, as the forms of domestic violence committed by perpetrators have diversified, and are used simultaneously.26 2. Daisy This application ‘provides information and support services’27 to victims of violence based on their location. This is similar to Toranj which opts to ‘connect victims of domestic violence with the resources and support they need to be safe, both in the moment

Andrew Cairns, ‘Opinion: Domestic violence in Australia is a national crisis’, Third Sector (online, 22 August 2019) <https://thirdsector.com.au/domesticviolence-in-australia-is-a-national-crisis/>. 1

Australian Institute of Health and Welfare, Family, domestic and sexual violence in Australia: continuing the national story 2019 (Data Report, June 2019) 1. 2

3 4

Ibid 4−5. Ibid 1.

Mary Gearin and Ben Knight, ‘Family violence perpetrators using COVID-19 as ‘a form of abuse we have not experienced before’, ABC (online, 29 March 2020) <https://www.abc.net.au/news/202003-29/coronavirus-family-violence-surge-invictoria/12098546>. 5



Crimes (Domestic and Personal Violence) Act 2007 (NSW) ss 9, 11. 7

Monica Campo and Sarah Tayton, Domestic and family violence in regional, rural and remote communities (Practitioner resource, December 8

and long term.28 It enables users to reach out to close contacts during emergencies.29 The police and other forms of assistance may also be called upon.30

Can legal technology break the wheel of violence?

Current solutions geared towards domestic violence in RRR areas have failed due to their homogenous approach. 31 There is a consensus that solutions must be nuanced to ensure community needs are being addressed.32 While legal technology cannot guarantee complete insulation from homogenization, it can certainly offer some level of personalisation and resource efficiency for lawyers to assist vulnerable clients. One core aspect that could be automated includes screening the risk assessments of clients.33 It helps inform several aspects of legal advice, like the urgency for an ADVO or how it could affect their residency status.34 Think of it as a reverse Intraspexion. 35 Instead of predicting potential litigations to minimize risk, lawyers could be informed of how ‘at risk’ a client is to becoming homeless or subjected to violence if an ADVO is not filed promptly, or the client is not aware of community support services. Early intervention is key, with tragic cases like Hannah Clarke serving as a poignant reminder of what happens if help comes too late.36 However, should such legal technology come to life, clients should provide their consent. It is acknowledged that technology can seem intrusive, especially to those who are experiencing stalking by way of GPS tracking.


Legal technology promises assistance with resource efficiency, and could be utilised effectively with the right level of finesse. However, it must be stressed that it may not be an appropriate tool to address underlying causes of violence experienced in RRR communities. Rather, it is one of many used to tackle domestic violence directly, as it takes a village to create change. Remember that #itstopswithme, and if you, or anyone you know, is being affected by violence, call 1800 RESPECT.

2015) 2.


‘Technology-facilitated abuse: the new breed of domestic violence’, The Conversation (online, 27 March 2017) <https://theconversation.com/ technology-facilitated-abuse-the-new-breed-ofdomestic-violence-74683>.





ReCharge: Women’s Technology Safety, Legal Resources, Research and Training 6. 11

12 13 14 15 16 17 18

Campo and Tayton (n 8) 2. Ibid.

Ibid 4. Ibid.

Ibid 5. Ibid.

Ibid 6.

Trish Munday, ‘Recruiting and retaining lawyers: A problem in rural, regional and remote communities’ (2009) 34(1) Alternative Law Journal 32. 19


Campo and Tayton (n 8) 8.


Ibid 8.




Karen, ‘A New App Helps Domestic Violence Victims Collect the Evidence Needed to Charge Their Abusers’, A Mighty Girl (Blog, 1 September 2019) <https://www.amightygirl.com/blog?p=26289>.


Australian Law Reform Commission, Family Violence–A National Legal Response (Report No 114, October 2010) 833.


23 24

25 26

Ibid. Ibid.

1800 Respect, ‘Daisy app’, Help and support (Web Page) <https://www.1800respect.org.au/daisy/>. 27

Janet Burns, ‘This Free App Is Helping Women Tackle Domestic Violence in Iran and Worldwide’, Forbes (online, 2 November 2017) <https://www. forbes.com/sites/janetwburns/2017/11/02/this-freeapp-is-helping-women-tackle-domestic-violenceiran-and-worldwide/#34a294e0493c>. 28

Ibid. Ibid.

Michael Cain and Suzie Forell, Recruitment and retention of lawyers in regional, rural and remote New South Wales (Summary Report No 13, September 2010) 1. Ibid.

Women’s Legal Service NSW, A Practitioner’s Guide to Domestic Violence Law in NSW (Practitioner’s Guide, August 2018) 11−12. 34

Ibid 15.

Jnana Settle, ‘Predictive Analytics in the Legal Industry: 10 Companies to Know in 2018’, Disruptor Daily (online, 29 January 2018) <https://www. disruptordaily.com/predictive-analytics-legalindustry-10-companies-know-2018>. 35

Annie Guest, ‘Funeral for Hannah Clark and children held in Brisbane’, ABC (online, 9 March 2020) <https://www.abc.net.au/radio/programs/worldtoday/ funeral-for-hannah-clark-and-children-held-inbrisbane/12038128>. 36


Profile for UTS Law Students' Society

The Full Bench Vol. 1 2020 - The Tech Revolution  

The Full Bench Vol. 1 2020 - The Tech Revolution