

ETCETERA
LAW IN THE DIGITAL AGE

INTRODUCTION
Et Cetera is the flagship publication of the Deakin Law Students' Society (DLSS) It seeks to reflect the Deakin Law School zeitgeist of the time, and resonates with the interests of Deakin Law students It aims to provoke thoughtful discussion on issues relevant to our readers as students, future legal professionals and citizens of the world. The discourse on intersectionality in the law is profoundly significant and demands continuous engagement We encourage you to delve into these insightful perspectives and reflect on how you can enhance your own understanding and advocacy
EDITORIAL
Priyanka Sharma
Molly Howie
Sophia Qureshi
Diya Matthew
Editor in Chief Editor
Design
Design
CONTRIBUTORS
Anna Araneta
Bridget Barnett
Deidre Missingham
Grant Cowan
Henry Jones
Katherine Boyles
Kelly Seal
Lachlan Ahale
Maria O'Sullivan
Neerav Srivastava
Robert McIntyre
We extend our deepest gratitude to Gadens, whose generous sponsorship and continued support of the DLSS have made this publication possible.

ACKNOWLEDGMENT OF COUNTRY
We acknowledge the Traditional Owners of the land on which the DLSS is founded, the Wurundjeri People of the Kulin Nation. We recognise their continued connection to the land and waters We pay our respects to Wurundjeri Elders past, present, and emerging and extend this respect to Aboriginal and Torres Strait Elders and people from other communities.
ABOUT THE DEAKIN LAW STUDENTS' SOCIETY
The DLSS is one of Deakin University’s oldest and largest student societies. We are a student-run organisation which aims to assist Deakin law students in making the most of their time at law school. Across our portfolios, we work to provide a range of events and services to assist you at every stage of your degree. Whether you want to improve your grades or make new friends, the DLSS is your one-stop shop for all things law at Deakin. For more information you can find the DLSS on Instagram, Facebook, TikTok, LinkedIn, or via our website.
DISCLAIMER
This publication is provided free of charge by the Deakin Law Students’ Society. Any opinions expressed in this publication are not to be held as those of the DLSS, Deakin Law School or Deakin University. The DLSS, Deakin Law School and Deakin University do not necessarily endorse these opinions; they belong solely to the authors
COPYRIGHT
This publication is subject to copyright. Except where permitted under the Copyright Act, no part of this publication may, in any form or by any means (electronic or otherwise) be reproduced or stored in a retrieval system or transmitted by any process, without prior written consent from the DLSS
PRIYANKA SHARMA
2025 Director of Communications
In every generation, the law is called upon to meet the challenges of its time Today, that challenge is the digital age, an era in which the pace of technological change outstrips the speed of legislative reform, where questions of privacy, security, ethics, and equity demand urgent, thoughtful answers.
Law in the Digital Age is not merely a collection of articles; it is a conversation. It brings together the voices of legal practitioners, academics, technologists and students, each offering a unique perspective on how the digital revolution is reshaping our profession, our institutions, and our shared concept of justice. Every author and contributor whose thoughtful work has made this edition possible has helped create something truly meaningful and for that I sincerely thank them.
For students and early-career lawyers, this issue offers insight into the skills, adaptability, and ethical grounding that will define the next generation of the profession
For experienced practitioners, it provides a moment to reflect on how far we have come, and how far we still have to go.
As I conclude my tenure as Communications Director of the Deakin Law Students’ Society, I feel deeply grateful for the opportunity to lead this project I extend my heartfelt thanks to my officers, Diya Matthew, Molly Howie and Sophia Qureshi, whose talent, dedication, and unwavering diligence brought these pages to life Their creativity, attention to detail, and commitment to excellence have shaped this publication into something we can all be proud of.
I also extend my sincere gratitude to our sponsor, Gadens, whose generosity and commitment to fostering dialogue on technology and the law brought this vision to life.
It is my hope that these pages spark dialogue, inspire and encourage each reader to play their part in shaping a future where the law and technology advance together in harmony
2025 Communications Officers
Law in the Digital Age is a topic at the heart of what legal students face as they prepare to navigate the future of legal practice
In this issue of Et Cetera we explore different perspectives on how technology is reshaping the law, a subject that grows more relevant every day as digital tools become integral to the legal profession
We are incredibly privileged to present you with eleven articles exploring the intricacies of law in the digital age written by experienced practitioners, passionate advocates and insightful students about their experiences and perspectives on the many facets of digital law. We seek to offer students the opportunity to understand the nuances of studying and working in the law in such a rapidly changing digital world.
This issue stands as a testament to the unwavering effort, commitment and dedication to everyone involved. Each article, feature and detail has been curated and considered to inform and engage our community.
As we conclude this year’s edition of Et Cetera, we would like to extend our heartfelt gratitude to our Director, Priyanka Sharma, for her visionary leadership and unwavering support throughout this journey. Her guidance and dedication have been instrumental in shaping this issue and bringing it to life.
We also thank our contributors, readers, and supporters for your continued engagement. Your curiosity and passion drive the conversations we aim to foster through Et Cetera This edition reflects our ongoing effort to understand and interpret the shifting contours of the world we live in. We hope it sparks new ideas, meaningful discussions, and fresh perspectives. As always, we welcome your thoughts and look forward to sharing more with you in future editions
Molly Howie
Diya Matthew
Sophia Qureshi
15 NAVIGATING LEGAL PRACTICE IN THE AGE OF MODERN TECHNOLOGY
Henry Jones
18 PRIVACY, CYBER AND AI: HOW TECHNOLOGY IS SHAPING LEGAL PRACTICE
Katherine Boyles
21 O BRAVE NEW WORLD? LAW IN THE DIGITAL AGE
Deidre Missingham
25 CRIMINAL LAW AT A CROSSROADS: EMBRACING AI IN 2025
Bridget Barnett
27 IN CONVERSATION WITH KELLY SEAL
29 IN CONVERSATION WITH MARIA O’SULLIVAN
LAW IN FLUX: NAVIGATING LEGAL PRACTICE IN A
DIGITALLY DISRUPTED
WORLD
Legal practice is undergoing a profound transformation, as emerging technologies and the continual demand ‘to upgrade’ reshape the way we communicate, work, and interact in and beyond our profession. The law can often be viewed as (and arguably is) something quite set in its ways, yet this is becoming at odds with technological advancement and societal expectations. This is particularly evident as lawyers are being called to maintain and gain perspective beyond being solely legal advisors – to be recognised now as holistic individuals who prize ethical stewardship, strategic collaboration and innovation alongside commercial acumen.
As a graduate lawyer-in-training at Gadens entering this dynamic field, I recognise the opportunity not just to learn, but to actively contribute, realising that this progress in the digital world is naturally aligned with the next generation of legal practitioners (regardless of age) emerging. Excitingly, Gadens’ leading tech and privacy law practice within its Intellectual Property & Technology (IPT) team is consistently operating on the forefront of law in the digital age, collaboratively drawing on the expertise of its diverse team of partners, lawyers, clients and more.
Today, lawyers exist and work in a landscape where generative AI, automation, and data analytics are no longer fringe tools but integral to daily practice. Platforms like Co-Pilot, Lexis+, AI, ChatGPT and even firm-commissioned programs are revolutionising legal research, drafting, and client engagement This is enabling legal practitioners to deliver faster and even more personalised services, augmenting their capabilities and freeing up time for deeper analysis and more meaningful client relationships However, the unreliability of generative AI and the like still requires substantial oversight and review, as its efficiency can sometimes be coupled with an overly generalised or out-of-context approach
By Robert McIntyre
A remarkable positive from this skills transition is the mindset that comes along with it. Lawyers are increasingly expected to navigate complex ethical terrain, from data privacy and algorithmic bias to the responsible and reliable use of generative AI as mentioned. This demands a renewed focus on continuous learning, training, adaptability, and collaboration –reaffirming the equalised need of both embracing generative AI/new tech and diversity and inclusion. I have found Gadens’ focus on this, generally as a new graduate and as one rotating specifically within the IPT Team quite motivating, as I have gained new knowledge as fast as I have been able to utilise it.
As in the commercial world, the public interest world of Pro Bono law has also seen legal professionals embracing digital innovation to bolster delivery of services, promote wellbeing initiatives and helping drive systemic reform. Yet now even broadly, clients across all fields are redefining expectations from their legal practitioners, seeking out transparency, empathy and inclusivity more than ever. This is requiring their lawyers to bring a more prevalent balance of technical expertise, emotional intelligence, cultural awareness and a commitment to justice. Gadens’ leading and strong commitment to Sustainability and Social Impact, within and beyond its IPT Practice, is a testament to the firm’s cutting-edge approach to maintaining relevance and engaging with diverse perspectives Having already worked on various commercial and Pro Bono matters, it is compelling to see how the benefits of embracing this evolving digital world flow when working within an environment conscientious of its risks and potential.
DIGITISATION IN THE LEGAL INDUSTRY
By Anna Araneta
Principal Solicitor at Denise Dwyer Lawyers
Gone are the days when a junior lawyer would physically attend a conveyancing settlement. It was only about 10 years ago when settlements actually took place at a settlement location, usually in the CBD, where three or more parties acting for the Vendor, Purchaser, Incoming Mortgagee and Outgoing Mortgagee would all be in the same room at the same time to exchange documents and bank cheques. And worse, if it did not happen within the time slot allocated for such settlement, and usually it is only in 15-minute increments, then your settlement would fail! That would mean the Purchaser will not be able to move into their new house! This is unless their solicitor is quick enough to secure an urgent licence agreement until the next settlement day. Not only was it inconvenient for the Purchaser, of course it is also inconvenient for the Vendor who is waiting for their sale funds. Not to mention the fact that oftentimes, simultaneous settlement would happen for the Vendor’s side as well who would normally be selling and purchasing at the same time
I’m glad those days are over
These days, settlements take place electronically using the electronic platform PEXA. It means that prior to the scheduled settlement all parties have ample opportunity to ensure that all their documents are completed accurately and all funds are accounted for exactly. All in the virtual world in virtual timing. Hence, failed settlements hardly occur. Unless of course, there is some drastic circumstance outside the parties’ control
When I started practice in 1999, as a junior lawyer, settlements would usually be a very stressful event. I would have to ensure that all documents such as Transfer, Duties Form, Notice of Acquisition or Disposition are done properly, all bank cheques must also be made out down to the last cent So if you make a mistake and had instructed your client to get a bank cheque made payable to the SouthEast Water for $235 95 and it was supposed to be for $253.59 for example, you risk settlement failing. And sometimes, it is not entirely your fault as such charges could very well change up to the very day of settlement. Another example is if you had not completed your State Revenue Office form for stamp duty properly. This would often mean a failed settlement also Luckily, most settlements go through successfully thanks to the efficiency of practitioners in the field be it lawyers or conveyancers.
It was always music to my ears to receive that dreaded phone call from the Firm’s city agent (if I was not able to attend settlement) to confirm that settlement was completed It was equally satisfying to report to my client that they can move into their new house that day.
Another change due to digitisation is when lodging land titles documents such as Transfer, Caveats, Mortgage and the like In those days, if you acted for a family law client whose name is not on the Title and you wished to secure a Caveat as soon as possible, this would mean you instructed your city agent to do so via sending them your hard copy caveat as well as a bank cheque or your firm’s cheque for the filing fee. Then after negotiating a settlement and if your client is successful enough to secure the family home, it meant that such transfer to their name meant another trip by your city agent to the State Revenue Office to have your duties form stamped non-dutiable and Land Titles Office to transfer the title into your client’s name.
In those days, a city settlement agent was the Firm’s best buddy You best hope that your agent is going to ensure that settlement takes place by insisting to all parties involved why a $0 50 difference on a bank cheque should not cause a settlement to fail. With the advent of technology, unfortunately the first and obvious casualty are those city agents. It very quickly became evident that there was no need to engage them as settlements became electronic and most documents were able to be lodged online all using the electronic platform PEXA As much as it is unfortunate for those businesses to have vanished, at the same time, digitisation means efficiency is easily secured for all practitioners involved.
In the area of Probate, it was only about 5 years ago when documents were lodged manually by either filing in person at the Probate Office or sending all your documents by mail. And when your documents were approved, you actually received a Probate parchment! Then it became possible to lodge your advertisement (intention to apply for Probate) online using the Probate Online and Advertising System. You would still send all your other documents such as Affidavit, Inventory of assets and liabilities and all your exhibits via mail. Nowadays, your entire probate application is prepared and submitted online, except for the original will and copy of originating motion which must be submitted via mail. No doubt it will only be a matter of time when original Wills will be accepted online.
Finally, in the area of briefing your counsel for Court, gone are the days when you would physically hand counsel your physical brief meaning your huge lever arch file or two, via their clerk a few days prior to the hearing. These days, you can easily upload all your documents via electronic platforms such as e-brief.
So as a practitioner who has been in the game for almost 30 years what is my take on digitisation? I am all for it As much as it was daunting at first to learn PEXA, Duties Online, RedCrest and the like, these days it is a must. And I am sure that in the foreseeable future there will be more changes along the lines of digitisation to enhance the way we practise.
ONLINE STUDY, DIGITAL FUTURE
By Grant Cowan | JD Student at Deakin University
Someone once said, 'You should study law', and that quick remark began my long legal journey.
As a Juris Doctor student, I am a mature-aged candidate at 43 years old This master's-level qualification builds upon a prior undergraduate degree, in my case, a Bachelor of Fine Arts. By the time you get to a second university stint you usually have a different lifestyle and set of priorities than your first. I used to be invariably single or coupled, driven and committed to my career progression, eager to explore the world and my prospects; I had time. I am now a partner, parent, employee and a student again however, lacking in time
Why should I tell you this?
Well, this will be the background of most of the Juris Doctor cohort (JD’s from here after) at any University Of course, there is more to the story, all the extra personal circumstances that come along each person’s own mid-life career pivot and development The reasons could be relocation or immigration, aspirations for promotion, following long lost dreams, or simply challenging oneself anew, but underlying them all is empowerment.
How is this possible? Digital technology. By a large degree these days. Therefore, it deserves acknowledgement and some analysis. Online study options facilitate the JD segment of the law profession which in the past may otherwise not have been populated with such diverse students. JD online students are from a range of backgrounds Online study allows parents or carers, those in remote regions, or those who travel or those with physical or social issues to participate.
The list is long, when it comes to describing why a student chooses online study, but regardless they can do it nowadays. Suffice to say it breeds variety.
So, this is good, right? But it also comes at the cost of one's mental wellbeing. Moreover, what is thrown in our Legal discipline is that grey knotty competitive practice that disappointingly rates low on wellbeing metrics, which is a prominent issue
The Journal of Open, Distance, and e-learning article identified that the delivery mode rather than the actual content is the significant barrier and challenge to wellbeing Now, let's talk about the pros and cons of this. 1
In 2025, the interplay of AI, online learning platforms, social media, access-to-justice tools and commodified legal research (Lexis, Westlaw, Practical Law and their newer competitors) provided enormous advantages. I can’t imagine the archaic and laborious systems of learning, described sometimes by our dear professors and lecturers, that we were required to attend before the digital age! Democracy of information means we can all know more if we want to. It is not locked away, or out of reach in the way it once was Automated case-law search, citation management, and even first-draft contracts are available to students who once depended on long mornings in a physical law library. For a mature aged student, this is precious time saved to commit to other responsibilities. Legal tech subscriptions provided by our universities, allows us to learn from the same research that firms use, reducing the shock when we enter practice. Deakin was highly regarded years ago, as a ‘lonely’ leader in providing online learning, and now years later I believe it was ahead of the curve
Nevertheless, there are trade-offs. We don’t have a physical student ‘body’ and its personal benefits of camaraderie, to chat informally and digest what we’ve learned together from the same lecture Commiserate, masticate and argue! Often the same information is interpreted quite differently by different minds. When you sign off from a lecture, I sit befuddled or stunned sometimes and want to turn to my mate and say, “Whaaa”? But alas, I’m alone. The moral support or guidance you could gain from those moments, where students tease out a tricky legal principle by sharing their frustrations together over coffee, are hard to replicate at home by yourself
An unnatural digital culture with persistent and pervasive technology can produce an ‘always on’ mentality, fuelling isolation, loneliness and imposter syndrome Law as a discipline and profession is competitive and this translated into leaderboard metrics, and commenting threads will intensify anxiety.
Access to justice technologies is another double-edged sword There is the temptation to rely on tech tools, like Large Language Models, which may save some time, but stop students and practitioners from slowly processing very complex information, which is a real legal skill. Chatbot assistance, and self-help portals expand the reach of help to communities who are underserved but they risk deskilling and reducing lower-level practice and shift client expectation This was the space where new graduates could refine their learning and build their skills but also engage with real people with solvable problems, which is satisfying but also edifying too. Cheap law, like fast food isn’t really very good for you or the world. It is prone to being over used, and may mean we as practitioners won’t learn the therapeutic value of practice, which is where we train our technical fluency to be balanced with ethical judgement and interpersonal skills. The positivity within humbleness and kindness neutralises the negatives of competition and the legal adversarial system
Lastly, online learning as a mode of delivery, which differs from traditional campus learning in the ways described in this article, may also affect the way employers view graduates as potential employee candidates, when comparing the two side by side An article by Conor Lennon, Assistant Professor at the University of Louisville, suggests there is a limiting effect so far.2
So, we’ve got online supported graduates, who’ve been trained with superior tools, and need to know how to balance tech and real-world personal negotiations, who should use their respective backgrounds to counteract any negative connotations about their online student status to broaden the quality of the legal profession, and not fall into traps of mental wellbeing decline.
That’s a lot! Where is it heading? The workplace
The workplace is changing, and many professionals expect working options and flexibility that reflect all the other modern conveniences we enjoy today I’m one Employers who embrace and accommodate this will probably attract talent and skill. Versatility is a modern benefit that I believe employers can offer their employees. Yet, this raises questions about supervision, professional socialisation where juniors can be siloed and left alone. We are back to the problem of online learning, that of being alone Wellbeing skills learned in study will be required in the job Get good at them now Those employers that offer enough socialisation with participatory flexibility will gain assiduous and bright legal practitioners.
So where is the legal profession heading specifically? I’m cautiously optimistic I expect a division or branching to occur: some legal work will further commodify and automate, handled by platforms and AI-assisted practitioners; and other work such as strategic litigation, complex transactional advising, and client relationships requiring deep empathy will become premium services that emphasise human interaction and judgments Law graduates who combine strong analytical foundations, ethical literacy and technological fluency will be best placed. Importantly, mental wellbeing and flexible employment arrangements will be central to sustainable careers, those employers or areas that don’t adapt will lose candidates that can’t justify outdated modes of work or extreme outputs, when seeking meaningful and gainful employment.
9

Technology is transforming the legal profession, and Neerav Srivastava, a lecturer at Deakin Law School specialising in law and technology, is at the forefront of observing the evolving relationship between law and digital tools. His research focuses on digital technology and often crosses disciplinary boundaries. With numerous publications in respected journals, citation in cases and multiple awards, Neerav offers a thoughtful and informed perspective on the role of law in today’s digital world.
He has written extensively on law and technology topics, including the emerging challenges around liability for chatbots when a professional bot is negligent. Notably, Neerav has observed that digital technologies are quickly becoming central to professions. However, he adds that the legal profession has been slower to adopt these technologies than other fields
“There is a sea change happening in the legal profession,” he says, “but compared to other sectors, the law has been relatively slow to adapt.” Despite this slower start, Neerav believes that change is inevitable and will accelerate over time
“A tsunami wave is about to hit the legal profession. When I say tsunami, I mean an unstoppable force It’s going to happen whether you want it or not Pressure will come from clients to reduce costs, forcing legal services to be more competitive. Law firms will have to adapt or be left behind ”
A critical point Neerav raises is that much of the development in legal technology has been driven by technologists rather than lawyers. “I suspect, just my view, that a lot of this has been driven by [technologists] rather than lawyers, when what you really need is lawyers and [technologists] working together.Without lawyer involvement, the metadata for legal content is not as rich as it could be, which limits how effectively it can be used by software But he notes this is changing: “The technology officer is getting integrated more into the law firm. Lawyers are starting to realise that the richness of the metadata leads to better results.”
1 Srivastava, N, ‘Liability for Chatbots: A Psychbot Negligence Case Study and the Need for Reasonable Human Oversight’(2023) 28 Torts Law Journal 155
With new technology comes new professional risks Neerav highlights that relying on AI tools like ChatGPT without proper human oversight exposes lawyers to clear liability. “If you’re not exercising oversight, consequences will follow, so this must be fully integrated into a law firm’s due diligence,” he explains. Recent cases where lawyers submitted AI-generated legal summaries in court without sufficient vetting, underscore these dangers. “The use of technology should lead to higher standards and efficiency and not the opposite Health technology may lead to better diagnosis. Due diligence exercises will be made more efficient. But when you have psychbots failing to identify a disclosure of child abuse, there is a risk of diluting standards. ”2
He emphasises, “The issue is less about using, say, ChatGPT and more about the failure to supervise it ” Professional standards require thorough review of any AI-generated work before it reaches clients or courts. This level of scrutiny, he says, should be routine and is essentially a law firm exercising traditional due diligence.
When asked about the specific risks of using chatbots in legal and educational contexts, Neerav distinguishes between tools used by a lawyer in doing their work, like research platforms, and those interacting directly with clients The latter, he says, can erode professional control and raise liability risks if the bot is unsupervised and gives incorrect or misleading advice. “The importance of human oversight, when there’s a high level of risk involved, is central to professional responsibility”.
Regarding whether Australia’s legal framework is prepared for these challenges, Neerav believes the common law can provide a gross answer, provided lawyers can clearly identify and articulate the “wrong” in new situations He contends that courts can adapt old doctrines to new contexts. This “robust incrementalism,” as he characterises it, allows the law to evolve. He notes that while common law is not as sophisticated or granular as regulation, it provides remedies
2 Danielle Kutchel, ‘Friend or Foe? Psychbots and the Law’ (Feature, Law Society Journal Online, 11 May 2023).
3
Looking toward the courtroom, the prospect of AI-generated evidence being accepted in court raises significant questions. Neerav cites the U.S. Loomis case, where algorithmic recidivism risk assessments were used in sentencing The problem, he explains, is that such algorithms often operate as “black boxes,” withholding their inner workings due to proprietary claims This lack of transparency undermines due process because defendants cannot challenge how a machine arrived at its conclusions.
“It sounds like a problem,” Neerav admits. Although these tools might be helpful, “they need to be transparent.” Moreover, there are serious concerns about fairness. He notes that in Loomis, the individual was sentenced for risks related to physical offences despite only being convicted of a non-physical one This overreach signals how AI use in legal contexts must be carefully managed to uphold justice and fairness.
4
Neerav holds the indispensability of human oversight in legal AI applications as a central concept moving forward: Human oversight is important, but it should be commensurate with the risk.
He also admits there may be legal domains where AI can operate more independently: “I think it can, but I think what will happen is we’ll stop talking about them as legal areas, or in quite the same way.” Neerav believes AI may take on more mundane tasks traditionally done by lawyers, but the profession will evolve alongside these changes. “Imagine if young lawyers could be saved from weeks of discovery work!”
He adds that “Technology will also create plenty of opportunities for lawyers in areas such as privacy, specialist contracting regulation and AI liability For example, recent assignments I have set have been on things like smart legal contracts, Facebook liability for ads that targeting insecure youngsters and whether Amazon is legally an agent of sellers. It’s been really nice to see Deakin students respond with enthusiasm and sophistication to novel issues.”
The path forward, he suggests, lies in embracing new opportunity, collaboration between lawyers and technologists, robust human oversight commensurate with the risks, and an adaptable legal framework that balances innovation with accountability. It’s not just about using AI, but about reimagining legal practice itself for the digital age
3 'State v Loomis: Wisconsin Supreme Court Requires Warning Before Use of Algorithmic Risk Assessments in Sentencing' [2017] 130 Harv L Rev 1530
4 Ibid


Before starting as a graduate, I spent years as a paralegal learning how to conduct legal research the ‘traditional’ way, armed with boolean connectors, filters, and a long list of keywords I’d brainstormed to make sure I caught all the variations and synonyms. Fast forward to now, halfway through my graduate year, and I can’t believe how much legal research has changed for the better. These days, my starting point often isn’t a list of keywords at all. Instead, it’s a question posed to an AI-enabled research platform that can interpret plain language, identify relevant authorities, and draw surprising connections that I might have missed.
I expected to spend most of my graduate year researching, reviewing documents, and performing other classic graduate tasks. I didn’t expect to be test-driving new AI platforms alongside that work. I’ve been extremely lucky that in my role I have had the incredible opportunity to be involved in pilot programs for tools that are shaping the way in which the legal industry works.
To be very clear, legal technology is not replacing lawyers. These tools aren’t replacing the human element, but they are changing where the human element is applied and refining the skills that firms are looking for in their graduates.
With AI and other legal technology reshaping the ways in which we work, future graduates will focus more time on delivering value in inherently human ways: solving problems in innovative and creative ways, applying an empathetic lens to our work, and understanding the strategic and commercial landscape in which our clients operate.
So, the question for law students isn’t whether AI will take your job, but rather how you can prepare to be the kind of lawyer that thrives alongside AI.
1. Build your digital literacy
If you’re still studying, this is the moment to start building these skills. Digital literacy isn’t just about being able to use legal technology, but knowing how to get the best from it. A simple maxim effectively summarises this point: better inputs produce better outputs. The quality of the questions you ask, the prompts you craft, and the context you give will directly shape the quality of the results you get. That’s a skill you can develop now. Experiment with AI-powered research tools if you can get your hands on them, play around with ChatGPT, Microsoft Copilot, and other AI platforms, and keep an eye on LinkedIn and other avenues for what those who are leading the legal technology drive are doing. By doing so, you will be ready for whatever legal technology might be waiting for you on day one of your graduate year.
2. Stay curious and adaptable
Technology will keep changing, but your ability to keep learning will be the constant that keeps you relevant and valuable. I’ve seen first-hand the major role that graduates and junior lawyers play in ‘reverse mentoring’ senior lawyers and partners, helping them get the most from these tools without being patronising. In an industry where efficiency and timelines are at a premium, that contribution is noticed, and your curiosity and adaptability will drive your ability to embrace new technology and teach others.
One of the best ways to build this adaptability is to deliberately seek out challenges (whether or not they involve technology) that push you beyond your comfort zone. The more you practise picking things up quickly in unfamiliar situations, the more resilient and adaptable you’ll be when faced with rapidly developing legal technology. You can also sharpen your creativity skills by putting AI tools to the test. Experiment with different prompts, compare the outputs and keep refining your approach, always endeavouring to push the limits to generate better outputs. By experimenting with AI at an early stage, you will better know how to leverage it in new and exciting ways to solve novel problems. This kind of hands-on practice will build the confidence in your own abilities and make you invaluable to senior lawyers and partners.

3. Focus on the human skills that tech can’t replace
If you take one point away from this piece, make it this: your value won’t lie solely in your ability to use the tech, but in the ways you can harness it to innovate and supplement your own capabilities. What will set you apart is how you apply technology to solve problems in creative, practical, and commercially sound ways. It’s one thing to extract the key clauses from a 100-page contract in seconds, but it’s an entirely different task to interpret the next steps for your client in light of their commercial goals, appetite for risk, and in an ethical manner. While technology can help us quickly get the relevant information and materials, our clients still rely on us to turn it into meaningful and actionable insights and recommendations.
You can start building those skills now. Read recent judgments and business news and practise viewing them through a commercial lens by thinking about how you might advise different clients in different industries or with risk appetites. Crucially, never underestimate the importance of empathy as a professional skill and make it a habit to pause and consciously put yourself in someone else’s shoes to understand how different people might approach the same situation.
New legal technology is already changing the profession. The best thing you can do now is embrace it, not just as a set of tools, but as a way to amplify your skills, deepen your understanding, and make space for the kind of work that likely drew you to pursuing a career in law in the first place. If you can learn to pair the precision and efficiency of technology with empathy, commercial acumen, and creativity, you won’t just survive in the digital age you’ll lead it.
NAVIGATING LEGAL PRACTICE IN THE AGE OF MODERN TECHNOLOGY
By Henry Jones | Student at Deakin University
The advent of modern technology has facilitated significant change within the legal industry. The competitive and dynamic nature of social media and the online world, and the associated stress to produce relevant marketing and promotional material, can lead some lawyers to fail to uphold their obligations. While modern technology has helped to reduce the stress of legal research, legal practitioners must ensure that they use it appropriately, meeting the standards set by the Solicitors’ Conduct Rules. Ultimately, technology has increased the pace of the legal industry, which can make it more difficult to uphold ethical obligations
1
The stressful and competitive nature of social media marketing can impact the upholding of ethical obligations in the areas of practice management and confidentiality, particularly given the fast pace of the online world With the rise of technology, many lawyers in small firms and those with their own practices run their own social media marketing campaigns. In fact, the use of social media in marketing is largely seen as a must in order to grow a client base. This presents a range of new stresses that lawyers must deal with. While lawyers being involved with marketing is not a new phenomenon, the fast-paced and dynamic nature of social media, including “trends”, can lead practices to feel pressured to stay constantly up to date with the online world. This fast pace and the associated pressure can give lawyers less time to be considered in their marketing, and thus more likely to breach an ethical obligation. With some, albeit conflicting, evidence that social media can have negative benefits, the stress created by the online world creates a newfound source of stress for lawyers. 2 3
Practitioners must be careful, then, to uphold their ethical obligations, especially in ensuring that their promotional material is not misleading or deceptive, or could be perceived as offensive. In addition, r 36.2 states that a solicitor must not brand themselves as being an “accredited specialist” or any other derivatives of those words, unless they have been accredited by the relevant body.
1 Ibid
2 Mentkowski, A (2015) Law Firm Marketing in the Age of Social Media: A Toolbox for Attorneys N Ill UL Rev , 36, 1
3 Berryman, C , Ferguson, C J , & Negy, C (2018) Social media use and mental health among young adults Psychiatric quarterly, 89, 307-314
Practitioners must follow this rule when creating marketing material, particularly online where a lawyer might be trying to encourage “clicks” on their website or social media account. Further, when creating online marketing and promotional material, practitioners must be careful to protect the confidentiality of clients.
Additionally, lawyers must be conscious of the type of content being posted to social media Under r 36 1 of the Solicitors’ Conduct Rules, a solicitor or principal of a law practice has certain obligations regarding the advertising they do. Solicitors must ensure that they do not create false, misleading or deceptive, offensive or prohibited content as part of marketing and promotion ‘in connection with the solicitor or law practice’. In social media marketing, particularly given the rise of short clips in the form of “Tik Toks” or “Reels”, it can be difficult to capture the nuance of law and legal practice.
Modern technology has assisted in reducing some stresses for lawyers, particularly in the area of legal research. However, while it may help to streamline legal research, technology, especially Artificial Intelligence (AI), must be used appropriately to avoid breaches of a lawyer's ethical obligations The introduction of the internet has made research easier and faster for lawyers. Where a lawyer once had to trawl through legal books to find a case, or a point of law, the internet now offers a faster alternative. Legal databases such as AustLII and Westlaw make it far easier for lawyers to find the law that they need.
Under r 9.1 of the Solicitors’ Conduct Rules, concerning confidentiality, a lawyer ‘must not disclose any information which is confidential to a client and acquired by the solicitor during the client’s engagement to any person, outside of staff at the solicitor's office, or others engaged in administering legal services to that client.’ While exceptions to this confidentiality principle are stipulated under r 9.2, lawyers must be careful when creating promotional material that they maintain the confidentiality of their clients. For example, a practitioner might create a promotional video detailing a legal issue faced by a client and how they dealt with it, to encourage others with similar problems to engage that practitioner. In making such content, a practitioner must ensure that they do not share information with their audience that was disclosed to them by a client and which is confidential.
7
8
4 Solicitors’ Conduct Rules (n 2) r 36.1.
5 Ibid rs 36.1.1-36.1.4.
6 Hamilton, J. R. (1971). Computer-assisted legal research. Or. L. Rev., 51, 665.
7 Solicitors’ Conduct Rules r 9 1
8 Ibid r 9 2
To do so would be to breach their ethical obligation under r 9 of the Solicitors’ Conduct Rules Ultimately, the rise of online marketing, particularly social media marketing, has created new opportunities for practitioners to expand their client bases. However, it also creates new problems for lawyers in regards to their ethical obligations. Particularly, the pace of the online world, and its associated stress and chaos, can leave a solicitor or practice manager less time to consider their marketing approaches, and thus exacerbate their chances of making an ethical mistake
Specific cases can be found through a “search” function, while the use of key words and key numbers can help lawyers to find primary and secondary sources within the areas that they need. With these developments, legal research has become more efficient and faster Indeed, the ability for a lawyer to conduct research in a more streamlined manner allows lawyers to respond to clients faster while maintaining a high standard of work. In doing so, lawyers are better positioned to uphold their ethical obligations of promptness, diligence and competence under r 4 1 3 of the Solicitors’ Conduct Rules. AI is being embraced by legal practitioners, contrary to initial reluctance when the computer first became a part of legal practice However, this new technology places new pressure on lawyers to adhere to their ethical obligations.
In doing so, lawyers are better positioned to uphold their ethical obligations of promptness, diligence and competence under r 4.1.3 of the Solicitors’ Conduct Rules. AI is being embraced by legal practitioners, contrary to initial reluctance when the computer first became a part of legal practice However, this technology places new pressure on lawyers to adhere to their ethical obligations. Additionally, using AI in such a way shows a lack of diligence and competency by a lawyer, leaving them at risk of breaching r 4.1.3 of the Solicitors’ Conduct Rules. Ultimately, while the rise of modern technology has helped to alleviate some of the stresses associated with research in the legal profession, such technology must be used appropriately, and with oversight by lawyers, to ensure that ethical obligations are followed.
The competitive nature of social media and the pressure to create effective and current marketing can cause some lawyers to neglect their obligations surrounding advertising and confidentiality. While technology has eased legal research, it must be used properly to meet ethical standards Ultimately, the fast pace of technology can lead to increased stress, and make it more difficult to comply with ethical obligations under the Solicitors’ Conduct Rules
9 Ibid r 9
10 Jones, J M (2009) Not just key numbers and keywords anymore: How user interface design affects legal research Law Libr J , 101, 7
11 Solicitors’ Conduct Rules r 4.1.3.
12 Computer-assisted legal research (n 26).
13 Solicitors’ Conduct Rules r 4.1.3.
14 Solicitors’ Conduct Rules r 4.1.3.

By Katherine Boyles, Legal Advisor –Privacy and Regulatory at Bupa
If you are a law student or junior lawyer with an interest in how technology and data are shaping business, how the geopolitical environment influences risks in your clients’ supply chains, how security weaknesses can lead to a large data breach, if you are good in a crisis, or if you like making the occasional Skynet joke, then legal practice in Privacy, Cyber or Artificial Intelligence (AI) could be areas that you enjoy This article sets out my observations about these practice areas and how modern technology is shaping them
What is privacy law and why is it important for new technologies? Privacy laws set out the requirements for how organisations (from privacy companies, NFPs to government agencies) collect, use and disclose personal information about people, send personal information overseas, protect personal information, and respond to data breaches This includes consideration of Commonwealth, State and Territory privacy laws, depending on the client and circumstances. It can also include surveillance laws in some cases.
Australia’s privacy laws, while being uplifted, such as through the 2024 reforms to the Commonwealth Act, with more anticipated later this year or early next, are generally considered weaker than some other privacy regimes around the world, such as the EU’s General Data Protection Regulation (GDPR). Australia is looking to raise its bar, as are many countries, to be more aligned to the GDPR.
Fundamental to privacy is tying back to the individuals using the technology, or who are affected by the project, such as customers or employees. In this area, you can make a big impact on customer safety and brand trust by building in good privacy practices from the start, or along the way where a client is moving at pace, while enabling those individuals to benefit from the use or implementation of exciting new technologies
Some new technologies will require you to determine whether the data involved even fits the definition of personal information, and therefore what sort of privacy protections, notices and consent your client may need in place. You will get to work with and learn from technical experts such as Data Governance and Privacy Engineers, learn about how systems and new technologies operate, and influence how privacy compliance is built into those systems and technologies.
What is cyber law and why is it an exciting, fast moving practice?
Cyber law is a rapidly growing area of practice with many applicable laws, depending on the nature of the client and the cyber issues they are looking at For example, with incident response, it can include, depending on the client and incident, consideration of incident reporting obligations under the Privacy Act 1988 (Cth), some State or Territory privacy acts, Security of Critical Infrastructure Act 2018 (Cth), APRA’s Prudential Standards CPS 230 and 234, and the new Cyber Security Act 2024 (Cth) Cyber law also involves incident preparation, simulations and recovery, cyber insurance, class actions, regulator engagement, advising boards and executives on their obligations for cyber resilience, how new technology can be implemented in accordance with legal obligations, third party supply chain risk mitigation, and analysing case law to advise clients on any cyber security implications for their business Some of these areas overlap with privacy
One of the most rewarding aspects in this area is working with, and learning from, technical experts such as Digital Forensics professionals, Threat Hunting Analysts, Security Engineers, Third Party Security specialists, up to Chief Information Security Officers There are also opportunities to learn about how vulnerabilities in software might affect the security of a platform, or how cyber criminals are using technology to make cyberattacks more effective
Legal
practice and AI governance
With the recent excitement about generative AI and large language models such as ChatGPT, Gemini, Copilot, and the ever-questionable AI summary on popular search engines, it is easy to forget that AI has been around for a while
The history of AI would be a standalone article, but my understanding is that many recognise the earliest substantial theoretical work on AI being Alan Turing’s work in the 1930’s. The responsible use of AI in legal practice would also be a standalone article
The recent rapid developments with AI technologies mean exciting opportunities for organisations and individuals to do new things, or do current things more efficiently, balanced by important conversations about how to use AI responsibly, fairly, ethically, and of course, lawfully Critical thinking is key in this area
Australia has a Voluntary AI Safety Standard in place as of September 2024 However, like many countries around the world, Australia is also looking at enacting a specific law, or to amend current law, to legislate guardrails regulating AI. An example yet again is the EU, which enacted a specific EU AI Act in August 2024, which operates in addition to other laws.
While Australia does not yet have its own specific AI Act, AI is very much not unregulated: all other laws as they currently stand apply to organisations’ development and deployment of AI services, just as they do with the use of other technology services. These laws include Privacy laws, Competition and Consumer law, IP laws, Occupational Health and Safety laws, Anti-discrimination laws, and human rights laws, amongst others.
This means that AI governance for clients can be a very multi-disciplinary space involving many stakeholders such as Data Governance, Cyber, Risk, and Compliance and lawyers with different specialisations depending on the specific type of AI being deployed or developed and how it will be used. Many clients will be looking to lawyers to provide guidance on what responsible AI governance looks like, and we are well placed to help, so long as we remember to talk to other experts who have a different perspective to ours and seek to understand the technology and associated issues in front of us: just as we would any other
Any opinions expressed in this article are the author’s own.

Over the last decade, lawyers have seen a massive acceleration in the impact of technology on the law, legal practice and practice management. Now, many have mixed feelings as even these familiar Information Age developments are swamped by tech innovation led by artificial intelligence (AI) dominating the Digital Age.
This brief article sets out my view on where we are and where we’re heading, from my vantage point as an administrative and commercial law practitioner and writer/presenter on privacy, data protection and cyber, with a background in law publishing
Information Age Legal Framework
In the decades before AI began infiltrating our lives, technology generally brought lawyers, legislators and the courts an incremental increase in new issues that could be framed within existing legal fields, such as corporations law Amendments accommodated new digital products and online versions of existing business practices and services: for example computer programs achieved protection via an amendment to the Copyright Act (1968). This legislation was also amended in 2001 to include moral rights, notably the right to attribution as a creator of a work, in response to social pressure for fair treatment for individuals where information and communications technology (ICT) had made possible the instant dissemination of words and images worldwide
Similarly criminal law has accommodated traditional crimes enabled or facilitated by technology, especially the internet, such as stalking or distribution of child pornography. Doxxing - the malicious release of personal data online – was criminalised in 2024
Transition Towards an AI Age Regulatory Framework
Today, Digital Age issues may be challenging the foundations of legal reasoning and the effectiveness of regulatory frameworks. Much Information Age legislation struggles to meet its legislated objectives
For example, established western copyright law, already flouted by publishers in some jurisdictions, is now locked in battle with international tech giants that use existing digital content to train their AI systems without permission or payment.
The Privacy Act 1988 (C’th) (Privacy Act) also struggles. It was drafted to be technology neutral and principles based to accommodate not only developing ICT but also changing social norms and expectations. Now, despite recent partial reform, enforceability of the Privacy Act risks being overwhelmed by a plethora of privacyinvasive tech developments and practices including facial recognition software, data scraping and poorly governed automated decision making. Exemptions from the Privacy Act, especially for small businesses, in practice severely limit the Office of the Australian Information Commissioner’s protective powers.
Regrettably, law makers and regulators dependent on the public purse for funding face severe budgetary challenges, even as well-resourced lobbyists for tech giants argue for less regulation. This may explain why, despite consultation and unlike the European Union (see Artificial Intelligence Act (Regulation (EU) 2024/1689)) and some other jurisdictions, Australia lacks an AI-specific statute. In its absence, the Courts, individual regulators, government departments, professional associations and others have issued comprehensive practice notes, standards and guidance
Transition Toward Legal Practice in the AI Age
During the Information Age, equipping lawyers with their own PCs meant fewer legal assistants needed to be employed to support them as the lawyers completed and recorded their billable hours, or otherwise evidenced the value of their work. Legal employers embrace new technology that promised to reduce overheads and increase their control over work outputs Digital developments deliver greater productivity and convenience. For instance, cumbersome looseleaf binders of legal information could be swapped first for CD-Roms then less expensive online subscription access Simultaneously, law became more accessible to lawyers and the general public alike via the introduction of free online legal resources, notably AustLII.
Now, building on these Information Age foundations, AI is starting to transform all aspects of legal work Consider just two:
1.How is the work being performed? In these early transformation stages, legal employers are trialling or adopting a range of individual AI tools to assist or supplement discrete aspects of lawyers’ work The tools offer to reduce tedious ‘grunt’ work, save lawyers’ time and enhance their clients’ experiences. Such tools include: client intake software to reduce data entry, fast legal research and summarisation tools, smart precedents, e-discovery and generative tools for drafting documents such as chronologies or submissions. Larger employers might go further and adopt specialised legal AI platforms that combine several functions.
But while many benefits of AI streamlining are clear, so too are associated ethical risks and the risks of professional mistakes including bias, hallucinations and privacy/confidentiality breaches, all requiring mitigation. Mitigation takes time. Law-qualified staff may be required to verify the legal citations in a firm’s client work, especially if generic AI tools have been used to generate content. In-house lawyers instructing external counsel may require detailed attestations in respect of AI used in performing the work. Accurate assessment of everyone’s net time saved may prove difficult.
Nevertheless, the trajectory of these tech developments is toward a later transformation stage where agentic AI begins to take over entire legal workflows, not just help and support individual lawyers. It takes little imagination to see the potential impact for lawyers behind the EU Parliament’s definition of an ‘AI system’ as a:
machine-based system… designed to operate with varying levels of autonomy, and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.
2. How is the legal work being paid for? The promises of AI accessibility and efficiency savings are already influencing expectations about lower costs of legal work for clients, whether those clients are government or private sector direct employers of lawyers, or organisations or individuals looking to engage external private sector counsel.
Re-evaluation of cost structures for all legal work is becoming inevitable Law firms are already initiating more fixed-price or agreed-value rather than time-based pricing of their services. While some hope to hire ‘AI-native’ law graduates to grow internally, others will engage senior but tech-enabled lawyers who have high levels of judgement, experience, and expertise – attributes not easily replaced by AI. Such attributes plus the cost of access to high-end legal tech may result in disappointing service cost reductions. Potential clients may decide that new technology adequately equips them to perform their own legal work. Lawyers themselves may conclude that investment in the legal tech industry produces better financial rewards than their own investment in knowledge work yields. Certainly the legal profession is facing interesting times
SO HOW CAN NEW GRAD BEST POSITION THEMSELVES?
Here are my key suggestions:
1 Gain traditional legal experience now as counter validation against AI-driven approaches.
2.Familiarise yourself with legal tech, understand AI. Value-adding digital skills such as data visualisation techniques will be an advantage.
3.Work on your soft skills and other attributes not easily replaced by AI
4.Prepare for the inclusion of AI tasks (such as use of prompts with GenAI) in the recruitment process.
5.Recognise that some legal fields move faster than others and the Digital Age will not affect all equally. Choose carefully!

2025 marks an exciting and pivotal moment for criminal lawyers in Australia. Our profession has seen steady improvements in technology and how we conduct practice, but none as significant as the introduction and rapid growth of artificial intelligence (AI) AI offers an efficiency that is beyond the scope of human capacity, and I see this technology being an essential tool for criminal lawyers in their practice.
In my experience practicing as a solicitor in criminal law for over 10 years, I have found the career both deeply rewarding and inherently challenging In regards to the latter, I have made the following observations about criminal lawyers:
They are generally time-poor, with most practitioners appearing in court regularly with uncertain and unpredictable court hours. They spend considerable time on practice management, administrative tasks and the paperwork aspects of their client’s matters.
They face inconsistent workflow and cashflow streams
They carry intense responsibility and the emotional burden of working closely with individuals in high-stress, life-altering circumstances, often with clients facing significant personal, emotional and mental health challenges.
Whilst AI is not going to remove all these challenges, it can assist lawyers significantly improve their efficiency and increase productivity, which in turn gives them more time to spend elsewhere. Legal-specific generative AI has access to enormous knowledge bases, and criminal lawyers in Australia are already using it. Some examples are:
Creation of high-quality drafts of matter-specific criminal law documents (e g letters, chronologies, summaries)
Conducting criminal law research. Connecting to more clients.
I expect that AI usage will only increase. Lawyers must remain vigilant and always remember that AI does not remove their duty to the court, their client or other practitioners They must review any AI output to ensure its accuracy The AI provides the initial groundwork, enabling the lawyer to build and advance the matter further. 25
It is a great outcome, and one that means that lawyers can charge less for certain tasks. It means the client will pay less for less skilled tasks that are indispensable but can be undertaken more cost-effectively. It means the lawyer can work on more cases, engage more clients, reduce the unbilled time spent on certain tasks, and earn more revenue for their firm. And it means more people will have access to legal representation, which is beneficial to the community as well as to the court and the legal profession.
As criminal lawyers, we are constantly updating our legal knowledge (for me, this has always been recent caselaw, legislation and practice notes). And now, this must include continuing education on AI. The Law Society of NSW and Law Institute of Victoria websites both have pages designated to AI resources for legal practitioners, and many courts in Australia have released practice notes and guidelines regarding the appropriate use of AI. All of this suggests an awareness by the highest ranks of the judiciary and the legal profession governing bodies of the inevitability of widespread AI adoption by lawyers. And leads me to think that, in time, AI will be taken up in a more open way in the court where the judges and practitioner alike will be using and referring to AI in proceedings
To be clear, I do not think it is wise for a lawyer to upload a Police brief of evidence or any of their client’s material into ChatGPT for analysis. But legal software providers (including the company I work with) have legal-specific generative AI tools available within the practice management system, so that AI can be used in a safe and secure way
Criminal lawyers must embrace AI in an appropriate way in order to progress with the technology of the current generation Doing so will also help more practitioners sustain their criminal law practices, and it is in the profession’s interest to safeguard these valuable assets for as long as possible.
One of the joys of being a criminal law solicitor is appearing in court, and the reality of AI making time outside the courtroom far more productive is truly exciting This is only the beginning, and I look forward to the improvements and benefits that lie ahead.
Bridget is a solicitor in NSW She began her legal career as a Tipstaff in the Supreme Court of NSW Since 2017, she has been the principal of her own law practice based in Sydney Since April 2025, Bridget has been employed at LEAP Legal Software leading the criminal law team
IN CONVERSATION WITH
MANAGER OF IMMIGRATION LAW FOR LEAP & LECTURER AT MURDOCH UNIVERSITY

The introduction and inevitability of artificial intelligence (‘AI’) is a hot topic within the legal profession. We had an opportunity to speak to Kelly Seal, Manager of Immigration Law for LEAP Australia and a Law and Ethics Lecturer at Murdoch University, who is working on the cutting-edge of changes being introduced to the industry Kelly’s role as head of LEAP’s Immigration Law vertical has given him the opportunity to combine his years of experience as an Immigration lawyer with his knowledge of software development and AI, to help developers and client teams produce software that is being used right now by Australian lawyers He lectures in Migration Law, AI in Law and Ethical AI Usage, amongst many other areas. With decades of experience in law and legal technology, Kelly offers considered insights into the use, appreciation, and future of AI within the legal profession.
At Murdoch University, Kelly is teaching his students to understand the legal AI tools at their disposal, to analyse the pros and cons of AI’s usage, and how to approach relying on it ethically. ‘In the profession right now, we would estimate that on average, a third of firms are now using AI on a regular basis Change is happening quickly. There is an expectation that when a student enters the workforce, they at least understand what’s available, how it works, what the pros and cons are, and what the risks are.’
He expressed how drastically the legal profession is shifting as law firms and some of the courts are leaning into AI more heavily, and how that will impact the lawyers of tomorrow. ‘We’re starting to see this big divide between the haves and the have-nots when it comes to this knowledge In order for students to actually be job-ready, I think they need to have some exposure to legal AI and technology.’
Kelly explained that, fundamentally, he teaches how to responsibly and ethically integrate AI into legal work and study. ‘I try to show students how AI works and how to leverage what it offers, but in my teachings, I also make a point of showing them the things it does wrong, because it can do things wrong, but we try to guide students on how to manage that and make sure it doesn’t catch on fire.’
‘There’s a very important saying in the AI world right now, which is that today is the day that AI is the worst it will ever be. What they’re saying is, tomorrow AI is going to be better. The day after that, it’s going to be better. You have to understand what’s coming down that path ’
Kelly also has a background in cybersecurity, which offers him a crucial and critical lens when considering the software he works with at LEAP Immigration ‘A couple of years ago, I went back to university and did my own post grad in cybersecurity. So, I spend time now making sure that when I’m dealing with any form of product, there’s a cybersecurity lens that’ s applied over the top of it. It’s really about going through it and looking at it and going, “Okay, what is it that AI should be doing? How should it be doing it in a way that is safe and secure and, importantly, correct?” and trying to make sure that the development works in that pathway.’ He recognises that despite the resistance to AI due to safety and security concerns, AI integration is inevitable, and so it is essential to ensure that the Legal AI used in the profession is secure and ethical.
Alongside profound technological advancements comes trends of satisfaction, dissatisfaction and acceptance. Kelly touched on this by referencing the Gartner AI Hype Curve. ‘The curve is really useful in understanding tech development. You’ve got your hype phase, where everyone is excited, then you get to the peak, and then you drop into the trough of disillusionment, where it doesn’t meet what everyone dreamed it could be. Then over time, it comes back up - as people get comfortable, and as the technology matures, it finds a use. We can measure most tech development using that hype curve, and it’s what I try to make sure that when I’m teaching people about AI in particular, that they understand that there will be all these promises, and many may not come to pass, but we will find that over time, that the technology will become something that’ s just part of what we do.’
Kelly further acknowledges the fear and apprehension that follows AI advancement but seeks to reframe the question from ‘what will AI do to hinder us?’ to ‘what can AI do to help us?’ ‘If thought of properly, and if understood properly, I believe it’s a tool that can empower those that work in the legal profession to actually be much more efficient and effective at their job. There’s definitely a way of thinking that looks at it from the perspective of access to justice issues By getting AI to a space where lawyers and the courts can use, hopefully, they can get through a lot more matters. And maybe, just maybe, the wheels of justice might turn a bit faster.’
Notably, Kelly considers the impact that administrative delays and the inaccessibility to legal solutions has on the justice system, specifically in family, immigration and criminal law He advocates for the benefits that AI can offer to the justice system in supporting the community to access and achieve justice. Kelly looks towards the future of AI and how it can garner a better outcome for those in the community
In conversation with
Maria O’Sullivan
Associate Professor at Deakin Law School
Automation is transforming the way public decisions are made. It can cross-check welfare payments against tax records, detect discrepancies, and even interpret medical scans. These technologies promise efficiency and precision, yet, Associate Professor Maria O’Sullivan warns that they also bring serious legal and ethical challenges
“Anything that’s automated has to be lawful under legislation,” she explains.
“Automation changes decision-making because we now have far more data collection and the ability to match data across systems ”
Government services are increasingly delivered through self-service systems such as online forms, chatbots and digital portals. While cost-effective, these tools risk alienating the very people they are meant to serve. Maria believes it is vital that “we still need a human in the loop somewhere” so that there is a point where a person can intervene
For her, the human element is good practice but the real safeguards are legal. She identifies three that are essential. The first is legality, meaning the decision must be authorised under law The second is a right to an explanation so that people understand why a decision was made. The third is a right of review to challenge the outcome
However, transparency is not always straightforward. Some algorithms are so complex they become a “black box” that even experts struggle to fully explain. Others are created by private companies that protect their code as trade secrets “A company like Microsoft might spend years developing a tax fraud detection algorithm and will not want competitors to copy it,” she notes.
Australia’s transparency laws, Maria says, have not kept pace with AI
“We have oversight systems like Freedom of Information and Senate Estimates, but nothing AI specific. In Europe, the GDPR and the EU AI Act have robust transparency provisions ”
Her recommendation is simple:
“AS MUCH TRANSPARENCY AS POSSIBLE, SO PEOPLE KNOW WHY THE DECISION HAS BEEN MADE AND HOW THE SYSTEM WORKS.”
She emphasises that certain groups are especially vulnerable to harm from automated decision-making, including welfare recipients, people in healthcare systems, prisoners, and refugees. Vulnerability can also be temporary. In these situations, access to an affordable right of review is critical.
In healthcare, privacy takes on heightened importance. “Health information is a level above in terms of sensitivity,” she explains While AI can excel in areas such as radiology, spotting details that humans might overlook, “you have to have that human in the loop, because if the system gets it wrong, someone must take responsibility ” In immigration, she warns about the risks of consent and data storage. “If data is held offshore and not properly secured, that is a serious risk.”
Traditional public law principles, procedural fairness and the duty to act reasonably all remain essential, but automation changes the scale of potential harm. “One error can become 10,000 errors,” she warns “It is not enough to have an individual right of review. We need the ability to look at the system as a whole.” The EU AI Act, which prohibits real-time surveillance and requires human rights reviews for high-risk areas such as immigration, offers a risk-based model Australia could consider.
Surveillance concerns extend across many aspects of public and private life. Authorities can monitor social media activity, track individuals through CCTV and body-worn cameras, and collect data from personal devices. This constant monitoring can create a “chilling effect,” Maria says, where people may alter their behaviour due to fear of being watched or recorded, especially if it could impact their employment or personal reputation. The increasing use of data collection tools raises further questions about how this information might be stored, shared, or used in the future.
Online disinformation is another growing problem. During the Indigenous referendum, the Australian Electoral Commission adopted a transparency-first approach, publishing false claims in a public register, correcting them, and engaging directly with people spreading them Content removal was reserved for limited situations, such as false information about election dates.
For law students and early-career lawyers, Maria highlights a newer frontier: privacy in wearable technology. “Think about health trackers and period-tracking apps In some jurisdictions, that data could be used in criminal proceedings We must be aware of where the data is stored and who can access it.”
Ultimately, she says, automated systems must be lawful, explainable and subject to affordable review The culture within the agency deploying them matters just as much and critics have pointed to a departmental mindset targeting and influencing how a system is implemented
For Maria, technology in public decision-making must never lose sight of fairness, transparency, and the human element.
AUGUST2025
