Artificial intelligence in legal practice: Lessons from Dayal [2024] FedCFamC2F 1166

Back in September 2024 we published an article titled ‘A ghost in the machine? Risks and pitfalls of AI use within legal practice’, highlighting the use of Artificial Intelligence (AI) by a solicitor acting in the matter of Handa & Mallik [2024] FedCFamC2F 957 (Handa & Mallik).

In that case, the Court ordered the solicitor to file submissions as to why referral should not be made to the Victorian Legal Services Board and Commissioner. The solicitor’s submission has now been considered by the Court in the matter of Dayal [2024] FedCFamC2F 1166 (Dayal).

Since the decision in Handa & Mallik, the Court observed that the solicitor acknowledged:

  • Handing up to the Court a document that purported to contain summaries of relevant authorities and included what looked like medium neutral citations identifying those decisions,
  • Using legal software (in particular, an AI driven research tool module) to generate the list of authorities and summaries,
  • That neither he nor another legal practitioner had reviewed the output generated by the research tool to ensure the accuracy of the list of authorities and case summaries, and
  • That the authorities identified in the list and summary tendered to the Court do not exist.

The solicitor offered an unconditional apology to the Court for tendering the inaccurate list and summary of authorities. Further, and among other things, he provided an assurance to the Court that he would ‘take the lessons learned to heart and [would] not commit any such further breach of professional standards in the future’. The solicitor asked the Court not to make a referral to the Victorian Legal Services Board.

In determining the matter, the Court observed that ‘…use of technology is an integral part of efficient modern legal practice’. However, the Court cautioned that whilst ‘…the use of AI tools offer opportunities for legal practitioners, it also comes with significant risks'.

The Court observed the worldwide attention drawn by the USA District Court case of Mata v. Avianca, Inc., 22-cv-1461 (S.D.N.Y. 2023) – in which attorneys relied on generative AI to prepare legal submissions which were filed referring to non-existent cases. Those attorneys were subsequently found to have abandoned their professional responsibilities, and sanctioned.

The Court then considered and reiterated guidelines issued by both the Supreme Court of Victoria and County Court of Victoria, which emphasise that:

  • Parties and practitioners using AI tools in the course of litigation should ensure they understand the manner in which those tools work, as well as their limitations.
  • The use of AI programs must not indirectly mislead another participant in the litigation process (including the court) with regard to the nature of any work undertaken, or content produced by that program. Ordinarily parties and their practitioners should disclose to each other the assistance provided by AI programs to the legal task undertaken.
  • The use of AI to assist in the completion of legal tasks must be subject to the obligations of legal practitioners in the conduct of litigation, including the obligation of candour to the court.

The Court then reiterated that ‘Generative AI does not relieve the responsible legal practitioner of the need to exercise judgment and professional skill in reviewing the final product to be provided to the court’.

With consideration to the solicitor’s admissions and conduct, as well as the stress he experienced due to the consequences of his actions, the Court determined that the Victorian Legal Services Board and Commissioner was the appropriate body to decide whether further investigation or action should be taken regarding the solicitor’s conduct.

In so doing, the Court opined that ‘…it is in the public interest for the Victorian Legal Services Board and Commissioner to be aware of the professional conduct issues arising in this matter, given the increasing use of AI tools by legal practitioners in litigation more generally’.

This decision underscores the critical importance of care and diligence when using AI in practice. While AI tools offer significant opportunities for efficiency and innovation, they can pose substantial risks. Dayal highlights the potential for AI to generate inaccurate or misleading information, which can in turn have serious professional and legal consequences.

Legal practitioners must remain vigilant and uphold the highest standards of professional conduct. The integration of AI into legal practice must be approached with caution, ensuring that human oversight and professional judgment are never compromised.

If you require assistance with a family law matter, please don’t hesitate to reach out to Will Stidston, Principal within BN’s Family Law team.

Dayal [2024] FedCFamC2F 1166
Handa & Mallik [2024] FedCFamC2F 957
Mata v. Avianca, Inc., 22-cv-1461 (S.D.N.Y. 2023)

Ask us how we can help

Receive our latest news, insights and events
Barry Nilsson acknowledges the traditional owners of the land on which we conduct our business, and pays respect to their Elders past, present and emerging.
Liability limited by a scheme approved under Professional Standards Legislation