It is amazing to watch how quickly Chat GPT writes highly sophisticated documents on the basis of a few short prompts. Something that might take hours or days of human work can be produced in minutes by generative AI. Gen AI has applications across most areas of human endeavour, and its potential to save significantly on time, effort and money in legal practice is obvious. But using Gen AI to prepare court documents must be done carefully, as a recent case shows.
In Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, the applicant sought judicial review of a decision of the Administrative Appeals Tribunal regarding immigration matters. The applicant’s solicitor filed a written outline of submissions in support of the application. Judge Skaros states at paragraph 4:
The applicant’s submissions … refer to “Murray v Luton [2001] FCA 1245”, “Mackinlay v MIMA [2002] FCA 953”, “Bavinton v MIMA [2017] FCA 712”, “Gonzalez v MIBP [2018] FCA 211”, “Seng v MIAC [2013] FCA 1279”, “Kahawita v MIEA [1993] FCA 870”, “MIAC v Thiyagarajah [2016] FCA 19”, “Heath v MIMA [2001] FCA 700”, “Mitsubishi Motors Australia Ltd v AAT [2004] FCA 1241”, “MIMA v Ameer [2004] FCA 276”, “Woods v MIMA [2001] FCA 294”, “MIAC v Wu [2015] FCA 632”, “Drummond v MIMA [2008] FCA 1774”, “Walters v MIBP [2016] FCA 953”, “Lao v MIMA [2002] FCA 1234”, “Alfaro v MIBP [2016] FCA 1156” and “Wai v MIBP [2016] FCA 1157”, but none of these decisions exist. They also in paras 1.2, 2.2, 3.1, 4.1, 5.1, 5.2, 6.1 and 6.2 provide alleged quotes from the Tribunal’s decision which also do not exist.
The non-existence of the cases was noted by the respondent in its outline of submissions. In response, the applicant’s solicitor filed amended submissions removing the non-existent cases. The final hearing was adjourned while the Court raised its concerns about the applicant’s solicitor’s conduct with the parties, noting that the judge and his associates had spent considerable time checking the citations and attempting to find the authorities.
The solicitor explained that due to time constraints and health issues (affecting his concentration and ability to sit for long periods), he “accessed the site known as ChatGPT, inserted some words and the site prepared a summary of cases for him. He said the summary read well, so he incorporated the authorities and references into his submissions without checking the details”. Despite this, the submissions contained 17 non-existent cases, replete with names and citations that appear authentic, and 8 fictional quotes from the AAT decision appealed from.
This highlights a feature of Gen AI – the end product is usually grammatically correct, relevant to the prompts entered, coherent and logical. Uncertainty is absent – the “writer” appears confident about what is stated. Text can appear tailored, like it was specifically written to address the questions asked. This can be seductive, especially for a busy lawyer under pressure to produce good quality work at high speed.
While expressed confidently and persuasively, Gen AI is often wrong. Gen AI is trained on vast datasets of human language, learning to generate text based on patterns in the data. However, Gen AI does not verify the data it learns from, and it cannot exercise judgment about the output. As an example, even if a recent judgment which changes or develops the law on a particular issue is in a chatbot’s dataset, the chatbot may not recognise that prior cases on that issue are no longer good authority. The chatbot may still use the prior cases in generating its output.
Another issue may be Gen AI’s failure to recognise that a particular case or line of authority is relevant to a question. Relevance in this sense may only be apparent to an experienced lawyer. Gaps like this may be more likely if the case is novel or complex.
As well as generating information which is incorrect or has gaps, Gen AI can also invent information. The chatbot perceives patterns within the vast collection of data it “trains” on, and may create entirely new information based on those patterns. This is known as “hallucinating”. The 17 non-existent cases (and the “quotes” from the AAT decision) referred to in Valu appear to have been Gen AI hallucinations.
What is striking in Valu is how many cases were hallucinated, and the apparent authenticity of the citations. The citations include MIMA (which stands for “Minister for Immigration and Multicultural Affairs”), MIBP (Minister for Immigration and Border Protection), MIAC (Minister for Immigration and Citizenship), and MIEA (Minister for Immigration and Ethnic Affairs), all historical ministerial titles, and names which certainly could be individual immigration litigants in the Australian context – Lao, Seng, Wai, Wu, Alfaro, Gonzalez, Kahawita, Ameer and Thiyagarajah. It is all so plausible.
In Valu, the Court found that the solicitor’s conduct fell short of the standard of competence and diligence expected of the solicitor, as well as failing in his duty to the court not to mislead or deceive, and referred his conduct to the Office of the NSW Legal Services Commissioner for consideration. A solicitor suffered the same fate for similar conduct in Victoria in Dayal [2024] FedCFamC2F 1166.
Both Valu and Dayal are in the federal jurisdiction. The Australian Federal courts have not (yet) published any guidelines or practice notes on the use of AI. The Supreme Court of Victoria has issued “Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation” (May 2024), and in November 2024 the Supreme Court of NSW issued Practice Note SC Gen 23 “Use of Generative Artificial Intelligence (Gen AI)”, which it revised in January 2025.
The Victorian guidelines emphasise practitioners’ existing obligations, including that the practitioner is responsible for the accuracy of documents. The NSW Practice Note is more prescriptive – it prohibits Gen AI generating the content of witness statements and the like, although it permits it in written submissions, provided the author verifies that all citations exist and are accurate and relevant.
The short point is that Gen AI is an extremely useful and attractive tool for legal work, but lawyers must understand that Gen AI is prone to inaccuracy, and that if Gen AI is used, outputs must be carefully considered and thoroughly checked. It should go without saying that the consideration and checking needs to be performed by humans with the appropriate experience and expertise.
This Keynote is all my own work and has not been prepared using generative AI. Correct as at 1 March 2025, although the speed of technological progress may render some aspects of this article obsolete quite quickly.
This article is for general information purposes only and does not constitute legal or professional advice. It should not be used as a substitute for legal advice relating to your particular circumstances. Please also note that the law may have changed since the date of this article.