UPDATE — NOVEMBER 2025: Since the initial reporting on the flawed Deloitte welfare-compliance report, the situation has escalated significantly. Deloitte has now formally acknowledged that generative AI was used in preparing portions of the $439,000 report commissioned by Australia’s Department of Employment and Workplace Relations. This admission followed mounting evidence that fabricated citations, invented legal quotes, and misattributed case law appeared throughout the document — errors that experts concluded could not have originated from any real legal record.
After internal review, Deloitte submitted a corrected version of the report with the hallucinated references removed and agreed to refund part of the contract payment to the Australian government. The exact refund amount has not been disclosed but reflects the government’s concerns about AI-related quality failures in an official consultancy deliverable. Despite the revisions, Deloitte insists the report’s core findings and recommendations remain valid. However, legal scholars continue to question the accuracy of substituted references and the firm’s broader methodological transparency.
The incident has triggered a wider debate across Australia’s public-sector consulting ecosystem. Deloitte is reportedly tightening its internal quality-assurance processes and reevaluating guidance on the permissible use of generative AI in client work. Meanwhile, DEWR’s investigation is ongoing, and lawmakers have cited the episode as evidence of the need for stronger oversight, disclosure rules, and audit requirements when AI tools are used in government-funded research or analysis.
ORIGINAL NEWS STORY:
Deloitte Report for Canberra Questioned Over Suspected AI-Generated Quote
The Australian Financial Review reports that fresh errors have been uncovered in a Deloitte report for the federal government, deepening suspicions that artificial intelligence (AI) may have been used in its preparation.
The $439,000 report, commissioned by the Department of Employment and Workplace Relations to examine welfare compliance systems, had already been found to include at least half a dozen references to non-existent academic works. Now, University of Sydney law academic Chris Rudge has identified what appears to be an invented quotation attributed to a key robo-debt case.
According to Deloitte, Federal Court Justice “Davis” — a misspelling of Justice Jennifer Davies — was cited in Amato v Commonwealth as stating that “the burden rests on the decision-maker to be satisfied on the evidence that the debt is owed.” However, no such language appears in the case’s consent orders, which contain no paragraphs 25 or 26 as the report suggested. Another passage was also incorrectly cited, and the case itself was misattributed to an unrelated migration matter.
Rudge told the Financial Review that the errors cannot be sourced in the legal record, leading him to conclude the material may have been AI-generated. “It is another thing entirely to misstate the common law of Australia in advice to the Commonwealth government,” he warned.
Deloitte has declined to confirm whether AI was used in drafting the report, though it has promised to correct the citations. The firm said it “stands by our work and the findings in the report.”
Legal scholars Lisa Burton Crawford and Janina Boughey also raised concerns, noting that replacement references provided by Deloitte do not clearly support the report’s claims. DEWR confirmed it has sought urgent clarification from Deloitte as it continues investigating.
Need Help?
If you have questions or concerns about any global guidelines, regulations and laws, don’t hesitate to reach out to BABL AI. Their Audit Experts can offer valuable insight, and ensure you’re informed and compliant.


