Deloitte to Refund Australian Government for AI Errors
This article will count 0.25 units (15 minutes) of unverifiable CPD. Remember to log these units under your membership profile.
Deloitte is to pay back part of its fees charged to the Australian government for a $440,000 Report after it used AI to help produce a report, and got caught out. The report, which reviewed a system that penalises jobseekers who miss obligations, was found to contain made-up citations, false legal references, and other serious errors.
It later came to light that Deloitte had used generative AI tools (like ChatGPT) in drafting parts of the document. While Deloitte insists the findings remain valid, one academic said the report suffered from “AI hallucinations”, where the AI fills in gaps with fake or misleading information.
A senator on the integrity inquiry called it out bluntly: “Deloitte has a human intelligence problem.” The message? Don’t let AI do your thinking for you.
Lessons for Accountants
Use AI as a tool, not a crutch. Don’t hand over your credibility to a machine. In a world of auto-generated everything, your professional judgment is your biggest value-add.
AI can definitely help, but it can also hurt.
If you’re using tools like ChatGPT or Copilot for reports, audits, or technical writing, double-check every reference and fact. You’re still on the hook for accuracy.Transparency matters.
If AI played a role in your reporting, disclose it. Better yet, explain how it was used and what checks were put in place.Client trust is everything.
Imagine telling a client SARS penalties were based on fake data. Whether you’re in practice or commerce, the integrity of your reports protects your reputation and your licence.Don't automate expertise.
AI can summarise, draft, or support. But only human judgment can interpret law, apply accounting standards, or assess business risk.
This is a reminder for us all: use AI responsibly and always check your facts!
Source Article: The Guardian