NDIA Uses AI for NDIS Draft Plans: 300 Staff in Copilot Trial
NDIA uses AI for NDIS draft plans amid automation concerns

The National Disability Insurance Agency has confirmed its staff are using artificial intelligence to help create draft plans for NDIS participants, while strictly limiting more advanced generative AI tools to administrative tasks like emails and meetings.

AI Trial Reveals Productivity Gains

Documents obtained under freedom of information laws show that 300 NDIA staff participated in a six-month trial of Microsoft's Copilot AI system beginning in January last year. The agency emphasised that this generative AI technology was only used for internal communications and never for client-facing NDIS tasks.

However, the documents reveal that before the Copilot trial even began, the agency was already employing machine learning technology to prepare draft budget plans for NDIS participants. The NDIA defines machine learning as "a subset of AI that involves the use of algorithms to learn from data and make predictions or decisions without being explicitly programmed".

Human Oversight Maintained in Critical Decisions

The agency has stressed that all final decisions on participant plans remain with human staff. An April 2024 AI policy document explicitly states that "AI tools must not access participant records" without special authorisation under the NDIS Act.

A briefing document prepared for Senate estimates clarified the process: "While machine learning is utilised within draft budgets for first plans based on key information from a participant's profiles, the algorithm is only ever used to make recommendations, with decisions made by actual delegates."

The technology aims to speed up initial analysis, providing quicker resolutions for participants while maintaining human judgment in all final decisions.

Staff Report Significant Efficiency Improvements

A June 2024 report highlighted substantial productivity gains during the Copilot trial. Staff experienced:

  • 20% reduction in task completion times
  • 90% staff satisfaction rating
  • Improved document and email preparation
  • Positive feedback from hearing-impaired staff about live meeting transcriptions

The system helped staff interpret NDIA policies and generate summaries, though the trial also faced challenges including staff concerns stemming from the robodebt royal commission findings and fears about AI replacing jobs.

Experts Warn of Automation Bias Risks

Dr Georgia Van Toorn from the University of New South Wales, who has researched algorithmic decision-making in the public sector, expressed concerns about machine learning's limitations.

"Machine-learning and data-driven approaches often fail in dealing with complexity and nuance," she warned. "We can't expect a machine-learning approach to be able to predict the types of support someone will need if that person doesn't fit neatly into a box."

Dr Van Toorn highlighted the "black box" problem with machine learning, where it becomes difficult to understand which data points the system uses and what biases it might incorporate. She also cautioned about automation bias, where human decision-makers might be unduly influenced by AI recommendations.

"The risk is that if it makes their job quicker or easier, a planner might be more likely to go along with the recommended plan, leaning on the algorithm instead of using their own judgment," she explained.

Disability Advocate Emphasises Human Element

Dr Stevie Lang Howson, an NDIS participant and disability advocate, stressed the critical importance of human oversight in decisions affecting people's lives.

"These are actually people's lives," Dr Lang Howson said. "They're how many times people are able to get to the bathroom, it's how often you're able to leave your house, it's whether the wheelchair that you're sitting in is too small and causes you pain."

The advocate emphasised that staff need proper training and time to make plans that truly reflect individual needs, rather than relying on automated systems.

An NDIA spokesperson reaffirmed that AI is not used in systems that interact directly with participants or for any decisions on NDIS funding or eligibility, with human delegates making all final determinations based on evidence provided by participants.

The revelations come as the federal government released a whole-of-government AI plan for public service use, with Finance Minister Katy Gallagher announcing that every public servant would receive access to generative AI tools along with training on their safe and responsible use.