Staff at the National Disability Social Agency are using Machine Learning to Help Create Draft Plans for NDIS participants, documents obtained by Guardian Australia.
Documents relating to the NDIA’s use of AI released under freedom of information laws show 300 staff took part in a six-month trial of Microsoft’s Copilot Ai in January last year.
The agency said Copilot uses Generative Ai, which is only used for NDIA emails, meetings and other client-facing functions – not for participants.
But the documents reveal that before the Copilot trial began, the NDIA was already using a form of AI – Machine learning – to prepare draft budget plans for participants.
Machine learning is defined as “a subset of AI that involves the use of algorithms to learn from data and make predictions or decisions without being explicitly programmed”.
The NDIA said that NDIS staff make all final decisions on plans and the AI policy document from April 2024 must be unrestricted by the Chief Information Officer and authorized under the Chief Information Officer and authorized under the Chief Information Officer and authorized under the Chief Information Officer and authorized under the Chief Information Officer and authorized under the NDIS Act.
The Briefing Document, prepared for the Senate estimates in 2023-24, read: “While machine learning is used within the profiles of a participant.
“The algorithm is only used to make recommendations, with decisions made by the actual delegates.”
The documents continue that “machine learning recommendations are used to help delegates by facilitating initial analysis for participants and improved services”.
Sign up: AU Breaking News Email
In a report of June 2024, it stated the staff experienced improved productivity in the preparation of documents and emails to “interpret a summary of the purpose”.
NDIA staff overall reported a 20% reduction in task completion times during the copilot’s stress-positive use of live-transcription in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings in meetings during meetings during meetings.
The report found difficulties facing the court include staff concerns about Robodob’s findings at the Royal Commission on Automated Descession-Commission, and concerns about AI being used to reduce decision numbers.
The report at the end of the trial notes that one of the risks of using Copilot is accidental data exposure, but the agency said it has access controls, regular audits, and training for employees.
Dr Georgia Van Toorn, from the University of New South Wales, who writes about the impact of algorithmic decision-making in the public sector and dealing with complexity and nuance”.
“I don’t think it’s a bad thing, especially in cases that are quite straightforward, but … we can’t expect a kind of support for someone who doesn’t fit in a box. And that’s most people, right?”
The problem with the van’s engine also increased
“I think there is a thought that because it is driven by data, it is accurate and personalized … but in this case, I think, I think that the person in the hole needs to understand the limitations [of the technology] … and use their discretion and judgment.
After the newsletter promotion
“And they need to be properly trained and supported to do it at the right moment.”
Van Toorn said the fact that NDIA documents clearly state that decisions on support plans are made by people is important.
However, he warns that there is a lot of evidence for what researchers call “automation bias” – where people are influenced by AI recommendations when making decisions.
“Maybe there are time constraints or pressure on planners to get through a certain amount of plans to meet the KPIS, or maybe there’s pressure on the NDIA to use the number or cost of plans,” he said.
“The risk is that if this causes their work to be quick or easy, a planner may be more likely to stick with the recommended plan or listen to NDIS participants.”
Dr Stevie Lang Hows, a promoter of Disability and Disability Aviderar, said that his “biggest concern” is if the staff is trained, with plans that suit our needs as individuals”.
“These are people’s lives. They are how many times people can go to the bathroom … how often you can be careful and it is more important to be careful and in a way that reflects the needs of the people.”
An NDIA spokesperson said AI was not used in the systems “to interact directly with participants or with any NDIS or allied funding decisions”.
“Delegates make participant NDI funding decisions using information and evidence provided by participants in accordance with the NDIS Act,” the spokesperson said.
The federal government on Wednesday released a government-wide AI Plan for the use of Generative Ai in public services. The finance minister, Katy Gallagher, said the plan would provide every public servant with generative AI tools, training and guidance on how to use the tools safely and responsibly.

