Australia’s Robodebt scandal provides some critical lessons for boards/governing authorities and senior executives implementing artificial intelligence (AI) and technology in both the public and private sectors. The emergence of AI governance guidance and standards, such as AS ISO/IEC 42001:2023 and NIST’s AI Risk Management Framework, are helpful tools for organisations to build explainability, traceability and accountability into AI technology to ensure it is safe and reliable. However, as this article highlights, focusing on the customer, engaging frontline staff, and understanding the importance of culture are critical in AI and technology, to both identify problems and for prompt escalation of problems and effective collaboration to rectify problems. Robodebt is a reminder, notwithstanding the immense opportunities from using AI and technology, that proper planning and risk assessment, combined with effective governance policies and processes, together with a collaborative culture, will differentiate between success and failure, with the consequent benefits or costs.
The Robodebt Scheme
The Robodebt scheme was designed to detect fraud and recover overpayments from welfare recipients going back to 2011-2012. Robodebt was piloted in 2015 before being expanded in 2016. The scheme relied on ‘income averaging’ to assess income and entitlements, which did not comply with income calculation provisions under the Social Security Act 1991 (Cth). By the end of 2016, there was public criticism of Robodebt, with reports of people being driven to despair. However, the scheme continued until November 2019. A Royal Commission was established in 2022, which reported in mid-2023 making 57 recommendations. Commissioner Holmes AC SC found that:[i]
‘Robodebt was a crude and cruel mechanism, neither fair nor legal, and it made many people feel like criminals. In essence, people were traumatised on the off chance they might owe money. It was a costly failure of public administration, in both human and economic terms.’
Lesson 1: Accurately estimating both the benefits and the potential risks and costs
The costly failure of Robodebt included its overestimation of savings, underestimation of costs, and inability to understand the risks involved, including harms to people, challenges brought in the Administration Appeals Tribunal, and a class action against the Commonwealth government.
Robodebt was intended to generate savings of $4.7 billion, however, it was estimated to have delivered only $406 million in savings.[ii] This takes into account $971 million in implementation and administration of the scheme.[iii] Therefore, the net cost of the Robodebt scheme was $565 million.
The net cost included settlement of a class action in the sum of $112 million approved by the Federal Court in Prygodicz & Ors v Commonwealth of Australia (No. 2) FCA 634. The Royal Commission Report notes that, [t]his settlement sum included the legal costs for Gordon Legal, which amounted to $8,413,795.71 at the date of settlement.[iv]
Lesson 2: Designing AI/technology projects with policies and processes that are customer-centric
Commissioner Holmes found that Robodebt was ‘launched in circumstances where little to no regard was had to the individuals and vulnerable cohorts that it would affect. The ill effects of the scheme were varied, extensive, devastating and continuing.’[v]
The recommendations for designing policies and processes with emphasis on the people they are meant to serve include:[vi]
- Avoiding language and conduct that reinforces feelings of stigma and shame associated with the receipt of government support when it is needed;
- Facilitating easy and efficient engagement, with options of online, in person and telephone communications that are sensitive to the particular circumstances of the customer cohort, including itinerant lifestyles, lack of access to technology, lack of digital literacy, and the particular difficulties of rural and remote living;
- Explaining processes in clear terms and plain language in communication to customers;
- Acting with sensitivity to financial and other forms of stress experienced by the customer cohort and taking all practicable steps to avoid the possibility that interactions might exacerbate those stresses or introduce new ones.
Lesson 3: Designing AI/technology projects with human oversight
Robodebt provides a salutary lesson when automation removes the human element, as it did in Robodebt from the outset, which was found to be a critical factor in the harm it did. Commissioner Holmes stated, ‘The scheme serves as an example of what can go wrong when adequate care and skill are not employed in the design of a project; where frameworks for design are missing or not followed.’[vii] Further, ‘a clear path for review of decisions is important in designing a system which adheres to the OECD AI Principles: ‘a person affected by a decision should understand why the decision was made, and there should be pathways for review of these decisions that are accessible to them.’”[viii]
In addition to recommending legislation reform and implementation of regulations to introduce a consistent legal framework in which automation in government services can operate,[ix] relevant for all technology where automated decision-making is implemented, whether in the public or private sector, are:
- A clear path for those affected by decisions to seek review;
- Departmental websites containing information advising that automated decision-making is used and explaining in plain language how the process works;
- Business rules and algorithms that are made available to enable independent expert scrutiny.
Following the Robodebt Royal Commission Report and in the Response to the Privacy Act Review Report released in September 2023, the Australian Government has agreed that the Privacy Act 1988 (Cth) should be amended to introduce a right for individuals to request meaningful information about how substantially automated decisions with legal or similarly significant effect are made, and require organisations to include information in privacy policies about the use of personal information to make substantially automated decisions with legal or similarly significant effect.[x]
Lesson 4: The importance of stakeholder consultations with frontline staff during project design, implementation and ongoing
Commissioner Holmes found a lack of consultation with Department of Human Services (DHS) frontline employees and stakeholder groups before implementation of the Robodebt scheme. It was clear to staff there were ‘obvious’ flaws in Robodebt.[xi] The Royal Commission Report also refers to a survey for public servants carried out by the Community and Public Sector Union (CPSU) in which ‘nearly all members who responded to the survey raised concerns within DHS about the legality of the scheme, but were told that the legal advice was that it could proceed.[xii]
Although Robodebt was designed to decrease the time staff spent on identifying anomalies, the scheme did not decrease workload. In fact, it led to an increase, as customers went to a Centrelink office to find out what was happening and then returned with pages of bank statements and pay slips. The report found that service officers often ran appeal scripts, as many felt debts were unfair or incorrect.[xiii]
The Royal Commission Report sets out the impacts of the Robodebt on the DHS employees, which included:[xiv]
- Increasing staff workload;
- Imposing a cultural shift that placed pressure on staff;
- Requiring specific training, which some staff considered was not provided adequately;
- Involving an increase in labour hire arrangements;
- Deterioration in staff morale.
The report also found that staff were not consulted on the proposal prior to the inception of the scheme, and when they did provide feedback, they felt their feedback was ignored by DHS.[xv] This highlights the importance of culture and leadership, including training on escalating issues and internal mechanisms for making and resolving complaints.[xvi] As a result, the report recommended that a consultation process should be put in place with frontline staff when new programs are being designed and implemented and better feedback processes should be put in place so staff could communicate their feedback in an open and consultative process with constructive processes for management to review and respond.[xvii]
Lesson 5: The importance of a culture that enables problems to be openly raised, discussed and resolved, including with independent legal advice where appropriate
The Royal Commission found that the failure to confront fundamental flaws in the Robodebt scheme appeared to have been a product of the culture within DHS at the time. In evidence before the Commission, Scott Britton, national manager, of the Compliance Branch DHS, said:[xviii]
‘…my own experience was there was difficulty in giving bad news or alternate [sic] views to Deputy [Secretary] Golightly. I had had a number of personal experiences with project reports that were [in] red, didn’t like red, had to change it, update it. It may be that. I don’t know. I’m – I’m basing it purely on my personal experience. I think there was a general cultural view around no one gives bad news. So fix it – get on with it and fix it.’
The culture of not giving bad news extended to in-house DHS lawyers whose professional duties required them to maintain their independence and not be a mere mouthpiece for their client.[xix] The report states that:[xx]
‘DHS lawyers gave evidence of their perception that, even where they sought to act independently, they were constrained by the culture of the department which discouraged this behaviour.’
The Royal Commission heard evidence about the structure and culture of both the DHS and the Department of Social Services (DSS) in-house legal teams and found that, ‘the professional independence of both agencies’ in-house lawyers was compromised in relation to the scheme.’[xxi] Commissioner Holmes found that, ‘the position that income averaging was a long-standing, lawful practice was so entrenched within DHS that lawyers at all levels were unable to question it in accordance with their professional obligations.’[xxii] Anna Fredericks (former principal legal officer, DSS) said the DSS legal team was:[xxiii]
‘…A very siloed type of culture. You were responsible for what you were responsible for and stayed within those bounds. There was a strong view that…our role as legal, as a service provider to the department was to provide advice on specific statutory interpretation, not to comment on – or not to necessarily explicitly not comment on, but perhaps not – it wasn’t our role to turn our mind to broader risks than what was being explicitly asked.’
Conclusion
While standards and guidance for robust AI governance provide a helpful tool for organisations, five critical lessons gained from the Robodebt scheme include:
- Accurately estimating both the benefits and the potential risks and costs, including adequate resourcing;
- Designing AI/technology projects with policies and processes that are customer-centred;
- Designing AI/technology projects with human oversight;
- The importance of stakeholder consultations with frontline staff during project design, implementation and ongoing;
- The importance of culture that enables problems to be openly raised, discussed and resolved, including with independent legal advice where appropriate.
At InfoGovANZ, we highlight the importance of collaboration across silos and the building of knowledge and expertise across the various functional areas and different professionals required for AI and technology projects. The Robodebt scheme underscores the significant risks and damage that can arise when technology projects are not adequately planned, there are inadequate governance mechanisms, and the culture is not conducive to open and constructive feedback both up and down the organisation’s hierarchy.
Author: Dr Susan Bennett PhD, LLM(Hons), MBA, FGIA, FIP, CIPP/E, CIPT
Principal Sibenco Legal & Advisory, Governance and Privacy Lawyer, Founder of InfoGovANZ
[i] Report Royal Commission into the Robodebt Scheme 2023, ISBN: 978-1-921241-58 xxix.
[ix] Ibid, Recommendation 17.1, 486-488.
[x] Australian Government Response to Privacy Act Review Report, 11.