A small Australian software vendor reading the Department of Finance’s draft procurement rule amendments would, on a careful read, find a sentence asking it to “accept accountability for AI-generated outputs” in any future government contract. The vendor would then have to work out what that sentence means when the AI in question is a general-purpose foundation model the vendor neither owns, trains, nor fully understands. The vendor’s commercial options are constrained. It can refuse the clause and lose the bid. It can accept the clause and carry an unbounded liability that no insurer in the local market is presently willing to underwrite. It can carve the AI out of the deliverable entirely and bid a more limited service, and lose anyway to a competitor that accepts the clause without thinking it through. None of those options is a sensible commercial outcome, and the procurement rule does not yet make clear which of them the rule was intended to produce.
The amendments themselves are a defensible piece of policy. InnovationAus reported (InnovationAus, 2026) that the Department of Finance is moving to require suppliers to disclose planned AI use in Commonwealth contracts and accept responsibility for the outputs of that use. The Digital Transformation Agency had already issued model AI clauses for Commonwealth contracts in March 2025 (Digital Transformation Agency, 2025). The APS AI Plan now mandates Chief AI Officers for each agency, mandatory AI literacy training across the public service, public-facing AI use reporting, and a structure of escalation up the chain. Taken together, these moves are an attempt to shift the locus of risk from “the Commonwealth carries everything because it does not know what is in the system” to “the supplier discloses what is in the system and shares the risk of its operation.” That is the right direction of policy travel. The harder question is whether this policy is being applied to a public sector that is currently capable of receiving it.
The empirical answer to that question arrived at the start of May, in the form of the first systematic analysis of Australian Government AI Transparency Statements (arXiv preprint, 2026). The study found significant variation in disclosure quality across the agencies that have published statements at all, and persistent gaps between the policy intent in the APS AI Plan and the practice of disclosure on departmental websites. Some statements run to multiple pages of substantive detail. Others run to a paragraph that says little more than that the agency uses AI somewhere. The arXiv study is the first rigorous empirical assessment of whether the Australian Government is doing what it says it is doing on AI transparency, and the answer is: inconsistently.
The Mandarin’s diagnosis of why public sector AI uptake keeps stalling lines up with that empirical finding (The Mandarin, 2026). The piece argues that the missing element is what it calls the connective tissue: governance, economics, and operating models that move pilots from proof-of-concept into production. A 6clicks summary of the Ready for Sovereignty 2026 forum in Canberra put the same observation more directly, that risk-first AI governance is currently crowding out opportunity in Australia (6clicks, 2026). The convergence of the academic and the practitioner accounts is not coincidental. They are looking at the same gap from different sides. The academic side sees agencies producing transparency statements that do not match their stated policies. The practitioner side sees agencies unable to move beyond pilots because the structures around procurement and governance do not yet exist at the right scale.
The procurement rule changes therefore arrive at an awkward moment for the agencies they apply to. The rules tighten the requirements on suppliers at the same time as the agencies receiving the disclosed information are not yet equipped to evaluate it well, and at the same time as the agencies themselves are struggling to disclose their own AI use consistently. The rule does not solve the practitioner-side problem. It adds to it, by creating a new compliance load that small and medium-sized vendors will struggle to absorb without specialist legal and governance support. The agencies that already have Chief AI Officers and mature governance practices will adapt to the new procurement requirements without much trouble. Smaller vendors and smaller agencies will need help of a kind that is not yet in the system.
A useful comparison sits in Washington. The US Government Accountability Office convened an expert panel in 2025 on the question of AI in federal procurement and found that AI could indeed support market research, proposal review, and fraud prevention, but that risks including inaccurate outputs and capability gaps in the federal workforce remained significant barriers to scaled use (Government Accountability Office, 2025). The parallels with Australia’s situation are direct. The technology is genuinely useful when applied carefully. The binding constraint sits in the organisational capacity to apply and govern the technology, not in the technology itself.
The connective-tissue gap shows up most clearly at procurement. A procurement rule is a constraint on supplier behaviour. It does not, on its own, build the capability inside an agency to read what the supplier discloses, evaluate the model behind it, or notice when the disclosed practice does not match the actual practice. That capability has to be built. It is built through training, through standardised evaluation methods, through procurement guidance that goes beyond model clauses into worked examples, and through cross-agency peer learning that lets newer adopters draw on the experience of more mature ones. A central guidance website is a useful piece of furniture in that build. It is not, on its own, the build itself, and I have written separately about the launch of AI.gov.au in those terms.
Public trust in AI-enabled government services depends less on the technology itself and more on the public’s confidence that the agency delivering the service has the maturity to deliver it well (Government News, 2026). On that reading, the procurement rule changes will only build trust if they are accompanied by visible improvement in how agencies actually use AI in practice, and in how that use is disclosed to the public. Without that visible improvement, the rules will look like a defensive crouch: a way of pushing risk onto suppliers without taking responsibility for building the public-sector competence the rules implicitly assume.
For vendors selling to government, the practical implication is to read the rule changes carefully and start work on disclosure infrastructure now. The clauses are coming. The agencies asking the questions will not be uniformly sophisticated about evaluating the answers. Vendors who present clean, consistent, evidence-backed disclosures will have an immediate edge over those who try to handle the question on a contract-by-contract basis. For consultants, the opportunity is in the gap between the policy and the practice, helping agencies build the evaluation capability that the rules presume already exists.
Frameworks without implementation pathways are intentions with formatting. The new procurement rules are genuine intentions, and the underlying analysis behind them is sound. Whether they become implementation depends on what happens around them in the next two budget cycles. How Chief AI Officers are resourced will matter. How the AI literacy training actually translates inside agencies will matter more. And how transparency statement quality improves, once agencies have to defend their statements alongside their procurement obligations, will matter most of all. The intent is clear. The pathway, currently, is not.
References
6clicks. (2026). Insights from Ready for Sovereignty 2026, Canberra: Australia’s AI governance stalemate.
arXiv preprint. (2026). Systematic analysis of Australian Government AI Transparency Statements.
Department of Finance. (2026). Draft amendments to Commonwealth Procurement Rules: AI disclosure and supplier accountability. Commonwealth of Australia.
Digital Transformation Agency. (2025, March). Model AI clauses for Commonwealth contracts. Commonwealth of Australia.
Government Accountability Office. (2025). Expert panel findings on AI in federal procurement. United States.
Government News. (2026). Building trust in AI.
InnovationAus. (2026). Suppliers face strict new AI rules in procurement shake-up.
The Mandarin. (2026). Why public sector AI uptake keeps stalling.

Leave a comment