The RBA Says 1 in 3 Australian Businesses Now Use AI for Advanced Tasks. What About the Other 2?
New Reserve Bank of Australia research finds almost one in three Australian businesses are using AI for advanced tasks — predicting demand, analysing inventory trends, automating complex decisions. That's a higher adoption rate than most executives assumed. It's also leaving two thirds of Australian businesses behind, and raising urgent compliance questions for the one third already in.

Data Engineer. Azure 6x Microsoft Certified. Monash University.
New research from the Reserve Bank of Australia, published this week, found that almost one in three Australian businesses are already using AI for advanced tasks — demand prediction, inventory trend analysis, complex decision automation. Not chatbots. Not basic automation. Advanced AI.
That is a higher number than most Australian executive teams assumed when they last had a conversation about "where we are on AI." It means the early-mover advantage is shrinking faster than boards realise. And it raises two questions that matter right now:
For the two thirds not yet doing this: What is stopping you, and what is the competitive cost of waiting?
For the one third already doing it: Are you doing it compliantly — with Privacy Act protections, observability, and the audit trails that will be legally required from 10 December 2026?
The answer to both questions has a 90-day window.
What "Advanced AI Tasks" Actually Means
The RBA's framing matters. "Advanced tasks" in their research is not a vague category. It specifically covers:
- Demand prediction — using historical and market data to forecast what customers will want and when
- Inventory trend analysis — identifying patterns in stock movement, supplier performance, and consumption to optimise working capital
- Complex decision automation — AI systems that make or substantially assist in decisions previously requiring human judgement
These are not pilot projects. These are production systems running at scale in approximately one in three Australian businesses — across financial services, retail, mining, healthcare, and professional services.
The RBA finding also matters because it comes from Australia's central bank, not a technology vendor. This is not AI industry self-promotion. It is empirical research on what is actually happening in the Australian economy.
Why Two Thirds of Australian Businesses Are Not There Yet
If one in three businesses have deployed advanced AI, the obvious question is why the other two in three have not. The honest answer is that it is rarely a single cause. It is usually a combination of four things.
1. The data readiness gap
The most common blocker for Australian mid-market businesses implementing advanced AI is not the AI — it is the data. Demand prediction requires clean, accessible, historically complete sales and operations data. Inventory analysis requires integration across procurement, logistics, and POS systems. Complex decision automation requires structured data that captures the decision inputs.
Most Australian mid-market companies have this data. Almost none of it is in a state where AI can reliably work with it. It is split across a legacy ERP, multiple SaaS tools, and hundreds of spreadsheets maintained by individual staff members.
The fix is not glamorous: a data foundation build that creates a single, queryable source of truth for the AI to operate against. This is a 6–12 week engineering project, not a technology purchase. It is also the work that most AI vendors skip, which is why so many Australian AI implementations fail to reach production.
2. The compliance uncertainty paralysis
Privacy Act obligations for AI are real, significant, and poorly understood by most Australian mid-market businesses. The result is a pattern we see regularly: a leadership team that wants to implement AI but is stuck in a "we need to understand the compliance requirements first" loop that never resolves.
The December 2026 automated decision-making transparency obligations under the Privacy Act 1988 apply to any APP entity using AI to make decisions that significantly affect individuals. If you are predicting demand and using that to make purchasing or staffing decisions — you are in scope. If you are automating inventory decisions that affect supplier contracts or pricing — you are in scope.
But here is the thing: the compliance path for advanced AI is well-defined. It requires a Privacy Impact Assessment before deployment, Australian-jurisdiction data hosting, full audit trails for automated decisions, and the technical capability to produce an explanation of any significant decision on request. None of these are obstacles that should stop an implementation. They are engineering and governance requirements that should be built in from day one.
3. The wrong pilot mindset
Many Australian businesses that have tried to implement advanced AI have failed — not because the technology is immature, but because they approached it as a pilot rather than a production project.
Pilot mindset means: a defined endpoint (the proof of concept), a success criterion (the demo works), and an implicit assumption that production is a separate conversation. The problem is that production never becomes a separate conversation in organisations that are not structured for it. The pilot succeeds, sits in a sandboxed environment, and slowly becomes irrelevant.
The businesses in the RBA's one-in-three group approached this differently. They defined production success criteria before they started — specific business metrics, specific measurement timeframes, specific ownership of outcomes. The AI system was designed for production from day one, not retrofitted after a pilot.
4. The ROI case was not made clearly enough
Boards and CFOs who have watched AI investments produce slide decks rather than results are increasingly sceptical. The request for "more budget for AI" in 2026 requires a fundamentally different business case than it did in 2023.
The case needs to be specific: not "AI will help us be more competitive," but "this demand prediction system, applied to our top 200 SKUs, will reduce stock-outs by an estimated 40% and reduce overstock by 15%, saving approximately AUD $340,000 in annual carrying cost and lost sales, against an implementation cost of AUD $45,000 and annual running costs of AUD $12,000."
The businesses that make this case — with a named measurement owner, a defined baseline, and a 90-day reporting commitment — are the ones getting budget approved. Those making the generic case are not.
For the One Third Already There: The Shadow AI Risk
The RBA finding that one in three businesses are using AI for advanced tasks is almost certainly an undercount of total AI usage within those organisations. The thing that CIOs in Australian businesses are losing sleep over right now is not the AI they know about — it is the AI they do not.
Shadow AI — the use of AI tools by employees without formal IT or security approval — is the 2026 version of shadow IT. And in Australian organisations, it is endemic.
A marketing team member uses Claude to analyse competitor pricing data — including information about customer segments. A finance analyst pastes cash flow projections into ChatGPT to generate a board narrative. A HR business partner uses an AI tool to screen job applications, evaluating hundreds of candidates against criteria that were never formally reviewed for bias or compliance.
None of these are in the IT asset register. None have been assessed under the Privacy Act. None have audit trails. And under the December 2026 automated decision-making transparency obligations, several of them — particularly the HR screening and financial decision support use cases — are already creating compliance exposure.
Why Shadow AI Is an Australian Compliance Problem Specifically
The Privacy Act obligations that take effect in December apply regardless of whether the AI system was formally approved. If an Australian APP entity uses any computer program to substantially assist in making a decision that significantly affects an individual, the transparency obligations apply.
The OAIC's first proactive compliance sweep, launched in January 2026, targeted approximately 60 organisations across six sectors. The regulator is not waiting for complaints. It is actively checking. And it is specifically looking at AI-related personal data handling.
For the one in three Australian businesses that have deployed advanced AI — and the many more that have significant shadow AI usage — the question is not whether compliance is required. It is whether the systems they are actually using can meet the compliance requirements if checked.
Most cannot. Yet.
The Practical Response to Shadow AI
The businesses that are managing this well have adopted a three-part approach:
First, a shadow AI inventory. A structured survey and technical audit of what AI tools employees are actually using — not what IT has approved, but what is actually running. This typically surfaces three to five times more AI usage than the formal register shows.
Second, a triage and decision framework. For each shadow AI tool identified: is the use case creating meaningful compliance exposure? Does it process personal information about individuals in ways that could significantly affect them? For tools that do, the options are: formal adoption with compliance controls, migration to an approved alternative, or restriction.
Third, policy update. An AI acceptable use policy that is actually communicated, understood, and enforced — covering which tools are approved, which categories of data can be used with AI tools, and what the process is for evaluating new tools before adoption.
None of this requires stopping AI use. It requires making AI use visible and manageable.
The Competitive Pressure From the RBA Number
The RBA finding has a competitive implication that Australian boards should sit with for a moment.
If one in three businesses in your sector are using AI for advanced demand prediction, inventory analysis, and decision automation — and you are not — those businesses are making better decisions faster, with lower error rates, than you are. They are carrying less dead stock. They are purchasing more efficiently. They are identifying demand signals earlier.
The gap compounds. A business that has been running demand prediction AI for 18 months has 18 months of model improvement, data refinement, and process optimisation that a business starting today does not have. The early-mover advantage in AI is real and is not theoretical.
For Australian mid-market businesses, the competitive window for getting advanced AI into production without having to play catch-up is narrowing. The one-in-three number is growing. The question is whether your business is in the one third leading or the two thirds following — and what you are going to do about it.
What to Do in the Next 90 Days
The 90-day horizon matters because it is both the window for demonstrating AI value under current board pressure and the remaining preparation time before mid-2026 accountability conversations.
For the two thirds not yet doing advanced AI:
The fastest path to production is a focused scope. Pick one workflow — demand forecasting for your top product lines, or inventory analysis for your highest-cost categories — and run a structured 2-week assessment of your data readiness, compliance requirements, and ROI case. If the data is accessible and the compliance path is clear, a 6–8 week build gets you to production with a measurable result.
Do not try to boil the ocean. The businesses in the one-in-three group that got there first did not transform everything simultaneously. They shipped one thing, measured it, and expanded.
For the one third already running advanced AI:
The December 2026 compliance deadline is eight months away. The OAIC sweep is already running. If your deployed AI systems do not have full audit trails and the capability to produce a meaningful explanation of any significant decision, you have a compliance gap to close.
Run the shadow AI inventory. Update the privacy policy to disclose your automated decision-making practices. And if any of your systems lack the observability layer required for explanation on request, plan the retrofit now — before the deadline, not after.
The Australian Market Context
The RBA research lands in a specific Australian business context: the same week as The Guardian reported on the AI-driven workforce restructuring happening at Atlassian, WiseTech, and Telstra. The same month that the Australian Government confirmed a five-year, multi-billion-dollar Microsoft AI deal for the public sector.
Australia is not watching AI happen from a distance. It is in the middle of it. The one-in-three number is not a ceiling — it is a current state. The businesses that will be in the one-in-three group in 12 months are already planning.
The only question for Australian mid-market business leaders is which side of that number they want to be on.
*Akira Data helps Australian mid-market businesses implement advanced AI that ships to production — demand prediction, inventory optimisation, complex decision automation — with Privacy Act compliance, full observability, and measurable ROI built in from the start.*
*The AI Readiness Sprint (AUD $7,500, 2 weeks) delivers a data readiness assessment, compliance posture review, and prioritised implementation roadmap. The Agentic Workflow Build (from AUD $25,000, 4–8 weeks) ships a production system with full audit trails and the explainability capability required for December 2026 compliance.*
*This article references Reserve Bank of Australia research cited in The Guardian, 14 March 2026.*
Share this article