The Exact Privacy Policy Template Australian Businesses Need for the December 2026 ADM Deadline
Every article explains the obligation. None gives you the actual words. Here is the complete, legally-grounded automated decision-making disclosure template — ready to insert into your privacy policy today.

AI PM at SOLIDWORKS. Founder, Akira Data.
*Published 22 March 2026.*
The OAIC launched Australia's first-ever proactive compliance sweep in January 2026 — 60 organisations, six sectors. The December 10 automated decision-making transparency deadline is 263 days away. Every week, Australian businesses ask a version of the same question:
*"We understand what we're supposed to do. Can someone just show us what to write?"*
This article does that. It gives you the actual template language — the disclosure section your privacy policy needs, the explanation request process your team needs to handle, and the internal audit trail you need to have running before December 10.
Use it as a starting point. Have your legal counsel review it for your specific situation. Then ship it.
The Legal Requirement in Plain English
From 10 December 2026, the Privacy and Other Legislation Amendment Act 2024 requires every Australian Privacy Principle (APP) entity that uses a computer program to make, or substantially assist in making, a decision that might significantly affect an individual to:
- Disclose in their privacy policy that automated decision-making is used, what kinds of decisions, and what categories of personal information are used
- Notify affected individuals when they have been subject to a solely automated decision on request
- Explain the automated decision on request, in a way that is meaningful to the individual
- Review decisions made solely by automated means, if the individual requests human review (for decisions in scope)
"Significantly affect" is defined broadly — decisions affecting access to services, employment, credit, insurance, housing, and similar material interests all qualify.
Who Is In Scope
You are in scope if your business:
- Has an annual turnover above AUD $3 million (or is otherwise subject to the Privacy Act — health service providers, government contractors, and others regardless of turnover)
- Uses AI, machine learning, algorithmic scoring, or automated workflows to make or substantially assist in making decisions about individuals
Common examples across Australian industries:
Financial Services: Automated credit decisioning, insurance premium calculation, claims triage, loan approvals, KYC/AML screening results that affect account access
Healthcare: Clinical decision support tools influencing treatment pathways, automated appointment prioritisation, insurance pre-authorisation systems
Professional Services: Automated candidate screening, performance management algorithmic scoring, automated contract or tender evaluation
Retail and eCommerce: Personalisation engines that affect access to offers or pricing, fraud risk scores affecting order fulfilment
Mining and Resources: Contractor pre-qualification scoring, safety incident pattern analysis influencing worker assignments
The Template
What follows is template language you can adapt. It is structured as a section to add to your existing privacy policy. Customise the bracketed fields for your business.
Template: Automated Decision-Making Disclosure Section
[SECTION HEADING: Automated Decision-Making]
How we use automated decision-making
[Business Name] uses automated systems — including artificial intelligence and algorithmic processing tools — to assist in making, or to make, certain decisions that may significantly affect you. This section explains what those systems are, how they work, and your rights in relation to automated decisions.
Types of decisions made using automated means
We use automated decision-making in the following circumstances:
[Customise this list for your business. Examples below — remove what does not apply, add what does.]
- [Credit or lending decisions]: We use automated systems to assess creditworthiness, calculate risk scores, and make preliminary lending decisions based on information you provide and data we hold about you.
- [Insurance assessment]: Automated systems assess your insurance risk profile, calculate premiums, and perform initial claims triage based on the information provided in your application or claim.
- [Employment screening]: We use automated tools to screen applications, assess candidate suitability, and prioritise applications for human review based on job requirements and application information.
- [Fraud and security screening]: Automated systems analyse transaction patterns and behavioural data to detect potential fraud or security risks that may affect your access to our services.
- [Service personalisation]: Automated systems determine what content, offers, or service options are presented to you based on your account history and stated preferences.
- [Other — describe specifically]: [Description of the specific automated decision and its significance to the individual.]
What personal information is used
The types of personal information used in automated decision-making include [customise: financial history, transaction data, behavioural data, identification information, application information, publicly available information, third-party credit bureau data, etc.]. We do not use [customise as appropriate: sensitive information such as health records, racial or ethnic origin, etc.] in automated decision-making without your explicit consent.
How our automated systems make decisions
Our automated systems use [describe at an appropriate level of generality: statistical models trained on historical data / rules-based scoring criteria / machine learning algorithms] to assess the information we hold and produce an output that is either used directly as a decision or presented to a human decision-maker for review.
[If humans review automated recommendations: "Automated assessments are reviewed by a qualified [team/person] before a final decision is made."]
[If decisions are made solely by automated means: "In the following circumstances, decisions are made solely by automated means without individual human review: [list specific circumstances]."]
Your rights
If you believe you have been subject to an automated decision that significantly affects you, you have the right to:
- Request disclosure: Ask us to confirm whether automated decision-making was used in a decision about you, and what personal information was used.
- Request an explanation: Ask us to explain the automated decision — the key factors that contributed to the outcome and how they were weighted.
- Request human review: Where a decision was made solely by automated means and you believe it was incorrect, ask us to have a qualified human review the decision.
To exercise these rights, contact us at [privacy@yourbusiness.com.au] with the subject line "Automated Decision-Making Request". We will acknowledge your request within [2] business days and respond substantively within [30] calendar days.
The Explanation Request Process
The template above is the public-facing disclosure. You also need an internal process for handling explanation requests. Here is a working template for that:
Automated Decision-Making Explanation Request: Internal Process
Step 1 — Acknowledge (within 2 business days)
Acknowledge receipt of the request. Confirm: (a) whether automated decision-making was involved, and (b) the expected response timeline (maximum 30 days under our policy).
Step 2 — Retrieve the decision record (within 5 business days)
Pull the decision audit log for this individual. The log must contain:
- Decision timestamp
- Input data snapshot (the exact data state at decision time)
- Model or algorithm version used
- Intermediate outputs (scoring components, flagged criteria)
- Final output and decision rationale
- Human review, if any (reviewer ID, review timestamp, override decision if applicable)
If your current systems do not produce this record automatically, you do not yet have the audit infrastructure required for December 10 compliance.
Step 3 — Prepare the explanation (within 20 business days)
Translate the technical audit record into a plain-English explanation addressing:
- What decision was made
- What the key factors were that contributed to the outcome
- How those factors were weighted (at an appropriate level of generality — not model internals)
- What information the individual could change to influence a different outcome (if relevant)
The explanation must be meaningful to a non-technical reader. "The model output a score of 0.73 using 847 features" is not a meaningful explanation. "Your application was declined primarily because the credit bureau data we assessed showed three or more missed payments in the past 24 months" is.
Step 4 — Deliver the explanation (within 30 calendar days)
Deliver in writing. Retain a copy of the explanation and the delivery timestamp in your records.
Step 5 — Human review (if requested)
If the individual requests human review of a solely automated decision, escalate to [nominated role — e.g. Customer Decisions Review Officer]. Human reviewer must: (a) review the full audit log and original inputs, (b) make an independent assessment, (c) document their reasoning separately from the automated output, (d) communicate the outcome to the individual in writing.
The Audit Infrastructure You Need Behind the Template
The disclosure and explanation process above are worthless without the technical infrastructure to support them. This is the part most businesses do not have in place.
What you need before December 10:
1. Decision registry — a list of every automated decision type your business makes or substantially influences, mapped to:
- The personal information categories used
- The system or tool making the decision
- The typical significance to the affected individual
- Whether human review is involved
2. Decision audit logs — for every automated decision, a timestamped, immutable log containing:
- The individual's identifier (pseudonymised in storage, de-referenceable on a valid request)
- The input data snapshot at decision time
- The model or algorithm version
- The output (score, recommendation, decision)
- Any flags, escalations, or human review actions
3. Audit log retention policy — minimum 7 years for decisions in financial services; discuss with your legal team for your industry. The log is useless if it is deleted before someone makes a request.
4. Explanation request intake — a named email address (or intake form), an acknowledged receipt process, and a documented escalation path.
5. Human review pathway — a nominated individual or team with authority to review and override automated decisions, and a documented process for doing so.
The Common Mistakes to Avoid
Mistake 1: Disclosing at too high a level of generality
"We may use automated tools to improve our services" does not satisfy the requirement. The disclosure must identify the specific types of decisions made by automated means and the categories of personal information used for each.
Mistake 2: Writing the disclosure without the infrastructure
Publishing a disclosure section that promises explanation and review rights you cannot actually fulfil is worse than not publishing it — it creates a specific, documented commitment you cannot meet, and the OAIC enforcement team will read your privacy policy.
Mistake 3: Assuming low significance = out of scope
The threshold is decisions that "might significantly affect" an individual. The OAIC has indicated this will be interpreted with consumer protection intent. When in doubt, disclose.
Mistake 4: Treating this as a one-time fix
Your privacy policy disclosure must stay current. If you deploy new automated decision-making systems after December 10, the disclosure must be updated before those systems go live. Build the update process now.
Mistake 5: Forgetting third-party tools
Your CRM's AI-assisted lead scoring, your HR platform's performance analytics, your accounting software's fraud flag — if these make or substantially influence decisions about individuals, they are in scope. Review your vendor stack, not just internally built tools.
The 30-Day Compliance Checklist
Week 1:
- [ ] Inventory every automated decision-making system you use (including third-party tools)
- [ ] For each system, document: what decision, what personal information, human review or not
- [ ] Assess current audit log infrastructure — what decision records can you actually retrieve?
Week 2:
- [ ] Draft the ADM disclosure section using this template, customised for your decision types
- [ ] Have your legal counsel review the draft
- [ ] Identify the gaps between your current audit log capability and what the process requires
Week 3:
- [ ] Build or configure the audit log infrastructure (this is the engineering work)
- [ ] Document the explanation request intake and internal response process
- [ ] Nominate the human review responsibility and document the pathway
Week 4:
- [ ] Publish the updated privacy policy
- [ ] Test the explanation request process end-to-end with a synthetic case
- [ ] Brief relevant staff (operations, customer service, compliance) on how to handle ADM requests
What Akira Data Does Here
We build the technical infrastructure — the decision audit logs, the explanation retrieval process, the monitoring that alerts you when automated decision rates shift materially. We also help you write the disclosure language and implement the internal process.
The Privacy-Safe AI Implementation engagement (AUD $20,000, 4–6 weeks) covers the complete compliance build: gap analysis, audit infrastructure, privacy policy update, explanation process design, and staff briefing. For organisations that already have systems deployed and need to retrofit compliance, we have a compliance retrofit sprint (AUD $12,000, 3 weeks) focused on documentation and audit log infrastructure for existing systems.
The OAIC compliance sweep is active right now. December 10 is 263 days away. If you are an APP entity using AI to make decisions about individuals and you do not have the disclosure or the infrastructure in place, the time to start is this week.
[Get a compliance assessment →](/contact?type=privacy-compliance)
Share this article