Two individuals working on laptops at a desk with coffee cups and office supplies.

Running 63 columns of customer data through AI

The Task I Hadn’t Done Before

I recently had a client hand me a spreadsheet containing 63 columns of customer data. The brief was straightforward: go through each field and classify whether it could be input into an AI tool.

It was the first time I’d been asked to do this. But it won’t be the last. Businesses are starting to think seriously about using AI to process customer data, and that means someone needs to work out where the legal boundaries sit.

The First Instinct (and Why It Doesn’t Work)

My initial assessment was to restrict almost everything. If in doubt, keep it out of the AI. But I quickly realised that approach doesn’t hold up in practice.

One way or another, this data is going to be processed through AI tools. Customer records, transaction histories, account details, all of it is heading in that direction. A blanket restriction isn’t a compliance strategy. It’s just a delay.

A Pragmatic De-identification Framework

Instead of locking everything down, we built a system. The approach was grounded in a core principle of the Privacy Act 1988 (Cth): information that has been appropriately de-identified is not personal information and falls outside the Australian Privacy Principles (APPs) entirely.

The OAIC and CSIRO’s Data61 De-identification Decision-Making Framework confirms this position. De-identification involves two steps: removing direct identifiers and then taking additional measures to prevent re-identification.

We applied this to the spreadsheet as follows:

  • Strip out all identifying data. Any field capable of identifying an individual (directly or in combination with other fields) was excluded from the AI input. Names, addresses, dates of birth, contact details, account numbers linked to identifiable records.
  • Retain non-identifying data. Fields containing transaction patterns, service usage, preferences, and other behavioural or operational data that cannot reasonably identify an individual were kept in.
  • Use a single unique key. One synthetic identifier was assigned to each record. This key allows the business to reconnect the AI’s output back to the full customer record inside its own systems, without exposing that data to the AI.

The result: the AI processes only de-identified data. The business reconnects the output internally. The customer’s personal information never enters the AI environment.

Why This Matters Now

This approach isn’t just good practice. It’s becoming essential.

In August 2025, the OAIC concluded its investigation into I-MED Radiology Network’s disclosure of de-identified patient data to an AI company without patient consent. The OAIC found the data was sufficiently de-identified and took no regulatory action. But it also flagged that organisations must use recognised de-identification standards, document their methodology, and impose contractual obligations on data recipients to prevent re-identification.

More significantly, from 10 December 2026, new automated decision-making (ADM) transparency obligations under the Privacy Act will come into effect. If your business uses personal information in an automated or semi-automated decision that could reasonably affect an individual’s rights or interests, you will need to disclose that in your privacy policy, including the kinds of personal information used and the kinds of decisions made.

Processing only de-identified data through your AI tools is one of the clearest ways to stay on the right side of these obligations. If the AI never receives personal information, the ADM provisions may not be triggered at all.

Building the System, Not Just Solving the Problem

What struck me about this task was how few businesses have actually done it. Most are either avoiding AI altogether or feeding customer data in without thinking through the privacy implications. Neither approach is sustainable.

The better path is to build a repeatable process. Classify your data fields once. Establish a de-identification methodology. Document it. Review it periodically. This way, every time a new AI use case comes up, you have a framework to assess it against rather than starting from scratch.

This is what I mean when I talk about systems over fires. The fire, in this case, is the risk of a privacy breach. The system is a documented, compliant framework for getting AI to work on your data safely.

Practical Steps to Get Started

  1. Audit your customer data fields. List every column in your customer database and classify each one as identifying, potentially identifying, or non-identifying.
  2. Apply a de-identification methodology. Follow the OAIC/Data61 framework. Remove direct identifiers and assess re-identification risk in context.
  3. Create a unique synthetic key. Assign each record an identifier that has no external meaning but allows internal reconnection.
  4. Document the process. Record your methodology, your classification decisions, and your rationale. This is your evidence of compliance.
  5. Impose contractual safeguards on AI providers. If customer data (even de-identified data) is being processed by a third-party AI tool, ensure your contracts prohibit re-identification and restrict data use.
  6. Review your privacy policy. With the December 2026 ADM obligations approaching, update your privacy policy to reflect any AI processing of personal information.

The Bottom Line

The question is no longer whether your business will use AI on customer data. It’s whether you’ve built a legally compliant process for doing so. De-identification gives you a practical pathway: your AI gets the data it needs, your customers’ privacy stays intact, and your business avoids the regulatory exposure that comes with getting this wrong.

If you’re sitting on a customer database and wondering how AI fits in, start by asking what you can strip out rather than what you need to put in. The answer is usually simpler than you think.

Need help classifying your customer data for AI compliance? Get in touch with Attune Legal.