OAIC’s Decision A Warning Re Use Of Facial Recognition Technology

As has been widely reported in the media, the Office of the Australian Information Commissioner’s (OAIC) decision concerning the use of facial recognition technology (FRT) by Bunnings Group is a salutary lesson in just how complex it can be to balance business need with legal compliance.

In a nutshell, Bunnings used FRT to maintain an intelligence repository of individuals who posed a risk to safety, security or property. Over a period of 3 years, the organisation’s CCTV system captured the facial image of every person who entered one of its 62 stores, including staff, visitors and contractors. The FRT created a searchable ‘real-time facial image’ from facial vector sets. Each vector set was then compared against a database of ‘suspicious’ individuals and a match triggered an email alert to a specialist team.

While Bunnings’ actions appear to be reasonable and well intentioned, the OAIC found them to be in breach of Australia’s privacy laws. The OAIC rejected the organisation’s position that no personal information had been processed by FRT in relation to non-matched individuals because their data was only held momentarily and automatically deleted within an average of 4.17 milliseconds of detection.

The outcome will impact most sectors

It is a decision that affects most organisations. The essential question was how the organisation met the requirement of ‘necessity’ in relying on so-called ‘permitted general situations’ under the Privacy Act 1988 (Privacy Act) for serious threat and unlawful activity or misconduct, to justify the FRT deployment. The OAIC considered that any such deployment must be proportionate, a legal requirement based on the objects of the Privacy Act.

Whilst acknowledging the accuracy, practicability and cost-effectiveness of the FRT system, the collection of sensitive personal information of thousands of people by FRT was considered by the OAIC to be disproportionate and not justified in the context of action taken in respect of unlawful activity on a relatively small number of occasions and in respect of a relatively small number of individuals, where less intrusive alternatives existed.

It’s a matter of principle

Under Australian Privacy Principle (APP) 1.2, organisations must take such steps as are reasonable in the circumstances to implement practices, procedures and systems relating to their data handling. The OAIC held that it would have been reasonable for Bunnings to have conducted a privacy impact assessment (PIA) before the deployment of FRT – which it did not do.

The placing of individuals on the database was done by a specialist team, based on actual or threatened violence, suspicion of committing repeated organised retail crime, inappropriate behaviour resulting in a prohibition notice, serious theft, and other criminal conduct. Some images were provided by law enforcement. Crucially, despite a variety of reasons for placing a person on the internal database (each of which would warrant a separate PIA), there was no policy defining the criteria for doing so.

Whilst many may feel sympathy for the retailer, the Commissioner is clearly seeing the bigger picture and is firmly on the side of protecting individuals’ privacy. This was reflected in Commissioner Kind’s subsequent statement: “We acknowledge the potential for facial recognition technology to help protect against serious issues, such as crime and violent behaviour. However, any possible benefits need to be weighed against the impact on privacy rights, as well as our collective values as a society“.

What can we learn?

This decision is a wake-up call for all organisations who deploy, or are considering the deployment of, FRT systems, including transport, retail, property or private security services, as well as employers relying on FRT for access control, workflow or health and safety risk management. The OAIC has produced a succinct infographic summarising its findings, which is available here.

Biometric information is of an intrinsically private nature and more permanent than other data. It can be used to uniquely identify an individual in a range of different contexts. Perhaps rightly the compliance burden is high and obtaining pragmatic but realistic privacy legal advice is more crucial than ever before.

What can your business do?

Some recommended practical steps include:

  • Conduct a PIA. In doing so, define your specific objectives and assess each of them in an objective manner and without a foregone conclusion in mind. Put yourself in the shoes of the individual rather than those designing corporate strategy.
  • Carry out due diligence on your FRT provider. Do your own checks and do not rely on the provider as the sole source of compliance advice, transparency, and training.
  • Negotiate appropriate terms with your FRT provider including SLAs for compliance queries and do not take on risks of new technologies that should rightly sit with the provider promoting them.
  • Do not rely on consent unless there is no other lawful way of deploying FRT. The OAIC felt compelled to clarify that indeed no valid, informed, voluntary, current and specific consent could be deemed to have been given by individuals entering the store.
  • Remember that ‘permitted general situations’ have specific narrow parameters which much be met or else risk non-compliance.
  • Provide mandatory transparency information to individuals as required under APP 5 determined by practicability, medium and audience demographics. Change your perception of transparency as a mitigation measure rather than a source of risk and liability.

Do you want more information?

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Alex Dittel Alex Dittel

Alex leads our Data Privacy, Cyber and Digital practice. He brings 15 years of experience in data protection, information security and technology commercial matters acquired during his time working for big and small technology companies and law firms in the United Kingdom and Australia.

Melbourne - Australia

More from Alex Dittel

English