The Price of Security: How Bunnings’ FRT Ignited Privacy Concerns

Between November 2018 and November 2021, Bunnings Group Limited (Bunnings) implemented Facial Recognition Technology (FRT) across 62 retail stores in Australia with the aim of mitigating risky conduct that had the potential to adversely impact the safety and security of individuals in its stores. It was determined that Bunnings interfered with the privacy of individuals whose personal and sensitive information was collected through this FRT system. Below, Corporate + Commercial lawyers, Amogh Kadhe and Zahra Rashedi, share why this determination serves as an important reminder for businesses to consider an assessment of their operations and their obligations under Privacy laws in Australia.

Privacy Concerns over Facial Recognition Technology

The Office of the Australian Information Commissioner (OAIC) initiated an investigation into Bunnings' practices, finding multiple breaches of the Privacy Act 1988 (Cth) (Privacy Act) and the Australian Privacy Principles (APPs) sitting under the Privacy Act. These breaches included:

·       APP 3.3: Bunnings collected sensitive information, including biometric data and criminal records, without the individuals’ informed consent or meeting any exceptions under APP 3.4.

·       APP 5.1: Bunnings did not adequately notify individuals about the collection, purpose, or intended use of their personal information or ensure that its customers were aware of these matters.

·       APP 1.2: Bunnings failed to implement reasonable practices, procedures, and systems to ensure ongoing compliance with the Privacy Act.

·       APP 1.3: Bunnings’ privacy policies omitted critical details about the type of personal information collected, how it was collected, and how it was managed.

As a result, the Privacy Commissioner determined that Bunnings’ practices constituted an interference with individuals’ privacy and ordered the company to cease these practices, provide public disclosure of its use of FRT, and take remedial actions to ensure future compliance.

Why is this an important case study when looking at FRTs?

This case is significant because it highlights both the opportunities and risks associated with emerging surveillance technologies like FRT in commercial settings and the need to thoroughly assess operations within the business in regards to privacy obligations. Some lessons include:

  • Sensitive information and biometric privacy: Biometric data, such as facial images processed for automated identification, is considered sensitive information under the Privacy Act and is subject to stringent protections. This case emphasises that businesses cannot collect such information casually or indiscriminately.

  • Consent and transparency are paramount: The determination underscores that individuals must give informed, voluntary, current, and specific consent before their biometric data is collected. Displaying vague notices or relying on implied consent (e.g. entry into a store) is insufficient.

  • Balancing safety and privacy: While FRT offers potential security benefits, such as identifying high-risk individuals and preventing theft, its use must be balanced with privacy obligations and impacts on personal privacy. This ruling reveals the societal unease around pervasive surveillance technologies in public spaces and the high expectations for businesses to respect privacy rights.

  • Precedent for corporate accountability: The determination sets a standard for Australian businesses using surveillance technologies to proactively assess and mitigate privacy risks through measures like Privacy Impact Assessments, compliance audits, and robust privacy by design practices.

  • Limitations of technology: Whilst not explicitly addressed in the determination, this case serves as a basis to consider other concerns about the accuracy of FRT systems, including issues of bias, false positives, and their potential for misuse or overreach in commercial environments.

Recommendations for business’ considering advanced surveillance

Finally, this determination provides some recommendations for businesses considering advanced surveillance technologies:

  • Compliance is non-negotiable: Companies deploying FRT or similar technologies must strictly adhere to the Privacy Act and the APPs, especially those relating to sensitive information. Non-compliance risks significant reputational and operational consequences, as demonstrated by the public nature of this determination.

  • Transparency and accountability: Businesses are required to clearly inform individuals about data collection practices, including what is being collected, why it is being collected, and how it will be used. This must be done in a way that is accessible and understandable to the general public.

  • Necessity and proportionality: Businesses should be able to justify their use of technologies like FRT by proving their necessity and showing that less intrusive alternatives were considered and found inadequate. This includes balancing the benefits of such technologies with their privacy impact.

  • Broader industry implications: The decision establishes a precedent for how the OAIC evaluates the use of surveillance technologies, prompting other organisations using FRT or similar systems to re-evaluate their practices. The Privacy Commissioner explicitly encouraged all APP entities to undertake Privacy Impact Assessments if currently using FRT and reassess their compliance frameworks in light of this ruling.

  • Future scrutiny and reforms: This ruling reflects the growing regulatory scrutiny of emerging surveillance technologies in Australia. It signals potential for stricter legal reforms and oversight, ensuring that technological advancements do not come at the expense of fundamental privacy rights.

If your business is looking to implement new surveillance technologies, conduct a Privacy Impact Assessment, or needs assistance with any other privacy-related matters, our commercial team is here to help. Reach out here.

Previous
Previous

Building Modern Law Firms: Lessons from Leadership

Next
Next

Personalised Persuasion: The hidden risks of AI and Consumer Protection Gaps