Out-Law Analysis 8 min. read
10 Feb 2026, 12:41 am
In a significant development for Australian privacy and technology governance, the Administrative Review Tribunal released its decision following a review of the Australian Privacy Commissioner’s decision on Bunnings’ use of facial recognition technology (FRT), which will help shape how FRT is deployed.
The decision should provide further clarity for businesses using these technologies in retail and public settings about what steps they should be taking to inform individuals when their personal information is collected, the conditions that must be met and steps that should be taken to permit the use of FRT to collect sensitive data without obtaining consent and ensure compliance with the APPs.
The Tribunal partially overturned the Privacy Commissioner's 2024 determination in the case, finding that Bunnings’ collection of biometric data was justified as a 'permitted general situation' exception in section 16A of the Privacy Act 1988, given its limited purpose of addressing serious retail crime. However, the Tribunal upheld the Privacy Commissioner's other findings.
Bunnings, a major warehouse retailer, maintained a database of around a few hundred people who it had assessed as posing a risk to its operations based on a number of criteria relating to their violent or criminal conduct, or suspected conduct, who it ‘enrolled’ in the database. The database held the unique biometric vector points of their images sourced from its CCTV and from state police. In a pilot between 2018 and 2019, instore CCTV cameras captured real time facial images of people entering around 60 of its stores that were then converted to an input vector dataset to compare with the vector sets in the database. A match would trigger an alert to a Bunnings employee who could then act, such as by calling the police or removing the person from the store. Unmatched images were automatically deleted by the FRT system within roughly four milliseconds.
Following an investigation commenced by the Privacy Commissioner in 2022, it was determined that Bunnings’ use of FRT had breached its obligations under three key APPs:
Bunnings applied to review the decision and, following a hearing in October last year, the Tribunal published its decision and reasons. The decision affirms the Privacy Commissioner's determination that biometric data had been collected by Bunnings and in relation to all of the APP breaches, except, critically, the finding that Bunning’s should have obtained consent and could not have relied on a permitted exception, or 'lawful basis', for collecting facial images using the FRT.
With the assistance of expert evidence, the Tribunal found that, when operating the FRT system, Bunnings “collected facial images of as many people as possible who came into the Bunnings stores” through the CCTV, from which the input vector sets were created by Bunnings, and that it collected both the matched and unmatched data on its local servers. The Tribunal did not consider that the speed of the matching process meant collection had not happened and agreed with an earlier decision of the Tribunal which found that “facial images collected by an entity for the purposes of biometric identification” was biometric information and therefore sensitive information for the purpose of the Privacy Act.
APP 3.3 prohibits the collection of sensitive information about an individual such as their biometric data without consent, unless one of the exceptions in APP 3.4 apply. This includes the existence of one of the ‘permitted general situations’ outlined in section 16A(1) of the Privacy Act.
The Tribunal accepted that the conditions were met for the ‘permitted general situation’ in item 2 of section 16(1) in relation to the biometric data collected by Bunnings because it:
Bunnings only had to satisfy the ‘reason to suspect’ condition for the exception to be triggered, which the Tribunal considered to be a relatively low bar. It found that, based on evidence from Bunnings’ employees, there had in fact been unlawful instore activity. The Privacy Commissioner had also accepted that Bunnings had reason to believe that misconduct, such as suspected theft, refund fraud, actual or threatened violence and other forms of abusive or threatening behaviour, was sufficient to give rise to a criminal offence and was occurring in its stores. Bunnings also had the practice of issuing a prohibition notices to people no longer allowed in their stores due to posing a serious threat to health and safety of employees or other customers, or who had stolen a minimum amount of goods.
In relation to the second condition Bunnings had to meet, the Privacy Commissioner argued that the meaning of “necessary” means “essential”, rather than “appropriate and adapted”. Although not defined in the Privacy Act, the APP guidelines define “necessary” for the purposes of the permitted general situations as “something more than merely helpful, desirable or convenient, but not essential or indispensable”. This definition, which Bunnings relied on, was accepted by the Tribunal.
Bunnings argued that by restricting the scope of its collection practice to repeat offending of theft, violence or abuse, it was an appropriate use of FRT to monitor and remove identified people from its stores due to their past conduct and likelihood of reoffending. The Tribunal rejected the Privacy Commissioner's argument that the use of FRT was inappropriate, including because it was not designed to and did not in fact address the broader issue of the unlawful conduct more generally. Drawing on the evidence of Bunnings' employees, which detailed significant criminal activity which affected profits and the impact that the abuse had on employees, the Tribunal accepted that Bunnings’ belief was reasonable based on the objective circumstances.
However, the Tribunal affirmed the Privacy Commissioner's findings that Bunnings had not complied with its APP 5.1 obligations to notify individuals of its use of FRT to collect their facial images, had not complied with APP 1.3 to keep its privacy policy up to date by disclosing this practice, and had not met its APP 1.2 obligations by taking reasonable steps, given the circumstances of the collection of sensitive information and impact on its privacy, to ensure it could comply with its APP obligations.
The Tribunal’s decision provides important direction for Australian businesses deploying FRT or other systems to collect and use sensitive data like biometrics. It means they can rely on the permitted general situation exception, which involves the balancing of the protection of privacy with the interest of entities in cases where there they can show for example, that there are serious, ongoing threats to their operations or the safety of their staff that require effective mitigation. However, the Tribunal’s reasoning means that this exception is far from a broad permission. To have a reasonable belief that the collection, use or disclosure of the information in question was necessary, an entity must be able to demonstrate proportionality regarding the benefits of the activities and the suitability of the chosen technology to address the issue, as well as the ineffectiveness of less intrusive alternatives and the use of appropriate data minimisation features.
The Tribunal heard evidence of the effectiveness of the FRT in relation to identifying known repeat offenders, reduced theft, safety of staff. In addition, there was human review for false positives the system produced, and the system was one of several controls Bunnings had in place. A further important factor was that it considered that the security environment in which Bunnings operates is significantly different to other retailers.
Even when an entity can lawfully collect an individual’s sensitive data, the Tribunal’s decision reinforces the importance of taking both reasonable steps to comply with the APPs and maintaining a clear and up-to-date privacy policy. Both the Privacy Commissioner and Tribunal found that Bunnings should have undertaken a structured and documented privacy risk assessment in relation to its use of FRT, had not developed or maintained adequate privacy policies that disclosed its use of FRT and did not provide clear and timely notice to customers that facial recognition was being deployed in its stores.
The Tribunal found that early signage referring only to “video surveillance” was found to be insufficient, and later notices stating that surveillance “may include facial recognition” still failed to communicate the system’s actual operation, purpose and consequences. The Tribunal emphasised that entities cannot rely on technical complexity or store environments as justification for failing to provide clear notice.
The Tribunal’s finding that even the momentary capture of facial recognition data constitutes “collection” also confirms that the practices of emerging technologies processing personal information if only briefly, can fall within the scope of the Privacy Act.
In a statement issued following the decision, the Privacy Commissioner highlighted the Tribunal’s ruling that even momentary capture of biometric information constitutes collection and reiterated that the adoption of emerging technologies must be accompanied by strong privacy governance, which includes comprehensive privacy impact assessments, explicit customer notification and transparent policy documentation.
The regulator also pointed to broader public concern about the challenges of protecting their personal information. Privacy advocates have expressed concerns about the Tribunal’s precedent of permitting covert biometric surveillance in retail environments, and the risk of normalising mass data collection, exacerbating biases and placing customers on watchlists without transparency or redress.
Unlike other jurisdictions, Australia does not currently have any specific laws that regulate FRT. The Privacy Act Review made several relevant recommendations in relation to using FRT and biometric information. In addition to the proposal to introduce a fair and reasonable test, these included introducing a requirement to conduct privacy impact assessments on activities with a high privacy risk and considering how to adopt enhanced risk assessment requirements for FRT and other uses of biometric information, which should be done as part of a broader consideration by government of the regulation of biometric technologies.
The review also recommended that the Privacy Commissioner should continue to develop practice-specific guidance for new technologies and emerging privacy risks and, specifically, the use of biometric templates or biometric information for the purpose of verification or identification, or when collected in publicly accessible spaces. However, while some of these recommendations were accepted by the government, none of them were included in the first tranche of the Privacy Act reforms that were passed in December 2023.
The outcome of the Tribunal’s decision provides some valuable clarification about how to comply with the APPs when collecting sensitive information and the requirements for using the technologies like FRT, which was a stated objective of the Privacy Commissioner in making her determination. The Tribunal’s decision is also important for defining when FRT technology may be used to address real safety and financial risks, while balancing the clear the need for transparency and trustworthiness.
Striking the appropriate balance between security and privacy in an era of rapidly advancing surveillance capabilities will continue to be a balancing act that requires weighing all relevant factors to support the decision to use technology which intrudes on privacy. As retail, transport, sports venues and other high traffic environments increase their use of biometric tools and employers seeks to protect their staff, it will be more important to set clear legal boundaries that address the risks, such as from profiling and errors, and to ensure there are avenues available for individual redress.
Co-written by Matt Wilson of Pinsent Masons.