Bunnings CEO now wants the Government to pass laws to make facial recognition legal after Bunnings was found to have breached the Privacy Act when using facial recognition
July 17, 2025 |
In November last year Bunnings was found to have breached the Privacy Act 1988 in its use of facial recognition technology without consent and failed to take reasonable steps to notify individuals that their personal information was being collected. There was no meek apologies from Bunnings. it came out saying it had every right to use facial recognition and that it was the most effective way to combat rising crime. It has appealed. Now the The Managing Director of Bunnings has come out in an AFR article complaining about the privacy law and wanting new laws to allow facial recognition in stores. That is very curious. To call for a change to the law while appealing a decision involving the extant law is not illegal. But it is quite arrogant. Changing the law to allow for such a carve out would significantly damage the operation of the Privacy Act.
Meanwhile in CBA using facial recognition logins to verify disputed payment the CBA is showing itself to be an enthusiastic user of facial recognition.
The AFR article provides:
Bunnings managing director Michael Schneider has called for privacy laws to be changed to allow the use of facial recognition in stores to reduce shoplifting and protect staff.
In a submission to a Productivity Commission review, Schneider pushed back on a Privacy Commissioner ruling against Bunnings’ earlier use of facial recognition, which found it had breached the privacy of thousands of customers.
“These technologies are essential to protecting team members and customers from rising incidents of violent and threatening behaviour across the retail sector, and other losses that come from retail crime,” Schneider wrote.
He added that the technology could be used safely, responsibly and ethically.
In November, Privacy Commissioner Carly Kind found Bunnings had breached the privacy of thousands of customers by using facial recognition technology without their knowledge or consent.
The CCTV system captured the faces of every person who entered 63 Bunnings stores in Victoria and NSW between 2018 and 2021 to identify repeat offenders from a database of images. The images were deleted in 4.17 milliseconds if they were not known offenders.
At the time, Kind said, while the technology may have been a “well-intentioned effort to address unlawful activity” its possible benefits did not outweigh the impact on privacy rights and society’s values.
Bunnings later released a video compilation showing staff being attacked in its stores, to illustrate the threats faced.
“Just because a technology may be helpful or convenient, does not mean its use is justifiable,” Kind said.
Schneider said the retailer disagreed with the finding that its use of facial recognition interfered with individuals’ privacy and has lodged an appeal of the Privacy Commissioner’s decision with the Administrative Review Tribunal.
He said Bunnings has recorded a 50 per cent rise in abusive and threatening incidents in its stores in recent years and “up to 60 to 70 per cent of incidents are caused by a small group of repeat offenders”.
The Privacy Commissioner’s determination did not impose any penalties on Bunnings, but it has restricted the Wesfarmers-owned retailer’s use of facial recognition tools in its stores.
In his submission to the Productivity Commission review into how new technology such as artificial intelligence should be used, Schneider recommended the Privacy Act be reformed to keep pace with changes ushered in by AI.
“Bunnings is concerned with the evolving interpretation of privacy regulations, which appears to prioritise prescriptive compliance over balanced outcomes. Regulators should focus on providing clear guidance and direction to give businesses confidence to act,” he wrote.
A submission by technology giant Meta meanwhile argued that AI could deliver an estimated $US53 billion ($80.9 billion) to US$127 billion in economic benefits by 2034, according to research from Deloitte.
The Facebook owner also called for the Privacy Act to be reformed and to mandate the Privacy Commissioner to consider “innovation and economic interests” while also protecting individuals’ rights.
“By explicitly tasking the Commissioner with these, the amended Privacy Act will more strongly encourage a holistic approach to enforcement. This ensures that privacy efforts are aligned with broader economic and innovation objectives,” Meta wrote.
Google’s submission urged the government to avoid heavy-handed AI regulation, such as the EU’s AI Act, which the tech giant argued would create a “chilling effect” on innovation.
The federal government introduced a voluntary standard on AI safety last year, and had been developing plans to introduce mandatory safeguards for AI in high-risk industries. However, Industry and Innovation Minister Tim Ayres is yet to reveal the government’s position on specific laws governing AI.
The tech giant claimed uncertainty surrounding regulation of AI systems and applications could cause tech companies to invest elsewhere.
“Businesses, especially those operating at the cutting edge of AI development, may hesitate to commit significant resources if the precise interpretation and application of the rules in Australia remain unclear, are overly restrictive, or if they anticipate future modifications that could render current investments non-compliant,” Google wrote.
The itnews article about the CBA’s use of facial recognition technology provides:
Data-matching technique used in an unfair dismissal case.
CBA is using facial recognition logins to its banking app to determine whether customers who dispute a transaction did in fact authorise the payment.
The investigative technique has come to light in an unfair dismissal case, where an employee was sacked after disputing “multiple transactions” totalling $500 from an unknown merchant on his personal bank account.
The now former employee said “the transactions in question were processed through QR code ordering” at a pub, “but were … handled by a third-party point-of-sale company.”
That meant the transactions did not show up on bank statements in the name of the pub.
“This naturally leads to confusion, as the transaction description does not clearly reflect the actual venue name,” the former employee said in written submissions to the Fair Work Commission.
“At the time of lodging the [transaction] disputes, I could not recognise the merchants involved.
“The right to dispute an unrecognised transaction is a basic consumer right, and I exercised that right in good faith – as any bank customer is entitled to do.”
CBA not only rejected the $500 claim, but also accused the man of lodging the dispute with fraudulent intent, commenced disciplinary action against him in his capacity as an employee, and ultimately dismissed him from his role.
A “serious misconduct” dismissal is likely to prevent the man from being re-employed in the finance sector.
In addition to the $500 claim, the man also disputed a different, unknown $49.97 transaction at the same time. This was refunded by the bank.
Face recognition data
One aspect of the case likely to raise questions is the bank’s admission that it used facial recognition authentication records in some capacity to try and prove who made the disputed transactions.
“[CBA]’s case … (in short summary) is that it investigated the disputed transactions and determined that the applicant must have been responsible for them,” the commission’s deputy president Gerard Boyce wrote.
“This is because the applicant was at the … venue on the day that the disputed transactions were made, and/or facial recognition software (embedded within the Commonwealth Bank app) was used to make and/or view (review) the transactions.”
The ex-employee countered that “his cousin, who he says shares access to his phone’s facial recognition capabilities, could [also] have been responsible for the transactions.”
CBA enabled iPhone users to log into its banking app using Face ID back in 2017.
CBA’s privacy policy for its app states that it doesn’t “collect or store [biometric] information in the CommBank app”.
However, it appears that any time a smartphone-based payment is either authenticated using facial recognition, or the user logs into the app with facial recognition to check on their transactions, this is logged by the app, and these logs can be used as a data point in investigations.
How detailed the logs are – and how definitively they tie a specific individual to a transaction – is not clear.
Also unclear is whether the user consent collected at the time the app is downloaded would cover this use of facial recognition-related data.
The unfair dismissal case remains unresolved.
In response to detailed questions from iTnews, a CBA spokesperson said: “We do not comment on matters currently before the courts.”