Metropolitan police in UK install first permanent facial recognition cameras in London

March 25, 2025

The Times reports that the first permanent facial recognition cameras have been installed in London.  It is a being touted as a pilot project but it may be precursor to the scheme being extended across London.  The Information Commissioner’s Office has released guidance on the use of the facial recognition, described as Biometric recognition.  It has also issued specific guidance for Live Facial Recognition Technology for police. There has been significant cases of misuse of facial recognition technology and its privacy implications. The misuse of facial recognition by police is well documented.  And it is misused by the private sector. In February 2024 the ICO ordered Serco Leisure to stop using facial recognition to monitor employee attendance.   The use of CCTV technology and facial recognition technology is more extensive in the United Kingdom than in Australia.  That said, the regulator is quite active in reviewing its operation and the legislation is more rigorous than in Australia. 

It is likely that the use of the facial recognition technology will quickly become more widespread, especially with the use of AI.  Doing so without propely adhering to the provision of the Privacy Act 1988 may attract the attention of the regulator.  On 19 November the Privacy Commissioner published a determination finding Bunnings use of facial recognition breached the Privacy Act.  On the same day the Privacy Commissioner published a guidance on the use of facial recognition technology. It is critical that organisations contemplating using this technology understand their obligations under the Privacy Act 1988.

The Times article provides:

Facial recognition cameras that scan for wanted criminals are being installed permanently on UK high streets for the first time.

The Metropolitan Police will permanently put up live facial recognition (LFR) cameras in Croydon, south London, as part of a pilot project that may see the scheme extended across the capital. Read the rest of this entry »

China publishes security measures on the use of facial recognition technology

March 23, 2025

In one of those “one for the books” events the Chinese agencies of Cyberspace Administration of China, in collaboration with the Ministry of Public Security have published security measures for the use of facial recognition technology. The measures will take effect on 1 June 2025. Given how intrusive Chinese authorities have been in the past with surveillance and the use of facial recognition technology it will be interesting to see how much of a real change will result.

The measures apply to activities using facial recognition technology, which is individual biometric recognition technology that uses facial information to identify an individual’s identity, to process facial information within China.

Interestingly the do not cover the processing of facial information from their scope for research and development or algorithm training purposes.

Under the measures, facial recognition activities must comply with applicable laws and regulations and, inter alia:

  • have a specific purpose;
  • be necessary;
  • minimizes the impact on personal rights and interests; and
  • implement strict protection measures.

Personal information handlers must, inter alia:

  • before processing, inform individuals in a prominent manner and clear and understandable language of certain information, such as contact information and purposes and method of processing;
  • inform individuals of any changes to the information provided to them;
  • when the processing is based on consent, obtain voluntary and explicit consent, including providing the right to withdraw consent;
  • when processing minor’s information, obtain the consent of a parent or other guardians;
  • stored information on facial recognition devices and not transmit it through the internet;
  • conduct a Personal Information Protection Impact Assessment (PIPIA) and include the contents outlined in the measures; and
  • if processing data of more than 100,000 individuals, notify the provincial-level or higher cybersecurity and informatization department within 30 working days, and provide the information outlined in the measures.

The measures require personal information handlers to Read the rest of this entry »

Office of the Information Commissioner attend Estimates

March 1, 2025


Senate Estimates is an annual event. For Governments it is a mandatory evil. For oppositions it promises to reveal a cornucopia of a information to embarrass the government and burnish its credentials. For the agencies, in particular the public servants who front the various Estimates Committees, it is a burden to be carried as part of the job. This year the Information Commissioner’s attendance before the Legal and Constitutional Affairs Legislation Committee proved to be no different. The Commissioner’s opening statement was the usual anodyne, nothing to see here, statement providing.

With the chair’s leave I take this opportunity to acknowledge the committee’s role and in doing so provide a brief opening statement outlining the important work of the Office of the Australian Information Commissioner (OAIC).

I appear today with the assistance of the FOI Commissioner Ms Toni Pirani and with the chair’s leave the Privacy Commissioner Ms Carly Kind appearing via link and Executive General Manager, Information Rights Ms Ashleigh McDonald.

Supported by our new organisational structure we are better positioned to operate as a contemporary and proactive regulator. Some of our recent initiatives and outcomes demonstrate our future direction. We have:

    • commenced preliminary inquiries into the privacy impacts of connected vehicles
    • commenced the development of a Children’s Online Privacy Code
    • developed a public facing dashboard to ensure that agency freedom of information (FOI) data is reported and presented more effectively
    • We will shortly deliver a report examining the use of messaging apps by Australian government agencies
    • We are building our strategic intelligence capabilities.

To deliver a proactive and contemporary regulatory approach to benefit the Australian community, agencies and industry alike, we will also focus on building staffing capabilities through an investment in new ways of working and professional development. Within our budgetary parameters, our technology and systems will also be a focus to support our new direction.

However, we are also mindful to deal with our core case management responsibilities and reduce our backlog in both FOI and privacy cases. Our resources are challenged by a 25% increase in FOI Information Commissioner review (IC review) applications compared to the same period last year. This is against a backdrop of an increase in FOI IC review applications over the last 5 years that is estimated to double the number of FOI IC review applications received in 2019–20. We also face an overall growth in privacy case work and increasing complexity in our case work arising from digital services and emerging technologies. This has a particular impact on our privacy case work.

Our enforcement capabilities have been assisted by an increase of funding in recognition of the complexities of enforcement. Similarly designated funding has been provided to the OAIC to develop the Children’s Online Privacy Code and guidance regarding the social media age limit.

Our appearance and preparatory papers are informed by data as at 15 January 2025.  However, to assist the committee, as at 23 February 2025 the OAIC 2024–25 case statistics are as follows:

    • 1,279 FOI review applications were received and 1,494 finalised.
    • 196 FOI complaints were received and 216 finalised.
    • 1,966 privacy complaints were received and 1,687 finalised.

During this period, we also finalised a number of complex privacy matters that have delivered a strong enforcement message and importantly established our expectations of the regulated community. In doing so, we are upholding the rights of privacy and information access enshrined in statute by the Australian Parliament and better serving the values and expectations of the Australian community.

I wish to acknowledge the significant work and expertise of the OAIC leadership in taking forward this major change program and recognise with gratitude OAIC staff for their dedication and commitment as we secure the fundamental human rights of privacy and information access in an increasingly complex environment.

The hearing before the Estimates Committee focused on the reduction in staffing in the office from 200 to 138 staff in the Office.  A 23% reduction in staff.  Also of interest is the Privacy Commissioner’s admission that the the findings of the Property Lovers determination is not being complied with.  In short, the behaviour complained of is continuing.  The Privacy Commissioner is investigating what to do next.  

An understaffed office is bad news for effective regulation.  That has been a chronic problem for this office.  Fortunately there will be a statutory tort as of June 2025 so in many cases individuals will not need to rely on the Commissioner taking up an investigation from a member of the public.

The Transcript provides:

CHAIR: With 20 minutes to go in our hearing, we’re going to politely and apologetically, dismiss the Australian Human Rights Commission. We won’t get to them this evening. We thank them for their time and for travelling. We do have questions for them, but we won’t have time to put them. We thank them for their ongoing work, particularly in the current environment. I know they’re working very hard. So thank you very much.

Welcome, commissioners. Do you have an opening statement you’d like to table?

Ms Tydd : I do have a very brief opening statement and I’m happy to table that.

CHAIR: Thank you very much. That will be circulated to senator so they can read from that when they have it in front of them. In the meantime, I’ll pass the call to Senator Scarr.

  Senator SCARR: Commissioner, how many staff have left the OAIC since August last year?

Ms Tydd : I don’t think I could speak with authority from the date of August, but I can give you the very high-level numbers of staffing pre and post our organisational redesign.

  Senator SCARR: Can you give me the dates for the organisational redesign, so I can calibrate that with my August date.

Ms Tydd : Yes. That was finalised in mid-November, about 17 November. The organisational redesign responded to our significant budgetary situation, in which we would be operating at a deficit. Action was taken around that. At the time, in July, we had an FTE of just over 200. Our organisational redesign that allowed us to operate within our budgetary parameters—

  Senator SCARR: Sorry; it’s late. I’ve got to get these numbers right. In July your FTE was just over 200?

Ms Tydd : Correct. And our ASL cap came down to 173. We knew that within our budgetary parameters we’d need to operate at around 165. We didn’t purely look at staffing levels in relation to meeting our budgetary parameters; we looked at a range of measures. They included external supply costs. Legal costs were something that we focused on as well. So, yes, we were required to reduce staffing in response to our revised budgetary parameters, and that process was completed around mid-November.

  Senator SCARR: Okay. What were the FTE numbers as at mid-November, when you completed that process?

Ms Tydd : There probably was still some lag. I’d say it would be about 175. I’ll see if I have any dates that will help you further. I can tell you that as at 18 December, as we were still working through that process, our staffing level was 175.

  Senator SCARR: Do you have the data as at today or the most recent data as at the end of the month? Do you have any most recent data?

Ms Tydd : As at 29 January, it was 138.4.

  Senator SCARR: So you went from 175 as at 18 December—that was the figure you gave?—

Ms Tydd : Correct.

  Senator SCARR: to 138.4 as at 29 January?

Ms Tydd : That’s correct, with a headcount of 156.

  Senator SCARR: Okay, so you’ve got part-time—

  Senator SHOEBRIDGE: So as we don’t have to traverse across this, do you mind if I ask: you’ve been talking FTE all the time through, so these have all been the same dataset of FTE, full-time equivalents?

Ms Tydd : Yes.

  Senator SCARR: So you went from—we’ll try and use the common terminology—FTE as at 18 December of 175 to FTE as at 29 January, which is only a month later, of 156. Is that correct?

Ms Tydd : The figure I have is 138.4.

  Senator SCARR: 175 to 138.4?

Ms Tydd : Yes. They’re the figures I have before me. Read the rest of this entry »

Australian Privacy Commissioner gets a nice media makeover, er is the subject of deep insightful report the way it is currently done, over lunch

February 2, 2025

C’est chic to do an in depth piece by over an extravagantly priced breakfast or lunch. Not only does the reader get to know something about the subject but we get an insight of what the movers and shakers are eating and where they congregate to consume. The Australian Financial Review has recently published a profile of Carly Kind, the recently appointed Privacy Commissioner. This is something of a first for Privacy Commissioners. The most recent Information Commissioners (who covered privacy), Timothy Pilgrim (a pleasant but through and through public servant) and Angeline Falk (a long serving deputy in the Office of the Australian Information Commissioner), were not media averse as such. But their media forays were relatively few and brief. Usually confined to an interview on the ABC or quotes for other media. Their speeches at conferences were safe and predictable and certainly not designed to shake up the woeful privacy culture in the Australian marketplace. Even by the grey standards of Australian regulators they were distinctly in the background. Which was a shame. Privacy issues did not get ventilated as much as they should have. That is perhaps understandable given the generally ineffective regulation and enforcement of the Privacy Act. To be fair the last few years has seen a marked improvement in enforcement but has come off a low base and has not had a significant impact on the market yet.  And to be fair Pilgrim and Falk were marked improvements on their predecessors.

Carly Kind has had a good start as Privacy Commissioner.  A distinct up tick in enforcement action and more assertive commentary.  That she has a pedigree largely outside the Australian Public Service is a huge advantage.  She may be less hidebound by conservative self restraining litigation guidelines.  We can only hope given she has been handed even more enforcement powers in the most recent amendments to the Privacy Act late last year. In this article she was candid in criticising poor public policy which has led to privacy invasive practices.  As I have been writing about for years.  She needs to bring high profile actions which puts high profile privacy breaching companies into the media spotlight.  This is a common approach of ASIC and the ACCC.  That is the only way of changing the culture in the market place.

The article gives some restrained hope that the coming years will see more effective and high profile regulation of privacy breaches.  It is well overdue.

The article provides:

My lunch with the Australian Privacy Commissioner, Carly Kind, begins with a confession.

“I tried to stalk you on social media on my Uber on the way,” I say as she sits down at Manly’s Noon café, bike helmet in hand.

Looking up other people’s social media is something everyone does but no one should ever admit to, particularly not to the woman charged with protecting the nation’s privacy by upholding the Privacy Act of 1988.

Kind is taken aback and for a moment, I think I’ve blown it before we’ve even ordered a coffee, let alone lunch.

“Did you find anything interesting?” she responds after what feels like an age.

No. She is on Instagram and on Facebook. But both attempts to glean any information of value were foiled despite me being a Millennial journalist well versed in the art of lurking.

Privacy Commissioner Carly Kind admits she’s less idealistic about the role of regulation in protecting online privacy and worries one day big tech will decide not to obey the law.  

Her Instagram is set to private. Her Facebook isn’t locked but the only photo I can click on is of the back of her head. I did manage to deduce she has 737 Facebook friends, but there are no workplaces, relationships, or really any other information to show.

When I lament my efforts were dashed, she’s nonchalant, “I really don’t use Facebook these days, but I can’t get rid of it because of Marketplace.”

I feel seen immediately.

Read the rest of this entry »

Meta settles civil penalty proceeding with Office of Information Commissioner arising out the Cambridge Analytica scandal for $50 million and an enforceable undertaking

December 17, 2024

In the dying days of 2024, when the focus is on presents, holidays and plum pudding (for some at least) Meta has settled the civil penalty proceeding in the Federal Court. Meta will also enter into an enforceable undertaking.   The $50 million will not be distributed immediately. Eligibility will depend on whether a person ws in Australia between November 2013 and mid December 2015 and installed This is Your Digital Life App or was a friend of someone who had that app installed.

This is a very welcome development.  The civil penalty proceedings power in the Privacy Act has until recently been underutilised.

The Commissioner’s media release provides:

The Australian Information Commissioner today agreed to a $50 million payment program as part of an enforceable undertaking (EU) received from Meta Platforms, Inc. (Meta) to settle civil penalty proceedings. The payment scheme will be open to eligible Australian Facebook users impacted by the Cambridge Analytica matter.

The Commissioner alleged that the personal information of some Australian Facebook users was disclosed to the This is Your Digital Life app in breach of the Privacy Act 1988 (Cth). The information was exposed to the risk of disclosure to Cambridge Analytica and other third parties, and risked being used for political profiling purposes.

The agreement announced today follows a court-ordered mediation, which has been ongoing since February 2024, as part of the Federal Court civil penalty proceedings the Commissioner commenced in March 2020.

“Today’s settlement represents the largest ever payment dedicated to addressing concerns about the privacy of individuals in Australia,” Australian Information Commissioner Elizabeth Tydd said.

“It represents a substantive resolution of privacy concerns raised by the Cambridge Analytica matter, gives potentially affected Australians an opportunity to seek redress through Meta’s payment program, and brings to an end a lengthy court process.”

As part of the resolution, the Commissioner has withdrawn the civil penalty proceedings in the Federal Court.

The EU requires Meta to set up a payment scheme, which will be run by an independent third-party administrator. Meta will appoint the third party to administer the payment scheme, who will be announced early next year. The scheme will be open to individuals who:

    • held a Facebook Account between 2 November 2013 and 17 December 2015;
    • were present in Australia for more than 30 days during that period; and
    • either installed the This is Your Digital Life app or were Facebook friends with an individual who installed the app.

The payment scheme will be structured into two tiers of payments. The first will permit individuals to apply for a base payment if they believe they experienced generalised concern or embarrassment because of the matter. The second category will provide for specific payment, likely to be higher than the base payment, to those who can demonstrate they have suffered loss or damage. The third-party administrator will also establish a timely internal review avenue for individuals in relation to the payment scheme. The Office of the Australian Information Commissioner anticipates individuals may be able to start applying to the payment program in the second quarter of 2025.

Any residual funds not exhausted in the payment scheme will be paid into the Commonwealth’s Consolidated Revenue Fund. Meta also paid a contribution to the Commissioner’s legal costs.

“The payment scheme is a significant amount that demonstrates that all entities operating in Australia must be transparent and accountable in the way they handle personal information, in accordance with their obligations under Australian privacy law, and give users reasonable choice and control about how their personal information is used,” Commissioner Tydd said.

“This also applies to global corporations that operate here. Australians need assurance that whenever they provide their personal information to an organisation, they are protected by the Privacy Act wherever that information goes.”

“We remain committed to applying our powers under the Privacy Act to achieve proportionate outcomes to ensure that Australians’ privacy is protected, particularly with respect to technologies that have a high privacy impact. This groundbreaking outcome reflects the significant concerns of the Australian community,” Privacy Commissioner Carly Kind said.

Since then Australian Information Commissioner Angelene Falk commenced the civil penalty proceedings against Meta in March 2020, the penalties for serious or repeated interferences with privacy (which can only be imposed following the commencement of civil penalty proceedings in the Federal Court), have increased from $1.7 million for each serious and/or repeated interference with privacy, to whichever is the greater of $50 million, three times the value of any benefit obtained through the misuse of information, or 30% of a company’s adjusted turnover in the relevant period.

Read the enforceable undertaking.

Details of payment scheme

    • Funds of $50 million will be available.
    • Individuals who were present in Australia for more than 30 days between 2 November 2013 and 17 December 2015, and either installed the This is Your Digital Life app, or who were Facebook friends of an individual who installed the This is Your Digital Life app, can apply for a base payment based on generalised concern or embarrassment, or an alternative amount if they can demonstrate specific loss or damage.
    • The third-party administrator will take reasonable steps to publicise the payment scheme.
    • Meta is required to make reasonable best efforts to notify those who are potentially impacted.
    • The payment scheme will be administered by a third-party administrator to be appointed by Meta. Payment is required to be made in a timely manner.
    • Details for accessing the payment scheme will be made public by the administrator in the second quarter of 2025.

The Enforceable Undertaking Read the rest of this entry »

The Australian Information Commissioner publishes a guidance on tracking pixels

November 5, 2024

Tracking pixels are HTML code snippets which is loaded when someone visits a website. It is used for tracking user behaviour. Advertisers can use this data for online marketing and web analysis. In the latest of a surge of guidances the Office of the Australian Information Commissioner (“OAIC”) has published guidance on tracking pixels.

Given the increased powers proposed in the Privacy and Other Amendments Bill 2024 organisations covered by the Privacy Act 1988 need to consider their use of tracking pixels before the amendments come into force.

The media release provides:

The Office of the Australian Information Commissioner (OAIC) has released guidance for private sector organisations to ensure they meet their obligations under the Australian Privacy Act when using third-party tracking pixels on their website.

Publication of the guidance responds to industry demand for greater detail on the application of the Privacy Act to tracking technologies, as well as interest in the topic across government, media and the community.

Many social media companies and other digital platforms offer tracking pixels. A tracking pixel is a piece of code generated by a third-party provider that can be placed on an organisation’s website to collect information about a user’s activity. When a user visits a webpage with a tracking pixel, the pixel loads and sends certain types of data to the server of the third-party provider.

Pixels are one of many tracking tools, including cookies, that permit granular user surveillance across the internet and social media platforms. They can be important to business for analysis, advertising and measurement of return on investment.

“However, many of these tracking tools are harmful, invasive and corrosive of online privacy,” Australian Privacy Commissioner Carly Kind said.

“This is a real concern in the community with our Australian Community Attitudes to Privacy Survey 2023 finding that 69% of adults did not think it fair and reasonable that their personal information was used for online tracking, profiling and targeted advertising, with that rising to 89% when material was targeted at children.”

The guidance makes clear that it is the responsibility of the organisation seeking to deploy a third-party tracking pixel on their website to ensure it is configured and used in a way that is compliant with the Privacy Act.

Before deploying a third-party pixel, organisations should ensure they understand how the product works, identify the potential privacy risks involved and implement measures to mitigate those risks, and not adopt a ‘set and forget’ approach.

Failing to conduct appropriate due diligence can create a range of privacy compliance and other legal risks.

Consistent with the OAIC’s recent guidance on the use of generative AI products, the OAIC is seeking to expand its range of guidance for organisations so that they can continue to grow their businesses while meeting privacy obligations in a way that builds community trust.

The guidance Read the rest of this entry »

Information Commissioner releases Annual Report

November 1, 2024

It is a annual report season for Government agencies and authorities. And that includes that of the Office of the Australian Information Commissioner.Yesterday the Commissioner released its 194 page Annual Report for 2023 – 24. 

Given the significant amendments to the Privacy Act 1988 it is better to look forward to how the Privacy Commissioner approaches her responsibilities with new found powers rather than poring over the activities of the Privacy Commissioner over the past year.  On that note the work rate improved but it remained a timid regulator by any measure.   Which is a pity given the the Information Commissioner’s remuneration was $576,174 and Deputy Commissioner Elizabeth Hampton was $380,091. The relatively newly appointed Privacy Commissioner, Carly Kind is on $109,239.

In relation to privacy complaints the the Commissioner stated:

Privacy has been very much in the spotlight, with the continuing incidence of major data breaches. In 2023–24, we received 13% more notifications under the Notifiable Data Breaches (NDB) scheme than the year prior, when there was a 4% increase. We lifted our response rate, closing 84% of notifications within 60 days (compared to 77% last reporting year). In the 2022–23 financial year we received a 34% increase in privacy complaints. This year, complaints have remained relatively high, with a slight decrease of 5% year on year. We successfully responded to this high demand, finalising 20% more privacy complaints (3,104 in total), building on last year’s increase of 17% (2,576 finalised in total).
We continued our focus on clearing longer-standing, generally more complex and resource-intensive complaints, finalising 84% (271) of the 322 matters that were over 12 months old as at June 2023. At the same time, more recent complaints increased in age over the reporting period. The volume of complaints, combined with the focus on the longest-standing, meant that by the year’s end there was an overall increase in matters older than 12 months to 729. The OAIC will continue to focus on aging cases through process efficiencies and the strategic application of resources.

 What is quite unusual is that Read the rest of this entry »

The Australian Information Commissioner issues updated guidance for charities and other not for profit organisations

October 24, 2024

The Australian Information Commissioner has issued updated guidances of charities and other not for profit organisations,  Guidances are not regulations but they are very important.  Organisations which comply with the guidances and somehow still have a data breach or other form of interference with privacy may be able to argue that they have done all that was required of them.  The reality is that if more organisations focused on complying with guidances and standards there would be far fewer data breaches.  Clearly all investigations are fact specific and compliance with a guideline does not provide any sort of immunity.

The statement from the Commissioner provides:

The updated guidance includes expanded advice on security of information, and steps that not-for-profits can put in place to ensure compliance with their retention and destruction obligations.

In particular, the updated guidance includes discussion on what to consider when engaging third-party providers, such as for fundraising, or software vendors. This area is particularly topical in the wake of high-profile data breaches affecting charities and NFPs.

Privacy Commissioner Carly Kind said the guidelines aim to help charities navigate their privacy responsibilities when collecting and handling personal information, and understand their obligations under the Privacy Act.

“We know how critical trust is to the work of not-for-profits and charities, and how important good privacy practices are to that trust”. Read the rest of this entry »

The Australian Information Commissioner releases guidelines

October 21, 2024

AI presents a major regulatory challenge across a range of governmental and private activities. And that is especially the case with privacy. The UK Information Commissioner’s Office has issued detailed guidance and other resources on Artificial Intelligence. The US Federal Trade Commission raised issues on AI, by way of a Big Data report in 2016, by post in 2017, issued a guidance by way of Q & A in 2020 and a finding on the use of Artificial Intelligence In the Matter of DoNotPay, Inc. Matter Number 2323042 September 25, 2024. Which brings us to the Australian Information Commissioner’s release of AI guidance today. There are actually 2 guides, one on the use of commercially available AI products.  The second relates to developers using personal information to great AI models.

AI needs personal information to properly work.  Lots of it.  Each of the guides and highlight the care that needs to be taken in considering the operation of the Privacy Act when using and developing Artificial Intelligence.  

The media release provides:

New guides for businesses published today by the Office of the Australian Information Commissioner (OAIC) clearly articulate how Australian privacy law applies to artificial intelligence (AI) and set out the regulator’s expectations.

The first guide will make it easier for businesses to comply with their privacy obligations when using commercially available AI products and help them to select an appropriate product. The second provides privacy guidance to developers using personal information to train generative AI models.

“How businesses should be approaching AI and what good AI governance looks like is one of the top issues of interest and challenge for industry right now,” said Privacy Commissioner Carly Kind.

“Our new guides should remove any doubt about how Australia’s existing privacy law applies to AI, make compliance easier, and help businesses follow privacy best practice. AI products should not be used simply because they are available.

“Robust privacy governance and safeguards are essential for businesses to gain advantage from AI and build trust and confidence in the community,” she said.

The new guides align with OAIC focus areas of promoting privacy in the context of emerging technologies and digital initiatives, and improving compliance through articulating what good looks like.

“Addressing privacy risks arising from AI, including the effects of powerful generative AI capabilities being increasingly accessible across the economy, is high among our priorities,” Commissioner Kind said.

“Australians are increasingly concerned about the use of their personal information by AI, particularly to train generative AI products.

“The community and the OAIC expect organisations seeking to use AI to take a cautious approach, assess risks and make sure privacy is a key consideration. The OAIC reserves the right to take action where it is not.”

While the guidance addresses the current situation – concerning the law, state of technology and practices – Commissioner Kind said an important focus remains how AI privacy protections could be strengthened for the benefit of society as a whole.

“With developments in technology continuing to evolve and challenge our right to control our personal information, the time for privacy reform is now,” said Commissioner Kind.

“In particular, the introduction of a positive obligation on businesses to ensure personal information handling is fair and reasonable would help to ensure uses of AI pass the pub test.”

The OAIC has published a blog post with further information about the privacy guidance for developers using personal information to train generative AI models.

The first guide Read the rest of this entry »

Privacy and Other LegislationAmendment Bill 2024 – Government moves the Second Reading and publishes Second Reading speech

October 8, 2024

The Government has published the Second Reading Speech and adjourned debate of the Bill. The Second Reading Speech is dated 12 September 2024 however the Daily Program lists the Speech as being moved today. It only recently appeared on the Bill’s homepage.

The Bill provides the Privacy Commissioner with more flexibility with enforcement, allowing for infringement notices and new civil penalties.  The real issue there is getting the Commissioner to use those powers.  The existing civil penalty provisions have only been used twice, and then only very recently and neither case has reached resolution. 

The statutory tort for serious invasions of privacy is welcome however the exemption carve outs, for journalism, law enforcement and security limit its effectiveness.  There is no consideration of whether the actions of the journalist is excessive and irresponsible in breaching a person’s privacy.  In the UK there is a balancing between Article 8, a right to privacy, and Article 10 a freedom of expression as applies to the media.  

There is specific provision for the development of a Children’s Privacy Code.  According to the Attorney General that is designed to align the protections with those that exist overseas. 

Doxxing will be criminalised.

There are other provisions which clarify the sharing of information when there are data breaches and during emergencies and regarding overseas data flows.

The amendments are conservative and modest but a move in the right direction. These changes will not make Australia’s Privacy Act the gold standard but if the further reforms proposed by the Attorney General’s Department are implemented then the level of protections will allow for a more effective regulation and protections.

The Second Reading provides:

Introduction

The digital economy has unleashed enormous benefits for Australians. But it has also increased the privacy risks we face through the collection and storage of enormous amounts of our personal data.

The Privacy Act 1988 represented the first time that a comprehensive, integrated set of legal rules protecting interests in privacy existed in Australia. On introducing it, Attorney-General Lionel Bowen told the parliament that ‘enormous developments in technology for the processing of information are providing new and, in some respects, undesirable opportunities for the greater use of personal information.’

In that respect, little has changed. Evolutions in technology and the way people use it continue to vex those who share information online, and those charged with regulating it. It is essential that Australians are protected by a legal framework that is flexible and agile enough to adapt to changes in the world around them.

The Privacy Act has not kept pace with the adoption of digital technologies. The vast data flows that underpin digital ecosystems have also created the conditions for significant harms—like major data breaches that have revealed the sensitive information of millions of Australians, exposing us to the risk of identity fraud and scams. Read the rest of this entry »