ASIC investigating how directors prepare for and respond to cyber attacks

September 18, 2024

The Australian Financial Review reports in ASIC pursues board directors over cyber breaches that it is investigating how directors deal with cyber attacks, both before and after they happen.  The ASIC Chair’s speech Effective compliance: Perspectives from the regulator highlights this increased focus. 

ASIC has been quite active in taking action against companies who have suffered damage as a result of data breaches, most notably its civil penalty proceeding against RI Advice.

The speech by the ASIC chair Read the rest of this entry »

Australian Government publishes policy for responsible use of Artificial Intelligence. Comes into force on 1 September 2024

August 17, 2024

The Australian Government has published a 19 page policy for the responsible use of AI. It comes into force on 1 September 2024.

The recommended actions include:

  • training staff on AI fundamentals taking into account roles and responsibilities such as employees involved in procurement, development, training, and deployment of AI;
  • make publicly available a statement outlining their approach to AI adoption, including information on compliance with the policy, measures to monitor the effectiveness of deployed AI systems, and efforts to protect the public against negative impacts; and
  • designate accountable officials for implementation of the policy within their organization, who:
    • are the contact point for whole-of-government AI coordination;
    • must engage in whole-of-government AI forums and processes; and
    • must keep up to date with changing requirements as they evolve over time.

The key principles of the policy are aimed at :

  • Australians are protected from harm;
  • AI risk mitigation is proportionate and targeted; and
  • AI use is ethical, responsible, transparent and explainable to the public.

The the press release is found here and the policy here.

The press release provides:

The Australian Government needs a coordinated approach if it’s to embrace the opportunities of AI. The Digital Transformation Agency has released the Policy for the responsible use of AI in government, an important step to achieve this goal while building public trust.

Coming into effect 1 September 2024, the Policy for the responsible use of AI in government positions the Australian Government to be an exemplar of safe, responsible use of AI.

Designed to evolve with technology and community expectations, it sets out how the Australian Public Service (APS) will:

  • embrace the benefits of AI by engaging with it confidently, safely and responsibly
  • strengthen public trust through enhanced transparency, governance and risk assurance
  • adapt over time by embedding a forward-learning approach to changes in both technology and policy environments.

‘This policy will ensure the Australian Government demonstrates leadership in embracing AI to benefit Australians,’ states Lucy Poole, General Manager for Strategy, Planning, and Performance.

‘Engaging with AI in a safe, ethical and responsible way is how we will meet community expectations and build public trust.’

Enable, engage and evolve

The policy is driven by the ‘enable, engage and evolve’ framework to introduce principles, mandatory requirements and recommended actions.

Enable and prepare

Agencies will safely engage with AI to enhance productivity, decision-making, policy outcomes and government service delivery by establishing clear accountabilities for its adoption and use.

Every agency will need to identify accountable officials and provide them to the DTA within 90 days of the policy effect date.

Engage responsibly

To protect Australians from harm, agencies will use proportional, targeted risk mitigation and ensure their use of AI is transparent and explainable to the public.

Agencies will need to publish a public transparency statement outlining their approach to adopting and using AI within 6 months of the policy effect date.

Evolve and integrate

Flexibility and adaptability are necessary to accommodate technological advances, requiring ongoing review and evaluation of AI uses, and embedding feedback mechanisms throughout government.

Supporting agencies standards and guidance

To help implement the policy, the DTA has published a standard for accountable officials (AOs) to lead their agency to:

  • uplift its governance of AI adoption
  • embed a culture that fairly balances risk management and innovation
  • enhance its response and adaptation to AI policy changes
  • be involved in cross-government coordination and collaboration.

‘We’re encouraging AOs to be the primary point of partnership and cooperation inside their agency and between others,’ outlines Ms Poole.

‘They connect the appropriate internal areas to responsibilities under the policy, collect information and drive agency participation in cross-government activities.’

‘Whole-of-government forums will continue to support a coordinated integration of AI into our workplaces and track current and emerging issues.’

The DTA will also soon release a standard for AI transparency statements, setting out the information agencies should make publicly available such as the agency’s:

  • intentions for why it uses or is considering adoption of AI
  • categories of use where there may be direct public interaction without a human intermediary
  • governance, processes or other measures to monitor the effectiveness of deployed AI systems
  • compliance with applicable legislation and regulation
  • efforts to protect the public against negative impacts.

‘Statements must use clear, plain language and avoid technical jargon,’ stresses Ms Poole.

Further guidance on additional opportunities and measures will be issued over the coming months.

Continuing our significant work on responsible AI

The last 12 months saw important work to better posture the APS for emerging AI technologies including the AI in Government Taskforce, co-led by the DTA and Department of Industry, Science and Resources (DISR), which concluded on 30 June 2024. 

The taskforce brought together secondees and stakeholders from across the APS for an unprecedented level of consultation, collaboration and knowledge-sharing. Its outputs directly informed this new policy and even more, continuing work to ensure a consistent, responsible approach to AI by government.

‘Our AI in Government Taskforce was crucial in demonstrating that we need a centralised approach to how government embraces AI, if it wishes to mitigate risks and increase public trust,’ states Ms Poole.

The Australian Information Commissioner has commenced civil penalty proceedings against Australian Clinical Labs Limited in the Federal Court

November 20, 2023

After coming off some serious questioning in Senate Estimates about poor enforcement practices the Commissioner announced on 3 November 2023 that the Office of the Information Commissioner has launched proceedings against Australian Clinical Labs on 2 November 2023 (file number NSD1287/2023). The Commissioner has filed a Concise Statement and Originating Application and Australian Clinical Labs Limited has filed a Notice of Address for service. The Commissioner is represented by DLA Piper, out of its Brisbane Office.  Previously the Commissioner has been represented by HWL Ebsworth.  Gilbert & Tobin, out of its Sydney Office, is representing Australian Clinical Labs.  GIlbert & Tobin represented RI Advice in the Federal Court case of  Australian Securities and Investments Commission v RI Advice Group Pty Ltd [2022] FCA 496. That case has been heralded as a positive development in enforcing data security as an obligation of Financial Service Licensees under the Corporations Act 2001, being 912A.  While R I Advice was the subject of compliance orders and penalties it is fair to say that Gilbert & Tobin did a good job in keeping the stringency of the orders and penalty to a moderate level.  Compared to overseas penalties by the European regulators, the UK Information Commissioner’s Office and Read the rest of this entry »

Legal and Constitutional Affairs Legislation Committee questions Office of Information Commissioner in Senate Estimates on 23 October 2023

October 27, 2023

Senate Estimates are an invaluable way of scrutinising government departments and asking questions on issues that do not find their way into Government reports. So it was with the Senate Legal and Constitutional Affairs Legislation Committee asked some long overdue questions of the Information Commissioner on 23 October 2023.  With the Information Commissioner top of the list of questions is the delay in investigating complaints and the lack of vigorous enforcement by the Commissioner.  Compared to other privacy regulators the Australian Information Commissioner’s Office is tardy and timid.

Senator Shoebridge asked questions relating to those very issues.  The answers were not particularly inspiring.  The good Senator hightlighted what privacy practitioners have long suspected, that the Commissioner doesn’t do enforcement.  This extract is revealing:

Sen ator SHOEBRIDGE: How could it be that 1,748 data breaches are referred to your office with not a single penalty over two years? What has gone wrong?

Ms Falk : It’s not a matter of something going wrong. It’s about regulatory strategy. It’s about ensuring that we’re using the right tool in the right circumstances.

Senator SHOEBRIDGE: It’s about never using the stick, isn’t it—never.

Ms Falk : That’s not the case. You’ll be aware that I do have proceedings before the Federal Court in relation to Facebook and also aware of the time that it takes for these matters to progress.

The regulatory strategy is not to take enforcement action.  In the US or the UK the enforcement would very much to the fore.  Here is is not the “right tool.”  Little wonder that there is a very poor privacy culture.  If enforcement is off the table there is Read the rest of this entry »

Turner v Bayer Australia Ltd (No 6) [2023] VSC 244 (10 May 2023): consideration by Victorian Court of GDPR obligations on a party whose discovery may contain personal information collected in the EU.

May 22, 2023

Justice Keogh in Turner v Bayer Australia Ltd (No 6) [2023] VSC 244 considered the application of the Victorian law and the European Privacy law, the General Data Protection Regulation (GDPR). The issue was whether releasing and reporting on personal information of individuals in documents generated in the EU attract protections that the Court should consider in the context of media reporting of a Victorian proceeding.

FACTS

The  proceeding is a product liability action concerning implanted permanent contraceptive medical devices identified collectively as the Essure device [1].

The trial commenced on 11 April 2023 and is estimated to run for 12 weeks [2].

Media organisations sought access to transcript and some of the documents relied on by the parties at trial.

The second defendant, Bayer Aktiengesellschaft,  is a corporation registered in Germany [4].

Some of the defendant’s discovery was of documents that originated from Germany (‘EU documents’), which some of which contained  personal data of natural persons residing in the European Union (‘EU’), including:

  • names,
  • job titles,
  • signatures,
  • business email addresses,
  • street addresses and phone numbers, and
  • personal email addresses,
  • street addresses and phone numbers (‘EU data’) [4].

The defendants opposed the media having general access to transcript and EU documents used at trial because, they argue, the release of EU data would be a breach of the GDPR [4].

The defendants sought orders requiring that media apply to the Court for release of transcript and any EU documents tendered at trial and give details of the context and purpose underpinning their request when applying for access, provide the parties with time to object to media access, and provide the parties further time  to redact personal information from documents to be released [5].

The defendants relied on a report of Professor Dr Gregor Thüsing, a jurist and professor at the University of Bonn in Germany who has has expertise in the European law of data protection and data security [12].

The court summarised Read the rest of this entry »

Re Lifestyle Residences Hobsons Bay Pty Ltd (recs & mgrs apptd) [2023] VSC 179 (6 April 2023): statutory demand, service under section 109X(1)(a), service outside the statutory period, whether director can make application on behalf of company when receivers appointed

April 23, 2023

The Victorian Supreme Court in Re Lifestyle Residences Hobsons Bay Pty Ltd (recs & mgrs apptd) [2023] VSC 179 considered a range of issues; whether a director can bring an application when receivers appointed, the operation of section 109X(1)(a) of the Act and the calculation of service. it makes it clear that there is an immutability of filing an application out of time making the application is a nullity.

FACTS

The facts relating to service were:

  • on 22 November 2022, Ms Celia Luki, the solicitor with carriage of the matter for the defendant, ascertained the registered office address of the Company from an Australian Securities and Investments Commission (‘ASIC’) company search [35].
  • Luki requested the Office Services Clerk in her firm in Redfern, New South Wales, to organise for the documents to be couriered to Melbourne for delivery to the registered office address.
  • a Client Services Assistant at McCullough Robertson received Luki’s instructions on the service of the statutory demand in the sum of $213,166.89 in an email forwarded to her by the Office Services Clerk, who also provided the statutory demand and accompanying affidavit.
  • the assistant logged into the Toll Priority (Aus) system and inputted those details, recording Luki’s email address as the contact person to receive email updates on the progress of the delivery of the demand. She printed a label from the Toll system, which included all of the recipient’s details which she affixed the label onto a Toll Express Services priority satchel and obtained a tracking number and manifest document.
  • in the afternoon of 22 November 2022, a courier from Toll attended the McCullough Robertson office and collected the sealed envelope and two copies of the manifest document [35]
  • on 16 December 2022 the tracking log records the documents were delivered to the company at the registered office address on 23 November 2022 at 9:46am. The proof of delivery document clearly records the registered office at which delivery occurred and the signature of Paula accepting delivery of the envelope [36]. Paula was a receptionist an accounting firm engaged by the company, whose business address is the registered office address of the company.
  • Paula was unsure who to forward the demand to and sought confirmation from her principal, Mr Sam Cimino. However, because Cimino was extremely busy that day, she was only able to email him and unable to speak to him in person [37].
  • on 24 November 2022, Paula had a discussion with Cimino, who instructed her to immediately send the statutory demand to Mr Burgess, Mr Dale Harrison and Mr Peter Van De Steeg, who are nominated contact people at the company. 
  • Paula emailed the nominated people at the company, attaching an electronic copy of the statutory demand but erroneously stated the demand had arrived by courier at the registered office address on 24 November 2022 when, in fact, it was delivered by courier the day prior [38]. 

Read the rest of this entry »

Media watch has a segment on “Media and privacy”, focusing on tort of interference with privacy. The venerable Paul Barry in full stentorian mode opines against it. Quelle surprise!

April 17, 2023

Tonight ABC’s Media watch broadcast a segment on the Attorney General’s Report on a Review of the Privacy Act, titled “Media and privacy”, with a focus on a proposed statutory tort of privacy. The coverage followed the traditional line adopted by media commentators in Australia, yes there are breaches but a tort of privacy would suppress free speech and so reform is a bad idea. Being Media Watch it was a reasonably comprehensive story, within the time alloted. But still quite predictable and overall not particularly sophisticated. The usual suspects came out against, such as Justin Quill with the usual lines about how such a reform will help the rich and kill investigative journalism. The supporters were also predictably supportive, being Michael Douglas and Barbara McDonald, but a good deal less shrill. Between now and the release of a draft bill expect strident stories from the participants in the Right to Know Coalition. In the past Chris Merritt (Privacy tort a blow to free speech 18 March 2009), Ainslie Van Onsolen (Push for a tort is misguided and wrong 21 September 2012), The Australian) and Micheal Stutchbury (Lawsuits no way to defend privacy or free speech 26 July 2011), among many others, have dipped their thumbs into the ink barrel when a privacy tort is mentioned and penned jeremiads about the end of journalism, the end of freedom of speech and no more public interest exposes if there such a privacy tort is enacted. There is a sameness about the columns; pictures of a grim future with judges wielding their gavels with abandon crushing story after story and villainous reprobates being protected. The offerings tended to be long on emotion and short on analysis. That does not mean it has not had an effect. Governments of both persuasions have steered clear of adequate privacy law reform for decades.

It is entirely understandable that the media would have an interest in privacy reform.  The problem is that it does not accept that the defence of public interest and freedom of expression in any tort will be given any weight.  That is fear based on emotion not logic.  On a more practical level given the gaping lacuna in the law regarding privacy, and the practical inability of the aggrieved to take any legal action for invasions of their privacy, it is in the media’s interests to keep  the status quo

The Media Watch report is quite a reasonable analysis, albeit limited by the fact that as the title suggests it focuses on media and privacy. Which is not the whole issue.  What is lost in this story is that there are many circumstances where the media is not involved, the interference with privacy is one person intruding on the seclusion of another.  Or interfering government officials.  Or organisations and businesses surveilling customers or just ordinary individuals.  With new and increasingly intrusive technology not having legal recourse is a failure of public policy.  None of this will convince the media and the fact that Australia is an outlier in this area of law causes it no concern at all.

The transcript of the story Read the rest of this entry »

Commonwealth Attorney General Privacy Act 1988 Review Report Part 1, chapters 3 & 4. Some observations about the analysis and proposals.

April 16, 2023

The date for submissions to the Attorney General’s Review of the Privacy Act Report closed on 31 March 2023.

I will be undertaking a detailed review of the Report, by related chapters, between now and when the draft Bill is released by the Government, probably before or after the Winter Recess.

This analysis relates to Chapters 3 and 4. The proposals contained in both chapters are not controversial and address weaknesses in the Privacy Act drafting that were identified for some time.  The recommendations regarding de identified and anonymised information attempt to address what remains a very difficult issue. The extent to which de identification is possible in a practical sense is matter of significant debate.  Those issues may come into sharp relief if a data breach involved theft of de identified information which was subsequently re identified.

CHAPTER 3 OBJECTS OF THE ACT

The Report notes that Privacy is not defined in the Act. It is a concept that can be broadly construed and may be understood as comprising a number of related concepts including informational privacy, bodily privacy, privacy of communications, and territorial privacy.

The Report proposes:

3.1 Amend the objects of the Act to clarify that the Act is about the protection of personal information.

The rationale for the amendment is that as the focus of the Act is to provide a framework for the handling and protection of personal information, the objects should more clearly reflect this.

The Report then states that the Act implements Australia’s international obligations in relation to privacy in part by providing a framework for regulating the collection, use, storage, disclosure and destruction of personal information but does not cover all aspects of privacy as the term is commonly understood.

The Report recommends:

3.2 Amend the objects of the Act to recognise the public interest in protecting privacy.

The Report notes that:

  • protection of privacy sits alongside other important interests: this is recognised in Article 17 of the International Covenant on Civil and Political Rights (ICCPR) and reflected in paragraph 2A(b) of the objects which are are sometimes, but not always, in tension.
  • paragraph 2A(b) of the objects should continue to recognise that the protection of the privacy of individuals is balanced with the interests of entities in carrying out their functions or activities.
  • the recognition of a public interest, as well as individual interest, in privacy will inform the balancing exercise, retaining sufficient flexibility for ‘countervailing interests to be given the weight they deserve’
  • the protection of privacy and the interests of entities in carrying out their functions and activities, including private commercial activities, are not necessarily in conflict. It is not a zero-sum game.
  • businesses that use data in a fair and responsible manner may serve the public interest indirectly, and deliver benefits to individuals and the broader economy, as well as their own commercial interests.

4.   Personal information, de-identification and sensitive information

The Report identifies a problem with principles-based definition of a lack of understanding  how to apply it to information in practice.

The Report notes that the definition has to be seen in context in the Act and as such the Act:

  • does not prohibit the collection, use and disclosure of personal information.
  • requires that the principles around personal information handling set out in the APPs must be followed, including only collecting reasonably necessary information and only using or disclosing it for the purposes for which it was collected unless the individual consents or another exception applies.

The definition of personal information is intentionally broad which ensures that APP entities keep privacy and risk-based personal information handling at the forefront of their minds when conducting their functions or activities,

Section 6 of the Privacy Act defines personal information as follows:

personal information means information or an opinion about an identified individual, or an individual who is reasonably identifiable:

(a) whether the information or opinion is true or not; and

(b) whether the information or opinion is recorded in a material form or not.

Individual is defined as a ‘natural person’.

The current definition of personal information has two limbs:

  • the information is about an individual, and
  • the individual is identified or reasonably identifiable.

The Report identifies two categories of uncertainty about the definition:

  1. it is unclear which types of information can be personal information. For example, there is confusion about whether technical information that records service details about a device is the personal information of the owner of the device. Further, there is uncertainty about whether inferred information about an individual, for example in an online profile, will be personal information.
  2. there should be more clarity about how to ‘reasonably identify’ an individual and correspondingly how to know when an identifiable individual becomes ‘de-identified’.

The Report proposes to clarify the two categories of uncertainty through proposals that address the two limbs of the test for Read the rest of this entry »

High Court revokes Facebook’s special leave application on the day of hearing. Information Commissioner’s civil penalty proceeding will now proceed beyond the service stage…almost 3 years after the originating application was filed

March 7, 2023

The High Court today revoked Facebook’s special leave application. The transcript is not available yet and reasons have not been published but the key argument for this volte face was a change to the Federal Court Rules on overseas service.

The Information Commissioner released a media release providing:

The Office of the Australian Information Commissioner (OAIC) today welcomed the Full Court of the High Court of Australia’s decision to revoke Facebook Inc’s special leave to appeal to the High Court.

The High Court granted the Commissioner’s application to revoke special leave due to a change in the Federal Court Rules in relation to overseas service.

This clears the way for proceedings to return to the Federal Court. The substantive proceeding seeking civil penalties against Facebook Ireland and Facebook Inc over the Cambridge Analytica matter will now progress.

“Today’s decision is an important step in ensuring that global digital platforms can be held to account when handling the personal information of Australians,” Australian Information Commissioner and Privacy Commissioner Angelene Falk said.

“Entities operating in Australia are accountable for breaches of Australian privacy law, and must ensure that their operations in Australia comply with that law,” Commissioner Falk said.

Background

On 9 March 2020, the Commissioner lodged proceedings against US-based Facebook Inc and Facebook Ireland (collectively, Facebook) in the Federal Court, alleging the social media platform had committed serious and/or repeated interferences with privacy in contravention of Australian privacy law.

The Commissioner alleges that from 12 March 2014 to 1 May 2015: Read the rest of this entry »

Federal Trade Commission commences enforcement action against GoodRx for extraordinary privacy breaches involving sharing consumer sensitive health information for advertising purposes

February 8, 2023

The Federal Trade Commission (the “FTC”) has announced enforcement action against GoodRX for a range of signficant breaches of customer’s information.  This the first time it is using its powers under the Health Breach Notification Rule.

This case highlights the temptations of monetising personal information to generate sales even if that meant disclosing personal health related information.  It also demonstrates that large operations can and often do ignore privacy and data security obligations when using data for financial gain. When the regulator takes action the flaws become very apparent and often make a bad situation much worse.
While the law differs in Australia it is very useful considering these actions because of the methodology the FTC deploys in framing their cases.  The technology is the same in Australia and the United States.  The issues are the same.

According to the FTC:

  • since  2011, GoodRx Holdings, Inc is a “consumer-focused digital healthcare platform” based in Santa Monica, California.
  • GoodRx advertises, distributes, and sells:
    • health-related products and services directly to consumers, including purported prescription medication discount products branded as “GoodRx” and “GoodRx Gold.”
    • telehealth services, branded as “GoodRx Care,” and previously as “HeyDoctor by GoodRx,” and “HeyDoctor,” through its subsidiary HeyDoctor, LLC (“HeyDoctor”) [2].
  • since at least 2017, GoodRx  promised its users that it would share their personal information, including their personal health information, with limited third parties and only for limited purposes; that it would restrict third parties’ use of such information; and that it would never share personal health information with advertisers or other third parties [3]
  • GoodRx offers a platform, available through its website (www.GoodRx.com) or mobile application (“Mobile App”), to search for and compare prescription medication pricing at nearby pharmacies, and to obtain prescription discount cards (the “GoodRx Coupon”). Since January 2017, 55.4 million consumers have visited or used GoodRx’s website or Mobile App [16]
  • GoodRx  collects:
    • users’ personal and health information, and prompts users to provide their email address or phone number, to access electronic coupons and refill reminders [19].
    • personal and health information when users register for an account, which is required for GoodRx Gold, the product charging a monthly subscription fee. [20]
    • personal and health information from PBMs. When users purchase medication using GoodRx Coupons, the PBM processes the transaction and sends a claims record to GoodRx (“Medication Purchase Data”), containing name, date of birth, and information about the prescription filled [21]

On February 25, 2020, Consumer Reports published Read the rest of this entry »

Verified by MonsterInsights