ASIC commences action against FIIG Securities for cyber security failures

March 14, 2025


The Australian Securities and Investment Commission announced yesterday that it was suing FIIG Securities for “systemic and prolonged cyber security failures” from March 2019 until 8 June 2023. As a result hackers entered FIIG’s IT system and stole personal information which was released onto the dark web. ASIC specifically referred to the Federal Court decision of Australian Securities and Investments Commission v RI Advice Group Pty Ltd (No 3) [2022] FCA 84. This was the first case where the failure to manage cyber risk was found to be a breach of its financial services obligations. That case was settled with the proposed parties proposing consent orders containing declarations and consequential orders. Given the nature of the repeated breaches RI Advices legal representatives negotiated quite a favourable outcome notwithstanding orders were made against their client. In the United States or the UK the penalties would have been much more severe.

Helpfully ASIC has provided a concise statement of facts and the Orginating Process.  From that ASIC alleges that between 13 March 2019 and 8 June 2023, FIIG did not comply with its AFSL obligations under sections 912A(1) of the Corporations Act 2001 (Cth) to:

  1. do all things necessary to ensure that financial services were provided efficiently, honestly and fairly (s 912A(1)(a)), by failing to have in place adequate measures to protect its clients from the risks and consequences of a cyber incident;
  2. have available adequate resources (including financial, technological, and human resources) to, amongst other things, ensure that it had in place adequate cyber security measures required by its licence (s 912A(1)(d)); and
  3. have in place a risk management system that adequately identified and evaluated the risks faced by FIIG and its clients; adopt controls adequate to manage or mitigate those risks to a reasonable level; and implement those controls (s 912A(1)(h)).

ASIC alleges that FIIG failed to have the following cybersecurity measures:

  • Planning and training: here was no cyber incident plan communicated and accessible to employees which is tested at least annually, and mandatory cyber security training (at commencement of employment and annually);
  • Access restrictions:
    • there were no proper management of privileged access to accounts, including non required access being revoked, and greater protections for privileged accounts; and
    • configuration of group policies to disable legacy and insecure authentication protocols;
  • Technical monitoring, detection, patches and updates: there was a failure to have or inadequate
    • vulnerability scanning, involving tools deployed across networks and endpoints, and processes run at least quarterly with results reviewed and actions taken to address vulnerabilities;
    • next-generation firewalls (including rules preventing endpoints from accessing file transfer protocol services);
    • endpoint detection and response software on all endpoints and servers, with automatic updates and daily monitoring by a sufficiently skilled person;
    • patching and software update plans (with critical or high importance patches applied within 1 month of release, and 3 months for all others), and a practice of updating all operating systems, with compensating controls to systems incapable of patching or updates; and
    • security incident event management software configured to collect and consolidate security information across all of FIIG’s systems with appropriate analysis of the same (daily monitoring);
  • Testing: there was a lack of
    • processes to review and evaluate efficacy of technical controls at least quarterly; and
    • penetration and vulnerability tests from internal and external points.

Read the rest of this entry »

NIST announces a review of its cyber security framework in light of developments in AI

Artificial Intelligence is becoming the great disrupter. And in privacy and cyber security its impact is especially acute. the National Institute of Science and Technology (“NIST”) has announced the process to develop a new cyber AI profile.

The NIST notes Read the rest of this entry »

EU release pseudonymisation guidelines

March 13, 2025

On 16 January the European Data Protection Board (EDPB) adopted Guidelines 01/2025 on Pseudonymisation which is effective on 17 January 2025. Pseudonymisation is poorly understood by organisations and some practitioners. It is also an important means of data protection.

t should be noted that OVIC has undertaken a very detailed assessment into de identification and higlighted the problems with it.

The guidelines sets out in details guidance on on the use and benefits of pseudonymisation under the General Data Protection Regulation (GDPR). Importantly it clarifies

  • what pseudonymization means,
  • how to use it to meet data protection requirements, and
  • how to implement it.

Australia operates under the Privacy Act and is not bound by the GDPR.  That said many organisations in Australia operate in Europe nad to that extent are bound by hte operation of the GDPR.  Further, the guidelines from the EU like the NIST publications provide valuable assistance in dealing with privacy issues. 

What is Pseudonymization?

Art. 4(5) of the GDPR defines pseudonymisation as “the processing of personal data in such a manner that it can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organizational measures to ensure that it is not attributed to an identified or identifiable natural person.”

Pseudonymisation can be implemented through various techniques, such as the use of tables that map pseudonyms to original identifiers while keeping pseudonyms and original identifiers separate and secure (e.g., in the hands of two separate organizations). 

Pseudonymisation should at least concern direct identifiers (e.g. passport or social security numbers, but also the combination of the full name of a person with his or her date of birth) which, alone, allow to identify data subjects. The pseudonymising entity should also be mindful of indirect identifiers (e.g. by deleting such indirect identifiers, generalising or randomising them), which may also allow to identify a data subject despite the pseudonymisation.
Read the rest of this entry »

The EU Commission announces the publication of general purpose AI code of practice

March 12, 2025

The European Commission has released the third draft of the General-Purpose AI Code of Practice. It includes commitments by providers of general-purpose artificial intelligence (AI) models, including:

  • documentation: the signatories commit to drawing up and keeping up-to-date model documentation, including ensuring quality, security, and integrity of the documented information and providing it to providers of AI systems and to the AI Office upon request; and
  • copyright policy

Providers of general-purpose AI models with systemic risk must commit to :

  • adopting and implementing a Safety and Security Framework that will apply to the AI models with systemic risk, as well as detail the systemic risk assessment;
  • conducting systemic risk assessment systematically at appropriate points along the entire model lifecycle;
  • selecting and further characterizing systemic risks;
  • determining the acceptability of the systemic risks;
  • implementing technical safety mitigations along the entire model lifecycle of the model, and ensuring they are proportionate and state-of-the-art;
  • mitigating systemic risks that could arise from unauthorized access to unreleased models;
  • reporting to the AI Office on the safety and security of the models;
  • carrying out adequacy assessments;
  • implementing systemic risk responsibility allocation;
  • obtaining independent external systemic risk assessments, including model evaluations;
  • keeping track of, documenting, and reporting serious incidents to the AI Office and, as appropriate, to national competent authorities;
  • ensuring protections on non-retaliation against any worker providing information about systemic risks;
  • notifying the AI Office of relevant information and the implementation of commitments;
  • carrying out documentation, as prescribed by the code of practice and the Artificial Intelligence Act (AI Act); and
  • implementing public transparency on systemic risks stemming from their AI models with systemic risk.

The AI Office will:

  • report on the feedback received from stakeholders on the template for an adequate public summary of the training data under Article 53(1)d) of the AI Act and outline the next steps for adopting the template; and
  • publish guidance clarifying the scope of the AI Act rules for general-purpose AI, including information on:
    • the definitions of general-purpose AI models;
    • placement of models on the market and providers;
    • exemptions for models provided under free and open-source licenses; and
    • the effects of the AI Act on models placed on the market before August 2025.

The press release Read the rest of this entry »

Office of the Information Commissioner attend Estimates

March 1, 2025


Senate Estimates is an annual event. For Governments it is a mandatory evil. For oppositions it promises to reveal a cornucopia of a information to embarrass the government and burnish its credentials. For the agencies, in particular the public servants who front the various Estimates Committees, it is a burden to be carried as part of the job. This year the Information Commissioner’s attendance before the Legal and Constitutional Affairs Legislation Committee proved to be no different. The Commissioner’s opening statement was the usual anodyne, nothing to see here, statement providing.

With the chair’s leave I take this opportunity to acknowledge the committee’s role and in doing so provide a brief opening statement outlining the important work of the Office of the Australian Information Commissioner (OAIC).

I appear today with the assistance of the FOI Commissioner Ms Toni Pirani and with the chair’s leave the Privacy Commissioner Ms Carly Kind appearing via link and Executive General Manager, Information Rights Ms Ashleigh McDonald.

Supported by our new organisational structure we are better positioned to operate as a contemporary and proactive regulator. Some of our recent initiatives and outcomes demonstrate our future direction. We have:

    • commenced preliminary inquiries into the privacy impacts of connected vehicles
    • commenced the development of a Children’s Online Privacy Code
    • developed a public facing dashboard to ensure that agency freedom of information (FOI) data is reported and presented more effectively
    • We will shortly deliver a report examining the use of messaging apps by Australian government agencies
    • We are building our strategic intelligence capabilities.

To deliver a proactive and contemporary regulatory approach to benefit the Australian community, agencies and industry alike, we will also focus on building staffing capabilities through an investment in new ways of working and professional development. Within our budgetary parameters, our technology and systems will also be a focus to support our new direction.

However, we are also mindful to deal with our core case management responsibilities and reduce our backlog in both FOI and privacy cases. Our resources are challenged by a 25% increase in FOI Information Commissioner review (IC review) applications compared to the same period last year. This is against a backdrop of an increase in FOI IC review applications over the last 5 years that is estimated to double the number of FOI IC review applications received in 2019–20. We also face an overall growth in privacy case work and increasing complexity in our case work arising from digital services and emerging technologies. This has a particular impact on our privacy case work.

Our enforcement capabilities have been assisted by an increase of funding in recognition of the complexities of enforcement. Similarly designated funding has been provided to the OAIC to develop the Children’s Online Privacy Code and guidance regarding the social media age limit.

Our appearance and preparatory papers are informed by data as at 15 January 2025.  However, to assist the committee, as at 23 February 2025 the OAIC 2024–25 case statistics are as follows:

    • 1,279 FOI review applications were received and 1,494 finalised.
    • 196 FOI complaints were received and 216 finalised.
    • 1,966 privacy complaints were received and 1,687 finalised.

During this period, we also finalised a number of complex privacy matters that have delivered a strong enforcement message and importantly established our expectations of the regulated community. In doing so, we are upholding the rights of privacy and information access enshrined in statute by the Australian Parliament and better serving the values and expectations of the Australian community.

I wish to acknowledge the significant work and expertise of the OAIC leadership in taking forward this major change program and recognise with gratitude OAIC staff for their dedication and commitment as we secure the fundamental human rights of privacy and information access in an increasingly complex environment.

The hearing before the Estimates Committee focused on the reduction in staffing in the office from 200 to 138 staff in the Office.  A 23% reduction in staff.  Also of interest is the Privacy Commissioner’s admission that the the findings of the Property Lovers determination is not being complied with.  In short, the behaviour complained of is continuing.  The Privacy Commissioner is investigating what to do next.  

An understaffed office is bad news for effective regulation.  That has been a chronic problem for this office.  Fortunately there will be a statutory tort as of June 2025 so in many cases individuals will not need to rely on the Commissioner taking up an investigation from a member of the public.

The Transcript provides:

CHAIR: With 20 minutes to go in our hearing, we’re going to politely and apologetically, dismiss the Australian Human Rights Commission. We won’t get to them this evening. We thank them for their time and for travelling. We do have questions for them, but we won’t have time to put them. We thank them for their ongoing work, particularly in the current environment. I know they’re working very hard. So thank you very much.

Welcome, commissioners. Do you have an opening statement you’d like to table?

Ms Tydd : I do have a very brief opening statement and I’m happy to table that.

CHAIR: Thank you very much. That will be circulated to senator so they can read from that when they have it in front of them. In the meantime, I’ll pass the call to Senator Scarr.

  Senator SCARR: Commissioner, how many staff have left the OAIC since August last year?

Ms Tydd : I don’t think I could speak with authority from the date of August, but I can give you the very high-level numbers of staffing pre and post our organisational redesign.

  Senator SCARR: Can you give me the dates for the organisational redesign, so I can calibrate that with my August date.

Ms Tydd : Yes. That was finalised in mid-November, about 17 November. The organisational redesign responded to our significant budgetary situation, in which we would be operating at a deficit. Action was taken around that. At the time, in July, we had an FTE of just over 200. Our organisational redesign that allowed us to operate within our budgetary parameters—

  Senator SCARR: Sorry; it’s late. I’ve got to get these numbers right. In July your FTE was just over 200?

Ms Tydd : Correct. And our ASL cap came down to 173. We knew that within our budgetary parameters we’d need to operate at around 165. We didn’t purely look at staffing levels in relation to meeting our budgetary parameters; we looked at a range of measures. They included external supply costs. Legal costs were something that we focused on as well. So, yes, we were required to reduce staffing in response to our revised budgetary parameters, and that process was completed around mid-November.

  Senator SCARR: Okay. What were the FTE numbers as at mid-November, when you completed that process?

Ms Tydd : There probably was still some lag. I’d say it would be about 175. I’ll see if I have any dates that will help you further. I can tell you that as at 18 December, as we were still working through that process, our staffing level was 175.

  Senator SCARR: Do you have the data as at today or the most recent data as at the end of the month? Do you have any most recent data?

Ms Tydd : As at 29 January, it was 138.4.

  Senator SCARR: So you went from 175 as at 18 December—that was the figure you gave?—

Ms Tydd : Correct.

  Senator SCARR: to 138.4 as at 29 January?

Ms Tydd : That’s correct, with a headcount of 156.

  Senator SCARR: Okay, so you’ve got part-time—

  Senator SHOEBRIDGE: So as we don’t have to traverse across this, do you mind if I ask: you’ve been talking FTE all the time through, so these have all been the same dataset of FTE, full-time equivalents?

Ms Tydd : Yes.

  Senator SCARR: So you went from—we’ll try and use the common terminology—FTE as at 18 December of 175 to FTE as at 29 January, which is only a month later, of 156. Is that correct?

Ms Tydd : The figure I have is 138.4.

  Senator SCARR: 175 to 138.4?

Ms Tydd : Yes. They’re the figures I have before me. Read the rest of this entry »