Commonwealth Attorney General Privacy Act 1988 Review Report Part 1, chapters 3 & 4. Some observations about the analysis and proposals.

April 16, 2023 |

The date for submissions to the Attorney General’s Review of the Privacy Act Report closed on 31 March 2023.

I will be undertaking a detailed review of the Report, by related chapters, between now and when the draft Bill is released by the Government, probably before or after the Winter Recess.

This analysis relates to Chapters 3 and 4. The proposals contained in both chapters are not controversial and address weaknesses in the Privacy Act drafting that were identified for some time.  The recommendations regarding de identified and anonymised information attempt to address what remains a very difficult issue. The extent to which de identification is possible in a practical sense is matter of significant debate.  Those issues may come into sharp relief if a data breach involved theft of de identified information which was subsequently re identified.


The Report notes that Privacy is not defined in the Act. It is a concept that can be broadly construed and may be understood as comprising a number of related concepts including informational privacy, bodily privacy, privacy of communications, and territorial privacy.

The Report proposes:

3.1 Amend the objects of the Act to clarify that the Act is about the protection of personal information.

The rationale for the amendment is that as the focus of the Act is to provide a framework for the handling and protection of personal information, the objects should more clearly reflect this.

The Report then states that the Act implements Australia’s international obligations in relation to privacy in part by providing a framework for regulating the collection, use, storage, disclosure and destruction of personal information but does not cover all aspects of privacy as the term is commonly understood.

The Report recommends:

3.2 Amend the objects of the Act to recognise the public interest in protecting privacy.

The Report notes that:

  • protection of privacy sits alongside other important interests: this is recognised in Article 17 of the International Covenant on Civil and Political Rights (ICCPR) and reflected in paragraph 2A(b) of the objects which are are sometimes, but not always, in tension.
  • paragraph 2A(b) of the objects should continue to recognise that the protection of the privacy of individuals is balanced with the interests of entities in carrying out their functions or activities.
  • the recognition of a public interest, as well as individual interest, in privacy will inform the balancing exercise, retaining sufficient flexibility for ‘countervailing interests to be given the weight they deserve’
  • the protection of privacy and the interests of entities in carrying out their functions and activities, including private commercial activities, are not necessarily in conflict. It is not a zero-sum game.
  • businesses that use data in a fair and responsible manner may serve the public interest indirectly, and deliver benefits to individuals and the broader economy, as well as their own commercial interests.

4.   Personal information, de-identification and sensitive information

The Report identifies a problem with principles-based definition of a lack of understanding  how to apply it to information in practice.

The Report notes that the definition has to be seen in context in the Act and as such the Act:

  • does not prohibit the collection, use and disclosure of personal information.
  • requires that the principles around personal information handling set out in the APPs must be followed, including only collecting reasonably necessary information and only using or disclosing it for the purposes for which it was collected unless the individual consents or another exception applies.

The definition of personal information is intentionally broad which ensures that APP entities keep privacy and risk-based personal information handling at the forefront of their minds when conducting their functions or activities,

Section 6 of the Privacy Act defines personal information as follows:

personal information means information or an opinion about an identified individual, or an individual who is reasonably identifiable:

(a) whether the information or opinion is true or not; and

(b) whether the information or opinion is recorded in a material form or not.

Individual is defined as a ‘natural person’.

The current definition of personal information has two limbs:

  • the information is about an individual, and
  • the individual is identified or reasonably identifiable.

The Report identifies two categories of uncertainty about the definition:

  1. it is unclear which types of information can be personal information. For example, there is confusion about whether technical information that records service details about a device is the personal information of the owner of the device. Further, there is uncertainty about whether inferred information about an individual, for example in an online profile, will be personal information.
  2. there should be more clarity about how to ‘reasonably identify’ an individual and correspondingly how to know when an identifiable individual becomes ‘de-identified’.

The Report proposes to clarify the two categories of uncertainty through proposals that address the two limbs of the test forpersonal information:

  • Reforms to clarify the types of information that can be personal information by:
    • replacing ‘about’ an individual in the definition of personal information to ‘relates to’. This would not significantly change the definition, but would make it clearer that technical and inferred information can be personal information
    • adding a non-exhaustive list of information that can be personal information to aid interpretation of the definition, and
    • amending the definition of ‘collects’ to make clear that inferred information is collected at the point the inference is made.
  • Reforms to clarify when an individual will be reasonably identifiable from information by:
    • introducing a list of factors to consider when determining whether an individual is reasonably identifiable
    • amending the definition of de-identify to make clear that whether information remains de-identified can change depending on the context, and
    • extending protections to de-identified information that are proportionate to the risk of the information being re-identified.

The Report recommends proceeding with the Discussion Paper’s proposal of replacing the test that for information to be personal information it must be ‘about’ an individual, with a test that the information must ‘relate to’ an individual.

The Proposal is:

4.1 Change the word ‘about’ in the definition of personal information to ‘relates to’. Ensure the definition is appropriately confined to where the connection between the information and the individual is not too tenuous or remote, through drafting of the provision, explanatory materials and OAIC guidance.

The Report noted recommendation 16(a) in the ACCC’s DPI Report which was to update the definition of ‘personal information’ to clarify it captures technical data such as IP addresses, device identifiers, location data and any other online identifiers that identify an individual. This was because of uncertainty whether these types of information constitute personal information following the decision of Privacy Commissioner v Telstra Corporation Ltd (Grubb case).

The Report produced a good short summary of the abomination of the AAT’s analysis in the Grubb case, stating (absent footnotes):

In the Grubb case, the issue before the Administrative Appeals Tribunal (AAT) was whether telecommunications metadata was personal information which Mr Grubb had a right to access. The AAT determined the case on the basis that it did not think the data was ‘about’ Mr Grubb, but was about the way that Telstra delivers a call or message and the service Telstra provided to Mr Grubb. The Privacy Commissioner appealed to the Full Court of the Federal Court of Australia, but this appeal was dismissed. The Full Court did not rule on whether the specific metadata in dispute before the AAT was about Mr Grubb as that question was not put before them, but their Honours did conclude that whether information is ‘about’ an individual would ‘require an evaluative conclusion, depending upon the facts of any individual case’.

Since the Grubb decision, there has been confusion around whether technical information is or can be personal information. The OAIC considers it would be concerning if technical information could not be captured by the definition, given that ‘online and device identifiers are increasingly being used to track individuals and are rivalling names and addresses as key identifiers’. The ACCC in its DPI Report considered that there would be significant benefits to updating the definition of personal information to cover the realities of how data is collected from individuals in the digital economy and to align the Australian privacy regime with overseas standards.

The rationale for the proposal to replace the word ‘about’ in the definition of ‘personal information’ with the words ‘that relates to’ is that it better highlights that there needs to be a relationship between the information and the individual. The change would not have the intention of significantly expanding the meaning of the word ‘about’ which was always intended to be very expansive. The change would clarify that personal information includes technical information, inferred information and any other information where that information relates to the individual, in the sense it can be seen to provide details about their activities or their identity and the connection is not too tenuous or remote. It would also bring the Act into line with terminology and practice in international data protection regimes, such as the GDPR, and other Commonwealth legislation such as the Consumer Data Right. The Report notes that the proposed change to the definition would bring Australia closer to the terminology used in GDPR jurisdictions and guidance in these jurisdictions  highlight that this change would not expand the definition beyond how it is currently intended to operate. The United Kingdom’s Data Protection Act 2018 s 3(2) defines ‘personal data’ as any information relating to an individual. However, the UK Information Commissioner’s Office (UK ICO) in their guidance on the meaning of ‘relates to’ in the GDPR uses the word ‘about’ interchangeably.

Technical information about an individual’s activities may relate  to an individual notwithstanding that technically the data can also be viewed as being about the individual’s service usage. That information is directly used to engage with the individual and records activities and services accessed by the individual.  However not a, not all information able to be linked to an individual will ‘relate to’ them.

The Report recommends that concerns that the change to ‘relates to’ would make the definition of personal information too broad should be addressed when enacting the change. ‘Relates to’ could be defined in the Act as a connection to the individual that is not too tenuous or remote and which requires an evaluative exercise based on the context and circumstances of the particular case.

The Report recommends that the OAIC publish guidance on the relevant context and circumstances APP entities should have regard to when considering if information relates to an individual. The OAIC could have regard to the following relevant considerations in drafting guidance:

  • the extent to which the APP entity or a third party seeks to collect and use or is likely to use information to learn about or to evaluate an individual, or to treat them in a certain way, or seek to influence their behaviour or decisions
  • the extent to which the information records an individual’s features, activities or preferences, and
  • the extent to which the information is part of a record about one individual in particular, and not aggregated with other individuals’ data.

The Report recommends publishing a list of information that may be personal information with the Proposal:

4.2 Include a non-exhaustive list of information that may be personal information to assist APP entities to identify the types of information that could fall within the definition.

Supplement this list with more specific examples in the explanatory materials and OAIC guidance.

The Discussion paper listed  part of the non exhaustive list being:

  • an identifier such as a name
  • an identification number
  • location data
  • an online identifier, or
  • one or more factors specific to the physical, physiological, genetic, mental, behavioural (including predictions of behaviours or preferences), economic, cultural or social identity or characteristics of that person.

The Report still recommends a non exhaustive list including:

  1. name, date of birth or address
  2. an identification number, online identifier, or pseudonym
  3. contact information
  4. location data
  5. technical or behavioural data in relation to an individual’s activities, preferences, or identity
  6. inferred information, including predictions of behaviour or preferences, and profiles generated from aggregated information
  7. one or more features specific to the physical, physiological, genetic, mental, behavioural, economic, cultural or social identity or characteristics of a person.

It has been the fashion in legislative drafting for some time to include examples and lists to clarify what is meant by the section the list or examples are linked to. It is not always helpful.  The list proposed by the Attorney General is at a high enough level of abstraction as to not date or be too limiting.  But any list can be seen as

4.2.4 Define ‘collection’ to clearly cover inferred information

The Discussion Paper proposed bringing OAIC guidance on inferences into the Act’s definition of ‘collection’ to ensure that personal information that is inferred or generated by APP entities triggers obligations under APPs 3, 4 and 5.

It proposes:

4.3 Amend the definition of ‘collects’ to expressly cover information obtained from any source and by any means, including inferred or generated information.

Notwithstanding  submissions accurately noting that inferences are already captured the moment that the inference meets the definition of personal information the Report still prefers to include ‘inferred’ in the definition of ‘collect’ to remove confusion about when the APPs begin to have effect for collection and use of inferred personal information, including in data analytics and machine learning processes: the collection would be from the point it is generated. The rationale is that this would enhance trust in entities who use these techniques in the Australian economy and clarify that reasonable steps to give notice apply where personal information is inferred from unidentified information, such as when actively inferring identity from data about geolocation. This belt and braces approach to drafting is in and of itself inoffensive however it is not condusive to simple and clean drafting.  The rationale by the Attorney General’s Department is tosh.  The amendment will not enhance trust nor will it clarify reasonable steps to give notice.

The Report commendably rejected the timeworn complaint that this amendment would cause “..practical challenges for notification and consent, or lead to ‘notification fatigue’”. It noted that these issues already exist, though the extent to which there is notification fatigue is highly contested. Quite extravagant stories have circulated in the United States of people being deluged with notifications. The reality, upon investigation, is markedly different.  APP 5 would require notice (and, if applicable, consent) at the time of collection, or as soon as practicable afterwards. The steps to give notice or otherwise ensure the individual is aware of relevant matters are those steps that are reasonable in the circumstances (if any). If there is a natural inference that the individual would expect or it is inseparable from the original information then APP 5 notice would not likely be required.  Organisations that have sophisticated means of collecting data have the capacity to provide notifications.

4.3 When is an individual identified or reasonably identifiable?

The second limb of the definition, that an individual must be ‘identified or reasonably identifiable’ for any information that relates to an individual to be ‘personal information’ was introduced into the definition of personal information in the Act in 2012.  It replaced the phrase ‘identity is apparent, or can reasonably be ascertained’.

The Report rejected further defining when the reasonableness threshold would be met as the range of circumstances in which entities deal with information is broad. Each entity will need to conduct the assessment in their own context and address the reasonableness of identification in that context.

Te Report proposed:

4.4 ‘Reasonably identifiable’ should be supported by a non-exhaustive list of circumstances to which APP entities will be expected to have regard in their assessment.

The Discussion Paper proposed defining ‘reasonably identifiable’ to cover circumstances in which an individual could be identified, directly or indirectly to emphasise that information may become reasonably identifiable indirectly through linking it with other information.

The Report cited the determinations in the Clearview and 7-Eleven that  generally speaking, an individual is ‘identifiable’ where they are ‘distinguished from all others in a group’. An individual can be identifiable where the information can be linked with other information that identifies them or where the linkage forms an ensemble that identifies them. The test does not require that an individual’s legal identity be known provided the information could be linked back to that specific person.

The Report described the terms ‘identified’, ‘reasonably identifiable’ and ‘de-identified’ as being  identification on a spectrum with:

  • one end of the spectrum begining at information unrelated to an individual,
  • then moves through various degrees of unidentified information
  • then reaching reasonable identifiability in the middle.
  • while at the other side of the spectrum are the degrees of de-identification
  • until an individual can no longer be distinguished and the information can no longer be linked with other information.

Recital 26 of the GDPR indicates that in determining whether an individual is identifiable, ‘account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly’.

The location of information on the identifiability spectrum can be fluid over time and contexts. Information that:

  • in one context does not allow for identification of an individual may, in a different context, identify them.
  • has been quarantined from other information, or had linking identifiers removed to prevent identification, will only continue to prevent identification while those circumstances persist.
  • is later linked to that information may be able to identify an individual.

The obligation to mitigate privacy risks in APP 1.2  extends to mitigating the risks of identifiability when APP entities are collecting, using, disclosing and storing unidentified or de-identified information.

De-identified information may be disclosed to a third party who holds information which enables the individual to be identified. Disclosing information from an environment in which it is personal information into a new context where an individual is no longer identifiable, will not be a disclosure of personal information.

The Report stated that the circumstances to aid assessment of reasonable identifiability, could include:

  1. the nature and volume of the information
  2. who holds or has access to the information
  3. how and why the information is collected, used, stored and disclosed
  4. the other information that is available (or known) to the recipient, and the practicability of using that information to identify an individual, and
  5. the context in which information is handled, including the context into which information will be disclosed.

4.4 De-identified, anonymised and pseudonymised information

The term ‘de-identified’ is currently defined by the Privacy Act as a state of information where an individual’s personal information is treated in such a way such that the individual is no longer reasonably identifiable. De-identified information sits outside the protections of the Act.

The Report concludes that:

  • the impracticality of achieving irreversible anonymisation makes a complete anonymisation standard unwarranted.
  • de-identification should not be viewed as a static condition. The level of technical de-identification is determined by the context:
    • in which the information is held, used or disclosed
    • the risk of re-identification.
  • de-identified information for the purposes of the principles-based Privacy Act should be defined to make it clear that de-identifying information is a process that involves treating it in such a way so as to not allow an individual to be reasonably identifiable while those circumstances persist.
  • de-identification is subject to its circumstances & when circumstances change , an APP entity cannot rely on the past de-identification and must conduct a proportional reassessment and possibly further de-identification.

As the Act gives effect to the principle of data minimisation  APP entities:

  • should be encouraged to only collect and keep the personal information they need,
  • use de-identified rather than raw personal information where the latter is not required.
  • may be able to engage in ‘functional de-identification’ with strict organisational and technical controls so that identifying information is separated.

De-identification is different to true anonymisation which may only be achievable by aggregating individuals’ data together.

The Report proposes:

4.5 Amend the definition of ‘de-identified’ to make it clear that de-identification is a process, informed by best available practice, applied to personal information that involves treating it in such a way such that no individual is identified or reasonably identifiable in the current context.

4.5 Protections for de-identified information

Because disclosure of de-identified information into an environment where it can be linked with other information may enable an individual to be identified or become reasonably identifiable the Report supports the approach taken by privacy laws in other countries in placing restrictions on disclosure of de-identified or pseudonymised information.

The Report regards it as not appropriate to apply all of the protections under the Act to de-identified information as there is insufficient justification to hinder activities that seek to maximise the utility and productivity of de-identified data. The Privacy Act should  ensure that functional or incomplete de-identification is afforded the necessary protections to ensure that the public can have confidence that entities are appropriately managing the privacy risks associated with handling de-identified information.

The Report proposes that:

  • APP 11.1 should apply to de-identified information. The reasonable steps required to protect de-identified information will be those steps that reinforce the quality and continuation of the de-identification, including further reasonable steps to protect it from motivated third parties where the information is sensitive, voluminous, or valuable.
  • APP 8 should apply to de-identified information. It would undermine the protections in APP 11.1 if APP entities could simply disclose de-identified information to a partner overseas where it may be re-identified without breaching the APPs.  APP 8 should be amended to require that APP entities take reasonable steps when disclosing de-identified information overseas to ensure that the receiving entity does not re-identify the information or further disclose the information in such a way as to undermine the effectiveness of the de-identification.
  • the NDB scheme in Part IIIC of the Act would also extend to de-identified information where the access or disclosure would be likely to result in serious harm because of the risk of re-identification together with the sensitivity of the information and other relevant harm factors.

The Report proposes:

4.6 Extend the following protections of the Privacy Act to de-identified information:

  • APP 11.1 – require APP entities to take such steps as are reasonable in the circumstances to protect de-identified information: (a) from misuse, interference and loss; and (b) from unauthorised re-identification, access, modification or disclosure.
  • APP 8 – require APP entities when disclosing de-identified information overseas to take steps as are reasonable in the circumstances to ensure that the overseas recipient does not breach the Australian Privacy Principles in relation to de-identified information, including ensuring that the receiving entity does not re-identify the information or further disclose the information in such a way as to undermine the effectiveness of the de-identification.
  • Targeting proposals – the proposed regulation of content tailored to individuals should apply to de-identified information to the extent that it is used in that act or practice.

4.7 Sensitive information

Sensitive information is defined as:

  • information or an opinion about an individual’s:
    • racial or ethnic origin
    • political opinions
    • membership of a political association
    • religious beliefs or affiliations
    • philosophical beliefs
    • membership of a professional or trade association
    • membership of a trade union
    • sexual orientation or practices, or
    • criminal record

that is also personal information.

  • health information about an individual
  • genetic information (that is not otherwise health information)
  • biometric information that is to be used for the purpose of automated biometric verification or biometric identification, or
  • biometric templates.

Regarding Biometric information the Report states:

  • it is sensitive information if it is to be used for the purpose of automated biometric verification or biometric identification.
  • other biometric information that is not a template or used for automatic biometric identification would be personal information where it meets the definition.
  • Biometric templates form a separate category of sensitive information and are covered regardless of whether they first meet the definition of personal information.
  • Biometric templates and biometric information used to verify identity are unique and inalienable to the individual such that they warrant particular protection.
  • OAIC guidance and IC determinations indicate that biometrics information can include both physiological features (like fingerprints, iris, or face geometry) and behavioural attributes (like a person’s gait or keystroke pattern).
  • the definition of sensitive information in the Act can encompass a broad range of biometric information, including behavioural biometrics, where used for automated verification or identification or a template is created. However, the definition is also deliberately limited.
  • currently, genetic but not genomic information is included in the definition of sensitive information.
  • an amendment to add genomic information within the definition of sensitive information is recommended.

The Report proposes:


a.     Amend the definition of sensitive information to include ‘genomic’ information.

b.     Amend the definition of sensitive information to replace the word ‘about’ with ‘relates to’ for consistency of terminology within the Act.

c.      Clarify that sensitive information can be inferred from information that is not sensitive information

The Report addressed the sensitivity of location data in the Discussion Paper.  It sees merit in including precise geolocation tracking data as a special category of personal information requiring express consent for tracking and storage over time which would require precise geolocation tracking data as a new consent-dependent category of information.  It does not support including location data as a category of sensitive information where the basis for the risk arising from geolocation tracking data does not stem from the geolocation data per se but from what it reveals.

Regarding precise geolocation data the Report stated:

  • it should not normally include IP Address or address/post-code type information, but rather the technologically precise location an individual was located at (by reference to GPS or equivalent) at a particular time or when undertaking a particular activity.
  • limiting location data to precise geolocation tracking will limit the regulatory burden of incidental location data required for the delivery of services.
  • it should also be limited to tracking data which is data collected repeatedly over time to record movements or activity.
  • would likely include an app tracking movement throughout the day for the purposes of marketing.
  • would include tracking movement for the purposes of rideshare services or health apps. Such apps would need to rely on valid, concurrent consent when using the app to authorise collection and could not store that information without consent.

The Report does not recommend that the collection of location data should be prohibited. As the issue is transparency it s necessary to obtain proper valid informed consent if APP entities are to collect a history or diary of an individual’s activity.

4.10 Recognise collection, use, disclosure and storage of precise geolocation tracking data as a practice that requires consent.

Define ‘geolocation tracking data’ as personal information that shows an individual’s precise geolocation, that is collected and stored by reference to the particular individual at particular places and times, and tracked over time.

Leave a Reply

Verified by MonsterInsights