Attorney General’s Department releases discussion paper on reform to the Privacy Act 1988
October 27, 2021 |
On 25 October the Attorney General’s Department released its long awaited Privacy Act Review Discussion paper (the “Paper”). It is something of a behemoth, being 217 pages long or about half a lever arch folder. That said, as a veteran of reading many reform papers on privacy over the years it is not the longest or most comprehensive. That honour falls upon the Australian Law Reform Commissions 2008 Report, For Your Information: Australian Privacy Law and Practice (ALRC Report 108), which filled more than 3 lever arch folders over 3 volumes. The ALRC’s 2014 Report,Serious Invasions of Privacy in the Digital Era (ALRC Report 123), at 332 pages, was modest by comparison and slightly built on the earlier ALRC report. The ACCC Digital Platforms Inquiry considered privacy related matters, in particular endorsing and recommending a statutory tort of interference with privacy, coming in at 623 pages. And there are reports from the Victorian Law Reform Commission and the New South Wales Law Reform Commission on privacy. The point being made is not that I have read a lot of reports. I have. It is also not that the size of the reports matter. They don’t. It is that this Paper is just another in a long line of reports on the need for report of privacy legislation. And those previous reports were prepared by much more learned authors and were more thorough than this Paper.
The Paper is a constrained work, making many generally uncontroversial recommendations to make interpretation clearer, operation of APPs more relevant and giving some increased powers to the Information Commissioner. It is far from comprehensive. It avoids making recommendations about a statutory tort of privacy. Rather it continues the continual policy loop as governments of every persuasion push this issue into further review, then consultation then bury it in a report and then hope it goes away until it is recommended or otherwise finds itself before the Government. It has been a hugely expensive, time intensive waste of time. Any body outside of a Government that looks into the issue recognises the need for a statutory tort of privacy.
The Report discusses the small business exception from the operations of the Privacy Act in the broad, on the one hand then on the other way, as well as that of the Employment Records, Political Parties and Journalist carve out but goes no further. Each exception is anomolous to a greater or lesser degree and the restricted coverage of the Act, covering only 5% of businesses, is a matter that should have been addressed with a firm proposal. Those carve outs make it regulation that is quite limited in scope.
The Paper did not consider the many exceptions to and limitations upon the APPs. There are too many exceptions which permit agencies especially avoid proper scrutiny.
It is interesting that the Paper quotes the GDPR definitions and practices quite liberally and endorses aspects of the GDPR but refrains from adopting those parts of the regulation, by way of amendment to the Privacy Act 1988, which makes the GDPR a much more effective privacy regulation regime.
The Paper does not consider the role of the Guidelines, which are prepared by the Office of the Australian Information Commissioner’s office, in proceedings. The Guidelines are important in giving context and detail to the broadly drawn Australian Privacy Principles (APPs). But they are not regulations. As such the Administrative Appeals Tribunal and the Federal Court are quite able to have no regard to them, which has happened in cases. This has made submissions on the interpretation of Principles a fraught affair before the AAT and the Federal Court where applicants have had a poor record of success. And not because they had weak cases.
Where major revision was warranted the Paper recommends modest improvements. An improvement is just that, so that is to be welcomed. But only to that degree. What the Paper does not address, which it cannot consider, is the general failure of the Office of Information Commissioner to properly regulate compliance with the Privacy Act 1988, flawed though that legislation be. Until recently it was a hapless and hopeless litigator and otherwise timid regulator. It had a limited public presence and was of little concern to organisations. As a result there has been a culture of non compliance and apathy towards proper privacy and cyber security protections. To be fair, the current Commissioner has been more active than her predecessors but that is not saying much and the office remains largely subterranean in terms of its presence in the business community or to governmental agencies. It remains a weak regulator. Governments must be prepared to appoint assertive and active Commissioners, preferably from outside the public service, and then provide the office with adequate resources. It needs to shed its reputation for timidity and being a cull de sac for those who want the quieter life.
The AG’s Department seek submissions in response to the Discussion Paper or other matters relevant to the review’s Terms of Reference by 10 January 2022. We are told submissions and feedback will inform the review’s final report. That would be a nice change. The reality is that a Federal Election is due between March and May 2022. It is possible that the Final Report will be released by the Government before the Federal Election but there is every chance that it will not be ready before Parliament is prorogued. What is almost certain is that there will not be any legislative amendments during the course of this Parliament. And Australian Parliamentary practice being what it is, it is unlikely that this process will proceed apace after the election. The incoming Government will have its agenda which will absorb the first year, there may be a new Attorney General who will take time getting across the portfolio and priorities change. In short, reform of privacy legislation will follow its usual pattern of small inadequate steps taken occasionally when doing nothing becomes a non option. While that may be the usual scenario what is changing at a rapid pace are the instances of data breaches and interferences with privacy.
The proposals are:
Objects
The Paper proposes amending the objects at section 2A, to clarify the Act’s scope and introduce the concept of public interest. The inserted provisions are:
- to promote the protection of the privacy of individuals with regard to their personal information, and
- to recognise that the protection of the privacy of individuals is balanced with the interests of entities in carrying out their functions or activities undertaken in the public interest.
Definition of personal information
The Paper proposes amending the definition of personal information. That is necessary given the analog decision for a digital age by the Federal Court in Privacy Commissioner v Telstra Corporation Limited [2017] FCAFC 4 which made the defintion of personal information unworkable and meaningless for information on the internet.
The current definition is:
personal information” means information or an opinion about an identified individual, or an individual who is reasonably identifiable:
(a) whether the information or opinion is true or not; and
(b) whether the information or opinion is recorded in a material form or not.
The proposal is to:
- delete the word ‘about’ and insert ‘relates to’ in its place.
- include a non-exhaustive list of the types of information capable of being covered by the definition of personal information.
- define ‘reasonably identifiable’ to cover circumstances in which an individual could be identified, directly or indirectly. Include a list of factors to support this assessment.
Definition of collection
The Paper proposes amending the definition of personal information to expressly cover information obtained from any source and by any means, including inferred or generated information.
Anonymity
The Paper proposes amending the Act to require personal information to be anonymous before it is no longer protected by the Act and recommends re-introducing the Privacy Amendment (Re-identification) Offence Bill 2016 with appropriate amendments. This Bill was a bad idea in 2016. It will not get better if reintroduced. Criminalising re identification will not change any behaviour. It does not punish or even censor those who ineffectively de identify documents. It is legislative virtue signalling and wishful thinking.
Flexibility of the APPs
The Paper recommends providing the Information Commissioner with increased powers to make APP Codes by amending the Act to:
- allow the Commissioner to make an APP code on the direction or approval of the Attorney?General:
- where it is in the public interest to do so without first having to seek an industry code developer, and
- where there is unlikely to be an appropriate industry representative to develop the code
- allow the Commissioner to issue a temporary APP code on the direction or approval of the Attorney-General if it is urgently required and where it is in the public interest to do so.
Proposal 3.3 recommends amending Part VIA of the Act to allow Emergency Declarations to be more targeted by prescribing their application in relation to:
- entities, or classes of entity
- classes of personal information, and
- acts and practices, or types of acts and practices.
Proposal 3.4 is more controversial in permitting organisations to disclose personal information to state and territory authorities when an Emergency Declaration is in force. Given the vagueness and potential breadth of declarations this has the potential for undermining the integrity of the operation of the Privacy Act. In this era of widespread public health orders and the extensive use of emergency powers this is likely not to attract the concerns it should.
Notice of collection of personal information
The Paper proposes amending APP 5 by:
- introducing an express requirement ithat privacy notices must be clear, current and understandable.
- limiting notices to the following matters:
- the identity and contact details of the entity collecting the personal information
- the types of personal information collected
- the purpose(s) for which the entity is collecting and may use or disclose the personal information
- the types of third parties to whom the entity may disclose the personal information
- if the collection occurred via a third party, the entity from which the personal information was received and the circumstances of that collection
- the fact that the individual may complain or lodge a privacy request (access, correction, objection or erasure), and
- the location of the entity’s privacy policy which sets out further information.
- strengthening the requirement for when an APP 5 collection notice is required. That will involve requiring notification at or before the time of collection, or if that is not practicable as soon as possible after collection, unless:
- the individual has already been made aware of the APP 5 matters; or
- notification would be impossible or would involve disproportionate effort.
- require APP 5 notices to be clear, current and understandable, in particular for any information addressed specifically to a child.
8.3 Standardised privacy notices could be considered in the development of an APP code, such as the OP code, including standardised layouts, wording and icons. Consumer comprehension testing would be beneficial to ensure the effectiveness of the standardised notices.
These amendments and proposals are not controversial.
Consent to the collection, use and disclosure of personal information
Consent is a huge issue in privacy and with collection of data. The amendments proposed are a good first step. Enforcement is necessary to instil in organisations what is required because at the moment consent is cursory, inferred by spurious logic and sometimes non existent.
The Paper:
- recommends defining consent as being voluntary, informed, current, specific, and an unambiguous indication through clear action.
- suggests considering standardised consents in the development of an APP code, including standardised layouts, wording, icons or consent taxonomies.
Additional protections for collection, use and disclosure of personal information
The Paper proposes amending the Act to:
- require that collection, use or disclosure of personal information under APP 3 and APP 6 must be fair and reasonable in the circumstances.
- incorporate legislated factors relevant to whether a collection, use or disclosure of personal information is fair and reasonable in the circumstances to include:
-
- Whether an individual would reasonably expect the personal information to be collected, used or disclosed in the circumstances
- The sensitivity and amount of personal information being collected, used or disclosed
- Whether an individual is at foreseeable risk of unjustified adverse impacts or harm as a result of the collection, use or disclosure of their personal information
- Whether the collection, use or disclosure is reasonably necessary to achieve the functions and activities of the entity
- Whether the individual’s loss of privacy is proportionate to the benefits
- The transparency of the collection, use or disclosure of the personal information, and
- If the personal information relates to a child, whether the collection, use or disclosure of the personal information is in the best interests of the child.
- include an additional requirement in APP 3.6 to the effect that that where an entity does not collect information directly from an individual, it must take reasonable steps to satisfy itself that the information was originally collected from the individual in accordance with APP 3.
- define a ‘primary purpose’ as the purpose for the original collection, as notified to the individual.
- define a ‘secondary purpose’ as a purpose that is directly related to, and reasonably necessary to support the primary purpose.
Restricted and prohibited acts and practices
The options the Paper recommends in reforming this part of the Privacy Act is:
Option 1
That APP entities must take reasonable steps to identify privacy risks and implement measures to mitigate those risks if they engage in the following restricted practices :
- Direct marketing, including online targeted advertising on a large scale
- The collection, use or disclosure of sensitive information on a large scale
- The collection, use or disclosure of children’s personal information on a large scale
- The collection, use or disclosure of location data on a large scale
- The collection, use or disclosure of biometric or genetic data, including the use of facial recognition software
- The sale of personal information on a large scale
- The collection, use or disclosure of personal information for the purposes of influencing individuals’ behaviour or decisions on a large scale
- The collection use or disclosure of personal information for the purposes of automated decision making with legal or significant effects, or
- Any collection, use or disclosure that is likely to result in a high privacy risk or risk of harm to an individual.
Option 2
Increase an individual’s capacity to self-manage their privacy in relation to the specified restricted practices, .
The possible measures include:
- consent (by expanding the definition of sensitive information),
- granting absolute opt-out rights in relation to restricted practices
- ensuring that explicit notice for restricted practices is mandatory.
Pro-privacy default settings
Regarding implementing some form of pro privacy default settings the Paper suggests two options:
Option 1
Have pro-privacy settings enabled by default. Where an entity offers a product or service that contains multiple levels of privacy settings, an entity must pre-select those privacy settings to be the most restrictive. T
Option 2
Require easily accessible privacy settings. Entities must provide individuals with an obvious and clear way to set all privacy controls to the most restrictive, such as through a single click mechanism.
Children and vulnerable individuals
Consent by children and protection of children is at the forefront of government’s mind in the context of social media and the internet in general. It is not surprising then that the Paper proposes to amend the Act to require consent to be provided by a parent or guardian where a child is under the age of 16.
The Review wants feedback on:
- whether APP entities should be permitted to assess capacity on an individualised basis where it is practical to do so.
- the circumstances in which parent or guardian consent must be obtained with 2 options being floated:
- Option 1 – Parent or guardian consent to be required before collecting, using or disclosing personal information of the child under the age of 16.
- Option 2 – In situations where the Act currently requires consent, including before the collection of sensitive information or as an available mechanism to undertake a secondary use or disclosure of personal information.
It is not surprising that the Paper makes not recommendation. The best regulations are those that do not differentiate people by age or sex or any other points of differentiation. That said, children and those with a disability are rightly recognised as persons requiring special protection. The difficulty is crafting protections which are effective and easily understood.
Right to object and portability
The proposes amendment to provide:
- an individual may object or withdraw their consent at any time to the collection, use or disclosure of their personal information.
- on receiving notice of an objection, an entity must take reasonable steps to stop collecting, using or disclosing the individual’s personal information and must inform the individual of the consequences of the objection.
- the right to object includes an unqualified right to object to any collection, use or disclosure of personal information by an organisation for the purpose of direct marketing. If an organisation provides marketing materials to an individual, it must notify the individual of their right to object in relation to each marketing product provided.
Right to erasure of personal information
The Paper proposes that an individual may only request erasure of personal information where (subject to exceptions):
- the personal information must be destroyed or de-identified under APP 11.2
- the personal information is sensitive information
- an individual has successfully objected to personal information handling through the right to object
- the personal information has been collected, used or disclosed unlawfully
- the entity is required by or under an Australian law, or a court/tribunal order, to destroy the information, and
- the personal information relates to a child and erasure is requested by a child, parent or authorised guardian.
An APP entity must respond to an erasure request within a reasonable period. & if it refuses to erase the personal information because an exception applies, it must give the individual a written notice that sets out the reasons for refusal and mechanisms available to complain about the refusal, unless unreasonable to do so.
This is a quite ineffective proposal. It is likely to be user unfriendly.
Direct marketing, targeted advertising and profiling
The Paper recommends that:
- the use or disclosure of personal information for the purpose of influencing an individual’s behaviour or decisions must be a primary purpose notified to the individual when their personal information is collected.
- an APP entity would be required to include the following additional information in its privacy policy:
- whether it is likely to use personal information, alone or in combination with any other information, for the purpose of influencing an individual’s behaviour or decisions and if so, the types of information that will be used, generated or inferred to influence the individual, and
- whether it uses third parties in the provision of online marketing materials and if so, the details of those parties and information regarding the appropriate method of opting-out of those materials.
- Repeal APP 7 in light of existing protections in the Act and other proposals for reform.
Automated decision-making
The Paper recommends requiring privacy policies to include information on whether personal information will be used in automated decision-making which has a legal, or similarly significant effect on people’s rights. This will not be the last word on AI or automated decision making. That will be a very significant issue and matter for government consideration in the future. It is not surprising that the Paper is so general in its commentary.
Accessing and correcting personal information
The Paper proposes the following:
- that an organisation must identify the source of personal information that it has collected indirectly, on request by the individual, unless it is impossible or would involve disproportionate effort;
- Introducing an additional ground on which an APP organisation may refuse a request for access to personal information:
the information requested relates to external dispute resolution services involving the individual, where giving access would prejudice the dispute resolution process.
- clarifying the existing access request process in APP 12 to the effect that:
an APP entity may consult with the individual to provide access to the requested information in an alternative manner, such as a general summary or explanation of personal information held, particularly where an access request would require the provision of personal information that is highly technical or voluminous in nature; and
where personal information is not readily understandable to an ordinary reader, an APP entity must provide an explanation of the personal information by way of a general summary of the information on request by an individual
These are improvements and good as far as they go.
Security and destruction of personal information
The Paper recommends amendments to:
- APP 11.1 to state that ‘reasonable steps’ includes technical and organisational measures.
- include a list of factors that indicate what reasonable steps may be required.
- APP 11.2 to require APP entities to take all reasonable steps to:
- destroy the information or
- ensure that the information is anonymised
where the entity no longer needs the information
Organisational accountability
The Paper recommends amending the Act:
- to Introduce further organisational accountability requirements into the Act, targeting measures to where there is the greatest privacy risk:
- to expressly require under APP 6 that APP entities determine, at or before using or disclosing personal information for a secondary purpose, each of the secondary purposes for which the information is to be used or disclosed and to record those purposes.
Overseas data flows
The Paper recommends:
- amending the Actto introduce a mechanism to prescribe countries and certification schemes under APP 8.2(a).
- having standard Contractual Clauses for transferring personal information overseas be made available to APP entities to facilitate overseas disclosures of personal information.
- removing the informed consent exception in APP 8.2(b).
- strengthening the transparency requirements regarding potential overseas disclosures to include:
- the countries that personal information may be disclosed to,
- the specific personal information that may be disclosed overseas in entity’s up-to-date APP privacy policy required to be kept under APP 1.3.
- introducing a definition of ‘disclosure’ that is consistent with the current definition in the APP Guidelines.
- amending the Act to clarify what circumstances are relevant to determining what ‘reasonable steps’ are for the purpose of APP 8.1.
These amendments are improvements.
Enforcement
The Paper recommends amending the Act:
- to create tiers of civil penalty provisions. The purported reason is to give the OAIC more options so they can better target regulatory responses including:
- A new mid-tier civil penalty provision for any interference with privacy, with a lesser maximum penalty than for a serious and repeated interference with privacy.
- A series of new low-level and clearly defined breaches of certain APPs with an attached infringement notice regime.
- to clarify what is a ‘serious’ or ‘repeated’ interference with privacy. This will be a very welcome development as the Commissioner has taken a very conservative approach to the meaning. I believe unreasonably so. She has not tested the meaning in the Federal Court despite having these powers since 2014. Whether this will prompt more assertive action from a famously timid regulator is another matter.
- so that the powers in Part 3 of the Regulatory Powers (Standard Provisions) Act 2014 (Regulatory Powers Act) would apply to investigations of civil penalty provisions in addition to the IC’s current investigation powers.
- to provide the Commissioner the power to undertake public inquiries and reviews into specified matters. This may be of some use but is
- at paragraphs 52(1)(b)(ii) and 52(1A)(c) to require an APP entity to identify, mitigate and redress actual or reasonably foreseeable loss. The amendment would be as underlined:
a declaration that the respondent must perform any reasonable act or course of conduct to identify, mitigate and redress any actual or reasonably foreseeable loss or damage suffered by the complainant/those individuals.
- give the Federal Court the power to make any order it sees fit after a section 13G civil penalty provision has been established.
The Paper also recommends introducing an industry funding model similar to ASIC’s incorporating two different levies:
- A cost recovery levy to help fund the OAIC’s provision of guidance, advice and assessments, and
- A statutory levy to fund the OAIC’s investigation and prosecution of entities which operate in a high privacy risk environment.
A very good recommendation is to amend the annual reporting requirements to increase transparency about the outcome of all complaints lodged including numbers dismissed under each ground. That will be welcome. The Information Commissioner’s Annual Reports on complaints and resolutions have been models of opacity. The office has trumpeted the number of complaints resolved but not given a proper breakdown on how they were resolved, including those dismissed.
The Paper posits 3 alternative regulatory models:
Option 1
Encourage greater recognition and use of EDRs. APP entities that handle personal information could be required to participate in an EDR scheme. APP entities that are not part of a recognised EDR scheme could be required to pay a fee for service to the OAIC as the default complaint handling provider if a complaint is made against them.
Option 2
Create a Federal Privacy Ombudsman that would have responsibility for conciliating privacy complaints in conjunction with relevant EDR schemes.
Option 3
Establish a Deputy Information Commissioner – Enforcement within the OAIC.
A direct right of action
In a welcome, belated, proposal the Paper recommends:creating a direct right of action with the following design elements:
- the action would be available to any individual or group of individuals whose privacy has been interfered with by an APP entity. A good idea.
- the action would be heard by the Federal Court or the Federal Circuit Court. The appropriate forum.
- the claimant would first need to make a complaint to the OAIC and have their complaint assessed for conciliation either by the OAIC or a recognised EDR scheme such as a relevant industry ombudsman. This is not a good idea. The lethargic approach by the Commissioner is a practical concern. The Commissioner has not been noted as an advocate for the consumer/complainant. The emphasis is to get a resolution, secondary is whether it was as good for the complainant as it should have been. This is the model used in the Victorian Privacy and Data Protection Act 2014. It is questionable whether it is a good model in practice even if it seems reasonable on paper.
- the complainant could then elect to initiate action in court where the matter is deemed unsuitable for conciliation, conciliation has failed, or the complainant chooses not to pursue conciliation. The complainant would need to seek leave of the court to make the application. Seeking leave of the court is not a good idea. It interposes a preliminary process, with the costs that come with that, that will involve an assessment at a very preliminary stage of the merits of the claim and the significance of any damage. It creates an artificial hurdle which should not exist.
- The OAIC would have the ability to appear as amicus curiae to provide expert evidence at the request of the court. Remedies available under this right would be any order the court sees fit, including any amount of damages. These processes are welcome
A statutory tort of privacy
Notwithstanding wiser and more learned minds at the Australian, Victorian and New South Wales Law Reform Commissions recommending a statutory tort of privacy as well as the ACCC the Paper still dilly dallies with the proposal and proposes options. It is like this issue is on a permanent loop without any body politic interceding and just recognising the lacuna in the law in this area and formulating a statutory cause of action. But in the meantime the tired exercise of floating options continues with:
Option 1:
Introduce a statutory tort for invasion of privacy as recommended by the ALRC Report 123. This is the best model and most appropriate course.
Option 2
Introduce a minimalist statutory tort that recognises the existence of the cause of action but leaves the scope and application of the tort to be developed by the courts. This is a reasonable second best option although Australian Courts have found dealing with privacy as a concept quite difficult and have referenced equity principles to the detriment of claims.
Option 3:
Do not introduce a statutory tort and allow the common law to develop as required. However, extend the application of the Act to individuals in a non-business capacity for collection, use or disclosure of personal information which would be highly offensive to an objective reasonable person. This not a dreadful option however there is no guarantee that the law will develop organically. So far the Federal Court has made a dreadful mess of the privacy related cases they have considered. It has demonstrated a difficulty of applying the Principles and the general definitions to the claims brought before it. There is every chance that this option will result in continuing stagnation in this area of law,.
Option 4:
In light of the development of the equitable duty of confidence in Australia, states could consider legislating that damages for emotional distress are available in equitable breach of confidence. A terrible idea. The development of the equitable duty of confidence has been lethargic and a poor vessel for privacy related matters. The UK courts have developed a stand alone tort of interference with privacy because the equitable cause of action was never a good mechanism for properly adjudicating such disputes.
Notifiable Data Breaches scheme
The Paper makes some minor recommendations about the complex and generally ineffective National Data Breach provisions of the Privacy Act by recommending:
- amendments to subsections 26WK(3) and 26WR(4) to the effect that a statement about an eligible data breach must set out the steps the entity has taken or intends to take in response to the breach, including, where appropriate, steps to reduce any adverse impacts on the individuals to whom the relevant information relates. All quite reasonable as far as it goes but does not tackle the real problem with the Scheme; the vagueness and the unworkable balancing self assessement exercise set out in the legislation which favours non disclosure.
- the Attorney?General’s Department develop a privacy law design guide to support Commonwealth agencies when developing new schemes with privacy-related obligations. That would be nice.
- encouraging regulators to continue to foster regulatory cooperation in enforcing matters involving mishandling of personal information. A classic public service bromide.