UK and US issue statement of protection of children on line

October 22, 2024

One of the key challenges with regulation of the internet is how to protect the interests, including privacy, of children which is effective. Protecting children covers a very broad spectrum of activities; protecting their personal information, shielding them from damaging images, minimising the adverse effects of social media on children’s mental and physical health, eradicating the transmission of explicit images from adults to children or children to children and stopping child pornography. It is difficult enough to regulate within a country let alone deal with extra territoriality. South Australia is considering banning social media for children. The report by ex Chief Justice French proposes a Children (Social Media Safety) Bill 2024 which imposes a positive obligation on social media platforms to prevent access to children under the age of 14. It is complex.

The UK – US has issued a statement on the protection of children on line.

The statement provides:

The United Kingdom and the United States share fundamental values and a commitment to democracy and human rights, including privacy and freedom of expression. Both the United Kingdom and the United States, alongside our international partners, are taking steps to support children’s online safety.

To make the internet safer for children, we should aim to ensure all users have the skills and resources they need to make safe and informed choices online and advance stronger protections for children. The United States and the United Kingdom intend to work with our national institutions and organisations to support these goals and shared values. To help further these aims, both countries plan to establish a joint children’s online safety working group to advance the aims and principles of this statement. Read the rest of this entry »

UK Information Commissioner fines Police Service of Northern Ireland 750,000 pounds for exposing the personal information of its entire workforce

October 21, 2024

When it comes to poor data security practices and serious data breaches the police and health service providers are generally amongst the worst performers. Both have serious cultural problems in properly treating personal information confidential. Both often have serious system problems, especially with their IT. The UK Information Commissioner’s fine of 750,000 of the Police Service of Northern Ireland is the most recent example. Here the breach was the very common human error of uploading a document onto a webpage.  That happens quite regularly.  Here the document contained the personal information of all employees of the Northern Ireland Police Service.  The consequences were baleful.  The quality assurance processes failed.  While the personal information was viewable for only 3 hours the Police Service are working on the assumption that the information was accessed by dissident republications who would use to intimidate.

The media release provides 

We have fined Police Service of Northern Ireland (PSNI) £750,000 for exposing the personal information of its entire workforce, leaving many fearing for their safety.
Our investigation found that simple-to-implement procedures could have prevented the serious breach, in which hidden data on a spreadsheet released as part of a freedom of information request revealed the surnames, initials, ranks and roles of all 9,483 PSNI officers and staff.
Mindful of the current financial position at PSNI and not wishing to divert public money from where it is needed, the Commissioner used his discretion to apply the public sector approach in this case. Had this not been applied, the fine would have been £5.6 million.

Summary of the breach

On 3 August 2023, PSNI received two freedom of information requests from the same person via WhatDoTheyKnow (WDTK). The first asked for “… the number of officers at each rank and number of staff at each grade …”, the second asking for a distinction between “how many are substantive / temporary / acting …”.
The information was downloaded as an Excel file with a single worksheet from PSNI’s human resources management system (SAP). The data included: surnames and first name initials, job role, rank, grade, department, location of post, contract type, gender and PSNI service and staff number.

As the information was analysed for disclosure, multiple other worksheets were created within the downloaded Excel file. On completion, all visible onscreen worksheet tabs were deleted from the Excel file. The original worksheet, containing the personal details, remained unnoticed and this was also not picked up despite quality assurance. The file was subsequently uploaded to the WDTK website at 14:31 hours on 8 August.
PSNI was alerted to the breach by its own officers at approximately 16:10 hours the same day. The file was hidden from view by WDTK at 16:51 hours and deleted from the website at 17:27 hours.
Six days later, PSNI announced they were working on the assumption that the file was in the hands of dissident republicans and that it would be used to create fear and uncertainty and for intimidation.
John Edwards, UK Information Commissioner said: Read the rest of this entry »

The Australian Information Commissioner releases guidelines

AI presents a major regulatory challenge across a range of governmental and private activities. And that is especially the case with privacy. The UK Information Commissioner’s Office has issued detailed guidance and other resources on Artificial Intelligence. The US Federal Trade Commission raised issues on AI, by way of a Big Data report in 2016, by post in 2017, issued a guidance by way of Q & A in 2020 and a finding on the use of Artificial Intelligence In the Matter of DoNotPay, Inc. Matter Number 2323042 September 25, 2024. Which brings us to the Australian Information Commissioner’s release of AI guidance today. There are actually 2 guides, one on the use of commercially available AI products.  The second relates to developers using personal information to great AI models.

AI needs personal information to properly work.  Lots of it.  Each of the guides and highlight the care that needs to be taken in considering the operation of the Privacy Act when using and developing Artificial Intelligence.  

The media release provides:

New guides for businesses published today by the Office of the Australian Information Commissioner (OAIC) clearly articulate how Australian privacy law applies to artificial intelligence (AI) and set out the regulator’s expectations.

The first guide will make it easier for businesses to comply with their privacy obligations when using commercially available AI products and help them to select an appropriate product. The second provides privacy guidance to developers using personal information to train generative AI models.

“How businesses should be approaching AI and what good AI governance looks like is one of the top issues of interest and challenge for industry right now,” said Privacy Commissioner Carly Kind.

“Our new guides should remove any doubt about how Australia’s existing privacy law applies to AI, make compliance easier, and help businesses follow privacy best practice. AI products should not be used simply because they are available.

“Robust privacy governance and safeguards are essential for businesses to gain advantage from AI and build trust and confidence in the community,” she said.

The new guides align with OAIC focus areas of promoting privacy in the context of emerging technologies and digital initiatives, and improving compliance through articulating what good looks like.

“Addressing privacy risks arising from AI, including the effects of powerful generative AI capabilities being increasingly accessible across the economy, is high among our priorities,” Commissioner Kind said.

“Australians are increasingly concerned about the use of their personal information by AI, particularly to train generative AI products.

“The community and the OAIC expect organisations seeking to use AI to take a cautious approach, assess risks and make sure privacy is a key consideration. The OAIC reserves the right to take action where it is not.”

While the guidance addresses the current situation – concerning the law, state of technology and practices – Commissioner Kind said an important focus remains how AI privacy protections could be strengthened for the benefit of society as a whole.

“With developments in technology continuing to evolve and challenge our right to control our personal information, the time for privacy reform is now,” said Commissioner Kind.

“In particular, the introduction of a positive obligation on businesses to ensure personal information handling is fair and reasonable would help to ensure uses of AI pass the pub test.”

The OAIC has published a blog post with further information about the privacy guidance for developers using personal information to train generative AI models.

The first guide Read the rest of this entry »

With the Federal Government proposing a statutory tort of interference with privacy a story about homeowner pointing CCTV into neighbours backyard

October 17, 2024

NIne News reports in ‘Am I justified?’: Homeowner installs CCTV camera pointing straight into neighbour’s backyard one homeowner installing a camera pointing into a neighbour’s yard. At the moment the legal options are cumbersome and generally ineffective. There is no tort of harassment and it would be difficult to successfully argue nuisance and not possible to argue trespass. A tort of interference with privacy would however deal with such egregious conduct. As the story makes clear the Privacy Act does not apply. No so in the UK where the Information Commissioner does have powers and has issued a guidance as to the placement of CCTVs. The Commissioner has stated in summary that:

Where possible owners should position their cameras to only capture their own property. However, if this isn’t possible and the CCTV captures someone else’s property, a public area or communal space, then data protection law applies. This is because CCTV can capture images and voices of other people, and this counts as their personal information.

It is not theoretical.  In 2021 the UK in Dr Mary Fairhurst -v- Mr Jon Woodard an Oxford County Court ordered the defendant to pay 100,000 pounds for breach of the Data Protection Act in collecting Read the rest of this entry »

UK Information Commissioner’s Office reprimands UK law firm Levales Solicitors for poor protection of data which were affected by a data breach

October 16, 2024

Law firms are prime targets for data breaches. One need only look at the recent massive data breach at HWL Ebsworth. Entry into law firms can be through a range of third party providers such as IT services. The UK Information Commissioner has reprimanded a UK Law Firm, Levales for breaching the General Data Protection Regulation. The incident affected 8,234 UK individuals, of which 863 individuals were deemed at high risk because of the nature of the data involved.

According to the reprimand:

  • The breach occurred after an unknown threat actor gained access to the secure cloud based server via legitimate credentials, later publishing the data on the dark web
  • 8,234 UK data subjects were affected, of which 863 were deemed to be at ‘high-risk’ of harm or detriment due to the special category of data including criminal data pertaining to ‘homicide, terrorism, sexual offences, offences involving children or particularly vulnerable adults’.
  • the data involved was:
    • Name
    • Data of Birth
    • Address
    • National Insurance Number
    • Prisoner Number
    • Health Status
    • Details of Criminal allegations not charged
    • Details of Criminal allegations prosecuted
    • Outcomes of investigations and prosecutions
    • Details of complainants and victims both adult and children
    • Previous Convictions
    • Legally privileged information and advice
  • Levales did not implement appropriate technical and organisational measures to ensure their  systems were secure because while outsourcing their IT management to a third party were unaware of security measures in place such as detection, prevention, and monitoring.
  • Levales had not reviewed if the technical measures associated with the contract, were appropriate for the personal data they were processing since the contract was first signed in 2012.

Read the rest of this entry »

Privacy and Other Legislation Amendment Bill 2024 beginning of second reading speeches

There have been 3 further 2nd reading speeches published; from Paul Fletcher (Liberal) on 8 October 24,  Graham Perrett (Labor) and Max Chandler Mather (Greens).  None are particularly illuminating.  All follow predictable paths.  Perrett recounts what is in the bill and how that is for the good.  Fletcher makes fair criticisms about the selective approach to reform, less fair criticisms about the delay in banning doxxing and a generally confused complaint about the statutory tort, as much about the process as the benefit of otherwise of having a tort.  The problem with the process argument is that the statutory tort has been recommended by the Australian Law Reform Commission since 2008.  It’s 2014 Report also recommended such a tort.  The Attorney General Department’s Report also recommended the tort.  There can be no serious complaint about ambush and lack of knowledge. The reality is that the Coalition has always been hostile to a statutory tort.  At least they are reserving their position until the completion of the Senate Committee process.  Where there will be long and loud complaining by the business sector.

The Cross benches have proposed amendments:

By Kylea Tink:

(1)  Schedule 2, item 10, page 67 (line 19), after “privacy was”, insert “expressly”.
[defences]
(2)  Schedule 2, item 10, page 71 (line 13), after “journalistic material”, insert “about matters of public interest”.
[public interest journalism]
(3) Schedule 2, item 10, page 72 (lines 6 to 8), omit all the words from and including “reasonably believes” to the end of clause 16, substitute:
: (a) reasonably believes that the invasion of privacy is reasonably necessary for one or more enforcement related activities conducted by, or on behalf of, an enforcement body; and
(b) is conducting a lawful investigation in respect of a serious crime.
[enforcement bodies]
(4) Schedule 2, item 10, page 72 (line 15), at the end of clause 17, add:
; to the extent that the intelligence agency is conducting a lawful national security operation.
[intelligence agencies]

By Zoe Daniel:

(1) Clause 2, page 2 (after table item 7), insert:
7A. Schedule 1, Part 16
The day after this Act receives the Royal Assent.
[commencement]
(2) Schedule 1, page 58 (after line 27), at the end of the Schedule, add:
Part 16—Miscellaneous amendments
Privacy Act 1988
90 Subsection 6(1) (definition of consent)
Repeal the definition, substitute:
consent means voluntary, informed, current, specific, and unambiguous indication through clear action, which has not since been withdrawn.
91 Subsection 6(1) (definition of personal information)
Repeal the definition, substitute:
personal information: see section 6AAA.
92 After section 6
Insert:
6AAA Meaning of personal information
(1) In this Act, personal information means information or an opinion that relates to an identified individual, or an individual who is reasonably identifiable:
(a) whether the information or opinion is true or not; and
(b) whether the information or opinion is recorded in a material form or not.
Note: Section 187LA of the Telecommunications (Interception and Access) Act 1979 extends the meaning of personal information to cover information kept under Part 5-1A of that Act.
(2) For the purposes of this section, an individual is reasonably identifiable if they are capable of being distinguished from all other individuals, regardless of whether or not their identity is known.
93 Application of amendments
The amendments of section 6 of the Privacy Act 1988 made by this Part, and section 6AAA of the Privacy Act 1988 as inserted by this Part, apply in relation to acts done, or practices engaged in, after the commencement of this item.
[definitions]

Fletcher’s second reading speech provides:

I rise to speak on the Privacy and Other Legislation Amendment Bill 2024. This is a bill that’s been in the pipeline for some time, yet it is a very curious creation. It seems to have been cobbled together from a range of different parts. Each of these parts does something different. They have different objectives, and they respond to different stakeholders. They are all somehow related to privacy, but they each have their own merits and drawbacks. It just does not sit together well as a whole. All the indications are that this bill was hastily stitched together at the last minute. Read the rest of this entry »

Cyber Security Bill 2024 introduced into the House of Representatives yesterday

October 10, 2024

Yesterday the Government introduced into the House of Representatives the Cyber Security Bill 2024. The Minister’s Second Reading Speech set out the operation of the Bill.

Features of the Bill include:

  • provisions relating to victims of “ransomware” – malicious software cyber criminals use to block access to crucial files or data until a ransom has been paid.  Victims of ransomware attacks who make payments must report the payment to authorities.
  • new obligations for the National Cyber Security Coordinator and Australian Signals Directorate on how they can use information provided to them by businesses and industry about cyber security incidents.
  • organisations in critical infrastructure – such as energy, transport, communications, health and finance – will be required to strengthen programs used to secure individuals’ private data.
  • increased investigative powers of the Cyber Incident Review Board. It will be able to conduct “no-fault” investigations after significant cyber attacks and share findings to promote improvements in cyber security practices.
  • new minimum cyber security standards for all smart devices, such as watches, televisions, speakers and doorbells.. Those standards will include secure default settings, unique device passwords, regular security updates and encryption of sensitive data.

The Second Reading speech provides:

In introducing this legislation, I acknowledge the work done in its development from the former Minister for Home Affairs, now the Minister for Housing, and also acknowledge the work of the very large number of members of the Department of Home Affairs in the cybersecurity section, who have worked for some years in the development of the legislation in the national interest that I present to the House today.

This bill, alongside the Intelligence Services and Other Legislation Amendment (Cyber Security) Bill and the Security of Critical Infrastructure and Other Legislation Amendment (Enhanced Response and Prevention) Bill, form the cybersecurity legislative reforms package. This package will collectively strengthen our national cyber defences and build cyber-resilience across the Australian economy. Read the rest of this entry »

Privacy and Other LegislationAmendment Bill 2024 – Government moves the Second Reading and publishes Second Reading speech

October 8, 2024

The Government has published the Second Reading Speech and adjourned debate of the Bill. The Second Reading Speech is dated 12 September 2024 however the Daily Program lists the Speech as being moved today. It only recently appeared on the Bill’s homepage.

The Bill provides the Privacy Commissioner with more flexibility with enforcement, allowing for infringement notices and new civil penalties.  The real issue there is getting the Commissioner to use those powers.  The existing civil penalty provisions have only been used twice, and then only very recently and neither case has reached resolution. 

The statutory tort for serious invasions of privacy is welcome however the exemption carve outs, for journalism, law enforcement and security limit its effectiveness.  There is no consideration of whether the actions of the journalist is excessive and irresponsible in breaching a person’s privacy.  In the UK there is a balancing between Article 8, a right to privacy, and Article 10 a freedom of expression as applies to the media.  

There is specific provision for the development of a Children’s Privacy Code.  According to the Attorney General that is designed to align the protections with those that exist overseas. 

Doxxing will be criminalised.

There are other provisions which clarify the sharing of information when there are data breaches and during emergencies and regarding overseas data flows.

The amendments are conservative and modest but a move in the right direction. These changes will not make Australia’s Privacy Act the gold standard but if the further reforms proposed by the Attorney General’s Department are implemented then the level of protections will allow for a more effective regulation and protections.

The Second Reading provides:

Introduction

The digital economy has unleashed enormous benefits for Australians. But it has also increased the privacy risks we face through the collection and storage of enormous amounts of our personal data.

The Privacy Act 1988 represented the first time that a comprehensive, integrated set of legal rules protecting interests in privacy existed in Australia. On introducing it, Attorney-General Lionel Bowen told the parliament that ‘enormous developments in technology for the processing of information are providing new and, in some respects, undesirable opportunities for the greater use of personal information.’

In that respect, little has changed. Evolutions in technology and the way people use it continue to vex those who share information online, and those charged with regulating it. It is essential that Australians are protected by a legal framework that is flexible and agile enough to adapt to changes in the world around them.

The Privacy Act has not kept pace with the adoption of digital technologies. The vast data flows that underpin digital ecosystems have also created the conditions for significant harms—like major data breaches that have revealed the sensitive information of millions of Australians, exposing us to the risk of identity fraud and scams. Read the rest of this entry »

US Department fines Providence Medical Institute $240,000 after ransomware attacks

In the United States the fines for breaches of data security can be quite heavy, much heavier than in Australia. Like Australia there is more than one regulator that can take action against organisations on various grounds for breaches of data security. The US Department of Health and Human Services (HHS) Office for Civil Rights (OCR) announced, found here, the notice of final determination finaliising a civil penalty of $240,000 against Providence Medical Institute for potential violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Security Rules, following a ransomware attack breach report investigation. The final determination is found here.

As can be the way, the background has quite a long history.  In July 2016, Providence acquired the Center for Orthopedic Specialists and initiated a two-year transition plan linking the Center for Orthopedic Specialists IT system into Providence’s IT structure. In April 2018 Providence filed a breach report which resulted into an investigation. The breach report concerned the unauthorized access and encryption of the Center for Orthopedic Specialists’ systems on February 18, 2018, February 25, 2018, and March 4, 2018. The attacks compromised Read the rest of this entry »

Court of Justice of the European Union rules that Meta must minimise the amount of personal information for personalised advertising, in this case about sexual orientation

October 7, 2024

Max Shrems has struck again. He has been successful in his claim against Meta on the user of sexual orientation about a user’s sexual orientation in personalised advertising as reported by the BBC in Meta must limit data for personalised ads – EU court and by breaking news in Activist wins privacy case against Meta over personal data on sexual orientation

Meta and other social media platforms use data to drive the effectiveness of personalised ads.  That means the collection of data, especially personal information, is a priority. In practice sensitive information, such as sexual orientation, may assist in refining the nature of ads directed at a person. 

The final judgment has not been published as yet. 

The BBC article provides:

Facebook-owner Meta must minimise the amount of people’s data it uses for personalised advertising, the EU’s highest court says.

The Court of Justice for the European Union (CJEU) ruled in favour of privacy campaigner Max Schrems, who complained that Facebook misused his personal data about his sexual orientation to target ads at him.

In complaints first heard by Austrian courts in 2020, Mr Schrems said he was targeted with adverts aimed at gay people despite never sharing information about his sexuality on the platform.

The CJEU said on Friday that data protection law does not unequivocally allow the company to use such data for personalised adverting.

“An online social network such as Facebook cannot use all of the personal data obtained for the purposes of targeted advertising, without restriction as to time and without distinction as to type of data,” it said.

Data relating to someone’s sexual orientation, race or ethnicity or health status is classed as sensitive and carries strict requirements for processing under EU data protection law.

Meta says it does not use so-called special category data to personalise adverts.

“We await the publication of the Court’s judgment and will have more to share in due course,” said a Meta spokesperson responding to a summary of the judgement on Friday.

They said the company takes privacy “very seriously” and it has invested more than five billion Euros “to embed privacy at the heart of all of our products”.

Facebook users can also access a wide range of tools and settings to manage how their information is used, they added.

“We are very pleased by the ruling, even though this result was very much expected,” said Mr Schrems’ lawyer Katharina Raabe-Stuppnig.

“Following this ruling only a small part of Meta’s data pool will be allowed to be used for advertising – even when users consent to ads,” they added.

Read the rest of this entry »