Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Protection Act in turning over sensitive medical data of around 1.6million patients to DeepMind

July 15, 2017 |

The UK Information Commissioner’s Office (the “ICO”) has its detractors however as a regulator it has been by far more energetic than its Australian equivalent.  The legislative structure is different as is the resourcing.  The UK Data Protection Act provides more scope for enforcement action and the penalties can be swingeing.  That said the approach taken by the ICO in both adopting an educational approach, the carrot, but also high profile and tough regulatory action, monetary penalty notices, highlights a difference with the Office of the Information Commissioner, which has been all about the education and very little about the enforcement. That has had a deleterious effect on privacy and data protection compliance in Australia.

The ICO took action against the Royal Free London NHS Foundation Trust for failing to comply with the Data Protection Act when it provided patient details to Google DeepMind.  The ICO found that the Trust did not comply with data handling with respect to the personal data of 1.6 million patients and failing to advise patients that their data would be used.  This is a timely publication given the pressure across the world to use big data analytics on health and other sensitive records.

The ICO’s media release provides:

The ICO has ruled the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind.

The Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury.

But an ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.

The Trust has been asked to commit to changes ensuring it is acting in line with the law by signing an undertaking.

Elizabeth Denham, Information Commissioner, said:

“There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.

“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.

“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”

Following the ICO investigation, the Trust has been asked to:

  • establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
  • set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
  • complete a privacy impact assessment, including specific steps to ensure transparency; and
  • commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.

The undertaking we have asked the Trust to sign, and the letter outlining the conclusions of the ICO’s investigation, have both been published today.

Interestingly the ICO followed up the media release and the publication of the undertaking with a blog post highlighting the 4 lessons that can be learnt from this incident:

it-security-1It’s not a choice between privacy or innovation


It’s welcome that the trial looks to have been positive. The Trust has reported successful outcomes. Some may reflect that data protection rights are a small price to pay for this.

But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I’ve every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work. This will also be true for the wider NHS as deployments of innovative technologies are considered.

it-security-2Don’t dive in too quickly

Privacy impact assessments are a key data protection tool of our era, as evolving law and best practice around the world demonstrate. Privacy impact assessments play an increasingly prominent role in data protection, and they’re a crucial part of digital innovation. Our investigation found that the Trust did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data. This is not how things should work.

The vital message to take away is that you should carry out your privacy impact assessment as soon as practicable, as part of your planning for a new innovation or trial. This will allow you to factor in your findings at an early stage, helping you to meet legal obligations and public expectations.

it-security-3New cloud processing technologies mean you can, not that you always should

Changes in technology mean that vast data sets can be made more readily available and can be processed faster and using greater data processing technologies. That’s a positive thing, but just because evolving technologies can allow you to do more doesn’t mean these tools should always be fully utilised, particularly during a trial initiative.

In this case, we haven’t been persuaded that it was necessary and proportionate to disclose 1.6 million patient records to test the application. NHS organisations, perhaps more than any other sector, need to remember that we are talking about the medical information of real patients. This means you should consider whether the benefits are likely to be outweighed by the data protection implications for your patients. Apply the proportionality principle as a guiding factor in deciding whether you should move forward.

it-security-4Know the law, and follow it

No-one suggests that red tape should get in the way of progress. But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason. Just as you wouldn’t ignore the provisions of the Health and Social Care Act, or any other law, don’t ignore the Data Protection Act: you need a legal basis for processing personal data. Whether you contact the ICO or obtain expert data protection advice as early as possible in the process, get this right from the start and you’ll be well-placed to make sure people’s information rights aren’t the price of improved health.

It is a very useful approach.  One that the Privacy Commissioner in Australia does not follow.  Pity.

The Undertaking relevantly provides:

(12) The Commissioner’s investigation determined that DeepMind processed approximately 1.6 million partial patient records to enable the clinical safety testing of the Streams application by the Trust. However, it is the Commissioner’s view that patients were not adequately informed that their records would be processed for the
purpose of clinical safety testing.
(14) Further the Commissioner is not satisfied that the Trust has, to date, properly evidenced a condition for processing that would otherwise remove the need for the Trust to obtain the informed consent of the patients involved for the processing of personal data for the clinical safety testing of the application prior to live deployment. As a result, during the Commissioner’s investigation and to the Commissioner’s satisfaction, the
data controller has not been able to evidence a valid condition for processing personal data under Schedule 2 to
the Act during the clinical safety testing phase of the application or to evidence a valid condition for processing
sensitive personal data under Schedule 3 to the Act during the clinical safety testing phase of the application. The
Commissioner has therefore required the Trust to provide evidence that any future testing arrangements with
DeepMind will comply with a processing condition in Schedule 2 and 3 to the Act.
(16) An estimated 1.6 million partial patient records were processed by DeepMind on the Trust’s behalf. The
Commissioner has considered the Trust’s representations as to why it was necessary for so many records to be used to support the clinical safety testing of the application. The Commissioner is not persuaded that proper consideration was given to the necessity of processing so many patients’ records. As such the Commissioner is of the view that the Trust has failed to demonstrate that the processing of such a large number of partial records was both necessary and proportionate to the purpose pursued by the data controller and that the processing was potentially excessive. The Commissioner did not receive evidence of whether lower volumes of records could have been used during the testing phase. Whilst the rationale for using the full range of records in the live clinical setting is now clearer, the Commissioner emphasises the importance of assessing the proportionality in future iterations of the application for testing or clinical purposes.
(17) The Commissioner’s investigation has determined that as patients were not provided with sufficient
information about the processing and as a result those patients would have been unable to exercise their rights to prevent the processing of their personal data under section 10 of the Act. As set out above, the Trust has now taken further steps to ensure patients are aware of the use of their data for clinical safety testing and of their ability to opt out from such testing. In the Commissioner’s view, this was not the case in 2015 and early 2016.
(18) Principle Seven requires that where a data processor carries out processing on behalf of a data controller, a
contract evidenced in writing must be in place. Although there was a written information sharing agreement in place at the time DeepMind was given access to the data that set out the parties roles and imposed security obligations on the processor, the Commissioner’s investigation has determined that this agreement did not in the Commissioner’s view go far enough to ensure that the processing was undertaken in compliance with the Act. Specifically, it is the Commissioner’s view that the information sharing agreement of 30 September 2015 did not contain enough detail to ensure that only the minimal possible data would be processed by DeepMind and that the processing would only be conducted for limited purposes. It is the Commissioner’s view that, the requirements
DeepMind must meet and maintain in respect of the data were not clearly stated. The Commissioner is also concerned to note that the processing of such a large volume of records containing sensitive health data was not subject to a privacy impact assessment ahead of the project’s commencement.
The Trust made a well massaged, by the PR department, statement trying to extract the best from what was a melancholy experience.  It provides:

We passionately believe in the power of technology to improve care for patients and that has always been the driving force for our Streams app.

We are pleased that the information commissioner supports this approach and has allowed us to continue using the app which is helping us to get the fastest treatment to our most vulnerable patients – potentially saving lives.

We have co-operated fully with the ICO’s investigation which began in May 2016 and it is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.

We have signed up to all of the ICO’s undertakings and accept their findings. We have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.

We look forward to working with the ICO to ensure that other hospitals can benefit from the lessons we have learnt.

One Response to “Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Protection Act in turning over sensitive medical data of around 1.6million patients to DeepMind”

  1. Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Protection Act in turning over sensitive medical data of around 1.6million patients to DeepMind | Australian Law Blogs

    […] Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Pro… […]

Leave a Reply

Verified by MonsterInsights