Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Protection Act in turning over sensitive medical data of around 1.6million patients to DeepMind
July 15, 2017 |
The UK Information Commissioner’s Office (the “ICO”) has its detractors however as a regulator it has been by far more energetic than its Australian equivalent. The legislative structure is different as is the resourcing. The UK Data Protection Act provides more scope for enforcement action and the penalties can be swingeing. That said the approach taken by the ICO in both adopting an educational approach, the carrot, but also high profile and tough regulatory action, monetary penalty notices, highlights a difference with the Office of the Information Commissioner, which has been all about the education and very little about the enforcement. That has had a deleterious effect on privacy and data protection compliance in Australia.
The ICO took action against the Royal Free London NHS Foundation Trust for failing to comply with the Data Protection Act when it provided patient details to Google DeepMind. The ICO found that the Trust did not comply with data handling with respect to the personal data of 1.6 million patients and failing to advise patients that their data would be used. This is a timely publication given the pressure across the world to use big data analytics on health and other sensitive records.
The ICO’s media release provides:
The ICO has ruled the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind.
The Trust provided personal data of around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury.
But an ICO investigation found several shortcomings in how the data was handled, including that patients were not adequately informed that their data would be used as part of the test.
The Trust has been asked to commit to changes ensuring it is acting in line with the law by signing an undertaking.
Elizabeth Denham, Information Commissioner, said:
“There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.
“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.
“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”
Following the ICO investigation, the Trust has been asked to:
- establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
- set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
- complete a privacy impact assessment, including specific steps to ensure transparency; and
- commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.
The undertaking we have asked the Trust to sign, and the letter outlining the conclusions of the ICO’s investigation, have both been published today.
Interestingly the ICO followed up the media release and the publication of the undertaking with a blog post highlighting the 4 lessons that can be learnt from this incident:
It’s not a choice between privacy or innovation
It’s welcome that the trial looks to have been positive. The Trust has reported successful outcomes. Some may reflect that data protection rights are a small price to pay for this.
But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I’ve every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work. This will also be true for the wider NHS as deployments of innovative technologies are considered.
Don’t dive in too quickly
Privacy impact assessments are a key data protection tool of our era, as evolving law and best practice around the world demonstrate. Privacy impact assessments play an increasingly prominent role in data protection, and they’re a crucial part of digital innovation. Our investigation found that the Trust did carry out a privacy impact assessment, but only after Google DeepMind had already been given patient data. This is not how things should work.
The vital message to take away is that you should carry out your privacy impact assessment as soon as practicable, as part of your planning for a new innovation or trial. This will allow you to factor in your findings at an early stage, helping you to meet legal obligations and public expectations.
New cloud processing technologies mean you can, not that you always should
Changes in technology mean that vast data sets can be made more readily available and can be processed faster and using greater data processing technologies. That’s a positive thing, but just because evolving technologies can allow you to do more doesn’t mean these tools should always be fully utilised, particularly during a trial initiative.
In this case, we haven’t been persuaded that it was necessary and proportionate to disclose 1.6 million patient records to test the application. NHS organisations, perhaps more than any other sector, need to remember that we are talking about the medical information of real patients. This means you should consider whether the benefits are likely to be outweighed by the data protection implications for your patients. Apply the proportionality principle as a guiding factor in deciding whether you should move forward.
Know the law, and follow it
No-one suggests that red tape should get in the way of progress. But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason. Just as you wouldn’t ignore the provisions of the Health and Social Care Act, or any other law, don’t ignore the Data Protection Act: you need a legal basis for processing personal data. Whether you contact the ICO or obtain expert data protection advice as early as possible in the process, get this right from the start and you’ll be well-placed to make sure people’s information rights aren’t the price of improved health.
It is a very useful approach. One that the Privacy Commissioner in Australia does not follow. Pity.
The Undertaking relevantly provides:
We passionately believe in the power of technology to improve care for patients and that has always been the driving force for our Streams app.
We are pleased that the information commissioner supports this approach and has allowed us to continue using the app which is helping us to get the fastest treatment to our most vulnerable patients – potentially saving lives.
We have co-operated fully with the ICO’s investigation which began in May 2016 and it is helpful to receive some guidance on the issue about how patient information can be processed to test new technology. We also welcome the decision of the Department of Health to publish updated guidance for the wider NHS in the near future.
We have signed up to all of the ICO’s undertakings and accept their findings. We have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.
We look forward to working with the ICO to ensure that other hospitals can benefit from the lessons we have learnt.
[…] Royal Free London NHS Foundation Trust enters into undertaking because of the breach of the Data Pro… […]