Close
Close

Published

Google DeepMind in hot water after AI-based privacy law breached

Google’s DeepMind has emerged unscathed after the UK’s Information Commissioner’s Office (ICO) condemned the Royal Free NHS Foundation for breaching patient privacy – after sharing 1.6m records with DeepMind in an AI-augmented application. No fines have been issued, but DeepMind has received a pretty public warning about overstepping the mark, when it comes to handling sensitive data – which might have set the industry back a few paces.

Such concerns were voiced at the time of the trial’s announcement, but the ICO has now confirmed that the NHS trust and DeepMind violated the UK Data Protection Act – after 1.6m patient records were used as the basis for a machine-learning program, designed to spot Acute Kidney Injury (AKI).

The goal was a noble one – use an advanced computing system to better diagnose AKI in patients, potentially saving lives, but also saving a lot of time and resources for doctors that could be spent elsewhere in a clinic. However, the year-long investigation has now found that the 1.6m patients were not properly informed about the use of their personal healthcare data in the trial.

Health records are probably the most sensitive pieces of information that a person generates. Their contents are never a matter of public record, and often contain things that people would never want known – including drug histories, STDs, abortions, miscarriages, and non-visible disabilities. It is assumed that these records are protected by doctor-patient confidentiality, and suitably secure IT systems.

However, there was a lot of trepidation when the deal came to light, when it transpired that complete patient records were involved in the Streams application, because of Google’s reputation for massive data collection – curating user profiles that help it to sell more effective advertising across its dominant web platforms in search engines and online video. It now appears that these highly sensitive records have been improperly shared with the Google subsidiary – for any patient passing through three London hospitals.

So has the finding set the industry back? Almost certainly, but it appears that the all it would have taken to not breach the rules was to adequately inform patients that their records were going to be used – either by letter, or at the point of service. And the patients wouldn’t exactly have had much power in such a warning, as healthcare isn’t exactly a free market – where a customer could go elsewhere for treatment.

Still, the news is unsettling for many patients, especially given what the industry knows about anonymized data sets – that it takes a surprisingly small number of factors to accurately identify individuals, among what should be an overwhelming amount of records.

The good news for Google is that DeepMind isn’t being fined – although we imagine that it has plenty of cash left over from its $500m acquisition in 2014. In fact, the ICO is blaming the Trust, as the Trust was acting the ‘data controller’ here – and the data was stored by a third-party, not inside ‘data processor’ DeepMind’s premises.

The initial AKI deal was signed in September 2015, and became public in April 2016 – following coverage from New Scientist that exposed the data sharing agreement, which found that the sharing went far beyond what was publicly reported.

The Trust said that it “accepts the ICO’s findings, and has already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times, and has never been used for anything other than delivering patient care or ensuring their safety.”

DeepMind said “we welcome the ICO’s thoughtful resolution of the case, which we hope will guarantee the ongoing safe and legal handling of patient data for Streams. Although today’s findings are about the Royal Free, we need to reflect on our own actions too.”

It added that “in our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as potential fears about a well-known tech company working in health. We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public, and the NHS as a whole. We got that wrong, and we need to do better.”

The Information Commissioner, Elizabeth Denham, said “there’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.”

Denham added that “our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”

The ICO is now asking the NHS Trust to sign an undertaking, which will see it promise to make changes to the projects, in order to comply. It asks for setting out how it will comply with the current and future projects, completing a privacy impact assessment, and also for the Trust to commission an audit of the trial – where the results will be shared with the ICO.

Close