A deal which led to the sharing of healthcare records of 1.6 million patients in the UK with Google’s AI company ‘DeepMind’ has been judged by the UK data protection watchdog the Information Commissioner’s Office (ICO) to have not complied with the Data Protection Act.
Back in May 2016 a data sharing agreement between Google’s A.I. Company DeepMind and the Royal Free NHS Trust meant that Google was granted access to the information of the patients for 5 years up to 2017 of 3 London Hospitals, namely; Barnet, Chase Farm and the Royal Free Hospital.
The information was intended to be used by Google for the specific purpose of developing an app called ‘Streams’ to alert doctors when a person is at risk of developing acute kidney injury (AKI). NHS figures at the time showed the need for such an app because kidney injuries were believed to cause 40,000 deaths a year in the UK.
What Went Wrong?
A member of the public complained, the ICO investigation took place, and it was reportedly found that there were some shortcomings in how the data was handled e.g. some patients were not adequately informed that their data would be used as part of the deal. This led to concerns being raised about transparency for patients about how records were being used.
The Royal Free Trust originally stated that the patient data that Google would be given access would be encrypted, and that the Google DeepMind employees working on the project would not be able to identify any individuals from it.
There were also assurances that Google could not use the data in any other part of its business; that the data would be stored in the UK by a third party, and that all data will be deleted when the agreement expires at the end of September 2017.
Despite concerns being raised in the media when the deal was first announced, the Royal Free NHS Trust pointed out that information sharing agreements of this kind weren’t unusual and that it was one of 1,500 agreements with third-party organisations that process NHS patient data.
The ICO has now asked for the Trust to commit to changes which will ensure that it is acting within the law by signing an undertaking. The Trust has been asked to establish a proper legal basis under the Data Protection Act for the Google DeepMind project (and for future such projects), to complete a privacy impact assessment, to commission an audit of the trial and share the results with the ICO, and to show how it will comply with its duty of confidence to patients in any future trial involving personal data.
What Does This Mean For Your Business?
If your organisation works in a medical field or develops products or services with medical applications or inputs, an agreement of this nature with the NHS or a private health company could represent an R&D opportunity. As the national data guardian Fiona Caldicot pointed out in this case, there was huge potential that creative use of data could have on patient care and clinical improvements.
This story is, however, a reminder that companies / project partners should always be very clear on the Data Protection law (and GDPR as it will be next year) before embarking on a project. It also illustrates how privacy impact assessments are an important data protection tool in digital innovation, and how, just because new technologies enable businesses to do more, it does not mean these tools should always be fully utilised. The price of innovation shouldn’t be the erosion of legally ensured fundamental privacy rights, and the costs for companies that don’t take account of this could be great.