With its potential to use data to make healthcare more efficient and effective, some say that artificial intelligence could save a NHS faced with rising demand, workforce shortages, and a funding crisis.
However, the possibilities for AI come at a time of increased scrutiny on the use of personal data. The EU’s General Data Protection Regulation (GDPR) is forcing organisations to consider how they collect, store and process an individual’s data.
GDPR raises several questions on how to use personal data with AI technology, which the NHS needs to address if it is to make the most of the opportunity.
A question of consent
For example, patients may expect to be asked for consent for their data to be used as part of an AI algorithm. This may not be the best way of complying with GDPR.
“While you may need consent from somebody to take part, and for ethical reasons you may still get that, that may not be the basis under which you process data,” explains Nicola Perrin, lead for Understanding Patient Data.
“So you may have a two track approach where you have consent for ethical reasons, but then the legal basis for processing data is actually different and would be more likely to be a public interest test, which gets quite complicated.”
GDPR gives the public clear rights on how their data is used, and working with patients will be crucial. We need to ‘take the public with us’ in the development of AI, as National Data Guardian Fiona Caldicott recently told the Lords Committee on the topic.
“Do not forget that this is my data, my body, my data so it starts with me giving permission,” says ‘digital patient’ Michael Seres.
Being open and compliant
Seres’ own company 11Health uses AI and algorithms to enable patients and healthcare professionals to make faster decisions. Under GDPR, patients have a right to explanation around such decisions.
This could be a real challenge, as some say opening the black box of algorithmic decision-making faces major legal and technical barriers. Seres is pragmatic.
“In my view it is not about sharing a ton of algorithms, but really about sharing the process and the mechanics of how the machine makes decisions. I think it is perfectly natural for patients to want to know how any decision is made. I see AI as no different. As long as decisions are explained and processes open, what is the issue?”
Transparency is one thing, but who will monitor compliance? The Lords noted that existing regulators, such as the Information Commissioner’s Office (ICO), are best placed. But there several regulators could be involved.
“It is unclear which regulator will set the lead,” says Eleonora Harwich, co-author of Reform’s Thinking on its own: AI in the NHS report. “Currently, for example, the MHRA does not require an explanation on medtech devices it regulates. It requires that they are within a certain amount of accuracy. This is different under GDPR. But the MHRA does not regulate data.”
Engaging the public
More inclusive approaches to information governance may be required under GDPR. This is what happened with Royal Free and Deepmind, when 1.6m patient records were shared to help find a better way to identify acute kidney injury. Deepmind is pro-actively engaging with patients, the NHS and the public to discuss how it and other companies could better address data sharing issues.
“Clearly mistakes were made in that case,” says Perrin. “But it’s been a wake up call for everyone to make sure they get the governance arrangements right in future.”
Working with the public will be critical. “I think it is time for the NHS to work directly with patients on this issue” says Seres. “To not deliver directives to us, but to create them with us.”
The Lords recommendation that the NHS outline data sharing plans for AI by the end of the year should support this, and introduce a much-needed consistent approach.
Multiple conversations are happening across the NHS about how it could exploit AI. These need to be informed by agreeing on the purposes for which we want to use AI.
What do we want to achieve with AI in the NHS?
We need to come together as a health system, join disparate conversations that are happening, and work out what we want to achieve. Do we want voice algorithms to help with patient FAQs for example? Or are we looking at diagnostics, workforce, logistics?” says Dominic Cushnan, digital information manager and AI lead at NHS Horizons.
“This can then inform other areas, such as how to involve patients, engage the market, and how we process the data. But we shouldn’t take on too much; we can’t solve everything. But we need to have that conversation.”
It’s called starting with the end in mind. AI may save the NHS, but as with all technology, its success depends on having the people behind it, and the right processes in place. Through compliance and conversation, GDPR can enable a vision for AI that will benefit generations to come – and potentially save the NHS.