back to article Google DeepMind's use of 1.6m Brits' medical records to test app was 'legally inappropriate'

Google's use of Brits' real medical records to test a software algorithm that diagnoses kidney disease was legally "inappropriate," says Dame Fiona Caldicott, the National Data Guardian at the UK's Department of Health. In April 2016 it was revealed the web giant had signed a deal with the Royal Free Hospital in London to …

Silver badge

It is my view, and that of my panel, that purpose for the transfer of 1.6 million identifiable patient records to Google DeepMind was for the testing of the Streams application, and not for the provision of direct care to patients

...I may have misunderstood, but I thought the data was used to train an AI model, not test an existing application? Without the data, there would have been no application.

1
5
(Written by Reg staff) Silver badge

Re: ratfox

Edit: Story updated - it's not an AI system. It's a fixed algorithm. We've tidied up the piece.

C.

11
0
TRT
Silver badge

Re: ratfox

Surely, though, you need some sort of feedback into the AI in order to train it? And if the only way to test the quality of the AI's predictive ability is to conduct further tests on those patients identified by the AI as at risk but where they were not picked up by the medics, then you'll only end up with an AI as good as the medics, not better than them.

0
0
Anonymous Coward

adverts

I'm guessing the production version pops up adverts on the screen like most other Slurp products?

"Dave, I can see that you have a terminal illness - here are some local undertakers..."

2
3

Re: ratfox

actually you end up with a AI as good as all the medics who use the system combine.

Plus on top of this Deepmind is no doubt going back over the ones Streams missed to work out why it missed that patient and refine and improve the AI. So overtime Streams has the potential to become many times better than a medic. Plut it won't get tired, it won't be rushed and it won't just have a off day like humans do.

0
1
Silver badge

Re: ratfox

"... it's not an AI system. It's a fixed algorithm. ..."

I like how that doesn't even matter, just like personal medical information to MegaCorps.

To doubt my need for tinfoil, I'm still not clear on who requested the computation. Sounds like typical Google on one hand, but on the other it sounds like it somehow was requested by doctors (with no ethical training).

0
1
Anonymous Coward

Paper Tigers

"There were legitimate ways for DeepMind to develop the app they wanted to sell. Instead they broke the law, and then lied to the public about it. Such gross disregard of medical ethics by commercial interests – whose vision of 'patient care' reaches little further than their business plan – must never be repeated."

Three words: Prosecution, Struck Off

colour me cynical, but I'll bet they (ICO, BMA, NHS and whatever other bodies are concerned) don't use them...I foresee, however, an outbreak of the usual 'mistakes made, lessons to be learned'

22
0
Silver badge

Re: Paper Tigers

Couldn't agree more. Although it would be the GMC that fails to do do the striking off, not the BMA.

4
1

Re: Paper Tigers

NGD has no legal powers, Dame Fiona Caldicott has no legal training from what I can tell. So actually this is far from the end of this.

0
0
Silver badge

2 more words

Performing Rights.

As a patient I own performing rights in any data has been created as a result of any medical examination or procedure I have taken part in.

So wack the basturds with a wacking great fee for every patient abused.

That'll learn 'em.

1
0

Re: 2 more words

I like it, but rather treat it like a class action suit. Sue them, cover your legal fees with the award and then hand the rest to the 1.6 million people who's records were abused. A combination of showing some regulatory backbone and (buying, using the offender's money,) public support.

0
0
Silver badge

Google

will, no doubt, be quaking in their boots.

#sarcasm.

Again, it proves if you have enough financial resources you can make any problem disappear.

Anyone who thinks Google will delete that treasure trove of private, confidential data is deluded.

25
0

Re: Google

or better than financial resources, they could just show their app actually works and working to save lives. Showing Google and London Royal Hospital approach to introducing AI actually works, saving lives, saving, saving doctors time.

0
1
Silver badge

Re: Google

And if you think that data is a financial treasure trove, you're equally deluded. Anon-mapped retina scans? Would you like to give me a viable business case?

Google stand accused of .. wait.. the >Trust< stand accused of using Deepmind's tool *too quickly* because it .. worked?

I do not like the way reality is being defined by the glorious and righteous flames of quasi-religious hatred..

EDIT: Aha, that's why my post was limbo'd - I'm not accurate. It looks like Deepmind had the correct data permissions *if* it was being used to help treat patients, but although it was was being used to help treat patients, it was *meant* to be being tested. And testing, requires more strenuous data approval than treating, because of course it does. *wibble*

4
4

AI actually works, saving lives, saving, saving doctors time...

... saving insurers money.

[Ah Mr Jones. Thanks for all your preima over the years, but we won't be paying your claim as Google gave us your name before you came in; we've had time to think up a few excuses...]

3
0
Silver badge

Re: Google

"I do not like the way reality is being defined by the glorious and righteous flames of quasi-religious hatred.."
Have an upvote daggerchild...

1
1

This post has been deleted by its author

Silver badge

Google's use of Brits' medical records to train and test its AI was legally "inappropriate," says Dame Fiona Caldicott

What does this actually mean? Did the hospital or Google actually break the law?

9
0
Silver badge

RE: Did the hospital or Google

Well the hospital certainly did for not protecting the patients data and identities.

I'd be happy for my non-identifiable data to be used in an experiment of this form so long as the full results are returned to the NHS.

8
0
Bronze badge
Childcatcher

Re: 'inappropriate' or 'illegal'?

Exactly - as AC says below, stop the pussyfooting. Was it illegal for the hospital to give Google 'identifiable patient records'? Or was it illegal for Google to then use those records beyond its remit? Or both?

6
0
Anonymous Coward

Re: RE: Did the hospital or Google

"I'd be happy for my non-identifiable data"

YEP but dont trust anyone to really make date non identifiable in this case. as soon as you start combining a few data sets from different sources then patterns emerge and people become identifiable. :-(

9
0
Stop

Re: RE: Did the hospital or Google

"I'd be happy for my non-identifiable data to be used in an experiment of this form so long as the full results are returned to the NHS."

I'd be happy only given another caveat: that the data, and any results of research using it, remain the IP of the NHS, and subject to the same confidentiality restrictions as the original data is (I mean.... should have been).

Or, possibly, that private companies could use this kind of data to provide useful analysis, for payment of a large fee, representing the real value of the data. Fee to be used to swell the NHS's coffers for spending on healthcare.

Where did this assumption that data simply belongs to whoever can get hold of it come from? Answer: it's a convenient lie which serves enormous commercial interests like Google and FB.

8
0

It legally it doesn't mean anything, it does sound rather good if you a participant in the sport called bashing google. We will have to wait for the ICO to offer us a proper insight on whether this was legal or illegal. My guess will be that Deepmind did comply with all of the relevant laws at the time and why it may have taken a unorthodox approach it didn't breach patient data or confidentiality or broke any laws.

It will probably make recommendation that Department of Health should construction a some rules and regulations around this.

0
0

Re: 'inappropriate' or 'illegal'?

It always illegal for a company to use information beyond the scope of it intended purpose. It has been since they first pass the data protection act. I'm pretty sure analysing patent data to improve patient care doesn't go beyond on that and is allowed by the declaration patient sign when they join up with their GP or sign forms at hospital.

The issue may be that google and Royal London are stretching that declaration to the maximum through. The ICO will have to decide. I

2
0
Silver badge
Boffin

@Korev

Actually both.

It was illegal for Google to be in possession of the data.

At a minimum, they should now provide an audit of how they used the data, where they stored it and to then verify its destruction.

The sad thing... their gall and disregard for the law is prevalent in Silicon Valley and is continually being taught in schools.

3
0
Silver badge

Re: 'inappropriate' or 'illegal'?

... the declaration patient sign when they join up with their GP or sign forms at hospital.

I don't recall ever signing any data protection stuff with my GP, but then when I last signed up with them, they were still on paper records.

Ditto when I've been to hospital - they've created records without asking my consent. They've also ignored my letters on the subject, but that's another matter !

0
0
Anonymous Coward

Streams is showing real patient benefits.

Yes, the patients really do appreciate the improved advertising they now get.

14
0
Silver badge
Devil

Re: Streams is showing real patient benefits.

Google is a totally inappropriate partner.

Also no evidence that any such "AI" actually does much. It's no different from 1980 "Expert Systems" for medicine, just more data.

An American hospital got into trouble doing a project with the arguably more expert IBM "AI" system, Watson. It never delivered.

6
2
Anonymous Coward

Re: Streams is showing real patient benefits.

Google is a totally appropriate partner, they have huge expertise at big data analysis, and if it saves someone YOU personally care about, you will be thankful.

4
10
FAIL

Re: Streams is showing real patient benefits.

@AC

"if it saves someone YOU personally care about, you will be thankful."

Ah, the usual "if it saves ONE life..." fallacy. Always deployed when there's an argument about public-health ethics. Always deployed as if it overrides any other considerations. Didn't have to wait long for it to pop up here.

14
1
Silver badge
Trollface

Re: Streams is showing real patient benefits.

But what if it saves a CHILD'S LIFE?!

3
0
Silver badge

Re: Streams is showing real patient benefits.

Ah, the usual "if it saves ONE life..." fallacy

I notice you didn't actually say he was wrong. Quite possible because in the case in question, the Trust were using it to try and save lives (before it had gone through formal trials), and you're unable to point out the Greater Evil you say is hiding behind it. You just 'know' it exists.

I'm afraid it's true - Google really are good at this stuff. Have you tried actually *buying* any of this data 'Google sell' about people?

Every hacker in the world is trying to get into Google with almost nothing to show for it. How's the NHS doing on that score at the moment?

3
3

Re: Streams is showing real patient benefits.

The Greater Evil here is perfectly clear.

Our confidential data is being provided to a profit-making company for nothing or next to nothing. Whether it (overtly) has so far or not, the company has no obligation whatsoever to respect privacy, to use the data strictly for the purpose intended, or to do anything other than pursue its own profit. It's a company that has a track-record of building income streams from data.

The fact this exercise may have helped some people is a distraction. It's great that it did, but that's no excuse to brush the evils under the carpet, as if there was no better way to achieve the same outcome.

5
3
Anonymous Coward

Re: Streams is showing real patient benefits.

"Also no evidence that any such "AI" actually does much. It's no different from 1980 "Expert Systems" for medicine, just more data."

Right off the commentard bingo card.

Neural nets are absolutely nothing like expert systems. If you limit the definition of "expert system" to "makes a decision" you have a point, but the underlying mechanics for how that decision is formulated are radically different. More importantly the mechanism for *developing the formulation mechanism* are about as different as it is possible to be. Expert systems were big long lists of fixed if-then-else. Neural networks are definitively not.

Watson being a steaming pile of turd is mainly due to it being a rebranded collection of acquired/legacy tools. Ask your local IBMer to explain what Watson is in less than 100 words. Proceed to smirk.

4
0
Silver badge

Re: Streams is showing real patient benefits.

"The Greater Evil here is perfectly clear" - Then why were you unable to show it?

"Our confidential data is being provided to a profit-making company for nothing or next to nothing" - Or, we could use the truth. Anon-mapped retina scans are being provided in return for a diagnosis tool that helps save lives.

"the company has no obligation whatsoever" - apart from the law, the contract they signed, etc etc..

"It's a company that has a track-record of" - playing Go, and Chess. You're thinking of the sister company, Google. Guilt by proximity isn't a thing.

Hatred is not reason. Fear is not proof. This is not how you make the world better. Confirm your target. Aim.

1
1

Re: Streams is showing real patient benefits.

Is there a better way to achieve the same out come. Hospitals across the NHS runs trials, so it not like running trials and monitoring what the results are is new to them, I very much doubt they be expanding the role out of the services to more doctors and staff if it leading to poorer outcomes for patients.

0
0
Silver badge

Re: Streams is showing real patient benefits.

"the company has no obligation whatsoever" - apart from the law, the contract they signed, etc etc..

So, this article is about them not following their contract. They were supposed to use the data to train and discard it. They are now running a service using that data.

Ignore whether it is a good or a bad thing; evidently they are not following their contract now so what happens in the future?

3
0
Silver badge

Re: Streams is showing real patient benefits.

"So, this article is about them not following their contract"

Actually, this article is about the Trust and Deepmind, not drawing up the *correct* contract. Technically.

Practically, it seems they drew up the correct contract for *live use* (and then used it live where it highlighted things). Not testing use, which it officially was (where they may not have been able to act?). This looks more like a paperwork squabble than anything actually evil.

1
0

Re: Streams is showing real patient benefits.

It isn't just a contractual issue between Google and the trust concerned. It is a question of whether data protection laws was broken. If patients' data was used without their consent or for purposes for which Google and the trust did not have their consent, then it is likely that both Google and the trust have acted illegally.

1
0
Silver badge

Re: Streams is showing real patient benefits.

"But what if it saves a CHILD'S LIFE?!"

But what if that child is the next Pol Pot?

(What ifs are fun!)

1
0
Silver badge
Trollface

Re: Streams is showing real patient benefits.

"But what if it saves a CHILD'S LIFE?!"

But what if KITTENS were hurt?

1
0
Silver badge

Re: Streams is showing real patient benefits.

Google really are good at this stuff

And therein lies the heart of the problem - we know darn well what Google are good at. They are very good at ignoring the law and using their size to avoid the repercussions. They are very good at mining large volumes of data.

Thus, we can have little (or no) confidence that they won't take this data that should be kept in it's own secure silo, never leaving UK (or at least, EU) control and jurisdiction, and then mine it along with other data that would probably de-anonymise it.

So far, I have not read anything to suggest that Google has the corporate structures in place to respond as MS have done with the Irish emails case - ie tell the US authorities to sod off as the US company & staff don't physically have the access to provide them with the data which is held by a different legal entity on Irish soil.

But most of all, I have seen nothing (but plenty to the contrary) to suggest that Google wouldn't pause even a second to consider mining the data along with everything else it holds.

2
0
Silver badge

Re: Streams is showing real patient benefits.

But... you just complained Google don't have corporate/geographic separation, and so can't be trusted with this kind of data, without noticing that Deepmind is a legally separate and UK-based corporate entity, fulfilling your exact criteria.

1
0

Re: Streams is showing real patient benefits.

But if that's the case it likely all trusts have been breaking the law for decades. As they all regularly process data in ways that unrelated to patient care.

0
0
Anonymous Coward

Oh please, stop the pussy footing..

Can we stop the BS already, dear watchdog?

It's not "inappropriate", it's quite simply illegal. I don't care whose political toes you'll stand on calling a spade a spade, but if they were involved they deserve the bruises. Stop the euphemisms and doublespeak already.

20
0

And why did Google have to use UK mgs data ?

Nothing g to do with no-one in America dared to because of the amount in fines/damages it would cost them.

As usual,Google will lie and lie again until they are forced to delete/destroy the data they should never had access to in the first place..

6
0
Anonymous Coward

Probably because the American health care system is not organized enough to give them that much data. Security through obscurity, if you will.

EDIT: Actually, I now remember that DeepMind is a British company that Google bought. That might explain why they got the data from here rather than there.

4
0

Because the NHS is one of the best in the world at collecting an organising this data and proactively using this to run the services and to guide changes that need to be made to the services to achieve better outcomes. The compare to other health care service providers not compare to other industries.

0
0
Silver badge

There's a more interesting ethical question than just "the rules"

So it was just to train the model. And then having trained the model they realised they had identified people who needed kidney treatment.

Would it have been ethical for them to have ignored the fact that people needed treatment and not told those people?

If it was me I would preferred to have known and been treated.

17
2
Anonymous Coward

Re: There's a more interesting ethical question than just "the rules"

Although the same argument could be used to justify intelligence gained through torture if it turned out to be life saving. This should just be treated as serendipity.

I would assume though that once AI had seen the data it should be discarded and what's left is simply rules.

At some point to validate the rules a real dataset has to be provided somehow too - in order to both validate the results, and indeed find the few that missed a diagnosis.

I don't see any reasonable way of delivering such an AI without at some point, the full dataset being provided to train an AI - unless we either want, no AI based diagnostics at all, or it has to practise on people individually and make catastrophic mistakes as it learns the incorrect decisions. Neither of these options seem sound either.

6
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2017