back to article Audit of DeepMind deal with NHS trust: It checks out, nothing to see here

An audit of the Royal Free NHS Trust and Google DeepMind's controversial app to detect kidney disease has deemed its current use of confidential data from real patients lawful – going so far as to suggest findings from other watchdogs were misplaced. The audit of the Streams app – which uses a fixed algorithm to help detect …

Anonymous Coward

More 'Lawyered' answers - Heads-We-lose Tails-They-Win Big-Tech dancing between 'legal' raindrops

"The scope, though, is limited just to the current use of Streams and, as emphasised repeatedly in the report, does not include a historical assessment of the app, which means the findings will be a frustration for critics who want to see the initial data gathering put under the microscope."

"Another concern is the lack of a formal retention period for the data stored by the app: although only data from the previous 12 months is needed to generate an AKI alert on the app, Streams now contains information up to eight years old."

~~~~~

"It's still clinical care through a mass surveillance lens," said Eerke Boiten, professor of cybersecurity at De Montfort University. "They need data (now grown to eight years' worth) on all potential patients 'just in case' – even though they admit 'the AKI event might only occur in the future or not occur at all'.

"This is justified by drawing an analogy with the hospital's regular data systems: they hold all the data on all past patients, so why shouldn't Streams too?"

~~~~~

'The hospital might say the vast amounts of data collected are necessary for "vital interests" of patients, he said, but: "The only 'vital interest' protected here is Google's, and its desire to hoard medical records it was told were unlawfully collected.'

3
0
Silver badge
FAIL

Docs can get this info anyway, so...

Docs are also bound by patient confidentiality laws and don't pull every legal loophole known to man in order to sell your data.

12
0
FAIL

Who pays the piper

So the Trust paid a company to audit them and the audit said "Yeh, you're fine!"

There's a shock.

PwC said the same about BHS etc. etc. etc.

26
0
Silver badge

Re: Who pays the piper

If you pay a lawyer or solicitor to say something for you it must be true.

1
0
Paris Hilton

you often get the happy ending you want when you pay for it

you often get the happy ending you want when you pay for it

4
0

Re: you often get the happy ending you want when you pay for it

You must go to different massage parlours to me.

3
0
Anonymous Coward

So we are now at the stage where medical data can be siphoned off the NHS by companies. I also doubt it being anonymous because what's the point? You detect someone at risk but have no way of telling them. I don't see why you need to do this anyway, just send a questionnaire to the patient then doctor and add a score to determine risk. Job done.

What will they sell next? Police records to determine risk of re-offending? Parking ticket data to determine if someone takes risks? School records for employers? Welcome to the world of tomorrow where all your data can and will be used.

6
0
Silver badge

"I don't see why you need to do this anyway, just send a questionnaire to the patient then doctor and add a score to determine risk. Job done." - AC

"Everybody lies" - House

"I also doubt it being anonymous because what's the point? You detect someone at risk but have no way of telling them."

Deepmind detects that 10231940214 is at risk. The hospital takes this information and provides information to the GP to perform test X on 1023194024 in future checkups.

1
0
Anonymous Coward

of course but then it's not anonymous as the person can be identified and do we really believe that the NHS made it anonymous before they gave it to google or they just dumped whole databases to save time and money?

6
0
Silver badge

What makes you think that the symptoms table of the database includes PII? I was under the impression that any database worth its salt would assign an ID to a person and use that as a primary key instead of anything else.

The anonymous person can be identified by the hospital, not by Google. From Google's perspective, 13891231 is anonymous. From the hospital's perspective, what's the point in having medical data that they can't access when that person comes back to the hospital?

0
0
Silver badge
Facepalm

I was thinking the same thing, just use an anonymised ID number, and you can reasonably keep old data to re-run your tweaked algorithm against in case new factors indicate that more people are at risk.

But ArrZarr's example was quite telling:

...detects that 0214 is at risk. ... perform test X on 024...

1
0
Anonymous Coward

Time to get your own back?

If it's an AI, there are two ways to make a nice profit from this.

The easy, but not probable way is to tell your Dr/GP you have a really bad pain in the "please send a check to [my address] on your elbow", and hope the data syphons through and accidentally ends up on an AI working the accounting software.

Or you do that single pixel hack that they did with the image recognition software, that thought a turtle was a gun... But I've been shouting random words at my GP, and Google Search has not given me next weeks lottery numbers yet.

Oh look, white coated men stepping out a van!

/JK

0
0
Silver badge

Google is tracking your phone.

They will know trips to the DRs and/or which department. Which year. Etc. Enough data to build a "person" out of it.

Any amount of data can be correlated. Any. All that is limiting, is how fine the data is, vs how broad the number of people are. The less people, and the finer the data, the better the correlation to find the exact match.

3
0
Silver badge

From Google's perspective, 13891231 is anonymous right up to the point where they compare information in the record with other information they hold and whittle the number of possible individuals down to a small number*.

* One is a small number.

1
0
Anonymous Coward

Data Protection Law isn't clear cut

But the Vital Interests justification is, and the ultimate arbitrators of it are the ICO - what they say goes and their guidance is pretty good on this one.

3
0
Silver badge

Suspect...

Fixed this for you : "the Trust commissioned (i.e. PAID) Linklaters LLP to conduct an audit of its use of Streams to give it the legal conclusion it wanted"

2
0
Silver badge

This kind of research should only be done in-house

If there's no budget then the point needs to be made that this is a future investment. The in-house staff gaining the expertise with it hopefully leading to higher morale within a profession that is commonly used as a punchball..

You can see what's coming in the current scenario... moneytisation. Down the line the external developers are going to screw the medical profession for access to what is their data. The staff that have pulled out all the stops to advise, collect, collate and upload this data will be frozen out the equation - we have no further use for you.

4
0
Bronze badge

Illegal data retention and access, just a complete whitewash, ICO should crack it's knuckles and come in with both fist flying and do the right thing and prosecute Google and the NHS trust !

4
0

The work is before the lawful event.

But its new, we'll send you an invoice anyway, as we're making it up. What do you mean It's half a century late! #conceptualjurisprudence

0
0
Anonymous Coward

A few years back

I saw a portabe device on a rolling stand at a hospital that was used for taking patients vitals.

It had a built in Facebook app that was visible on the screen with the official FB icon.

When I questioned the nurse about it she verified that it was indeed a Facebook app but she they never use it.

1
0
Anonymous Coward

"Genossen, wir müssen alles wissen!"

Erich Mielke, Ministerium für Staatsicherheit

0
0
Anonymous Coward

Re: "Genossen, wir müssen alles wissen!"

Funny how Google starts sending me "Unusual Activity" Captcha verfication pages when looking up that quote after having visiting this comments page.

0
0
Silver badge
Black Helicopters

She's not dead yet!

In times past, the test for a Witch was to try and drown her. If she drowned, she wasn't a witch.

I notice everyone here has already decided guilt, devoting paragraphs to questioning the motivations of an AI company doing medical research, despite medical research being Practical Application Number 1 in AI research even 20 years ago...

This is all going a bit "But Her Emails" - everyone knows something was done wrong, but nobody understands what, and nobody knows what the negative effects *actually* are, and nobody can find anything evil, but the ambient fear and hatred is now self sustaining.

Google, do not want anyone's kidneys. Google do not HAVE anyone's kidneys. Please do not stab medical people just in case they could be coveting your kidneys.

0
0
Silver badge
Mushroom

Re: Practical Application Number 1

It doesn't mean to say that because a technology is perfect to "solve" a given problem, that it should be actually used to solve that particular problem.

Hiroshima kinda stands out as a typical example.

0
0

anonymised data is not anonymous

While a sample of 1 event from each patient maybe able to be anonymised, full history from all patients quickly becomes much easier to re-identify. You can scramble the DOB but if you have access to other patient records (a health provider or private health insurer in Australia) then they become easier to match. If you convert DOB to an age at time of attendance/treatment event then with the full history of health events you can narrow down the DOB. Google et. al. can guess a family who have been sick (the keywords you've been using for searches, map searches). Facebook et. al. data can be used if you've been posting about yourself or others you have been sick or injured. It may not be open slather but some companies can potentially utilise the data available.

The ethical way is to let clients & users know the external parties who have access and potential use of their data. Not just a generic warning but to be more specific and log the access. And allow uses to block access until specifically asked to grant access prior to data being shared.

0
0

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2018