Another brick in the wall
The wall is in a cellar, behind the wall is a coffin that is having the final nails hammered in; inside the coffin is privacy.
In a refreshing change, the British government yesterday appointed an NHS technology advisory panel with almost no medics or tech experts on board. Today, it announced the names of expert advisors to the nebulous "Centre for Data Ethics and Innovation", one of two new bodies set up this year. This one is intended to assure the …
to paraphrase Arthur Dent in the first episode of the HHGTTG radio play
Let's appoint a bunch of cronies from industry to write the rules in such a way as to optimise profit, and hang the consequences for the general public. What could possibly go right?
Sounds like another sell-out by government
My coat please, I want to get out of here!
"The slurping is one thing... it's what is done with the slurped data that's the troubling part."
I consider both of those aspects about equally troubling, but the slurp is arguably the more important part. If the data isn't slurped, then it can't be abused. Once the data is slurped, there is literally no way of knowing what is being done with the data, so you can't know whether or not it's being abused. That makes the whole thing a matter of trust.
But note that when I use the term "slurp", what I mean is "spy" -- data which is collected without my consent. If I have given my informed consent, then I have decided that I trust whoever I give consent to. Without consent, there can be no trust.
By definition, AI may not be slurping your data today but, because it's AI, may decide to completely autonomously slurp data tomorrow ...
It's supposed to be AI, we're putting tax-payers' money into AI, at the very least it should demonstrate some semblance of AI ...
Or is it just a set of interpretation algorithms, lots of data and some deep data mining tools that people have rebadged ...? /cynical mode off/
"If large data processors like Google and Facebook help write the "ethics", they are writing the law that governs themselves."
Having two of the least ethical companies in the tech industry (where there is plenty of competition for the "unethical" crown) help to determine what is and is not ethical sounds like a fantastic idea!
Some medical AI having/not-having some medical data of mine is not the really important question; the real one is where else does that AI pass that data, probably without my agreement and out of scope of what the NHS trust thought it was agreeing to.
I don't want to find: myself getting spam for hypertension pills; or my mortgage rate going up; or being denied a job; or ... We all know that private data will end up in all sorts of places not envisaged, there will be flimsy excuses invented to give a veneer of justification - but the real reason will be some organisation's profit at my expense.
You forgot to mention the insurance companies who can't wait to get their sticky fingers on your medical file (they already have your consummation habits). Combine this with the utter stupidity of people giving away their DNA to companies like 23andme and you will see the magnitude of the sh*%t we're getting into. If this is future, please stop right here so I can disembark!
"That's BS for people who want to make money out of your data without paying for it."
I don't think paying for it is the important qualifier here -- I think getting your permission is (paying for it might be a condition you require to give permission, though).
Without them doing that, they aren't just people who want to make money from your data, they're straight-up thieves.
Data slurping is the big hit, because you can do it for cheap and sell the harvested data for good money, over and over again. Add to this the fact a vast majority of victims doesn't really mind, and you have a big winner. Just watch as the rest of the world jumps onto that bandwagon: It is as profitable as spam, and like spam it's not going to go away that easily.
Sorry to be blunt, but it's not some rare people muttering "Uh, I'd rather not" in some forum which will change that, the only thing which could (somewhat) help is if there is a big public outcry, making politicians start worrying about their votes - More than about their future cushy public sector jobs.
Honestly, how likely is that?
"Honestly, how likely is that?"
In the long run? Very likely. It is all but inevitable that the ubiquitous surveillance being forced on everyone will cause, at some point, widespread harm. When that happens, the shit will truly hit the fan.
At this point in time, though, I figure the only reasonable approach to take is that of self-preservation. When the disaster comes, I would prefer not to be a part of it, which means subverting as much surveillance as possible right now. There's no point in expecting the law to protect you at this point in time. Only tears lie at the end of that path.
And now they're redefining the notion of ethics.
Well, here's the thing : the public at large doesn't care about how the word AI is bandied about, it makes no difference to them. That public will, however, care about how their personal data is (ab)used, especially when those who know are educating the rest about the risks.
Not to mention the banking sector which has done a fine job in 2018 to educate its customers on the risks of surrendering too much data to people who are not worthy of having it.
Privacy is theft ? That's where they want us to go ? Well then call me Lupin. Arsène Lupin.
That public will, however, care about how their personal data is (ab)used, especially when those who know are educating the rest about the risks.
By the time the public at large care enough to (try and) influence the liars, ignoramii, shysters and crooks of Westminster, the free for all data gravy train will have long left the station. The reason that the government have appointed a panel of snout in the trough rent-a-gobs to provide "technical advice" is specifically to create definitions to argue that the public sector giving people's data to private companies is in the best interests of the population at large, and thus the individuals concerned, and wholly out of scope for the UK GDPR-compliant rules.
Take a look at the fabricated business cases or farcical consultations associated with any government policy, and you'll see that they are NEVER about doing things correctly, these are merely abused to post-justify a poor decision taken by (at best) an Oxbridge arts graduate who didn't know anything about the decision they were taking. For starters, if the twerps of government don't like the advice, they simply ignore it - which is why Professor David Nutt was sacked by the last Labour government, for applying science and data to the failed fifty year+ "war on drugs". More generally, all the tinsel of technical advice, consultations and performance audits are merely the civil service going through a process before doing what it wanted to do in the first place. Wait for a few days until the latest NAO smart meter progress report lands, and then watch how BEIS denies the fact that the entire programme is a shambolic mess, and then denies that this fact is due to the incompetent programme design by government, and further denies that the programme will deliver a fraction of the benefits alleged in the business case (and won't even consider revising the business case and concluding that the whole thing should be binned).
Actually not. One recognised definition of ethics is 'the rules of conduct recognised by certain limited departments of human life' (1789) [Shorter OED 1933]. So in fact nothing much has changed since at least the late 18th century. What we should be talking about is morality.
According to 'Fowler's', ethics is the principle, morals the practice; by which means it is concluded that ethical and moral (adjective) are much the same. Additionally that the term 'ethical' is often used because 'moral' has connotations of sexual 'misbehaviour'. Oh for the days UK public servants read and quoted these (thought-provoking, not necessarily absolutely definitive) sources.
Dear El'Reg before you contact me no I won't go on record with any of this.
Right now in the NHS it's a war, with researchers, universities and private companies all wanting patient information, all wanting it faster, all wanting it in more convenient ways for themselves. There are a limited number of those hated NHS middle managers fighting back, reminding directors of their legal and ethical obligations and pushing back. Those middle managers are mostly nearing retirement..
What we need is one NHS Trust to be HAMMERED by the ICO to put the fear of god into the rest of them, softy-softy will ultimately only result in more information going out as we are pushed to meet government set targets or initiatives (internet access to your patient record anyone??). Otherwise it's inevitable that YOUR information will end up in the hands of just about anyone who fancies having a look.
Biting the hand that feeds IT © 1998–2019