Well I suggest that we compare the answers we get from AI to those from government's Natural Stupidity systems.
Algorithms should not be solely responsible for criminal sentencing, while a change in law may be required to open up public data sets involving health information. These were just some of the topics Information Commissioner Elizabeth Denham touched on today in a wide-ranging Parliamentary hearing about the use of AI in …
what bearing does this have for credit checks when applying for stuff like mortgages and credit cards?
I have a number of cards & pay the balanec every month, a mortgage and no other debts yet got refused a re-mortgage online with the company i've had for the last 3 homes & current home for 2 years. A visit to the branch and an internal call to their appraisal team discovered the reference agency didn't have my correct previous address. As they'd moved my mortgage from old house to new they where happy to ignore the wrong info from the reference agency and approve my re-mortgage. Only took 4 hours in branch to sort it :( Equifax have taken over a year to correct the info. I'll only know if they've done it correctly if i get refused something because of incorrect information they have on me and the automated checking fails to understand the info i'm declaring is correct and the info Equifax is wrong.
"A visit to the branch and an internal call to their appraisal team discovered the reference agency didn't have my correct previous address."
It's good that you sorted it out. But in the event that an adverse decision with a demonstrable financial loss were to be made on demonstrably incorrect information from a credit reference agency wouldn't it be a basis for a libel suit?
@Doctor Syntax: to file a libel suit in the UK, you need to be ready and willing to meet all your own legal expenses. To anyone who doesn't have a six-figure bank balance, it's not really on the cards. (Though I suppose you could use sponsorship or crowdfunding, if you think it'll work.)
This is pretty much by design: the law protects rich people, because obviously they need it more than us plebs.
I considered this point and how do you determine an algorithm has had an effect?
Lets take their current case of voter manipulation, so first of all I would need to identify all the users targeted then I would need to look at why they were targeted, what parameters were used to target them and then determine the algorithm used, if it was voter manipulation and if it was successful.
I could probably save them the time and say don't because all those questions would be insanely difficult to accurately quantify especially the last one. I also predict that if an algorithm was used to identify specific voters it won't be the one UKIP hand over.
I think more than likely it was targeted ads using the tools readily available on social media as you can narrow down targets based on things like "likes" or who they follow to determine if someone is possible on the fence but then is that wrong? Would you not describe that as being the same as politicians targeting seats with a low majority? I don't know but I don't think the government is going to fix or work it out.
>I would need to look at why they were targeted, what parameters were used to target them and then determine the algorithm used, if it was voter manipulation and if it was successful.
I don't think ICO are interested in any of those. They're interested in an individual being identified (without adequate permissions).
That is basically what is being said whenever some meatsack hides behind "The computer/process/algorithm.system has disapproved your request for <whatever>"
I saw something that nicely sums it iup.
"Discrimination is prejudice in action."
Algorithmic "decision" making is institutionalized discrimination baked into software.
"Software has been developed to predict how likely a criminal will reoffend and is used by the courts to hand out appropriate punishments."
I've no doubt that someone has developed some piece of software, based on statistical analysis of age, place of residence, income, race, crime committed, etc that will calculate a probability of reoffending based on reoffending rates for people in each combination of these categories. But a statistical probability based on mass profiling absolutely should NOT be used for determining "appropriate" sentences in a system where everybody is treated as an individual and is equal under the law.
If this really is being used right now, why is it not a massive scandal? I mean, I know this is how insurance companies work for calculating premiums and spreading risk, but this is the fining or imprisonment of citizens we are talking about here, not the profits of insrance companies, and length of sentences should not be "spread around" in the same way due to race, class, etc, as you are basically punishing people more harshly simply for being of a particular race or class. Let every sentence be considered on its own merits, depending on the seriousness of each particular case.
"If this really is being used right now, why is it not a massive scandal"
Because of the massive amount of money to be made out of awarding such contracts for the commercially confidential computer algorithms. And notice how the ICO is going to allow companies to regulate themselves and decide what information to release. In short a pretend investigation, a deception designed to conceal in plain sight that your personal records are being handed over to commercial unaccountable entities and there's nothing you can do about it.
"Denham told MPs on the Science and Technology Committee that companies and organisations must be able to explain how decisions are made by machines, but stopped short of saying they should be required to make that process available to the public, citing concerns over commercial confidentiality."
"the use of AI in decision-making" .. yet another pretext for some functionary to deflect blame for the next cock-up and say 'computer says no'.
"She said the body has served an information notice on UKIP to compel it to release information relating to the EU referendum"
That one almost slipped passed: Under the pretext of engaging in "algorithmic auditing", the ICO is intimidating a political party whose policies they don't agree with.
Biting the hand that feeds IT © 1998–2019