Don't they realise that without an ethics committee to govern these things were practically inviting skynet in...
That's cute. AI and IoT need 'ethics regulation', mumbles Lib Dem baron
A Liberal Democrat peer has suggested that the Internet of Things needs government regulation in the UK. Speaking in Parliament yesterday, Baron Timothy Clement-Jones said that artificial intelligence, as well as IoT, needs "huge consideration" of its "ethics". "It may be that we need to construct a purpose-built regulator …
COMMENTS
-
-
Thursday 9th February 2017 13:34 GMT The Mole
Re: Nooo
Really? I guess the Medical Research Council with their Research Ethic Committees don't exist then? There is significant amounts of regulation around medical research ethics for good reason.
The data protection act and GDPR are regulations around the ethics of data handling, mandating privacy must be taken into account. In fact these already place restrictions on what AI and IoT algorithms are allowed to do with data (proportionate and consensual use of data). Additionally the use of algorithms can't be discriminatory - you can't use an algorithm which rejects job applications based on someones name commonly being used by terrorist suspects for instance. Between these there may be a good case to say we have sufficient regulation job done.
-
Thursday 9th February 2017 14:42 GMT Steve Davies 3
Re: Nooo
The GDPR is all well and good but you can bet your bottom dollar that all the usual suspect will be lined up read yo meet the various Gubbermint Ministers ASAP after the BREXIT deal is done to ensure that none of it gets implimented here. All that lovely data just going to waste and all those lovely snooping IoT devices that will get sold to an unsuspecting public. The ad men etc will be drooling with anticipation.
I'm also sure that a few directorships may well be heading in directions you can guess only too easily.
-
Thursday 9th February 2017 15:59 GMT Anonymous Coward
Re: Nooo
> Additionally the use of algorithms can't be discriminatory - you can't use an algorithm which rejects job applications based on someones name commonly being used by terrorist suspects for instance.
You want to bet? Given the questions I have been asked by inHuman Remains over the years.
-
-
Thursday 9th February 2017 13:04 GMT Matthew Taylor
IOT
Seems fair enough to me. Ok, strong AI is still a way off, but facial recognition (for example) is real enough. And companies are already using deep neural architectures for data prediction. Both of those areas could raise ethical issues. As for IoT, how well are they secured? Could these voice activated home entertainment things get compromised by some nefarious third party?
That said, tasking ofcom with "analysing the algorithms involved" doesn't seem like a particularly sound approach.
-
-
Thursday 9th February 2017 14:40 GMT Natalie Gritpants
"I had not realised that an algorithm, programmed by an engineer, can, for example, take the decision to bin an application,"
Probably get better results than say - a busy HR person just binning the bottom two thirds of the CV pile. But heaven forbid, how can I go to one of my old school chums with my child's CV if it's going to be looked at by a thing that didn't go to a nice school.
-
Friday 10th February 2017 11:20 GMT dajames
Probably get better results than say - a busy HR person just binning the bottom two thirds of the CV pile. But heaven forbid, how can I go to one of my old school chums with my child's CV if it's going to be looked at by a thing that didn't go to a nice school.
Do not fear ... the algorithms can be programmed to keep applications only from candidates who did go to the Right School. It'll happen automatically, and there'll be no need to fork out for a nice bottle for your old chum.
The wonders of modern technology, eh?
Icon because ... School!
-