back to article Regulate This! Time to subject algorithms to our laws

Algorithms are almost as pervasive in our lives as cars and the internet. And just as these modes and mediums are considered vital to our economy and society, and are therefore regulated, we must ask whether it's time to also regulate algorithms. Let's accept that the rule of law is meant to provide solid ground upon which our …

  1. Zog_but_not_the_first
    Unhappy

    Booting up my Mystic Meg algorithm, I predict...

    This won't happen.

  2. Tom 64
    Facepalm

    what...

    The implementation of an algorithm is usually just the automation of an existing process.

    If the process is illegal, that's hardly the fault of the programmers is it?

    1. Anonymous Coward
      Anonymous Coward

      Re: what...

      "If the process is illegal, that's hardly the fault of the programmers is it?"

      IIRC the use of Formal Definition Techniques was proposed as a way of proving that sets of rules were consistent.

      The English parliament these days even seems to bypass close scrutiny of any draft legislation. The Executive prefers to have vague bills passed - whose details are later set arbitrarily by unscrutinised "secondary legislation" powers of decree.

      1. Steven Jones

        Institution unknown

        "The English parliament these days"

        I was unaware that there was such thing as an English Parliament. Pray, where does it convene?

        1. Anonymous Coward
          Anonymous Coward

          Re: Institution unknown

          "I was unaware that there was such thing as an English Parliament. Pray, where does it convene?"

          As distinguished from the Scottish/Welsh/Northern Ireland Assemblies. They are separate law making bodies whose laws do not always align with those being passed by Westminster that only affect England. Hence the vexed problem of "the West Lothian question".

          1. Trigonoceps occipitalis

            Re: Institution unknown

            The West Lothian Question was identified by Tam Dalyell, the Labour MP for the Scottish constituency of West Lothian. It is specifically the right of Scottish MPs, as opposed to MSPs, to vote on bills debated in Westminster affecting England only but have no right to vote on bills on the same matter in the Scottish Assembly. English MPs have no rights to vote on Scottish devolved matters, as is reasonable. There is a democratic deficit inherent in the arrangement that one day will have to be addressed.

        2. Pen-y-gors

          Re: Institution unknown

          I was unaware that there was such thing as an English Parliament. Pray, where does it convene?

          At Westminster. Let's face it, any decisions made there are purely in the interests of English Tories, they don't give a damn about the interests of Wales, Scotland and Norn Ireland, even on non-devolved matters, so I think 'English parliament' is pretty accurate.

        3. Geoffrey W

          Re: Institution unknown

          RE: "I was unaware that there was such thing as an English Parliament. Pray, where does it convene?"

          These kind of responses are so tedious; nit picking on some trivial point or mis-chosen words, when its perfectly obvious what is meant. It serves no purpose except to demonstrate what a clever dick the responder is and to divert the discussion down a totally irrelevant path. Stop it at once, you naughty boy!

      2. Anonymous Coward
        Anonymous Coward

        Re: what...

        The E̶n̶g̶l̶i̶s̶h̶ Henry VIII parliament.

        TFTFY.

      3. David Shaw
        Flame

        Re: what...

        The {national} parliament{s} these days even seems to bypass close scrutiny of any draft legislation.

        I seem to recall one of the early ILETS data-retention laws being passed entirely by fax!

        One noble Lord in the UK briefly noticed, but he was told to calm down as "it wasn't that important" - just seemingly - at present illegal per ECJ

        As for the algos, by the time the Amazon Cloud has finished training my software defined architecture, can even I understand the rules, never mind explain it to the Palatial incumbents?

    2. Anonymous Coward
      Anonymous Coward

      Re: what...

      > The implementation of an algorithm is usually just the automation of an existing process.

      And many decisions made by humans are pretty arbitrary anyway - such as the binning of applications based on a cursory scan of a CV. Are all such decisions to be regulated, even in the absence of a computer? Will you be able to challenge why you weren't called up for an interview?

      If your bank decides not to offer you a loan, will the law compel it to do so? This implies not only that the bank will have to reveal its reasons not to offer the loan - the so-called "algorithm" under discussion here - but also for those reasons to be challenged and potentially overridden.

      This in turn implies that you would have a statutory right to receive a loan from a bank, if you meet some criteria decided in law or by a judge - not those criteria chosen by the bank itself.

      1. shrdlu

        Re: what...

        > And many decisions made by humans are pretty arbitrary anyway -

        > such as the binning of applications based on a cursory scan of a

        > CV. Are all such decisions to be regulated, even in the absence

        > of a computer? Will you be able to challenge why you weren't

        > called up for an interview?

        Your CV is sensitive personal data and any processing of that is already required to be done accurately fairly. It doesn't matter whether it is done by eye or by algorithm. In agencies who routinely use search algorithms they should be required to prove that the algorithm is fair and accurate. The Information Commissioner should be auditing these.

        > If your bank decides not to offer you a loan, will the law compel it

        > to do so? This implies not only that the bank will have to reveal

        > its reasons not to offer the loan - the so-called "algorithm" under

        > discussion here - but also > for those reasons to be challenged

        > and potentially overridden.

        The same rules apply. Failure to process the data accurately and fairly is an offence. If your application is rejected unfairly then the bank must either grant the application or pay compensation. They may also need to consider whether they have a taste for cocoa and porridge. The ICO should be auditing these decisions.

        > This in turn implies that you would have a statutory right to receive a

        > loan from a bank, if you meet some criteria decided in law or

        > by a judge - not those criteria chosen by the bank itself.

        That is how courts work.

        1. Anonymous Coward
          Anonymous Coward

          Re: what...

          > That is how courts work.

          There are specific laws about discrimination on grounds of race/religion/gender/sexual orientation. If the algorithm used any of those factors as inputs then it would be at odds with those laws. Care would be required for anything which might be a proxy for those attributes (such as surname).

          But apart from those, is there a general statutory duty of "fairness" in business decision-making? And if there is, why are all the more specific laws required?

          1. Anonymous Coward
            Happy

            Re: what...

            There are specific laws about discrimination on grounds of race/religion/gender/sexual orientation. If the algorithm used any of those factors as inputs then it would be at odds with those laws. Care would be required for anything which might be a proxy for those attributes (such as surname).

            It's not that simple. A set of features which individually weakly select for particular characteristics can be combined together to strongly select for race/religion/gender/sexual orientation and thus inadvertently discriminate against these groups. As a trivial and rather obvious example, a retailer might use information on, amongst many other things, makeup and clothing colours purchased. But I bet there are some really subtle things that are not at all obvious because individually they only have a weak effect.

    3. John Smith 19 Gold badge
      Unhappy

      "If the process is illegal, that's hardly the fault of the programmers is it?"

      Apparently you've never heard the expression "Ignorance of the law is no defense."

      You're attitude is why a lot of people think all devs are nothing but bodge merchants.

      By analogy with the construction industry that would be designing a building below the known safety standards or building it with materials which don't meet their specs.

      So yes it is illegal.

      1. Yes Me Silver badge
        Headmaster

        I am not a lawyer but...

        Re: "If the process is illegal, that's hardly the fault of the programmers is it?"

        The computer's defence is clear: it was only obeying orders. The programmer's defence is less clear. If she was obeying orders but by doing so told the computer to break the law, the defence that she was only obeying orders or that she was ignorant of the law doesn't hold water. She's as guilty as her boss.

        In any case, the idea of regulating algorithms is a nonsense. Blame the humans, not the machines.

        1. Anonymous Coward
          Anonymous Coward

          Re: I am not a lawyer but...

          Can a computer break the law? Literally speaking. I don't think any inanimate object can be treated as capable of understanding the law. (Neither is any human being, but that's another rant).

          This is actually a very deep and potentially very embarrassing inquiry. Seeing how many human systems and organizations - such as governments and corporations - are largely designed to diffuse blame and prevent any specific person or people from being held legally responsible.

          When decisions are embodied in a computer program, they become definite, exact and undeniable. But the program, and the computer that executes it, are not the kind of entities that are capable of legal or illegal behaviour.

          So the computer program becomes a kind of "confession in advance" by those who can be held legally responsible if anything goes wrong. Once this doctrine becomes established and widely understood, there may be a very noticeable decrease in the amount of automation.

    4. Anonymous Coward
      Anonymous Coward

      Re: what...

      The two programmers who worked for Bernie Madoff were convicted. They accepted suggestions about desired output and then created programs that created that output which was used to mislead people into handing over their money to madoff.

      Programmers can be held accountable.

      It would be a great idea if the legal lack of clarity (input can always contain errors, so we try to make programmers responsible for all output) was reduced. The option to "give a programmer files and then let him make something of it" is still quite prevalent.

    5. Anonymous Coward
      Meh

      Re: what...

      "But if he had been 36 instead of 19, he would have received a more lenient sentence, though by any reasonable metric, one might expect a 36-year-old to receive a more punitive sentence."

      Because ageism rocks!

      1. Anonymous Coward
        Anonymous Coward

        Re: what...

        "But if he had been 36 instead of 19, he would have received a more lenient sentence, though by any reasonable metric, one might expect a 36-year-old to receive a more punitive sentence."

        One presumes that the result of the algorithm is because, statistically, a 19 year old is a greater risk, and may require more emphatic punishment to reliably change behavior.

        In that case, the algorithmic result is, logically, the most reasonable choice.

    6. Anonymous Coward
      Meh

      Re: what...

      If the process is illegal, that's hardly the fault of the programmers is it?

      It would be so simple if there is was line in the code which said, e.g.

      if (sex == MALE && skinColour == WHITE) price *= 2;

      You might expect that such an algorithm would be illegal. (Or maybe not, since discrimination is generally a one way street, but that is another discussion).

      However, AI isn't like that, generally the programmer doesn't design the algorithm. The computer learns the algorithm itself, in the case of neural networks by adjusting internal weights to optimise the results towards the desired outcome. It is hard to understand what the effects of the individual weights are, and the bigger the network the harder it is. In any case, few legal professionals have the necessary graduate level mathematical training. So the "algorithm" is a black box, which has tuned itself to maximise the number of desired outcomes in a large sample of test cases, with perhaps a dozen factors (and often many more). You can't easily work out how it is operating internally, and the only way to find undesirable outcomes from specific combinations is to test them. Which is going to take a while if you want to fully explore the 12-dimension (or whatever) space.

      1. j.bourne

        Re: what...

        Nail - head - on - hit. The problem is not the algorithm per se, it's the data that's allowed to be used to base the outcome on. E.g. if Gender, age, ethnicity, etc... weren't parameters in the first place then they wouldn't be available to discriminate on....

        Oh, so it's just short people you pick on then .... Yep that's ok nothing on the statute about height discrimination (or is there?).

      2. Anonymous Coward
        Anonymous Coward

        Re: what...

        Nevertheless, however the computer works and regardless of whether any human being can understand how it arrives at decisions, the people responsible for using it to make decisions must carry the can legally. The buck cannot stop with a machine, so it must stop with the people who installed the machine as part of their system. In principle, I suspect it's not very different from hitting someone with a spade. It's not the spade's fault!

      3. EBG

        Re: what...

        Differential pricing based on the customer as a variable, rather than being routed in the cost of product variation, should be illegal. E.g if we want to subsidise OAPs' travel, up their pensions and let them make choices. A big threat comes from the degradation of the fundamentals of money and un-impeded, non-discriminatory choice as to how we use our money once we have earned it.

    7. Oh Homer
      Headmaster

      "Swapping liberty"?

      What an odd characterisation of labour.

      An equitable exchange of labour for money or goods is not a loss of liberty, it's the voluntary utilisation of one's liberty for personal gain.

      I am not somehow "less free" because I choose to work for someone else, to earn a wage so I can buy goods, rather than work for myself to make or grow those goods directly.

      Either way I still have to work. To characterise such work as a loss of liberty is like saying that merely being born is comparable to slavery, as if the only true "freedom" is being strapped to a couch for 75 years, being spoon fed jelly by a nurse.

  3. Anonymous Coward
    Anonymous Coward

    Not just computers

    "When an organisation doesn't know what it wants from an algorithm, how can it measure what the results are? "

    Many of the algorithmic rules that affect people inconsistently in England are in laws set by politicians. They are algorithms that are being devised without a thorough investigation of their consistency when applied.

    Examples are the minimum sentencing rules - capable of being as unjust as the USA recidivism machine rules. It appears that they tie a judges hands even when they recognise the injustice in a particular case. The weighing of evidence itself is by a human algorithm. For example - is a picture illegal? There are algorithms people are required to follow to make that subjective judgement.

    Every day there are examples of people in the benefits system being unfairly disadvantaged by a particular combination of the rules.

    People applying for residency in the UK - who have lived here most of their lives - are being denied that privilege because there is a requirement for an "evidence" that they could not meet. eg children who cannot produce utility bills in their own name.

    To illustrate the general ignorance: the BBC "Thinking Allowed" programme recently mentioned the problem of computers making such decisions - however the presenter called them "logarithms". That is a word of Greek origin that has nothing to do with the Arabic derived "algorithm".

    1. P. Lee

      Re: Not just computers

      >is a picture illegal?

      I wonder if this article off the back of the BBC article which suggested that Google's algorithms are racist because if you ask for baby picture you only get white baby pictures? I'm just going to take a moment to laugh at the SJW's... ok I'm back.

      I'm with you in that it "isn't just computers" but mostly processes are relatively observable and have audit trails. However I don't think "processes" is where the article goes. We're into knowledge/decision systems. The problem I see with these is not the "AI" or whatever, but the massive consolidation in many industries and the lack of competition. This may be via corporate consolidation or merely that all the corporates are running the same software. Either way, that is unhealthy and intervention may be required to stir things up and bring back competition. Perhaps we do need to think about splitting Google, AWS or MS up. Hmmm... I'm guessing that won't happen for the next four years.

      The point about cartels offloading decisions to software with the objective of maintaining the cartel is interesting, but again, I think we're talking about business practice rather than particular algorithms.

      1. chelonautical

        Re: Not just computers

        >The problem I see with these is not the "AI" or whatever,

        >but the massive consolidation in many industries and the

        >lack of competition. This may be via corporate consolidation

        >or merely that all the corporates are running the same

        >software. Either way, that is unhealthy and intervention

        >may be required to stir things up and bring back competition.

        Good point. Another consolidation concern is consolidation in the data sources that record people's lives. Companies like Google and Facebook hold such vast quantities of data about the general public that their datasets will probably end up being given a very high weighting in any decision-making process. These companies know so much about us, why wouldn't every employer, bank, insurance company or any other business use them to find out much more about our habits and risk profiles? There is a lot of power in a few hands.

        As a result of this consolidation of personal data into a small number of internet services, any errors or unfavourable entries in the records of Google/Facebook/etc. could easily follow us around. If someone once said something foolish on Facebook or Googled something risky, these things could be added to their digital "permanent record" and result in life-long disadvantage in employment, credit, insurance and many more areas. This is already the case to some extent, as people can be found online but it used to require a degree of manual effort and patience on the part of a human. The next human at a different organisation might not bother or may search less thoroughly, so you have less chance of any past online embarrassment becoming permanently life-ruining.

        What's new is that large-scale automated information sharing and deep AI-based analytics of people's life history is becoming possible such that organisations can automatically judge people's entire digital lives and reject them in a matter of milliseconds. As a simple example, imagine going to an online car insurance comparison website and being told "all companies declined to quote for you" without knowing why. It might be because you posted a couple of things about social drinking on Facebook so they have all wrongly concluded you are a drunk and therefore a bad risk. It might be for some other reason entirely. Will it be possible to find out? Will companies admit any responsibility or will they just pass the buck to their third-party "lifestyle analytics" provider in another jurisdiction off-shore? Will there be an ombudsman who will help you find the culprit? Ultimately, who can you sue for the damages caused by incorrect automated decisions?

        These issues already occur today on a smaller scale (e.g. several times a call centre operator has told me "the computer says X" without being able to explain why), but on a small scale it's easier to handle or shop around elsewhere. The danger is that unexplained incorrect or biased decisions become automated and repeatable to the extent that you can never escape them in any aspect of your life. If everyone uses the same software and the same data sources that becomes increasingly likely.

  4. Anonymous Coward
    Anonymous Coward

    Even if

    These crusty old law makers could look at the algorithm, they probably wouldn't understand it anyway.

    > "require organisations using algorithms to retain records on all of the data they are using".

    This looks to be the real golden egg, access to the data.

    1. Doctor Syntax Silver badge

      Re: Even if

      "These crusty old law makers could look at the algorithm, they probably wouldn't understand it anyway."

      Let's examine your ageism.

      First of all, look at the summary from the table here http://parliamentarycandidates.org/news/the-age-of-the-new-parliament dating from 2015

      18 - 29 2% 30 - 39 14% 40 - 49 32% 50 - 59 32% 60 - 69 16% > 70 4%

      How does this compare with your concept of "crusty old". BTW, without looking it up, how do you think those 4% over 70 are distributed between parties?

      Now let's think what we might consider as an ideal age distribution. I think most of us would like our MPs to have some practical experience of the world they're trying to administer. My least ideal candidate would be a newly graduated or even younger policy wonk who has no concept of life outside of their own party machine. Such an MP isn't going to come into Parliament without being well into that age distribution, is going to spend some extra years broadening their experience in dealing with governance at all levels from constituency matters upwards and then should remain there so that their experience adds value. Does that distribution seem particularly unreasonable.

      There's also the notion implicit in the A/C's statement that somehow it's only the young who are aware of algorithms. So here, my young A/C is a little research exercise for you. Who are Whitfield Diffie and Martin Hellman? How old are they? Why do you think they should be unable to understand what algorithms are much less understand them? And, if you bothered to look up the answer to the question I posed earlier about >70 MP's parties, how did that fit you preconceptions?

    2. Anonymous Coward
      Anonymous Coward

      Re: Even if

      > These crusty old law makers could look at the algorithm, they probably wouldn't understand it anyway.

      If it was a traditional algorithm (like a flowchart or decision-tree) it would be fine.

      If it's some newfangled AI "machine learning" algorithm, where you just blast a load of data into a neural net and somehow "train" it to do the right thing, that's a lot harder to scrutinise for anyone - IT professionals included.

      1. Anonymous Coward
        Anonymous Coward

        Re: Even if

        It's certainly true (or very plausible) that some neural networks or even computer programs may reach reasonable decisions by methods that no human being can ascertain (or understand).

        But anything arising from such automated decision-taking is still entirely the responsibility of whoever used the computer to make decisions. It can't be any other way.

        If you can't be sure exactly how your decision-making system will work in ALL circumstances that might possibly arise, don't deploy it. If you do, you are like someone firing off a gun in random directions and hoping you never hit anyone.

  5. hplasm
    Big Brother

    Treading on politicians' toes?

    "...in the hands of a few programmers who have no accountability for the decisions that they're making,"

    and

    "in the hands of a few politicians who have no accountability for the decisions that they're making,"

    Seems like the former offends the latter- which is the status quo...

    1. Nick Kew

      Re: Treading on politicians' toes?

      Is it the politicians? They're just doing what they've long done: bewailed the new that isn't under their control.

      It's more the old-media (including to a great extent organs like El Reg which still use an oldfashioned Journalist/Editor model) crying about their own loss of the minds of their followers.

  6. Ken Hagan Gold badge

    Plase stop using the word algorithms

    To someone who actually develops algorithms for a living, your use of the word as a short-hand for "using a computer as a legal or PR fig-leaf" really grates.

    Algorithms are intellectual constructs and their form is constrained (if not determined, in simple cases) by what they are intended to do. Calling for algorithms to be regulated makes about as much sense as calling for mathematical theorems to be regulated.

    You can regulate whether people can *use* particular algorithms for particular purposes, but I think you'll find that hard to regulate in the case where someone chooses to run the algorithm on neurons rather than silicon. (Societies that try to regulate what goes on inside someone's head have a Bad Track Record, historically.)

    If I'm reading you correctly, your gripe is not the algorithm, nor even the fact that it is running on a computer, but simply the fact that the people who choose to run those algorithms on the computer are using the computer to put themselves at arm's length from the legal consequences of the algorithm delivering an anti-social or illegal answer.

    Happily, I believe that even in this case there is ample legal machinery and precedent already in place. If a corporation directs an employee to perform an algorithm and that employee ends up breaking some law, the corporation carries the can. Directors are liable, etc. This system has been tested on several generations of corporate shysters and crooks and appears to work. Using a computer rather than an employee merely increases the calibre of the cannon directed at the corporation's feet.

    1. TRT Silver badge

      Re: Plase stop using the word algorithms

      And on top of that the algorithm may only be sensible over a limited range of input conditions. Now, you might be able to open up the mechanism of the algorithm to inspection, but you can't release all the possible input values because these might well contain personally identifiable information.

    2. Pen-y-gors

      Re: Plase stop using the word algorithms

      Calling for algorithms to be regulated makes about as much sense as calling for mathematical theorems to be regulated.

      Isn't that exactly what dim-but-crazy Amber and barking-mad Theresa are wanting to do with encryption theorems.

      1. Doctor Syntax Silver badge

        Re: Plase stop using the word algorithms

        "Isn't that exactly what dim-but-crazy Amber and barking-mad Theresa are wanting to do with encryption theorems."

        Looking at the state of governance of nuclear powers around the world I'm starting to think that by comparison BoJo is a rational and diplomatic negotiator, Rudd is a competent technocrat and May a benevolent internationalist. We're doomed, I tell you, dooomed.

    3. Frumious Bandersnatch

      Re: Plase stop using the word algorithms

      I totally agree, Ken. We should be talking about "automated processes" or the like.

      It seems to me that the only thing that needs legislating here is in the realm of data protection (or FoI) requests. Let's say that someone is refused insurance cover. I think that it's quite possible and reasonable to make a data request asking the organisation to clarify the factors leading to the decision. I'm pretty sure, though not certain, that this sort of request is allowable and that it should receive a reply.

      However, once you start using automated processes, there is a great risk that the organisation being asked for such information will, deliberately or not, seek to obfuscate what their processes are. You'll just get a response "computer says no". If you kick this up to the ombudsman or whatever, there's every likelihood that the organisation will argue two main points: first, they'll say that their algorithms are a trade secret, and second, they'll say that the cost of satisfying the request is excessive. I don't think that the first point needs much comment, but for the second, it's quite possible that they'll be able to make a good excuse: since software is so much more complicated than manual processes (which they'll no doubt have documented as part of their quality certification or whatever), the cost to audit it will be so much more. Since data requests can legally be refused on grounds of cost, this will end up with more data requests being refused, with little or no recourse.

      So, as a result, I think that the only changes that need to come about are to ensure that the same transparency standards are applied to automated processes as manual ones. This needs to happen both in terms of privacy/FoI legislation and non-legislative areas, such as ISO quality standards (which I assume is immune to Brexit).

  7. Pat 11

    confidence intervals

    These kind of algorithms are using statistics to estimate parameters and make decisions. One way to make them more accountable without be to require all such processes to produce confidence intervals. For example, predicting recidivism - if the algorithm says X has a 68% chance of committing another crime, that sounds worrying, but if it also says that estimate has a 95% confidence interval of 26-81% then it looks much less certain. And if they can't generate confidence intervals, it's a shit algorithm that should not be trusted.

    1. allthecoolshortnamesweretaken

      Re: confidence intervals

      Good point.

      Also a good argument for the "let's look inside the black box" suggestion.

      1. Anonymous Coward
        Anonymous Coward

        Re: confidence intervals

        IIRC when Expert Systems were getting popular - it was considered essential that the way they reached their conclusion in each particular case was clearly tracked for human verification.

    2. Doctor Syntax Silver badge

      Re: confidence intervals

      "if they can't generate confidence intervals, it's a shit algorithm that should not be trusted."

      And even if they can but make no distinction between different offences and the circumstances in which they were committed then it's still a shit algorithm.

    3. Nick Kew

      Re: confidence intervals

      You only get confidence intervals from real data.

      You only get real data after a system has been operating long enough to collect them.

      Then there's the joker in the pack: someone's sure to mess with the "other things being equal" part of any study.

  8. Anonymous Coward
    Anonymous Coward

    But more importantly...

    When are they going to replace MPs with algorithms?

    At least algorithms can be programmed to have ethics and not lie...

    1. This post has been deleted by its author

    2. Primus Secundus Tertius

      Re: But more importantly...

      MPs could be replaced at any time by men with guns.

      Most of the time in most parts of the world the MPs are less worse than the men with guns.

      1. Steve Davies 3 Silver badge

        Re: But more importantly...

        MP's? Ah, you mean Military Police. They have guns.

    3. allthecoolshortnamesweretaken

      Re: But more importantly...

      "When are they going to replace MPs with algorithms?

      At least algorithms can be programmed to have ethics and not lie..."

      As Ken has hinted at already, algorithms can also run on neurons instead of silicon. And there are enough experiments (B F Skinner comes to mind among others) that demonstrate that humans can be conditioned very much like lab rats. Therefore it is entirely possible to program MPs.

      In fact, most of them are, although calling simple if/then conditions or goto loops algorithms is stretching things a bit.

      1. Charles 9

        Re: But more importantly...

        "As Ken has hinted at already, algorithms can also run on neurons instead of silicon."

        I think the problem here is there's no assurance it'll run consistently and precisely on neurons.

    4. elDog

      Re: But more importantly...

      Which came first, the politician or the algorithm driving the politician passing the laws that govern the algorithms?

      And what do we call the first human programmer of the first algorithm? God?

  9. lightman

    Another opportunity for the government to do all the wrong things for (possibly?) the right reasons !

    Perhaps the first step in this process is legislating that the lawmakers are required to have a more than "just adequate" understanding of the technology they are regulating, or we shall see repeats of wholly unacceptable situations like the head of a government agency not knowing what the thing they were legislating against actually was ..... anyone remember cookies ?

  10. MK_E

    The mortgage one is hilarious. When I was buying my first house, the bank told me at first that the computer didn't think I could afford the repayments month to month.

    The fact that it was a clear £100/mo less than what I was paying in rent, that I was already paying and managing to save up for a deposit while paying, should have been self-evident that I could afford a smaller amount.

    1. Nick Kew

      MK_E, you were lucky. In my youth I was denied a mortgage, despite the repayments being lower than the rent on a ****hole room without luxuries like hot water in the communal bathroom.

      I fled the country to escape that, and so missed my generation's chance to buy at a reasonable price in the 1990s. But at least today's rental market is much-improved.

    2. Anonymous Coward
      Anonymous Coward

      Mortgage

      That's quite a good case in point - precisely because it is so simple. For a start, as soon as the bank people said anything like, "The computer doesn't think you can afford the payments", they revealed their utter ignorance of what was really happening.

      Computers do not think. One day they possibly might, but as of today they don't. What they should have said was, "We have done some predetermined sums on our computer, and we don't think you can afford the payments".

      The decisions were all made by bank staff - probably managers - and then programmed into the software. If any mistakes (or legal offenses) resulted, they were the fault of those people.

  11. Charlie Clark Silver badge
    Coat

    Trite nonsense

    Algorithms are almost as pervasive in our lives as cars and the internet

    Algorithms are far more pervasive than either cars or the internet. But it seems the author is not sure as to what an algorithm is.

    Mine's the one with the pocket Knuth in the pocket.

  12. cambsukguy

    Precedent will occur

    We will have precedent and case law in a few years.

    A few trips up to the highest courts and lawyers will be quoting cases for and against blaming the vehicle (and thus the SW and HW therein) for various kinds of incidents.

    The Tesla/truck crash which killed the driver must already have started the ball rolling - the system can't just allow Tesla to state 'Hmm, didn't see that truck - will tweak the algo' despite requiring the driver to essentially never use the actual 'self-driving' tech.

    Once they start blaming the car tech, the flood gates will open and other tech will start getting blamed for stuff - 'I didn't WANT two tons of creamed corn, it was Alexa's fault"!

    Obligatory xkcd

  13. Boris the Cockroach Silver badge

    Remember

    one of the oldest rules in computing

    Garbage in , garbage out.

    You can have the most wonderful algorithms ever, and completely proven to work doing what they are suppossed to be doing. sadly the data entry gets a minus sign in the wrong place, and next thing you know is that mariner 1 does a 180 degree turn at mach 2

    But the idea of regulation does raise a concerns. are they qualified to do the job? how long will they take to do the job? how much will it add to the cost? and finally what happens when they f*** up as well?

    1. Charles 9

      Re: Remember

      But what about garbage input that didn't look like garbage input at the time (as in it seemed to make sense)? Like say someone who only ever had one name.

  14. Will 28

    Feedback

    I think this article would have benefitted greatly if at the start you had defined what you considered an "algorithm" to be. Without this, especially given your apparent definition differs from what most programmers would use (I think, I can't be sure because you didn't give a definition), it isn't really possible to give any value to the subsequent points made about them.

    By the end of the article I got the feeling you could have just replaced the word "algorithm" with "scary thing".

    1. Anonymous Coward
      Anonymous Coward

      Re: Feedback

      "Algorithm - a process or set of rules to be followed in calculations or other problem-solving operations".

      The eponymous origin being the approach to solving mathematical problems by Mohammed ibn-Musa al-Khwarizmi (780-850). Applications in electronic computers are only a recent use - algorithms have been a recognised method of processing information for over a thousand years.

      1. Wensleydale Cheese

        Re: Feedback

        "Algorithm - a process or set of rules to be followed in calculations or other problem-solving operations".

        I once had some university professor ask me what algorithms I was using on my current project, which was a simple data transfer from one system to another.

        No calculations or problem-solving as far as I could see, it was simply shifting data from A to B.

        As such, I didn't understand the question.

        I suppose it did demonstrate that I wasn't used to obfuscating stuff by the use of academic language...

        1. Anonymous Coward
          Anonymous Coward

          Re: Feedback

          "No calculations or problem-solving as far as I could see, it was simply shifting data from A to B."

          There was a problem to solve - how to move the data. No matter how simple - you had to make decisions about how to do it - and decide the order of the actions.

          1. Charles 9

            Re: Feedback

            Which presents a bigger problem. What if you're never told what are the data you're supposed to move nor the criteria? How can you play when you're never even told the rules?

  15. Primus Secundus Tertius

    Regulation versus Appeal

    One cannot hope to regulate every detailed thing. The answer must be a right of appeal to legal precedent or common sense, with the latter occasionally allowed to prevail.

    1. Anonymous Coward
      Anonymous Coward

      Re: Regulation versus Appeal

      "The answer must be a right of appeal to legal precedent [...]"

      Unfortunately in the UK you need deep pockets to take the case all the way through the appeals process. Only an Appeal Court ruling forms future legal precedence - no matter how many times juries refuse to convict people for the same offence.

      Too often these days the law makers rush bills through with insufficient scrutiny. Leaving some poor sod to be prosecuted by the police and CPS who will push the limits of a law's vague wording.

      1. Doctor Syntax Silver badge

        Re: Regulation versus Appeal

        "Too often these days the law makers rush bills through with insufficient scrutiny."

        It's called agile.

  16. Anonymous Coward
    Anonymous Coward

    Gas & Electricity Companies

    Clearly use algorithms that are stupid

    When working out your monthly payments they assume that your energey use will be the same in December as it is in June. Clearly this is not the case for 99.99999% of the population but still they use it.

    It would not be rocket science to introduce an ALGORITHM that worked on the basis of actual average data obtained from your meter readings but will they? Like heck they will. Using a direct debit allows them to adjust your payments over the year depending upon your use. The only changes they will make would be to increase your payments. Like petrol prices, up in a flash and down sometime never.

    That money of yours they have when you are in credit is free cash and can even earn them interest. Do they give that back to us?

    Like heck they do.

    1. GrapeBunch

      Re: Gas & Electricity Companies

      adjust your payments over the year depending upon your use.

      I misread as:

      adjust your payments over the year depending up your arse.

      Really I did. And so did the gas company.

  17. handleoclast
    Angel

    The fix is obvious

    All we need to do is implement an AI system to check the algorithms used by other AI systems.

    What could possibly go wrong?

    1. Charles 9

      Re: The fix is obvious

      Turtles all the way down is what you'd get. What checks the AI that checks all the other AIs?

  18. morenewsfromnowhere

    I may be wrong, but I think the article is confused, in the way that all or almost all people are when it comes to law and ethics.

    Law is neither here nor there, and quite often it is a problem in and of itself.

    The key principle is that everything two entities - companies, people, Governemnts, whathaveyou - do together must be voluntary (they must both agree to it) and well-informed (they must know what they're agreeing to). So you can't coerce, and you can't deceive. The sole exception of self-defence and then all bets are off.

    Where law is compatible with this principle, it's good and right; where it's not, then law is being used by one group to enforce its will on another group, without their consent and not in sefl-defence.

    When we talk about algorithms and all this - it's kinda neither here nor there. What matters is *disclosure*. If you sign up for something, and you've been well-informed about whatever it is, and you agree to it, *then by definition it is ethical*.

    One case where there are problems is when there is no choice - so let's say there's only one electricity provider in an area, and they want to use smart meters which are known to be incorrect in their readings. You then have Hobson's choice - electricty and bad meters, or no electricty. What I usually find in these situations is that *prior* to this point, someone or something else was imposing non-voluntary contracts - such as regulating an industry and reducing competition - which in turn *led* to this problem, further downstream.

    In such situations then you are in fact being coerced, at one remove; all bets are off. You are free to act in whatever ways are necessary to ensure you are not coerced or deceived. Hack the meter.

    1. Charles 9

      "The key principle is that everything two entities - companies, people, Governemnts, whathaveyou - do together must be voluntary (they must both agree to it) and well-informed (they must know what they're agreeing to). So you can't coerce, and you can't deceive. The sole exception of self-defence and then all bets are off."

      Which is usually never the case. Each side is usually trying to hide things from the other: either to outplay the other or as a defense against the other trying to backstab them. Confidence is something that usually only comes with trust, and trust is the exception, not the rule. Unless there's a crisis, one man usually doesn't trust the other and will tend to act in competition. That's why theoretical things like pure capitalism don't work in reality (asymmetry of knowledge) and why you have thought experiments like the Prisoner's Dilemma.

      "In such situations then you are in fact being coerced, at one remove; all bets are off. You are free to act in whatever ways are necessary to ensure you are not coerced or deceived. Hack the meter."

      Nope, because the electricity companies tend to have government mandate on their side. IOW, if all bets are off, what do you do when it's the OTHER side that has all the guns? Oh, and a willingness to use scorched earth tactics?

  19. Anonymous Coward
    Anonymous Coward

    We should not be using algorithms to determine sentences for criminals, the judiciary is there for that to determine the facts of the case and the intent of the criminal. I understand it is still an opinion but its better than something that does not know or see what is in front of it and is basing it's opinion based on information fed in to it.

    Mortgages are based on ability to pay so I have no problem with them, money in - money out = amount you can afford with an adjustment to factor in interest rate changes.

    Googles algorithms should be open to scrutiny by a legally backed watchdog because they can make or break a company/product/person/opinion.

    At the end of the day the box (Pandora's) is now open and it's not going to get shut.

    1. Doctor Syntax Silver badge

      "Googles algorithms should be open to scrutiny by a legally backed watchdog because they can make or break a company/product/person/opinion."

      That might be easier said than done. We keep getting told that even the authors of the learning programs don't know how they work after they've been subject to training.

      In any event I'm not sure even the results would pass scrutiny. Google seems to specialise in presenting hits which are irrelevant to what I actually want too much of the time. The likes of Amazon & eBay all too often present prompts along the lines that "You might also be interested in ${What I just bought or something similar and won't need to buy again for a long time}" or "People who looked at $Thing also looked at ${Amazing variety of other things which just goes to show that looking at $Thing doesn't correlate well with looking at anything else}". I'd have thought that anyone who's experienced this would think very carefully about devolving any serious decision to such processes. Clearly, however, too many in business can't understand what lies in front of them.

    2. Charles 9

      "We should not be using algorithms to determine sentences for criminals, the judiciary is there for that to determine the facts of the case and the intent of the criminal. I understand it is still an opinion but its better than something that does not know or see what is in front of it and is basing it's opinion based on information fed in to it."

      But if you depend on humans, what happens with a charismatic suspect?

      "Mortgages are based on ability to pay so I have no problem with them, money in - money out = amount you can afford with an adjustment to factor in interest rate changes."

      But since mortgages tend to be long-term things, they also have to take vulnerability into account. How likely is the borrower to suffer a significant event that severely alters his/her ability to fulfil his/her end of the deal (say, the industry he/she is in is prone to collapse leaving him/her not just unemployed but unemployABLE).

      "Googles algorithms should be open to scrutiny by a legally backed watchdog because they can make or break a company/product/person/opinion."

      But Google is multinational. They can probably play foreign sovereignty against you. What will you do then? Block Google and get complaints up the wazoo?

      1. Anonymous Coward
        Anonymous Coward

        ""We should not be using algorithms to determine sentences for criminals, the judiciary is there for that to determine the facts of the case and the intent of the criminal. "

        The politicians in England have imposed rules for determining minimum sentences on conviction. If someone pleads guilty immediately there is a fixed discount on the sentence. If they insist on a jury trial and are found guilty - then there is no discount. If an innocent person changes their plea to guilty because of the fear of a subjective jury finding them guilty - then apparently the judge is not allowed to impose a lesser sentence than the minimum that the rules prescribe.

        Minimum sentences rules were introduced because tabloid-fearing politicians didn't want judges exercising any intelligent discretion based on the evidence presented.

        1. Charles 9

          "Minimum sentences rules were introduced because tabloid-fearing politicians didn't want judges exercising any intelligent discretion based on the evidence presented."

          I thought it was out of fear a charismatic criminal would get off light.

        2. Doctor Syntax Silver badge

          "Minimum sentences rules were introduced because tabloid-fearing politicians didn't want judges exercising any intelligent discretion based on the evidence presented."

          The tabloids' algorithms didn't match the judges' - and didn't have as much data.

  20. jake Silver badge

    Sounds like the makings of a witch hunt to me.

    Fear of the unknown on a societal level can be a very ugly thing.

    Out of curiosity, has anybody put these politicians on the spot and asked them point-blank to define exactly what they mean by "algorithm"? The answer (or lack thereof) might be amusing ...

    1. Charles 9

      Re: Sounds like the makings of a witch hunt to me.

      And if many of them CAN answer the question accurately? Wouldn't that be even scarier?

      1. deathOfRats

        Re: Sounds like the makings of a witch hunt to me.

        Yeeesss... that would be scary, but, how many would many be? Something like one, two, three, many, many-one, many-two, many-three, LOTS?

        Even then, I would be scared, though...

      2. jake Silver badge

        Re: Sounds like the makings of a witch hunt to me.

        It wouldn't be scary, per se ...but my gaster would be well and truly flabbered.

    2. Tom 64
      Mushroom

      Re: Sounds like the makings of a witch hunt to me.

      > "The answer (or lack thereof) might be amusing"

      Or it might make my blood boil and result in me yelling at the telly.

      1. jake Silver badge

        Re: Sounds like the makings of a witch hunt to me.

        Tom, if you have enough energy to boil your blood & yell at DearOldTelly, you'd probably be better off spending your copious free time educating the voters in your neighborhood. If we all did, something might actually get done.

        (Can you imagine fifty people a day? I said FIFTY people a day walkin' in, singin' a bar ... Oh, wait, that was a different protest entirely. But it just might work ... )

        1. Charles 9

          Re: Sounds like the makings of a witch hunt to me.

          But as a comedian once said, "You can't fix Stupid."

  21. Anonymous Coward
    Anonymous Coward

    Quantum Computing

    And when this finally _really_ materialises, then what?

    1. Doctor Syntax Silver badge

      Re: Quantum Computing

      "And when this finally _really_ materialises, then what?"

      I'm uncertain about that.

  22. User McUser
    Headmaster

    No, no they don't.

    Some laws stop us taking each other's stuff (property, liberty, lives)

    Whoa, slow down there fellow - laws do NOT stop anyone from doing anything.

    Murder and theft have been officially illegal for at least 4,100 years that we know of[1] and yet stuff still gets stolen and people are still murdered every single day in every single country on the planet. Laws merely establish a fixed and uniform punishment for specific acts so that everyone knows ahead of time what the consequences are should they be caught committing one of said acts.

    While some people *might* weigh the punishment against the crime and choose not to murder their coworker or steal that shiny-shiny from the jewelry store, there are plenty of other people who don't perform such calculus, or reach a different conclusion, and thus steal and murder as they please.

    It would be more appropriate to say that laws discourage us from taking each other's stuff.

    [1] The "Code of Ur-Nammu" dates to ~2100-2050 BCE and specifies punishment for (among other crimes) murder, robbery, adultery and rape. Spoiler alert - the penalty for all of them is death.

    1. Charles 9

      Re: No, no they don't.

      "[1] The "Code of Ur-Nammu" dates to ~2100-2050 BCE and specifies punishment for (among other crimes) murder, robbery, adultery and rape. Spoiler alert - the penalty for all of them is death."

      I suspect given the conditions of that society (no such things as jails, for example, and no practical destination for an exile, etc.), death was basically the only option that would stick.

      PS. In the end, laws are just ink on a page. What matters law to a charismatic sociopath able to raise a army big enough to overrun you?

  23. Doctor Syntax Silver badge

    AFAICS this lies at the heart of data protection legislation which includes control of processing. A good deal of the intent of such legislation is to prevent inappropriate processing. As I said in a comment on another topic maybe tightening up here might lead banks to re-evaluate branch closures.

  24. Anonymous Coward
    Anonymous Coward

    It is a nice idea however unlikely to ever be implemented

    Sadly the people who will ultimately make the decision on this are virtually guaranteed to have obtained their positions via a way around the rules that apply to everyone else.

    They are not going to make things fair, they have to maintain the status quo else they find themselves unable to compete.

    This is very much what they fear along with being found out, so no it will never be allowed to happen and if ever it did then things would return to how they are now within one generation and even more biased within two.

    It doesn't matter where or how you live, only in deperation are people not biased against the stranger no matter what they claim.

  25. Anonymous Coward
    Coat

    It's much worse...

    "And yet, if the last decades of open-source software have taught us anything, it is that simple availability does not incentivise investigation."

    You don't have to look at open source for that, just look at much more substantial examples. How about projects which are basically build upon pseudo science and which can be proven to be bollocks by merely applying some simple mathematics on them.

    For example: a project which will allegedly solve the worlds water scarcity by extracting it out of the air. People made a project, a nice presentation with featured unrealistic claims ("it'll easily extract 40liters of water per day") and as a result they managed to gain a lot of funding, including government funding. Even though it can be proven that the whole concept is flawed and won't work.

    Project even made the news and hardly any reported bothered to also look at this from a scientific perspective or to get someone to do that for him.

    This is about something in plain sight, fully out in the open, yet people still manage to allow themselves to be conned by it.

    So then someone things that algorithms need to be more transparent? Uhm, right...

  26. Anonymous Coward
    Anonymous Coward

    Could this idea be more backwards?

    Let's accept that the rule of law is meant to provide solid ground upon which our society can function. Some laws stop us taking each other's stuff (property, liberty, lives) while others help us swap our stuff in a way that's fair to the parties involved (property, liberty, time)."

    I'm sorry; you must be new here.

    The law is simply a generally-accepted monopoly on the initiation of violence. This may, in theory be used to help society function as described above. In practice, it throws as few crumbs in that direction as is necessary to maintain the violence monopoly. The overwhelming majority of its energy is directed towards further enriching and empowering those who control the violence monopoly. This attracts people who prefer to improve their position through the unilateral application or threat of violence, as opposed to other groups who prefer (or are at least willing) to improve their position through other means, ranging from fair trade to outright fraud. In the US, people complain about having to choose between the openly sociopathic Donald Trump, the openly sociopathic Hillary Clinton - but the system only admits people like this. There is an enormous apparatus of local, state, and federal government that acts as a training and grooming program to weed out the weak and idealistic. Some people do get as high as the US Congress and keep their values mostly intact, but these people are so rare that they don't really mean anything. On the (R) side you have someone like Ron Paul or Justin Amash, and on the (D) side you have someone like Dennis Kucinich. These people are denigrated as loons, and the system tries very hard to reject them. I'm not saying that their positions are right or wrong, merely that they are or were fairly immune to corrupting influences against their values. Amash's own party spent an insane amount of money trying to force him out during the last election. People who cannot be bought and controlled are not wanted in government (note that I specifically exclude Bernie Sanders, who is just as corrupt as the rest when people aren't paying attention - which is most of the time - from the good guy list).

    So then we have laws. Laws are designed to further enrich or empower the people writing them. This is sometimes done directly in a pay-for-play fashion, and other times indirectly by buying votes or otherwise paying enough attention to the outrage of the day to make the unwashed masses shut up and go away. But the kicker is that most of the time it's both - legislation designed to help "protect" the Morlocks from the big bad business people who are.... actually the ones who write the laws. Big business then complain loudly, bitterly, and publicly about how unfair the legislation is, the politicians claim Victory for the People, and the media (almost entirely comprised of pathetic, self-important rubes) gushes in admiration. The end result is that Big Business has fewer competitive worries and few (if any) meaningful costs or restraints placed on them. Every once in awhile a Sacrifice Must Be Made For the Greater Good (a company gets whacked hard), but again, this is relatively rare and almost always done in favor of some competing oligarchy so that money and power are not lost overall by the political class.

    A lot of people say that this description of laws is grossly unfair, unrealistic, etc. Nope. In my younger days I was rather involved with it. I've literally been in the room where businesses helped write regulations to "improve public (whatever - safety, health, etc.)" in ways that covered what they were already doing and specifically hurt or eliminate their competitors that couldn't afford to keep up with them. The word "monopoly" was originally coined as a privilege granted or sold by governments to give an organization exclusive rights to some sort of business practice. These days, we call it legislating for the public good.

    There are a lot of metaphors for politics; the most useful one I have found is that it's just a specialized form of gang warfare controlling the most rarified form of turf.

    Now we bring in AI, which some people fear will be even smarter and more evil than Big Business. And we should have bigger and badder laws to deal with it! But... if AI is that big and smart, then why won't it game the system even better than Big Business has? Especially when you consider the fact that all of the big, bad AIs will be controlled (at least initially) by Big Business? Does anybody have a plan to make this work that doesn't involve unicorns and fairies?

    We can't even control dumb megacorporations with nation-states. We're pretty thoroughly screwed trying to apply the same techniques to these hypothetical ultra-smart ones. Nation-states themselves are the biggest polluters on the planet, and the worst of the pollution that they don't create directly is the result of them selling off the rights to pollute. They kill more people through violence - by many orders of magnitude - than even the worst corporations. DuPont has killed a lot of people, but they're pikers compared to even a mid-tier national government. Democide (the slaughter of people by their own governments, excluding war) has killed around a quarter of a billion people in the last century alone. It's hard to figured out whether the number of people locked in cages for political or victimless crimes is in the 8-figure range or the 9-figure range. In exchange we get schools (that graduate illiterates by the tens of millions), roads (that cost exorbitant amounts of money to build and maintain, if and when they're actually maintained), national health care (that is escaped by anyone who can afford to), national defense (that creates more enemies and terrorists than it eliminates), and a justice system (whose results are more dictated by the wealth of the defendant than the guilt of the defendant), etc. I realize that a lot of people are quite concerned about losing these "essential benefits," but personally I think that we're already living pretty close to the worst-case scenario there.

    Time to start cracking on the next stage of organizing humanity....

    1. Charles 9

      Re: Could this idea be more backwards?

      But then comes the armor-piercing question.

      "Can you think of any better without changing the human race as a whole?"

      In other words, what you describe sounds like the absolute pits...until we start looking at the alternatives.

      Otherwise, we may be better off just waiting for the Taelons or whatever to come and become "beneficent guardians" for our own protection.

      1. Anonymous Coward
        Anonymous Coward

        Re: Could this idea be more backwards?

        But then comes the armor-piercing question.

        "Can you think of any better without changing the human race as a whole?"

        Oh good grief, how can you not? Nothing is provable in a comment (that requires tons of a priori reasoning followed by experimentation), but there is an absolute mountain of theory to throw at the subject:

        1) A system of ethics (and government is just a practical application of this) should function unilaterally: that is to say, my ability to coexist and interact peacefully with others should not require that the others share my values. Tons of stuff has been written on this.

        2) Ethics are simple. The problem is that people want things or want to do things that can't be accomplished ethically, so we introduce rationalization. Rationalization is complicated. If ethics appear to be complicated, then start eliminating rationalization.

        3) Representative government does not scale well. I don't pretend to know where the line is where it starts to fall apart (and that would vary by culture), but my wild guess would be 50,000 people for an overall entity.

        4) I'm discarding direct democracy, as that's just an ISO-Certified methodology for mob rule.

        5) Forcibly grouping people into blocs based on which map coordinates of the planet they happened to be at when they exited their mother's womb is perhaps not the worst way to organize societies, but it has to be close. Lumping people together based on geography stopped making sense at least 50 years ago, and gets worse every day. Government should be something you join, like a church or a Rotary Club or something, because it actually reflects your values. And you should be reasonably free to change governments fairly easily. If you, in your heart of hearts, want to be a hardcore communist you should be able to go be a hardcore communist. If you want to be the opposite (anarcho-capitalist?), then go do that. If you want something in between, you should be able to find something that suits you. Some people love leisure, and some people love work. Some people love to work with others, some prefer to work alone or in small groups. None of these approaches is inherently wrong, but mixing these types in the same government is problematic at best - people then are incentivized to force others to be Their Type.

        6) Any organizational methodology that involves being constantly at war is almost certainly defective. The rationalization for this is that the world is full of assholes that must be dealt with through military might, but if this never ends then chances are you're the asshole. One thing I've learned through world travel is that most people just want to be left alone in peace. That doesn't mean that they don't get angry or have grievances, but that almost never drives them to violence on its own because most stuff just isn't worth getting you or your family and friends killed over. It's governments (and occasionally religion) that rile them up, under pretenses that almost invariably turn out to be dishonest.

        7) Somebody will always get screwed in the end. If you try to eliminate this entirely, you only guarantee screwing everyone. I would personally value systems that are more quick and flexible with self-correction over systems that are slow.

        And I could probably go on for another 20 pages, but this should be enough to get somebody going.

        1. Charles 9

          Re: Could this idea be more backwards?

          Except every single method you propose has fatal flaws.

          1-2) People will CHEAT and then hide the fact they're cheating. Humans will be bastards if it'll give them a leg up on their neighbors. It's damned near instinct.

          3-4) NOTHING scales well for 7 billion (and direct democracy pretty much doesn't scale past tribal size), and unless you have a system that can encompass EVERY human, one side or the other's going to feel slighted and want revenge.

          5) Eventually, two such blocs will end up at odds. Usually over resources like arable land or women. If it can happen to two people, it can certainly happen to two blocs of people. Geography WILL matter at some point because the Earth is finite. EVERYTHING is finite at some scale.

          6) Except when there are too many people. You eventually end up with a "Baker's Dozen in an Egg Carton" situation: too many people for whatever geography can accommodate. At that point, war isn't just desirable, it's inevitable. Either SOME die or ALL die from sheer exhaustion of resources.

          7) And many humans hold grudges. If you screw someone, you risk an act of revenge, and some people see Mutual Assured Destruction as an acceptable scenario in that context.

  27. Brian Miller

    Article title should be: "Subjecting our laws to algorithms"

    From the text of the article, the laws and sentences are being subjected to algorithms. The problem is that the humans who should be giving the results a second thought and using them as a guideline are instead rubber stamping the results.

  28. Anonymous Coward
    Anonymous Coward

    I think that algorithms are already largely regulated...

    In that they are embedded in products and services. If you sell a product or service with an algorithm that makes that product unsafe, then your company is liable. If you create an algorithm that misrepresents data, as in a financial scam or your average VW diesel, you are liable. If your algorithm unjustly impacts the economic opportunities of one or more racial/ethnic/gender groups, then you are liable.

    Yes, you have to be careful when writing or incorporating algorithms, but the same can be said for code in general, or even many physical components of a tangible product. I worked at a medical device company where our materials people changed out the type of plastic used in a device, (stupidly) without testing the toxicity of the new plastic--oops. The Food and Drug Administration slapped us into next week over that one.

    In terms of public services/government, you can in many cases also sue if an algorithm impacts your benefits/employability/economic or property rights. You can also vote to "kick the bums out" if a specific elected official or group of officials is involved.

    1. amanfromMars 1 Silver badge

      Re: I think that algorithms are already largely regulated...

      Howdy Marketing Hack,

      Do you think an algorithm was responsible for Theresa May calling for a snap General Election today? Or/And is it also a 0day gone renegade rogue?

      1. Anonymous Coward
        Black Helicopters

        Re: I think that algorithms are already largely regulated...

        Its the KGB hacking the British election process!

  29. ocratato
    Headmaster

    Exams

    There are two categories of algorithm that need to be distinguished in this discussion. The first is the traditional algorithm that uses a well defined set of steps to produce some result, such as an encryption algorithm. The second category is the ones using the new deep learning neural networks, as used by Google for image classification.

    The first category is the software equivalent of a mechanical device - it does exactly what its user asks it to do (assuming no bugs). The second, which is what I think the article's author is concerned about, is more like the software equivalent of a dog - it can be trained to do what we want, but it is not entirely under our control.

    While we can test, or even examine, the first category software, this is fundamentally impossible for the second. Its learning is distributed across a myriad connections that makes it impossible to examine for "correctness".

    The solution, I suspect, is to test these programs in the same way we would test a living organism - by giving it an exam. An autonomous vehicle, for example, would need to be given a comprehensive driving test. Software for giving financial advice should also be put through tests similar to what a human doing the same job would need to go through. So, perhaps the author is correct - these programs need to be subject to the same sort of laws that apply to us.

  30. geoffb_au

    Already Catered For

    Any algorithm is only in place because a human wanted it there. Even considering a sufficiently advanced AI that could create its own, the AI is only there because a human wanted it.

    Our existing legal frameworks, as stated in the article, are tried and tested (although not always perfect or completely fair) against protecting one person from the decisions and actions of another. Any algorithm is just an extension of a person's (or group's) decision to bring it about.

    Got denied a job because an algorithm on LinkedIn found a tweet you made years ago saying Gina Davis has a banging rack? You already have recourse.

    Car crashed into an orphanage because the ABS algorithm failed as you attempted to take a corner? There's already recourse.

    Medical diagnosis algorithm your doctor used wound up prescribing medication that compelled you to respond to articles on the internet? You've already got recourse... oh.

  31. geoffb_au

    Also...

    Could you imagine the nightmare of trying to make your software compatible with the legislation of multiple different jurisdictions? Gawd. Even just good ol' America. Each state would have it's own set of ridiculous standards to apply, as well as federal, then the UK, then the EU. By the time it wormed its way down here to Australia, nobody would give a pickled rats arse. It would be the end of us. We'd be back using rotary dial phones and riding bicycles, even getting our news from bits of paper!

    1. Charles 9

      Re: Also...

      And there are some who WANT that. Slow down the rat race, control the population and the exploitation of the Earth, and all that.

  32. amanfromMars 1 Silver badge

    AIMinds are a'Boggle at the Arrogance of Simple Minds

    A can of quite alien worms has been opened wide right at the heart of core systems of SCADA administrations, Alexander J Martin, and the prognosis for the state of executive health is already decided in the correct interaction and proper engagement with what the future offers via A.N.Other Means and Advanced IntelAIgent Memes.?! ……. AI@ITsWork

    And shared as a question exclaiming fact because as a fiction would it be too stealthy to steer and/or influence with concerned input to output. Although to be perfectly honest, would that choice be a lock only to be opened with the demonstration of one being worthy of handling the key.

    Do you not all find it somewhat perverse and naive of historic politically biased systems imagining they will ever have command and control over future eventing machines with access to more information and intelligence than will ever be designed to flow to and/or through traditional closed elite executive order systems?

  33. GrapeBunch

    Weapons of Math Destruction

    I've opened but not got too far along with, a book with the above title. It deals with exactly the topic being discussed here.

    I think that a regulatory approach would be as pointless as outlawing stupidity. You have to keep redefining stupidity, and as soon as you think you're done, somebody moves the yardsticks. Even a requirement to file flowcharts, founders on the reasonable contention that the algorithm and its flowchart are trade secrets of the company.

  34. EBG

    for a start

    Hold corporations accountable for their actions, whether generated by human, or by algos ( or a bit of both ).

    I.e. ** enforce existing law **.

    I had a months long battle with Talk Talk. They sent threatening letters that I could not easily reply to (they instructed me to phone a premium line). Couldn't easily reply, as they did not list the registerd address on the communications. This is against the law, and if it is being done routinely by a major company, it really should be absolute no-brainer for goverment to prosecute punitiavely. Why isn't that happening !?

    1. Anonymous Coward
      Anonymous Coward

      Re: for a start

      They don't care.

      For further explanation see comment above: #Could this idea be more backwards?

  35. Anonymous Coward
    Anonymous Coward

    Hmm...

    Presumably they'll use an algorithm to regulate algorithms. So we'll have a collection of algorithms that have one set of rules and the algorithms that we right that have another set of rules.

    So like the Police. Seems logical.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like