back to article Dr Hannah Fry: We need to be wary of algorithms behind closed doors

Sure, algorithms are insanely useful, but we need to watch we don't become complacent and unable to question them, University College London's Dr Hannah Fry warned in an interview with The Register. Dr Fry is a lecturer in the mathematics of cities at the Centre for Advanced Spatial Analysis at UCL, where her research " …

  1. Daniel von Asmuth
    Thumb Up

    Algorithms that sit behind closed doors

    That sounds like 'Closed Source' to me:-(

    1. Anonymous Coward
      Anonymous Coward

      Re: Algorithms that sit behind closed doors

      That sounds like 'Closed Source' to me:-(

      Not at all. I work for a company that uses open source software, but the algorithms I write determine weather or not you get that all important first interview for a job.

      If I balls this up I can totally fuck up your career.

      PS you as the person trying to get the job have no access to the code or the rules that determine weather you are selected or rejected.

      1. Commswonk

        Re: Algorithms that sit behind closed doors

        I work for a company that uses open source software, but the algorithms I write determine weather or not you get that all important first interview for a job.

        I find your involvement in such a project to be deeply disturbing, given that you have used the wrong "whether" twice.

        I sincerely hope your algorithms don't discriminate against people who can't spell correctly, or use the wrong version of words that sound the same, e.g their / there, its / it's, your / you're and so on.

        Ah yes homonyms; I knew there was a word for them.

        1. Anonymous Coward
          Anonymous Coward

          Re: Algorithms that sit behind closed doors

          > I sincerely hope your algorithms don't discriminate against people who can't spell correctly

          I doubt it. How would he test it?

        2. Wensleydale Cheese

          Re: Algorithms that sit behind closed doors

          "I sincerely hope your algorithms don't discriminate against people who can't spell correctly,"

          By the form he shows here, I sincerely hope that his algorithms don't discriminate against people who can spell correctly,"

      2. DavCrav

        Re: Algorithms that sit behind closed doors

        "If I balls this up I can totally fuck up your career.

        PS you as the person trying to get the job have no access to the code or the rules that determine weather [sic] you are selected or rejected."

        So how do we know you haven't ballsed it up? This sounds like a totally unaccountable, closed-source system that has real-life consequences with no oversight whatsoever. If we don't know your criteria then we cannot be certain you aren't breaking the law with regards discrimination. And you probably are, since you are apparently writing algorithms to decide whether someone is interviewed based on their personal information.

      3. Anonymous Coward
        Anonymous Coward

        Re: Algorithms that sit behind closed doors

        Whether you use weather or whether matters not to me. I'd like to add that what it sounds like, and as in most large software products, you are providing a large framework for the customer to work out details in their data set; in this case a resume database and analysis system of some sort or other. With this type of arrangement, I think what Dr Fry is alluding to would be how that type of customer can alter the conditions and configurations of the analysis to be unfair to any group based on criteria that may or may not be relevant. And without any outside visibility to this there could be misuse of the data, or there could be a waste. What if their setup is not efficient enough and lets highly qualified candidates slip through? Knowing these types of irregularities could also help influence their customers (the job candidates I suspect) not to bother with them as they can't manage their quality, or do their job placement without being wrong the majority of the time. That sort of thing. Things you could deduce as a job applicant after several dealings with this type of company, or you could already know through word or mouth, or through independent analysis of the company and how good they are overall. For the most part we are stuck with the manual analysis of recruiter companies, rather than having a detailed report of who the most successful players are in a given field/market, we just have to try and see. I'd much rather a system that offers me the best player up front, without me having to judge them individually. Access to the data that hold this info would help this decision, if it were available openly.

      4. Tom 38
        WTF?

        Re: Algorithms that sit behind closed doors

        Not at all. I work for a company that uses open source software, but the algorithms I write determine weather or not you get that all important first interview for a job.

        If I balls this up I can totally fuck up your career.

        PS you as the person trying to get the job have no access to the code or the rules that determine weather you are selected or rejected.

        Algorithms are code. Code can be either open or closed source. If the source code is inaccessible, it is closed source code, even if portions of it are open source code.

        Presuming this isn't software from somewhere like North Korea, then you cannot "totally fuck up" someones career, because you are not the only people doing this. If your algorithms are bad, then you will be supplying not the best candidates to your clients, and others will able to supply the good candidates your algorithm rejected; your business would suffer, but the candidates you reject will be perfectly fine.

        1. BebopWeBop
          Thumb Down

          Re: Algorithms that sit behind closed doors

          Rubbish. You assume a perfect market - where is this he?

        2. ChrisPv

          Re: Algorithms that sit behind closed doors

          This is nice and dandy, until this algorithm is included in "industry standard" HR software.

    2. Anonymous Coward
      Anonymous Coward

      Re: Algorithms that sit behind closed doors

      "Closed source" doesn't mean nobody can look at the code but the developer itself. It just mean the code is not open to everybody - I have tools and libraries which are commercial, "closed" source tools for which I have the source code also - of course under an NDA. I can't distribute and publish it, but I can inspect it.

      Even Windows code can be inspected if you fulfill the requirements.

      Frankly, the algorithm the article is about require an high level of expertise for inspection - and you can still test a "black box" for bias, if needed.

      Stallman political agenda ha nothing to do with review of algorithms.

  2. SW10
    Mushroom

    It's exceptional when the exceptions prove the exception.

    these biases and unintended consequences that can end up having a damaging effect

    Modelling the generic case is straightforward; it's the exceptions that cause the heartache. Truth is of course, there are many more exceptions than you ever realize...

  3. Semtex451
    Thumb Up

    Finally a Fry that actually knows what they're talking about.

    1. DavCrav

      "Finally a Fry that actually knows what they're talking about."

      Were you thinking Stephen Fry? Or Elizabeth Fry? Or Fry from Futurama?

      1. Anonymous Coward
        Anonymous Coward

        Well Elizabeth might push the test, but assuming they were talking about the article (and the delectable Fry therein)....

      2. Wibble

        His algorithm was being Fryist

  4. TRT Silver badge

    The most obvious one for me...

    would be in the stock market. Trying to predict all those volatile markets based on news reports and quarterly data and the way other brokers are buying and selling. And the speed with which it can all happen too. Wipe out money equivalent to the GDP of a country like Luxembourg within a few milliseconds.

    1. LionelB Silver badge

      Re: The most obvious one for me...

      Trying to predict all those volatile markets based on news reports ...

      That's actually been around for a quite while in financial (algorithmic) trading circles, under the name "sentiment analysis". Whether it actually works is another issue. In the early 2000s I worked for a hedge fund on design of algorithmic trading systems. We investigated sentiment analysis (as it was at the time) and quickly rejected it. It didn't work. Then again, that was pre-Twitter, etc., so maybe there's better mileage in it now.

      To expand slightly on "didn't work" - the basic conundrum with financial (or in fact any) prediction is this: you might think that adding more streams of information ("variables") as input to a predictive model will inevitably increase predictive power. In fact the opposite is more often than not the case. More variables means more model parameters to fit, usually resulting in poorer overall model fit and poorer prediction. To be useful in a predictive sense, an information stream has to overcome this effect; if it doesn't, you are basically just throwing noise into your model. But how to tell whether an information stream will turn out to be useful or just noise - especially when the "rules of the game" are continuously changing, so that you may only base prediction on limited historical data? The answer to this is... voodoo... or "suck it and see" (which can be costly).

      ... and quarterly data and the way other brokers are buying and selling

      That's standard algorithmic trading and is ubiquitous, because it does work (or at least it's easier to make it work).

  5. Nolveys

    President Joe once had a dream. The world held his hand, gave their pledge so he told them his scheme for a saviour machine. They called it the prayer, it's answer was law. It's logic stopped war, gave them food, oh how they adored it until it cried in its boredom.

    Please don't believe in me, please disagree with me. Life is too easy. A plague seems quite feasible now or maybe a war or I may kill you all.

    Don't let me stay, don't let me stay. My logic says burn so send me away. Your minds are two green, I despise seen. You can't stake your lives on a Saviour machine.

    I need you flying and I'll show that dying is living beyond reason. Sacred dimension of time. I perceive every sign, I can steal every mind.

    Don't let me stay, don't let me stay. My logic says burn so send me away. Your minds are two green, I despise all I've seen. You can't stake your lives on the Saviour machine.

    1. Nolveys

      ...transcription by Siri.

  6. Dagg Silver badge
    Pint

    The Computer says "No!"

  7. JLV

    So, how does this play with "non-algorithmic" stuff like neural networks? The decisions are embodied in data structures that are not immediately clear to interpret, even to experts, even if they were open to scrutiny.

    An interesting discussion, for sure, but also not sure what exactly is to be done about it. I guess being aware of the potential for problems is a very good start.

    As far as say the "wrong hiring rule engine", one would hope that customer firms eventually wise up that they are not being advised in the most optimal manner for their needs and shop elsewhere. Though, having seen firms rely on the oddest schemes for hiring advice (handwriting analysis is/was a biggy in France), I wouldn't hold my breath.

    1. LionelB Silver badge

      So, how does this play with "non-algorithmic" stuff like neural networks?

      Damn good question. Strictly, neural network-type systems are not "non-algorithmic" - the network is certainly running an algorithm - but an inscrutable one. Of course the design of the network learning system will be a known algorithm, and the data used to train the network will be known, but - the actual decision-making details may well not be. Put simply, it may well be impossible to analyse how/why a network has reached a decision. I find that pretty scary:

      Recruiter: I'm afraid your application was rejected.

      Recruitee: Oh. Why is that?

      Recruiter: Our sophisticated interview application analysis software has rejected you as a suitable candidate.

      Recruitee: Oh... I see. On what grounds?

      Recruiter: We can't say for sure. It's very sophisticated though, based on cutting-edge Deep-Learning techniques.

      1. sebt

        "Recruiter: Our sophisticated interview application analysis software has rejected you as a suitable candidate."

        It's basically about people being too pathetic to actually take responsibility and make the decision themselves. (Because they're desperate not to make decisions: make a decision and you could get sued!). If I don't get a job, there can be all kinds of reasons. Maybe my potential future manager just didn't like me. Or maybe the interviewers decided I wouldn't be a good fit into their current situation or team. These are perfectly valid reasons: it's probably better to act on that at the start rather than trying to work together.

        But people seem to prefer handing over responsibility to something "greater" than themselves, so that they can't be blamed, and to provide some illusion of objectivity and authority. If they shanghaied God into this role, they'd be laughed at. But what they're doing with their systems is no different.

        (deliberately not posting as AC so that these moron "systems" put a black mark against my name)

  8. Banksy
    Angel

    Love her...

    Nothing useful to add on the article but I think Dr Fry is great. Where is the love heart icon?

    1. 0765794e08
      Joke

      Re: Love her...

      Indeed. I must admit that Tori Amos' warm-up, before the piano arrived, was exceptionally thought-provoking and intelligent!

  9. Richard Barnes

    It doesn't have to be computerised

    Surely in the world at large public and private sector administrators use rule sets every day whose rules are not open to inspection by the general public? It doesn't really matter whether the processing of these rule sets is automated or not, there is still the possibility of perceived unfairness.

    For example, underwriting rule sets may or may not be computerised, and they will determine whether I can buy insurance or get a mortgage. I don't have access to the rules and I cannot question them.

    What exactly is new here?

    1. 's water music

      Re: It doesn't have to be computerised

      What exactly is new here?

      The scale of the problem. Weapons of Math Destruction by Cathy O'Neil is an interesting light-weight read on the subject.

  10. Hans 1
    Big Brother

    "I would like people to know more that there are limitations. Algorithms and data should support the human decision, not replace it." ®

    https://youtu.be/ARJ8cAGm6JE?t=1m3s

  11. mwnci

    I feel like such a douche...The geek in me is like "Why U no listen?"....I was listening, not to the content, just that voice...

    1. Anonymous Coward
      Anonymous Coward

      Check out the Radio 4 programmes "Computing Britain" and "The Curious Cases of Rutherford and Fry" on BBC iPlayer or Apple iTunes.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like