back to article IWF shares 'hash list' with web giants to flush out child sex abuse images online

The UK's telco-backed Internet Watch Foundation has distributed a hash list of child abuse images to the likes of Google, Facebook and Twitter – in an attempt to hasten the removal of such content across the globe. Microsoft's PhotoDNA was one of the technologies used by the IWF to create the hashes, which serve as digital …

  1. Vimes

    Isn't MD5 subject to certain issues? The term 'collision attack' comes to mind where two values can produce the same hash. Presumably this sort of weakness means that it's possible for innocent images to be caught up in the dragnet.

    And as for anything to do with capabilities provided by 3rd parties, will those 3rd parties be accepting responsibility for any false positives or negatives found as a result of using their systems? Will those using said systems face any consequences if/when problems occur?

    Or (as I would personally expect) will everybody get off scot-free, and all because it's 'for the children'?

    1. Anonymous Coward
      Anonymous Coward

      So you propose they do what?

      1. Vimes

        Avoid using mechanisms that have proven weaknesses for one thing.

        And have some well established way of dealing with errors that doesn't leave webmasters at the mercy of such systems with no way of appealing mistakes (as appears to have happened with internet filtering).

        I'm not objecting to this happening, I just object to this continual lack of consideration of what should happen when things go wrong (as they inevitably do eventually). Nobody ever seems to face any consequences and as a result the same sorts of mistakes are made again and again.

        1. Brian Morrison

          As you no doubt realise, if it was flagged as CP originally then that simple fact can be used to make everyone else care not-a-jot about the consequences of that mis-identification.

          It was pr0n then, so it's pr0n now! Filthy, mucky stuff! Ick!!!

          1. Anonymous Coward
            Anonymous Coward

            It was pr0n then, so it's pr0n now! Filthy, mucky stuff! Ick!!!

            Not pornography.

            Images of child abuse. Children, being raped. Children, being forced to carry out sexual acts. Nasty, horrific, criminal stuff.

            Images that are then downloaded and uploaded time after time after time after time. Describing it as pornography denigrates the victim, validates the abuser and conflates your common-or-garden muck viewer with these criminals.

            1. Graham Marsden

              @AC - "Not Pornography"

              "Nasty, horrific, criminal stuff."

              ORLY?

              Samantha Fox and other models like her posed for photographs when they were 16. This was perfectly legal and above board with no exploitation, nothing horrific and no criminality.

              Then, some years ago, the Government of The Vicar of St Albions decided that this sort of thing wasn't acceptable to their prudish moralistic standards and decided to redefine "child" from "someone under 16" to "someone under 18".

              So now such images are classed as "child pornography" or, if you prefer "abuse".

              Of course our witch-burning tabloids were delighted that images of someone who is over the legal age of consent for sexual activity were now criminalised...

              1. Bernard M. Orwell Silver badge

                Re: @AC - "Not Pornography"

                "Samantha Fox and other models like her posed for photographs when they were 16."

                And if you now have a copy of any of those pictures, in physical or electronic format, you are committing an offence against children and may be charged as a sexual offender.

                Can't help but wonder if the images have been redacted in the newspapers own archives....or the British Library service....

                ...Think I may go and throw out or burn that old stack of newspapers I keep in the garden shed. Just in case....

            2. Bernard M. Orwell Silver badge

              And deleting/filtering those images does nothing for the victim and does nothing to apprehend the abuser.

              Again; carpet meets broom and nothing more.

          2. Roq D. Kasba

            Collisions - not really a big deal as the most fleeting of visual confirmation will easily identify it as such

    2. Camilla Smythe

      You might hope or expect that should a match be found it will be flagged for verification by a human operator and that would include a broader investigation for other material that might be collocated with or close to the possibly offending image along with the gathering of other possible evidence.

      1. Vimes

        @Camilla Smythe

        You'd have hoped so, however some companies love automated processes. It saves them money.

        https://torrentfreak.com/google-fiber-sends-automated-piracy-fines-to-subscribers-150520/

        1. Camilla Smythe

          Re: @Camilla Smythe

          I take your point in respect of 'bully boy' tactics but would hope that given the serious nature of what is been sought and discovered then the bar will be set higher and the follow up action be more robust.

          Presumably the hashing is used to identify things like fluffycat10a.jpg as being possibly offending material without relying on the file name. I would again hope that it is first verified as being offending and then someone doing a bit of thinking and looking for fluffycat10b.jpg and similar others in the same location.

          Perhaps El Reg can contact them for comment...

          https://www.iwf.org.uk/about-iwf/contact-us

          Journalists with any media enquiries please call +44 (0) 7929 553679.

          "IWF chief Susie Hargreaves said that the hash list "could be a game-changer" in helping to hunt down child sex abuse image peddlers online."

          1. Anonymous Coward
            Anonymous Coward

            Re: @Camilla Smythe

            Isn't the other point, not covered in this article, that these aren't hash keys for images of child porn... not at least in any legal definition. These, at best, are images the IWF considers as child porn or likely to be child porn.

            You'll excuse my worries about giving an NGO responsibility for this with no over-site at all.

          2. Vimes

            Re: @Camilla Smythe

            serious nature of what is been sought and discovered then the bar will be set higher and the follow up action be more robust.

            Governments & charities are more worried about missing cases than they are creating problems for innocent people or potentially destroying their lives. If you want evidence of that then look on the other side of the pond and their approach to 'no fly' lists.

            http://www.theguardian.com/us-news/2015/aug/10/us-no-fly-list-predictive-assessments

    3. durandal

      Or (as I would personally expect) will everybody get off scot-free, and all because it's 'for the children'?

      While I'm not one especially worried abouut Paedogeddon which is always around the corner, this system is working to remove actual images of child abuse. You know, the pictures of children being raped and whatnot.

      Even if the scheme was put together with no consideration of the possibility that a hash of a given image might raise a false positive somewhere, then the very nature of the work means that the image would need to be looked at by a live operator who is quite capable of telling the difference between a tourist picture of the Colosseum and an image depicting child abuse. Remember, the hashes work on a file level, it's not an automated comparison system of the actual image content (so you're not going to get odd false positives relating to the computer not being able to work out the context of a given image).

      The worst comes to worst, a hypothetical image sharing site decides that they'll simply disallow uploads if a file hits the checklist, and you'll end up not being able to share that photo of Our Margaret with that horrific sunburn, how we laughed when Our Kev swapped the factor 50 for tanning oil.

      1. Vimes

        @durandal

        then the very nature of the work means that the image would need to be looked at by a live operator who is quite capable of telling the difference between a tourist picture of the Colosseum and an image depicting child abuse.

        What do Microsoft or Google do between receiving the hash & finding a page with images that match and then getting around to examining them? Just de-list whatever page is hosting them?

      2. JP19

        "You know, the pictures of children being raped and whatnot."

        Or maybe just album covers.

        You say "You know" but he doesn't know, hardly anyone does because we are not allowed to.

    4. Anonymous Coward
      Anonymous Coward

      The term "collision attack" refers to someone deliberately crafting a document to have a specific MD5 hash, rather than the probability of two random documents having the same hash. The only use for MD5's vulnerability in this particular context would be for a producer of abuse images to flood the Internet with perfectly innocent images that had the same hashes as the illegal ones.

    5. Adam 1 Silver badge

      > The term 'collision attack' comes to mind where two values can produce the same hash.

      A hash algorithm by definition MUST permit collisions where the size of the hash is smaller than the size of the input data.

      Let's use small numbers to illustrate. If your hash was just 1 byte in length, and your input was 4 bytes, you have 256 possible hashes to share amongst 4 billion odd input possibilities.. Sha1 is from memory 160 bytes, which gives 1.4615016e+48 hashes. That is a big number* but much much much smaller than the possible arrangements of bytes in a valid JPEG file.

      * citation needed

  2. Anonymous Coward
    Anonymous Coward

    Very Noble

    But IWF's watch list is infamous for cock-ups (knocked out Wikipedia) and potentially open to abuse by big content by having it's remit expanded.

    1. Anonymous Coward
      Anonymous Coward

      Re: Very Noble

      "IWF analysts, who have the gruelling task of sifting through photos and videos showing children being sexually exploited"

      Maybe some of the analysts get off on it. In fact, maybe it is a good form of employment for pedophiles. With the right brain monitoring equipment there is no need to click the "abuse detected" button. Less analysts, more images.

      1. Anonymous Coward
        Anonymous Coward

        Re: Very Noble

        @AC

        Hardly, it's like scraping up dead road accident victims then informing their next of kin they are deceased; a dirty job but someone has to do it.

  3. ratfox Silver badge
  4. Suricou Raven

    Ok, pedos:

    You all know what a password-protected zip is, right?

    Seriously, anyone with child abuse imagery would have to be a moron of legendary scale to upload the pics to Facebook.

  5. JimmyPage Silver badge
    WTF?

    Am I being a bit thick here

    Or could this whole castle of sand be skewed by changing at very least a byte or two of data in the source image ?

    I appreciate this might have some - limited - effect. But it's hardly the dragon-slayer it's being touted as.

    1. Camilla Smythe

      Re: Am I being a bit thick here

      "As of the end of 2014, the Home Office's CAID national database held 4.4 million images and over 30 million hashes."

      Given the number of hashes exceeds the number of images then it would seem they are in some way dealing with possible duplicates.. The article info-graphic also mentions PhotoDNA hashes "to identify images even if the image has been altered".

      "Dragon-Slayer"... OK it might be flawed or subject to avoidance by other means but then 'crime investigation' often relies on someone making a dumb mistake resulting in leads elsewhere.

      1. Vimes

        Re: Am I being a bit thick here

        The article info-graphic also mentions PhotoDNA hashes "to identify images even if the image has been altered".

        Which in turn suggests the possibility of false positives.

        For that matter who gets to oversee the work of the IWF?

        1. Camilla Smythe

          Re: Am I being a bit thick here @Vimes

          Granted.. That one popped up in my head as I was typing. Again it needs Human Intervention in order to make sure and investigate further.. Apparently you cannot FOI them which seems a bit strange given they are supposedly delivering a 'public service' which will have a serious impact on others should they fuck things up.

          https://petition.parliament.uk/petitions/105322

          Not mine, someone else's and it makes sense to me.

          1. Vimes

            Re: Am I being a bit thick here @Vimes @Camilla Smythe

            It's not just FoI that's the problem here. There seems to be a lack of regulation & oversight, even by government never mind the general public.

            Who oversees the IWF? What rules do they have to abide by? What consequences do they face when they publish erroneous entries on their list? (without any such consequences you can bet there's no incentive to stop making those mistakes)

            This is the sort of thing I'm talking about: it looks like there has been zero consideration put into what should happen when mistakes happen.

            And they will happen - that much is inevitable.

    2. Old Handle

      Hashes

      The MD5 and SHA-1 hashes will be totally useless if the image is changed, but the PhotoDNA version is apparently designed to resist this. Of course that kind of fuzzy matching will necessarily have a greater chance of false positives.

    3. Adam 1 Silver badge

      Re: Am I being a bit thick here

      > changing at very least a byte or two of data in the source image

      Wouldn't even take a byte. I mean, even changing as subtle as #FFFFFF to #FEFFFF would be very* unlikely to not have a radically different MD5 and SHA1 signature.

      * it is possible that the signature wont change, in the same way you might win lotto, then on the way to pick up your winnings, an asteroid shoots down toward the spot you are standing only to be blown to smithereens by a coincidental lightening strike.

      1. hugo tyson
        Go

        Re: Am I being a bit thick here

        I think that if you change one bit, it is impossible that the signature/hash will not change. With a proper sig/hash/crypto, if you change any single bit of the input, an unpredictable (to an attacker) 50% of the bits in the hash will change (invert).

        If you change 2 bits, same as changing 1 then another, 50% change, then some proportion p, whose mean is 1/2, of those change back and proportion 1-p of the others change: making on average a 50% change in all - repeat by changing another bit: so by recursion, any change in the input including complete replacement (whatever that means) changes, on average, half the bits in the output.

  6. adnim Silver badge

    "in an attempt to hasten the removal of such content across the globe."

    Nope just stops it being indexed and forces it further underground.

    Here comes controversy...

    I myself do not understand how an adult can be sexually aroused at images of naked children, let alone images of children being abused. Unfortunately there are those that do get aroused at such images. Simple images of naked children are generally harmless to the child. Images of children being abused are something else entirely. Sexual abuse of a child causes long term psychological damage to the child's understanding of love and sexual relationships. And indeed serious problems with the formation of relationships which often last into late adulthood.

    I do not have a problem with a paedophile masturbating to images of naked children, perhaps this release will stop some physical abuse, perhaps it will exacerbate the problem and cause those with such desires to go on to commit abuse of children. Although not everyone seeking a high goes on to the next level, else everyone that had a pint or a spliff would be an alcoholic or a heroin addict. I do have a very serious issue with an adult physically abusing a child for sexual gratification and believe they should be kept away from children.

    The view of society is to vilify those that do get aroused by images naked children. It is certainly right to vilify those that get off on images of child abuse or perhaps even treat them, I see it as a total lack of empathy and a mental illness.

    I thought long and hard before I posted this because I expect outrage at what some of what I have written. Before all the down votes flood in I wish to state that I speak from experience not as a paedophile but as a victim.

    Cant find an icon suitable, beginning to get depressed now, I should fuck off shut up and think of something else.

    1. JimmyPage Silver badge
      Childcatcher

      No outrage here

      But sadly, morals, and evidence-based policy hardly (if ever) coincide.

      Drugs and sex are two of the most legislated areas of human behaviour, despite being, in the main, activities which take place in private.

      Bear in mind, the more hysterical and pitchfork-wielding society becomes, the harder we make it for people who might realise they need help to seek it. Thus increasing the risk to children.

    2. Anonymous Coward
      Anonymous Coward

      "Simple images of naked children are generally harmless to the child."

      IIRC even clothed pictures of children are officially in the illegal categories if someone regards the pose as "provocative". Most family albums - at least in previous generations - would contain something that could be judged technically to fall into that category.

      The problem with human nature is that a person's ego is driven by rewards. Tell them that they are doing a good thing by finding suggestive pictures and they will get their mental high every time they read a picture in that way. Even if their emotion is purported to be disgust - they are still getting an emotional kick out of it. In extremes they deliberately seek out that which they can loudly claim disgusts them. One Chief Constable said that if anyone on of his specialist team started to judge pictures dispassionately - then he took them off the team.

      Power over others corrupts - as most jobsworths behaviour will confirm.

      1. Anonymous Coward
        Anonymous Coward

        @A/C

        You should see the images of young children some sports governing bodies put out in the name of publicity. They make the children look like whores in their skimpy outfits and layers of make-up. And then they wonder why paedos are attracted to the coaching children.

      2. Bernard M. Orwell Silver badge

        Doesn't even have to be a real child/person; a drawing will suffice....

    3. JP19

      "The view of society is to vilify those that do get aroused by images naked children."

      Yes because we are no longer allowed to vilify or hate anyone else.

      Peedyfiles are the only race/creed/sexual orientation we are allowed to hate with full blessing of the authorities.

      Just look at how much time and money the authorities expend finding new peedyfiles for us to hate, even dead ones.

      1. Bernard M. Orwell Silver badge

        Re: "The view of society is to vilify those that do get aroused by images naked children."

        "we are no longer allowed to vilify or hate anyone else."

        Nah, we get to hate the Terr'sts too. And the poor. And the lazy scroungers. And disabled scroungers. And immigrunts. And furreners who are weird. And lefties.

        Lots of material available for the Mandatory Daily Hate.

    4. Anonymous Coward
      Anonymous Coward

      I am afraid that you will not find much sympathy for your ideas on some access to non sexual media of children. The ideal of this would have been where real children did not have to even be involved at all, i.e drawn media. But this was also made as illegal and objectionable as media showing real children in the UK, even though there were no victims.

      I believe there was a feakanomic type study on this that found that as the CP material in counties was restricted, the incidence of abuse went up.

      If people really cared about protecting children, then discussions along these lines and some studies to confirm or disprove the above, could be done. But no this is all vile and has to be torched, even if a more pragmatic approach may actual result in less abuse.

      Also people have no interest in the details of what is being done to protect from abuse. they just need to be told that something will prevent abuse and they are happy, even if it may not when you look into the details (i.e may give false positives, etc). Decreasing abuse should be the only goal, no matter even if action to get this result is counter intuitive.

      1. Anonymous Coward
        Anonymous Coward

        "If people really cared about protecting children, then discussions along these lines and some studies to confirm or disprove the above, could be done. But no this is all vile and has to be torched, even if a more pragmatic approach may actual result in less abuse."

        The over-emotional voter is repeatedly prodded by the Government seeking an easy majority. Thus even as cannabis is heading for legalisation in the States, in the UK it is still ranked with heroin, and the police will still smash your door in at dawn because the Cabinet pointedly wants them to. And though medically it is a wonder-drug, the sick and dying can get fucked. It is (and always has been) about rabble-rousing and only rabble-rousing, for votes.

    5. Graham Marsden

      @adnim

      Thank you for that post.

      Of course the fact is that, just with groups such as Alcoholics Anonymous, Gamblers Anonymous etc, there are ways of helping people come to terms with their desires and control them with the support of others, but, regrettably, they come up against the twin problems of the NIMBYs who would say "well, yes, I suppose these people should be able to get treatment, but not anywhere within 100 miles of a child" and the Tabloid Media, who would take great delight in "outing" any such organisation and broadcasting its location and membership to every witch-burner and vigilante out there.

    6. Camilla Smythe

      @adnim

      "in an attempt to hasten the removal of such content across the globe."

      Nope just stops it being indexed and forces it further underground.

      Back in the day there was alt.svens.house.of.12.year.old.lust via Deja News. Somewhere along the line Google took usenet over in order for people to go Nikie Drop Shop on 'The Eight'

      Errm..

      https://www.google.co.uk/search?hl=en-GB&source=hp&biw=&bih=&q=alt.svens&btnG=Google+Search&gbv=1

      Seriously WTF?!1!?

      Been there, do not fully remember it, and it does alter your life even if you do not remember it.

      Cant find an icon suitable, beginning to get depressed now, I should fuck off shut up and think of something else.

      Hang Tough.

    7. Anonymous Coward
      Anonymous Coward

      >The view of society is to vilify those that do get aroused by images naked children. It is certainly right to vilify those that get off on images of child abuse or perhaps even treat them [...]

      Out of interest, what about those who have a sexual attraction to children, but know it's wrong to act on it? E.g. Molesting them and/or possessing photos and recorded footage of abuse.

      I personally have no problem with people have an attraction to children but masturbate to a fantasy of their neighbors daughter (or whatever) in their head, and know full well that it's wrong to act on it.

      And "treating" (aka. supression through force and bigotry) people like that just seems to me to be a form of social pressure and thought crime. ("How you feel is wrong!," "You need help!," etc.)

      It's also no different to how most people in backwards ass places feel about those who have an attraction to the same member of their gender. (And for those of you who didn't bother to read, I'm specifically talking about attraction, not sex, which would be an act.)

  7. Anonymous Coward
    Anonymous Coward

    I personally can't help wonder how far this will go.

    If it's going after photos of children being sexually abused then fine.

    But if they start targeting drawn Japanese artwork and the like under some pathetic excuse that it's "child pornography," which it isn't, regardless of what the law says, then I don't agree with that.

    Then who's to say they won't go after other things they don't like?

    I just forsee this being abused.

    1. Anonymous Coward
      Anonymous Coward

      "I personally can't help wonder how far this will go."

      The answer is Thought Crime. It is already the case that a picture can be judged not on its obviously innocent content - but on what is alleged to be in an accused person's mind when they look at it.

      In previous times that would have made some shopping catalogues illegal for certain people to have in their house.

    2. This post has been deleted by its author

    3. Anonymous Coward
      Anonymous Coward

      I think that they would have a problem with targeting drawings as they are only illegal in a minority of countries and nowhere to the same extent as the UK, i.e anything that 'looks' under 18 is illegal. So identifying this material would be pointless as the country where the image is posted will just ignore any notices to say the material should be taken down.

      Also if done so the image is blocked for UK users, this also presents a problem as the idea is the blocks are supposed to be temporary, and limited since most countries agree non drawn CP should be removed from the internet, so there is normally not a lot out there, so the block list is not huge.

      If the block list the IWF has, had to include drawn material, that is somewhat more prevalent, since it is not illegal elsewhere and that is not going to be removed, the blocklist would become huge. A huge block list is unmanageable and processor intensive. The IWF implementing this would cause a big internet slow down, due to their budget. They would also have to pretty much block all .jp domains.

  8. Anonymous Coward
    Anonymous Coward

    Slipery Slope

    1) Journo publishes politically embarrassing photo.

    2) Politician gets the hash of the image added to the list.

    While a visual inspection might do the job, why are you visually inspecting presumed CP?

  9. Anonymous Coward
    Anonymous Coward

    Partially-funded ...

    ... by a bunch who cannot be trusted to Police themselves.

    1. Anonymous Coward
      Anonymous Coward

      Re: Partially-funded ...

      to divert attention from themselves?

  10. Tiglath

    Flawed understanding of the database!

    The article authors have Flawed understanding of hashing. Each IMAGE of the database posses its own hash value therefor if the database holds 4.4 million IMAGES it should only hold 4.4 million hashes. So what are the other 25.6 million hashes of if they are not tied to an image?? If law enforcement has the ability to find child porn they also posses the ability to delete it with those same hash values without the Gothic Melodrama of arrest.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019