back to article Google flings another £1m at online child sex abuse vid CRACKDOWN

Google has pledged a further $2m (£1.27m) to encourage the development of tools that seek and destroy online child sex abuse material. The advertising giant is keen to prove to the British government and the popular press that it's doing all it can to eradicate the vile images and videos from the web. By throwing money at the …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Eh?

    "The company said it had "used 'hashing' technology to tag known child sexual abuse images"

    I'm not sure how this could possibly be legal, in order to identify an image as child abuse, a human would have to look at it, the only people who can see these images are Police. I also assume that using a "hashing technology to tag" really means that maintain a database of hashes, if they were tagging images, they'd have to be uploading them back to the Internet which would be an extremely serious offence.

    1. noboard

      Re: Eh?

      I was thinking a similar thing. "Google has pledged a further $2m (£1.27m) to encourage the development of tools that seek and destroy online child sex abuse material." How are you meant to create a system to identify and block suspect images without having a boat load of images to test with in the first place?

      Plus how would you test a solution works? I pitty the poor soul that has to search for those kind of images to make sure they're blocked.

      1. Anonymous Coward
        Anonymous Coward

        Re: Eh?

        Maybe they just run the algorithm over the database that the IWF will already hold. No human would then need to "view" the images.

        Doesn't take much to add a bit of common sense to this news item. Or are people intentionally miss reading it because it involves Google?

        I work in IT support. Often in small business and people's homes. If I spotted an image *I* am not the one who gets arrested. Which is what you are implying. My job is to notify the authorities - either directly or anonymously. That is common sense. Here Google takes that extra step of trying to automate a system that allows them to automatically spot images without a human having to view them.

        1. Yet Another Anonymous coward Silver badge

          Re: Eh?

          The problem is that - even if this were possible - the list is supplied by a "charity" with unknown political aims, unknown backers, and no review process.

          They already blocked wikipedia over an album cover you can buy in HMV, their counterparts in other countries have blocked gay marriage and breastfeeding sites.

          1. NogginTheNog
            Thumb Down

            Re: Eh?

            "They already blocked wikipedia over an album cover you can buy in HMV, their counterparts in other countries have blocked gay marriage and breastfeeding sites."

            But they've also blocked hundreds, probably thousands, of pages of sick shit as well. So the odd cock up or dubious judgment and you want to throw the baby out with the bathwater (sorry for the questionable metaphor there!)?

            1. Yet Another Anonymous coward Silver badge

              Re: Eh?

              I want some official government authority with judicial review and a known political agenda - not some "charity"/religous group/think-tank to have quasi-government power to ban anything that they don't like.

              Won't somebdoy think of the children is not a valid reason.

    2. Anonymous Coward
      Anonymous Coward

      Think it through...

      This is an IT Technology site, not the Daily Mail, so think it through a bit.

      If Google spots an image, they will make a hash from it - reducing that image to a unique number using an algorithm. Once they have a record of the image as a NUMBER they would logically report the image to IWF \ Police etc and then block it from a search.

      Now if every image indexed is passed through that same algorithm by the computer, any time it comes out with the same result the computer can flag it up for the human to check, block, and pass on to the IWF.

      Yes, a human would have "viewed" the image. But upon viewing they report it which complies with the law. It is the actions of what the human who view the image does next that make all the difference to the law.

      This isn't some kind of Facebook Tagging of images to upload to the net. They mean they are keeping a record of the bad images in a format that means they can be spotted automatically without the need of some poor guy having to keep looking at this kind of nasty smut.

    3. Old Handle

      Re: Eh?

      The IWF aren't police either. How do they get away with it?

  2. Thomas 4

    This isn't the DM's style

    I'm fairly sure the DM would rather that £1,000,000 was used to hire hitmen to murder suspected "paedo scum" before they get to court.

    1. Anonymous Coward
      Anonymous Coward

      Re: This isn't the DM's style

      Given the number of "OOh! Look at X...all grown up!" (ie, has sprouted tits) stories in the DM; I'm not totally convinced that they are the right people to be policing something like this.

  3. Anonymous Coward
    Anonymous Coward

    Surely the issue lies with the hosting firms rather than the list compilers?

    1. Lamont Cranston

      ^ Very much this.

      Holding Google to be somehow responsible for these images, seems like a massive cop-out, to me. Then again, I wouldn't be massively surprised to learn the the UK government (or the Daily Mail) doesn't really know how the internet works.

      1. Waspy
        Joke

        Re: ^ Very much this.

        The Daily Mail not having a clue about any subject you care to choose yet having a very strong opinion on it? Never!

      2. tomjol

        Re: ^ Very much this.

        Does Maria Miller strike anyone as the type of person who understands how the Internet works?

      3. Anonymous Coward
        Anonymous Coward

        Re: ^ Very much this.

        Agreed,

        it's like holding BT responsible for dirty phone calls. Mass censorship isn't going catch those responsible, the only way is to catch them by well funded international criminal investigations.

        As ever the content MAFIAA will try to hijack this and use it for their own purpose which I find almost as distasteful as the crimes the IWF is seeking to prevent.

        1. Anonymous Coward
          Anonymous Coward

          Re: ^ Very much this.

          Actually, Google/the ISPs are being put in exactly the same situation that BT are in. If someone makes nuisance phone calls, BT are obliged to put in place filtering and interception to do their best to prevent the calls from getting through.

      4. James 51

        Re: ^ Very much this.

        You beat me to it. It's a personal bug bear when I hear someone on the radio (usually a middle aged or eldery talking head) stay it must be possible for google to stop this (they did it for China!).

      5. ratfox

        Re: ^ Very much this.

        The reason people accused Google is because they have money to spare, and they care about their public image… And hey, it worked.

        Google probably had a look at the subject and immediately decided to shell out the dough rather than have these people nag them any more.

    2. Anonymous Coward
      Anonymous Coward

      If you index the web then surely you are obliged to omit the illegal parts?

      I don't remember the Yellow Pages having brothels and drug dens listed last time I looked.

      Of course, if you see a search engine like a phone book then yes there are dodgy people listed in the phone book. But it lists their name, address and number, it doesn't say "drug dealer" next to it.

      1. Mark .

        Should maps and street view blank out those dodgy places too? Your Yellow Pages example is where people buy commercial advertising, which doesn't seem relevant here. The analogy would be more if someone was using a phone number for naughty purposes, and expecting Yellow Pages to have to check every phone number just in case.

    3. Anonymous Coward
      Anonymous Coward

      Hosters not listers

      Maybe, but see how many of them you'd get to come and talk to the government about doing something...

  4. JimmyPage Silver badge
    FAIL

    Yet curiously ...

    we are told it's not possible to identify orphan works ...

    1. Gordon Pryra

      Re: Yet curiously ...

      hmmmmm

      I'm guessing they just forgot that one, would be pretty funny to see Google in the dock accused of stealing someones work and the evidence against them compiled by their own product

    2. Anonymous Coward
      Anonymous Coward

      Re: Yet curiously ...

      we are told it's not possible to identify orphan works

      Owww, that was baaaaad :). Thumbs up, I saw what you did there.

  5. Anonymous Coward
    Anonymous Coward

    £1m? I suspect that will come off their tax bill.

    That barely covers Schmitty's bonus.

  6. Chazmon

    humans

    The poor sods who look at this stuff to establish if it illegal are generally students. Apparently it is student nurses that last the longest before giving up as it is revolting work. The time they last is more likely to be measured in minutes than hours and hours than days.

    There is currently a lot of work going into creating detectors to spot when the same image has been modified or file type changed to save some poor sap from looking at it again.

    1. Anonymous Coward
      Anonymous Coward

      Re: humans

      I read something a few years ago about some sort of "recognition software" that was being developed and had a good hit/miss ratio. It still involved some human confirmation, but it massively reduced the amount of exposure the checkers had to have. I think it had the features you mention as well.

      A colleague of mine moved jobs to a police computer forensics place, apparently his job included recovering stuff like this. I said to him they couldn't pay me enough.

    2. Anonymous Coward
      Anonymous Coward

      Re: humans

      And yet, some humans must be looking at those pictures voluntarily, or they wouldn't exist in the first place. Isn't it ironic that people who don't want to see the pictures have to see them to make sure people who do want to see them can't? I realize there are certain problems with the "obvious" solution... it's just funny how the world works sometimes.

  7. Anonymous Coward
    Anonymous Coward

    Finding these images seems like the perfect job for

    sociopaths. They are unlikely to be affected, and it would keep them away from the rest of us.

  8. Crisp
    WTF?

    Don't we pay the police to bring criminal scum to justice?

    Why do we need to pay charities to do the job that we're already paying the police to do out of our taxes?

    1. Anonymous Coward
      Anonymous Coward

      Re: Don't we pay the police to bring criminal scum to justice?

      You might as well ask why the financial services sector has to pay the police to actually investigate cheque and card fraud.

      The answer is that the police weren't doing it, possibly in the case of CP images, because they weren't being hosted in countries where we could do anything about it. The solution is to setup a QUANGO who have a remit to make a list of images or servers which host this content.

  9. Tim Worstal

    Hmm.

    Given that Google doesn't actually spider the entire web or internet, this will keep such stuff out of Google's index, sure, but it won't wipe it off the internet/web.

    And which sick porn host leaves robots.txt open to search engine bots anyway?

    1. Henry Wertz 1 Gold badge
      Devil

      Re: Hmm.

      "And which sick porn host leaves robots.txt open to search engine bots anyway?"

      Wouldn't you like to know? Just kidding.

  10. Henry Wertz 1 Gold badge

    Alrighty then

    "If Google spots an image, they will make a hash from it - reducing that image to a unique number using an algorithm. Once they have a record of the image as a NUMBER they would logically report the image to IWF \ Police etc and then block it from a search."

    We did think it through. At the point of seeing the image (to report it), the person filing the report has alrleady committed several offenses... 1) Posession of child pornography (it's in the browser cache). 2) Transmission of child pornography (the image has almost certainly bounced around a bit in Google's "cloud" as the site is crawled, image seen, and then hash generated.) 3) Possible further offense sending the link on at the time the link is sent to police and/or IWF. Obviously the cops are not going to go bust Google and IWF for massive pornography violations. But for sake of argument, what legally makes them not just a bunch of giant perverts who want alll the illegal pornography on the internet?

    Anyway...

    Regarding uk.gov's question, it doesn't seem like Google has any obligation whatsoever. They are a search engine, not a hosting provider, and not an ISP. That said, if they wish to throw in a mil for this it's OK by me and I do hope it's put to good use.

  11. sena.akada
    Facepalm

    Here we go..

    Yep, here we bloody go. It starts off about child porn, and now it's on to 'hate speech'. What next? Governments just cannot be trusted with this stuff can they? Ugh..

    1. Anonymous Coward
      Anonymous Coward

      Re: Here we go..

      Yes, imagine that politicians - the people we elect to rule us - think that the Internet should have the same publishing laws as the real world.

      Both child porn and hate speech are illegal, why the Internet is some special case that shouldn't be covered by "real world" laws is something I just don't understand.

      1. Tim Jenkins

        Re: Here we go..

        "why the Internet is some special case that shouldn't be covered by "real world" laws is something I just don't understand"

        But "child porn and hate speech" are culturally defined, NOT global absolutes, and the Internet is, well, global, so which "real world" laws" do we apply?

        What the UK defines as "hate speech" is constitutionally protected in the USA. Images defined as "child porn" in the UK are available in Japanese news agents. Meanwhile Sky broadcasts films which end up fuelling murder (http://www.dailymail.co.uk/news/article-2332571/April-Jones-murder-Mark-Bridger-watched-violent-slasher-film-rape-scene-obsessed-child-porn.html) but apparently that's OK because it wasn't viewed via the interweb...

        I don't know how this all gets dealt with, but simplistic thinking doesn't help.

        (Sorry about using the DM URL there; the same story was in all the others, but it's always worth it to bring up the sidebar of shame alongside whatever issue they are currently being disgusted by)

        1. Anonymous Coward
          Anonymous Coward

          Re: Here we go..

          We use local laws to enforce local internet use. You wouldn't expect to buy a gun on the internet, but guns are legal in America, why is child porn and hatespeach different?

  12. MigMig

    And so the internet war rages on

    Forget the fact that there are people out there that are willing to post this stuff online. There's nothing more noble than sweeping things under the rug, in this case with censorship. It's this type of noble behavior that's inevitably followed by witch hunts.

    1. Anonymous Coward
      Anonymous Coward

      Re: And so the internet war rages on

      How do you suggest that publication of child porn in another country is dealt with? Should it just be allowed, for people in this country to access it because it's the Internet and the Internet is special. Or should it be blocked from access, because we can't go into other countries and get the people posting this information. Even if it's people from the UK posting the material, it's vanishingly unlikely that our Police will be able to get server logs from the sort of organisations who are happy to host child porn.

      So, allow any material in any country to be accessed in the UK? Or make an attempt to prevent material which our society has decided is illegal from being accessed in the UK and prosecute those who post it from/in the UK.

  13. BlindFaith
    Big Brother

    Going Underground

    Read the reg for years, but this is my first ever posting.

    One thing sprung to mind reading this. Let's assume that the image is somehow encoded (presumably based on it's size and pixels) . This meta-data surely becomes redundant the moment anyone alters the image.

    This whole thing is utter lunacy. It's about taking over the web, not child safety. Any the tragic irony is that it will just hasten the creation of an 'under-web' that anyone who actually wants to express themselves (philosophically, religiously or politically) - remember it's political reasons that are behind all this, will be driven to a different underground platform.

    It's prohibition all over again.

This topic is closed for new posts.

Other stories you might like