back to article Yahoo! couldn't! detect! hackers! in! its! network! but! can! spot! NSFW! smut! in! your! office?

Having laid bare over half a billion usernames and passwords through meager funding and witless indifference, Yahoo! is putting its faith in artificial intelligence to protect people from bare skin. Yahoo! engineers Jay Mahadeokar and Gerry Pesavento in a blog post on Friday said the company has released an open-source model …

  1. Tomato42
    Boffin

    False detections?

    I'm betting that false positive and false negative rates are above 50%

    1. Anonymous Coward
      Anonymous Coward

      Re: False detections?

      The amount of pictures of chairs and tables we had to unblock was madness....and if you like persons of a darker shade, then you were good to go.

  2. Gene Cash Silver badge

    Old joke

    I'll bet Yahoo "can't tell an ass from a hole in the ground"

  3. Simon Harris

    Automatic porn detector.

    Can I have one of those please?

    Saves me the trouble of having to go and look for the stuff myself.

  4. Kevin McMurtrie Silver badge

    Open sourcing my version too

    private static Pattern NSFW_MATCHER = Pattern.compile(".*\\.yahoo\\..*", Pattern.CASE_INSENSITIVE);

    public static boolean isNsfw (String url) {

    return NSFW_MATCHER.matcher(url).matches();

    }

    It could probably use some more training data but it's working fairly well.

  5. Anonymous Coward
    Anonymous Coward

    This has been worked on for a LONG time

    When I worked at a university in the mid/late 90s, a couple of the professors were working on image processing to determine nude pictures. Then there's the question of what is "NSFW", which might not need to be fully nude. Even wearing clothes that are see thru or too tight or a suggestive pose could be NSFW, while a naked person in the right setting might be OK. It is a judgment call even for humans, an algorithm will never master it.

    1. Anonymous Coward
      Anonymous Coward

      Re: This has been worked on for a LONG time

      Everyone is nude under their clothes.

      1. Anonymous Coward
        Anonymous Coward

        Re: This has been worked on for a LONG time

        Also under their feathers ... right, Sam?

      2. CAPS LOCK

        "Everyone is nude under their clothes."

        Well you may be naked under your clothes, but proper people such as myself, have a thick coat of woad.

    2. Anonymous Coward
      Anonymous Coward

      "I know it when I see it"

      The finest of legal minds gave us this from the US Supreme Court:

      I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description ["hard-core pornography"], and perhaps I could never succeed in intelligibly doing so. But I know it when I see it...

      and that's not for want of trying. So the idea that a bunch of image filters will usefully judge whether Michael Phelps poised on the starting block is more or less unsafe than something plainly salacious but better clad is, shall we say, arousingly provocative...

      But as so often it will come down legal tickboxing: run our special script and then bask in the security of having adopted "industry best practice", regardless of actual outcomes.

      1. Destroy All Monsters Silver badge
        Gimp

        Re: "I know it when I see it"

        A typical american problem.

        Also, I fap to pictures of fully colthed deliciously gender-ambivalent black Olympic runners.

    3. Anonymous Coward
      Anonymous Coward

      Re: This has been worked on for a LONG time

      "Then there's the question of what is "NSFW","

      As elastic as there's work. For the algorithm to work it would need to know about the company HR policies of the viewers and the status of the viewers themselves.

      One place I worked, an employee was viewing porn. It turned out that he was having to stay late because the collection lorry was frequently late, but being staff, he wasn't being paid overtime. And there was no official HR policy.

      We showed him how to delete the browser cache in future.

    4. Anonymous Coward
      Happy

      A couple of the professors were working on image processing to determine nude pictures

      That's what they told you anyway. Funny how, despite the huge volume of input data amassed, it never worked.

    5. Anonymous Coward
      Anonymous Coward

      Re: This has been worked on for a LONG time

      Then there's the question of what is "NSFW",

      Reminds me of the days when I used to stand in for an IT director in a SME.

      We acquired a new HR director from one well known (for their reverse Midas touch) UK Telecoms operator. On her second day in the job she understood that I am not running any "NSFW" filters on the web - only AV and malware detection. She came to me to insist to deploy them, because "you know, some people may be looking at stuff".

      I told her 1. NO, 2. Enumerate the "Some People". 5 minutes later she came with the CEO in tow.

      At that point I took them into the kitchenette, opened the copy of the Sun left in the middle of the breakfast table on page 3 and explained her that _THIS_ _WILL_ _BE_ the golden standard for _ALLOWED_ (this was in the days when there was still page 3) unless she bans it, puts the order which officially bans Sun on the premises in writing and staples it on the kitchenette notice board.

      She had the color of a plum tomato towards the end of the discussion, with half of the company laughing their ass off (as they were present and watching).

  6. Mark 85

    Given the reports for Facebookers that I know, this may have been handed to FB. Seems war, injury, and death pics are ok and never get taken down, but any hint of a nekid human gets swift attention.

    1. Anonymous Coward
      Anonymous Coward

      The Facebook presence of ISIS will collapse if ever they change their opinion on modest dress and start beheading naked people.

  7. Nolveys

    That's right, Yahoo shareholders, this is where your money is going.

    At least I won't have to look at those horrible pictures of crying, naked children running down Vietnamese streets anymore.

  8. ultrastarx1

    nsfw, i'm living the dream everyday my boss shows up and she's nsfw, grrrrrrrrrrrr

  9. allthecoolshortnamesweretaken

    "The NSFW model is designed to take an image and output a smut probability between zero and one ..."

    Smut ptobability... now there's a phrase.

    Can I have 100 pictures that are 38.9% NSFW, please?

    1. Commswonk

      Smut ptobability... now there's a phrase.

      Indeed there is. Perhaps you can tell us what it means.

      Read it carefully before replying...

    2. Wensleydale Cheese

      "The NSFW model is designed to take an image and output a smut probability between zero and one ..."

      I'll have a 0.5 please, Carol.

  10. A Long Fellow

    A solution without a problem

    Let's take this off the computer screen and see how much sense it makes:

    "We're going to post a guard at the entrance of the building who will check everybody's person and carried items to ensure that NSFW items do not enter the premises."

    If it's senseless in meatspace, then it's senseless on the computer screen.

    Ffs.

  11. Snake Silver badge

    Meh

    I have no problem with this as long as *I* get to say whether it is on or off for my account. In that case it is just another tool; if not, it's enforced censorship and any and all companies that enforce it can stuff off.

    1. Anonymous Coward
      Anonymous Coward

      Re: Meh

      How is a company mandating what their computers can and cannot be used for censorship?

      1. P. Lee

        Re: Meh

        >How is a company mandating what their computers can and cannot be used for censorship?

        BYOD, 4G.

        Hmmm, can someone still be sued?

        Probably.

        The issue isn't really censorship, its "bother, the IT industry has nothing left to sell so we gotta make some snake oil and find a way to flog it." Selling people things they want to in is probably fairly easy. Hence, "A technical solution for X" is an easy sell.

        1. Anonymous Prime

          Re: Meh

          FTFY:

          'Hence, "A technical solution for XXX" is an easy sell.']

      2. Suricou Raven

        Re: Meh

        It's a tool. From a technical perspective, there's not much difference other than scale between the corporate filter to keep the employees from facebook and the national firewall to keep the people from reading news about how corrupt and oppressive their government is. I'm sure Saudi Arabia, Iran, Pakistan and (what's left of) Syria are looking at how to plug this technology into their filters to make sure all those corrupting western women with low-cut tops. They might need to retrain it to local standards.

  12. anthonyhegedus Silver badge

    Rather than output a probability between 0 and 1, perhaps an angle between 0 and 90 degrees?

    1. Hans 1
      Thumb Up

      >Rather than output a probability between 0 and 1, perhaps an angle between 0 and 90 degrees?

      Thanks, made my day ... though you did forget the joke icon ... probably why so few noticed ...

      Here's a thumb!

    2. Darth.0

      @anthonyhegedus

      And with this comment, you sir win the Internet today. Thanks for the chuckle.

  13. Pete 2 Silver badge

    What it needs most is a good UI

    > a classifier for NSFW detection, and provide feedback to us on ways to improve the classifier."

    So essentially Yahoo are building a search engine for porn?

    I'm surprised it took someone all this time to get a round to that.

    All it needs now is a snappy name.

  14. Paul Woodhouse

    ??????? how many years ago was it Censornet was doing exactly this? and very very well too....

  15. Nameless Faceless Computer User

    why is still still a thing?

    Why is looking at porn at work still a thing? Back in ancient times, porn seekers would use the company Internet connection since "the net" was either unavailable at home or was only available on dial-up. Today we've got 3G, 4G, and LTE on our mobile devices - which are often faster than the company network because of all the damn filtering.

    1. Anonymous Coward
      Anonymous Coward

      Re: why is still still a thing?

      ... Just a hint from your friendly BOFH that you would probably want to switch WiFi *off* before doing a pr0n trawl on the company supplied mobile!

      If the smut doesn't go over the company network, the BOFH might not have to do something about it.

  16. CAPS LOCK

    Oh thank Dog Yahoo are here to save us from internet porn...

    ... what have we done until now?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like