It's widely believed that Google search results are produced entirely by computer algorithms - in large part because Google would like this to be widely believed. But in fact a little-known group of home-worker humans plays a large part in the Google process. The way these raters go about their work has always been a mystery. …
Interesting job requirements
"This is a Personalized Search Engine Evaluator position. As a Personalized Search Engine Evaluator, you will be given tasks that are generated from your personalized content based on your Google account linked to your Gmail address that you use to register with Leapforce. Ideal candidates will be highly active users of Google's search engine and other products; use Google Play at least once per week; use Google+ more than once per month and have more than 11 people per circle and have a Gmail account with web history turned on."
Interesting that you have to have Web history turned on to get the job, I wonder why.. Are they going look it up?
Also interesting they only list these requirements for jobs outside the US, Canada and Egypt. Employer law issues? Maybe rejecting candidates for their web history or Google+ posts isn't well received by some authorities?
Alas I miss on all of these, shame.
Re: Interesting job requirements
Google will undoubtedly know all about your personal life anyway given all the data they collect.
It's a wonder the above job specification doesn't say "must be an Android phone owner and spend at least 2 hours a day posting annoying crap about iOS and other Android rivals on websites".
post rated quality
moving on - elapsed time 2s
Maybe the ratings these people give pages are used to measure the effectiveness of the main google page rank algorithm ? By comparing how humans rate a page with how their algorithm rates a page, they can look for differences and hence areas to improve the alogrithms.
Just a thought....
Re: Devil's Advocate
My thought exactly ... 1500 humans would never be able to rate whole Internet or even its representative parts ...
Re: Devil's Advocate
The linked article (interview with LionBridge user) states:
One thing I think the SEO community is missing is that this program has nothing to do with SEO or rankings. What this program does is help Google refine their algorithm. For example, the Side-by-Side tasks show the results as they are next to the results with the new algorithm change in them. Google doesn’t hire these raters to rate the web; they hire them to rate how they are doing in matching users queries with the best source of information.
I expect the statistical tools used decide where to apply human labellers is indeed glamorous (from a statistician's point of view).
If it looks too good to be true..
Judging from the general thrust of comments on one of the blogs linked in the article, a job as a rater doesn't seem to be much different than the "work from home" horror stories of yesteryear. The ones where poor/desperate people get sucked in to investing time and money in 'home assembling ballpoint pens' or 'stuffing envelopes for marketing companies", only to find they either make no money for a lot of effort or in fact lose money due to up front investment in time that could be spent more productively, purchasing materials or paying for compulsory 'training'.
Prospective raters appear to be required to take an initial simple test based on the instruction manual (probably to weed out those that can't actually read). After this they are then required to take a 140+ question 'test' evaluating actual sites. What is interesting is that a lot of the posters appear to have to "wait" until some test data is available (why aren't they using a bank of standardised tests?), and when they inevitably fail they are either immediately hit with a request to re-take the test with new data (some several times) or after a long period of begging they are suddenly 'allowed' to re-take it (also presumably reflecting that there may be a load of actual work at that time or they have to wait until more becomes available). Of course they don't get paid for any of this testing!
A suspicious person might conclude that most of the 'testing' is unpaid processing of new test data supplied by actual customers in a bid to keep overheads down and profits up. After all they already have a basic ability to weight the candidate's findings based on the short initial test, and since they don't have to pay these chumps they are probably amalgamating the results across a whole bunch of them plus those of them a small number of paid testers whom they have already found to be reliable, classic crowdsourcing with a twist.
There are a few really positive comments that reek of astroturfing, and a few more genuine-looking ones from people who are getting paid, most claiming they book 8 or 9 hours of work a week, but that they have to invest significantly more than that in the research side of the evaluation which they cannot book. One guy claims the going rate is 9 Euros an hour (i.e. about a quid more than UK minimum wage), plus one suspects the 'contractor' is responsible for all tax, NI or other equivalents plus 30-60 days payment terms. Factor in the non-paid research overheads and you'd probably be better off getting a paper round.
Not the first
This story reminds me of a dodgy website a few years ago that claimed it could robotically convert voice mails to text messages. It turned out they were relying on human drudges in third world countries to do it.
Re: Not the first
Well - robot does mean "worker" in its original language.
Really interesting article. Of course, there's still some clever technology involved but it is interesting to see how humans factor in, if a little frustrating and adding to the mystique of algorithm that we don't know exactly how their input is applied!
recently we're trending to look at the human content filters salaries are we're saying they are underpaid.
are we gonna start beating google with the same "slave labour" stick as we used on apple.
i hope so.
so if you're a sad scumbag with no social links at all you can do this
sums it up
Re: so if you're a sad scumbag with no social links at all you can do this
One of us has this backwards. But I'm not going to re-read the article to find out who.
Re: so if you're a sad scumbag with no social links at all you can do this
though you need at least a few people in your circles, so you need at least some social links
I'm curious as to what this 'research' is that needs to be done and isn't paid time. Is this to understand things that more savvy people already would? Or is it to understand the relevance of the search result if the field is not one that you know? Those are the only reasons I can think of.
There are lots of ways a page could require research, such as possibly having hidden links and search terms that require HTML-level inspection, or having lots of links to mostly spam sites, to having suspicious information or references. BTW, your arrogance is offensive. The reasons you can think of don't even scratch the surface.
So to sum up...
The "secret sauce" is made of PEOPLE!
Re: So to sum up...
Oh dear. Just like Soylent Green.
And the question in the back of every Reg reader's mind:
"How do I include the 'porn' rating in my Google searches?"
(Doubtless someone will tell me within seconds!)
Could We Possibly Be More Confused?
The humans rate web-pages primarily as a means of evaluating the effectiveness of the search engine, NOT the quality of the web-site in itself. A good search engine needs to interpret the query and return the most relevant results first. For example, suppose that a domain gets a rep for serving up trash; then search engines might get a mod to downgrade the findings from that domain. This change should not impact the findings from more legitimate sites, other than to improve their placement.
"Google is sensitive to the accusation that contractors could game the system. Matt Cutts insisted last year that "even if multiple search quality raters mark something as spam or non-relevant, that doesn't affect a site's rankings or throw up a flag". So, Google employs a network of site raters, devises a complex manual for them to follow, then ignores their judgements?"
It explains this right there in your link to http://searchengineland.com/interview-google-search-quality-rater-108702
"One thing I think the SEO community is missing is that this program has nothing to do with SEO or rankings. What this program does is help Google refine their algorithm. For example, the Side-by-Side tasks show the results as they are next to the results with the new algorithm change in them. Google doesn’t hire these raters to rate the web; they hire them to rate how they are doing in matching users queries with the best source of information."
So they're using it to compare their 'magical' algorithms. Is that really hard to understand? As they're subjective, how can they rate them _other_ than with humans?
My own private Google
The thing that bothers me every time I read a story about how clever (or otherwise) Google's algorithms are is the thought that it is telling me what it thinks I want or ought to hear, rather than the way it is, even in response to non-commercial queries.
For example, presumably it already tells Chinese users that nothing newsworthy happened on 4 June 1989. Does it also only connect Texans with sites that say that the World was created 6000 years ago, that Climate Change is a fraud and that they will be Raptured anytime soon?
Suppose I search for something that is related to what I myself do. Someone in business would do this regularly to find out about their competition. When Google shows me my own pages and those of my immediate colleagues near the top of the list, is this because they are genuinely the important ones, or because it thinks that I will like that?
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- RUMPY PUMPY: Bone says humans BONED Neanderthals 50,000 years B.C.
- Geek's Guide to Britain Kingston's aviation empire: From industry firsts to Airfix heroes
- Analysis Happy 2nd birthday, Windows 8 and Surface: Anatomy of a disaster
- Review Vulture trails claw across Lenovo's touchy N20p Chromebook