Last week, connectivity-hardware maker Belkin admitted that one of its employees had been using Amazon's Mechanical Turk hiring service to pay for positive - and false - reviews of Belkin products. While we congratulate Belkin for quickly admitting its employee's unethical behavior, and while we can only assume that said …
Three kinds of "reviews" to avoid...
"I think the product photo looks really cool, and I'm gonna buy one if I ever get a job!" -- The majority of user-generated reviews have owned the product for less than a month. Of COURSE they think it's great: it takes longer than that to realise you've been conned by slippery marketing.
"I m a journalist with "Well Known Newspaper" and my Editor has given me three days to install this product, test it, and write the review. The manufacturer says I can keep the software if they like the review. But no, I don't use this kind of software in my daily job, so I really have no clue what I am writing about." As a former journalist, I received such "opportunities" more often than I care to think. But sadly, none of my 'reviews' were ever published after I included a disclaimer similar to the above :-)
"As a Senior Editor for "Well-Known-Website" I can tell you this product is the bee's knees. Joe Sleaze, Marketing Director for Universal Widget, says the product may work after it is unpacked. Mr. Sleaze said considerable usability testing had been conducted. "Seven out of the 12 users we tried it with were able to get the product to display the splash screen," Mr. Sleaze said. "Less than half of our selected user testers had a PhD in Computing Science. It's a re-written press release. Read the Microsoft Reviewer's Kit for their latest product, then count the number of direct quotes from it that you see in the national press.
At least Paris Hilton has been rated by multiple users who have actually owned the product... As well as hundreds of thousands that haven't...
Of course both matter. Expert reviews are definitely necessary for expensive purchases. Opinions of friends counts too. User reviews are also helpful esp. if you want to see how fast stuff breaks down.
I found that a lot of experts don't take into account price. eg. the more expensive phones generally score higher even if you pay relatively way to much for what you get.
Oh and I never listen to the guy in the shop and never trust a single site. Sorry.
Most UGC is barely useful, you have to dig through pages of crap for a review that says more than just 'its good/bad'. That's not a problem on sites like Amazon or Play, but those 'shopping' sites that tout this crap as 'reviews' need shutting down. They pollute the search results, are factually incorrect or incomplete (if your lucky!).
User generated reviews
UGR's are complete rubbish and you should never take any notice of them.
I didn't really think that through, did I?
There are other ways
Metacritic is great. You can usually sort the wheat from the chaff by not only looking at the average score, but by reading the comments.
Several games I've bought over the years have received a less than favourable review score on Metacritic, but if you read the fan comments (and disregard the stupid ones like "Sony lolzors 111!!1111!!!Roflcopter!!!" etc.), you get a more rounded approach.
Read the data, do some analysis using your brain and see if you can get something you enjoy, rather than something that some reviewer (and editor!) was paid to say they enjoyed.
Sense of Clarkson
As Jeremy Clarkson once wrote:
"When presented with the opportunity to be a reviewer, people think they have to either gush or damn. Hand them a choice of giving a rating of anything from one to 10 and all you get are ones and 10s. Six, in the world of amateur reviewing, does not exist".
He's not wrong. Exhibit A: the iPhone.
Both can suck
like on dabs for example. I once read a comment where someone gave a hard drive 1* because of the 1000->1024 thing. You also get comments like "I bought this product, put it in the dishwasher, hit it with a sledge hammer and then lit it on fire. Now it doesn't work, shoddy product never buying again" Admittedly anyone reading those can tell that the writer is a dribbling moron to be ignored, but if you are in the list view and sort by rating, a few of those reviews can sink a product to the bottom of the list.
That's not to say expert reviews can be trusted, all you need is an expert "mac fanboi" to review a windows PC and you will get a bad review even if its the best PC on the planet.
takes great pictures
DPReview is a good example of a topic that absolutely requires some kind of formal methodology. The site's forums are packed with people who (a) bought a camera the day before and think it's great because they're blinded by the novelty or (b) spent £2,000+ on a camera system and are hardly going to admit that they made a terrible mistake. There are a few good blogs that post thorough tests of lenses and cameras, but you can never be entirely sure that they haven't made a mistake, or that their testing was flawed.
But then again DPReview is not without controversy. They have made a few boobs over time, although they generally admit this and fix it; there was an ongoing saga a while back regarding the site's poor rating of the Canon 50D's high ISO performance, but I suspect that there are so many variables involved in testing photographic gear that a dedicated individual could cast doubt on any test. On the balance of probabilities, sites such as DPReview and Photozone.de are more reliable than "this cam takes great pictures"-type user reviews.
As Mr Sentient points out above, they do tend to ignore price; I am sure that e.g. the Canon 24-70mm f/2.8L is a wonderful lens, but it's out of reach of most people. The kind of people likely to buy it are either professionals who were in the queue when it was announced, and are capable of making their own judgement, or they are rich amateurs who just want a big, expensive-looking lens. On that level a review would be futile.
On the other hand...
UGC reviews are not tainted with backhanders or marketing bull.
Users do just that - use the products where so-called expert reviewers will only use the product for a very short period (if at all)
I trust UGC over so-called experts but one of the best review sources is the support forum for each product. Whilst the initial posts on these places is usually about a problem, the resolution can tell you much more about a product than any of the hands of hacktards that have already been crossed with silver.
Experts can be just as biased because of bribery. Look at the videogames industry - so corrupt it's never worth reading any reviews from the big game magazines, they're all in EA's pocket.
Tech and gadgets: funnily enough, most 'expert' reviewers gloss over any faults and present items as amazing (just so they can keep their free one) - then the user generated reviews point out all the faults!
Presumably if you're after spending big money on something, it's better to find out about it yourself (by knowing the subject and getting your hands on it) than rely on some 'expert' to pull the wool over your eyes so he can get his brown envelope.
Got your dolby turntable and your bag on your head!
UGC is often just noise
One of the things I most often find myself trying to find is a comparison between products A and B, to see which one is probably going to be a better buy for me.
Most consumers won't have, say, two different SLRs from different brands and be able to give an objective opinion on them. Instead, most of the reviews for A will say 'This thing is awesome! A symbol of the second coming! 10/10 ***** and most of the reviews for B will say 'This is sooo much better than A. It is brilliant in all the ways that A is not, 10/10, *****'' along with a smattering of disappointed punters who will bitch about some minor problem that befell them, and award 1/10.
So asking most people to rate something is pointless. Unless a product is genuinely awful, all you'll get is a bunch of knee-jerk reactions, worthless anecdotes and fanboy tribalism.
Moreover, the popularity of customer reviews, combined with the vast number of execrable price comparison websites that choke google's search results make searching for product reviews largely futile. Everyone seems to be hoping for a slice of the amazing omniscient and most importantly *free* UGC pie, and what everyone seems to be getting is a thousand pages of 'Be the first to review this product'...
This and that
I find the lack of knowledge about the reviwer a problem on some items: e.g. when reviewing an entry-level cameras, it would be very useful to know whether the reviewer has ever owned a digital camera before: both groups give good but different ideas about the camera. So perhaps a few backgroup checkboxes on reviews would help - nobody seems to bother though - easiest is best.
As regards false reviws, I suspect that on amazon, there may be review-war: to get your review to the top, slag off the others. I'd hope there were tools in place to spot that, but I doubt it. Cheepest is best likely rules again.
I love it
this story is awesome, it comes on a page with a little red banner, really neat. Can't recommend it enough, in fact The Register as a whole is great. 5 stars. (Can I have my laptop now please.....)
When I started messing about with 'pooters, all we had were magazine reviews (actually, when I started there weren't any magazines - still have issue #1 of PCW somewhere +10 saddo points).
One memorable review I saw in *another* mag was for a device that they freely admitted didn't work - yet it still got 3 stars out of 5. Given that magazines are entirely self-serving and rely heavily on advertising and the goodwill of producers to provide equipment for review, I formed the conclusion that they are, shall we say, exceptionally generous in their judgements.
The one set of magazines I did like was in a spanish computer magazine, where they took the overall star rating they had assigned, and divided it by the price, to give a value for money rating.
Online reviews can be the exact opposite - extremely negative regarding all aspects of a product merely because the package arrived a day late. They can also be utterly uncritical of a product (cough, apple, cough) if the reviews are done by the good ole fanboys. You also have to be aware of "drive-by" reviews, were people simply click on a yes/no vote for products, based on their own biases without even using a product - GAOTD gets lots of complaints about that.
When all is said and done, I suppose the best route is to follow some of the more respected online reviewers. Plus, of course, knowing what it is you actually want the product to do.
Stop it. Please.
Can we have a moratorium on Yelp quotes please?
Every time Ms. Ichinose gets a credit round here, the Monty Python "Smoketoomuch" skit plays in my head, gives me the insane giggles and worries my colleagues.
pinch of salt!
Like anything, take peoples comments with a pinch of salt. Most reviews dont actually describe the product, but merely people interaction with it. Think the I-phone. Everyone raves about it until it comes to text messaging, where they suddenly become strangely quiet. Why? Because they dont like it, but never use the function.
Personally i find it strange that a belkin employee NEEDED to write reviews, as their stuff is pretty good anyway, just a bit over priced.
Importance of Forums attached to review sites
DPR is an interesting example. The reviewers and people who run the site are well versed with cameras and do their best to come up with objective comparisons. However, even they have their particular slant on things - not a bias as such, but placing more emphasis on some aspects of camera performance than others. For example, they place a lot more emphasis on JPEG processing on top-end DSLRs than do most serious enthusiasts I know who use RAW.
What adds huge strength to the site (and, no doubt, a lot of annoyance) is the very active forums which are associated with the site. Inevitably they contain a huge range of people - some simple brand-fans, but many with lots of expertise and experience. In many cases quite a lot more than the people running the site.
It is certainly the case that expert reviewers with access to laboratory resources can produce more consistent reviews, but that doesn't mean that they always come to the correct conclusions or make appropriate recommendations.
Yelp is a counter-example...
It's worth bearing in mind that user-generated reviews are generally very polarised - essentially, people write when they have something they want to say, be that bad or good. Hence for many products you will generally see more 1-star and 5-star reviews than 2-4 star reviews; averaged out, though, the net effect is broadly the same.
The other thing you didn't mention is something that El Reg has itself covered in some detail: when review systems are gamed by the system-owner. In particular, Yelp has (undenied) form in (claiming to) game the system so that bad reviews or good reviews may appear top. This opens a shady door and one wonders what other actions sites such as Amazon (which has already removed low-ranking reviews before) take
Not one mention of Apple?
The Appstore is the pinnacle of useless user reviews. Even with their recent "you must have downloaded this to review it" patch the content is mind boggling.
The greatest app ever can crash ONCE, and it gets one star, which can quickly drop it off the top 50 if its one of the 1st reviews. The "reviews" read like YouTube comments most of the time.
Amazons system is pretty good, the users have often prevented me for falling for manufacturers hype and lies. And are often good for a laugh, like this fine piece:
Hoist by his own Bayard
"we can only assume that said employee - identified by Engadget as bizdev rep Michael Bayard - has either joined the growing ranks of the unemployed or is now on a very short lease"
Why would you assume that? If there was the slightest chance they'd fired his ass over this, they would have wasted no time in telling us so. Quite why they haven't, and why they think they can get away with that, is a total mystery to me.
"As Ichinose says, 'The community self-polices themselves'"
Always better than self-policing someone else entirely.
How can you trust anyone or any site?
Even if they have 25 positive reviews on one site and 50 on another with no bad reviews anywhere - all it means is that there were at least 75 people who were paid to give good reviews.
Things are getting bad when people are prepared to change the opinions which they reached through reason, just because someone offers them a little money.
Madness of crowds
"The madness of crowds" predates "the wisdom of crowds" by a couple of hundred years, is far more readable and provides much more evidence to support it's central thesis: that mob behaviour rapidly falls to the level of the lowest common denominator.
I've never understood why anyone gives the phrase any attention at all; a brief look around proves that humans function even less logically in when operating in groups.
I only really take notice of the negative comments
Most negative review points are fairly petty, 100 people saying "I don't like the on button's blue light, it should be red" or something equally banal is easily ignored. 5 people saying that it blew up after 10 minutes is worth taking note of.
The positive reviews tend to enthuse (or should that be euthanise) about things without saying much
Oh and most importantly, never watch the gadget show if you want to find out how good something is. Each occasion I've watched that execrable programme, I've been appalled by it's sheer pointlessness. My 2 favourite/most hated reviews being for the Wii where the bald one leapt around for 2 minutes like a giddy schoolgirl and that was the review, they never showed the Wii doing anything and the other is the MP3 player review where the only judging criteria was "How quick can you copy a track from a CD to the player?" Sound quality, battery life, interface design, capacity, supported playback formats, the colour of the power cable, the length of headphone cables...nope none of these matter. Fools
Not only is UGC often done by interns at the company but how often is so called experts reviews done with an obvious bias - cameras get glowing reviews when they take out double page spreads for adverts, games are a hit when the reviewer is jetted off somewhere exotic to play them. These reviewers know that even a slightly tarnished review means no free junkets for them in the future. Magazine editors knwo that it's the bottom line that counts, not whether product A is actually better than product B.
UGC just is more cost effective for the marketing department but it's no less (or more) truthful than professionals writing about a product.
Paris - what the fcuk happened to her marketing department ?
I agree with Sentient
Both are useful.
User reviews are good provided you read enough to get a reasonably balanced view.
Expert reviews give a lot more detail and are more likely to cover specific areas of interest (e.g. If I buy a camera to do macro shots, I don't care if it's quality in panoramas is crap).
It's all about the reviewer's objectivity and credentials
Both professional and user reviews have their place (although it is a very large pity that so few publications now do long term tests) but both can be misused : in the case of the professional by political factors and time, and in the case of the user by inexperience.
A decent reviewer should be :
1) Experienced. Both in the field the product/service is aimed at, and also in a range of similar products. Most people don't like to admit their own purchases were poor, and can't afford to test lots of kit.
2 a) uncorruptable. The number of professional reviewers swayed by a nice lunch or long term loans is probably non trivial
b) independent. Their editor should not be able to overrule their decisions based on political factors such as advertising.
3) Not into hifi, av or wine. If you can find a truly decent and objective review of any of those, you're quite lucky.
4) Intelligent and motivated (rather than stupid and lazy). Go look at computer modding reviews of heatsinks and fan controllers. Now realise how many reviewers a) don't test heatsinks inside cases and b) rate stupid bling that needs adjusting manually over a non flashy controller that does it for you. Muppets.
It is left as an exercise of the reader to decide if 4) is actually a combination of a lack of 1-3.
I personally wouldn't be without either, as I use an average of all reviews to let me decide if something is worth buying.
Bad reviews are more useful
It's generally easy enough to read a bad review and decide if the poster is either mad or unlucky, but any good review could be a plant. If no one can think of anything substantial to say against a product that's usually a good sign.
Best of all is to ask a friend who has one and forget about what the crowd thinks either way.
Like most of the web it has it's place...
I have never gained anything worthwhile from user-generated reviews of popular products, but quite often when you get into a more niche area the people who are ready to write reviews are members of the community interested in that topic and tend to have a bit more insight about the area. This is also somewhere that typically in-house editors don't exist or don't have enough knowledge about the field to be helpful.
So yes, the wisdom of crowds may be basically the intellectual output of imbeciles, but once you get into less crowded areas, you find nooks and corners where the people who participate in communities know enough about the activities involved that user generated content really comes into it's own. This way of connecting people outside of the mainstream always has been a significant facet of the power of the web.
It's always wise to remember
That people are idiots.
For example the best selling 'newspaper' in the UK is 'The Sun'.
So if you believe what the majority of people say about a product you are basing your analysis on the thoughts of a group that predominately think that the Sun provides an acurate and intelligent view of the world.
On the flipside though, 'experts' can also be idiots.
Take the reviews for 'Indiana Jones and the Kingdom of the Crystal Skull'...
Veteran movie mag 'Empire' gave it 4/5, 'Total Film' gave it a still respectable 3/5.
The average user review on the Total Film site gave it a far more realistic 2/5.
So what conclusion can we draw from this? The public at large are in essence a bunch of semi-literate idiots... But often still know what they are talking about better than some self-proclaimed 'experts' !
Like PC game reviews in magazines?
We know how reliable THEY are, don't we?
"The Appstore is the pinnacle of useless user reviews. Even with their recent "you must have downloaded this to review it" patch the content is mind boggling."
You beat me to it, christ those reviews are a load of shit. I can only assume that they are from spoilt 13 year olds with iPhones.
DONT GET THIS GAME!
I dont believe how much I hate this game! I actually hate it more than the sound grenade app!!
For starters the sound grenade app isn't a game it's a gimmick, amusing, but not a game. Secondly the game in question is being sold for £0.00 and is fairly good, but because it's not Halo it's shit.
or for iStethoscope (Free too)
I don't like it at all. I don't even know where this "mic" is. It is useless to me
That probably because you are on an iPod Touch and it doesn't have one.Thanks for bashing this guys application (which you can't use) and letting the world know you are a tit.
Amazons legendary No.1 Reviewer!
Anyone ever looked into the legendary Harriet Klausner? Perfect example.
She is the No.1 reviewer for Amazon. She posts on average 30+ book reviews...a DAY! She says in interviews that she is a very good speed reader. One day I think 100 reviews went up.
However, yes it is too good to be true. Her reviews are now seen as pure comedy value as folks then rip her reviews to pieces. All the books she reviews get 5 stars but often the reviews dont even relate to the book or are just cribbed from the back of the jacket (and not that well either).
Its quite obvious she doesnt read these books at all.
The words fraud and hack spring to mind but its caused no end of publicity for her and just slowly destroys Amazons value as a consumer site. Why someone hasnt stopped her I dont know.
Yes, but they can be relied on to tell you if it will still work after it's been hit with a missile or dropped into an abyss.
Funny it should be Belkin, eh?
I think it's fair to say that over time, those who have the capability will be able to smell the bad reviews (syntax, grammar, badly formed arguments, drool on the text, etc) and those who don't will continue to post crap reviews. So as well as the UGC hosters having some form of filtering, we'll all get some degree of it for ourselves. There's evidence of this in the comments above, with plenty of people noticing subtle trends between sites and their approaches' effects on the review outputs.
Oh - the title - every piece of Belkin networknig hardware I've ever bought has been total shit. No joke, I'm not griping, I have a bluetooth print hub which is frankly dreck, I've fried an ADSL router just by configuring and rebooting it a few times and I have a Wifi USB stick which doesn't work reliably with any OS.
0 out of 5 for Belkin, I'm not surprised they're bending the review systems through cash, it's their last gasp to try to get a good reputation back from the mound of lamentable plastic outsourced bollocks they've been pedalling for the last few years!
" According to Stephanie Ichinose of Yelp, "Our posters take a certain amount of pride in the reviews they post, and in their community - New York, Chicago, San Francisco, and the rest." "
And isn't that part of the problem? Users here have buy-in, they're involved -- a vested interested. They're less likely to write negative reviews, and this can mean one (or both) of two things: writing overly positive, non-objective reviews or not posting reviews about bad places at all. Neither of these paints a complete picture.
User-reviewers are always biased in some regards, and nowhere is this more obvious than on User-Generated-Content sites with user reviews.
Take for example Lionhead's near-forgotten game "The Movies". On their website, every reviewer was also a content generator. Rate someone anything less than 5 stars and you risk a review war.
You had films from 13 year olds being rated good because it's excellent "for a 13 year old" and in the end getting rated higher than stuff of higher quality produced by 30 year olds. Yes, I can see the value in "encouraging the youngsters", but they all went into one great ratings bucket and it didn't help anyone find the stuff they wanted to watch.
Ebay & Apple
Ebay first: I wanted some batteries for my camera. The manufacturer's ones were rated 720mAh and most of the unbranded ones were rated 600 mAh. One Ebay seller was advertising them as 1200 mAh. Sounded very suspicious but I was curious enough to buy one and to put it through some tests. The capacity was 600 mAh. Possible innocent explanation: the battery pack contains two cells in series; we all know that you don't add the mAh numbers together in that case, but maybe they didn't. Anyway, the point is that this seller had 100% AAA++++ feedback; most likely, people submit their feedback as soon as they've "unboxed" the item and rate the seller's prompt despatch and good price. By the time they've actually used it in their camera and discovered it's no better than all the others it's too late, and EBAY DOESN'T LET YOU GO BACK AND CHANGE YOUR REVIEW.
In contrast the Apple iPhone App Store does - I think - let you go back and change your review. (Or at least you can submit additional reviews of the same app). But they have got other things wrong: you can only see reviews submitted by users in the same country as you, which is problematic if there are only 3 reviews in total and you're in a small country. Worse, even the APP DEVELOPERS CAN'T SEE REVIEWS FROM OTHER COUNTRIES so most of those "it crashes when I do X" comments don't reach the person who could fix it.
I've also seen a lot of App reviews that say "great" but give it only one star. I have the feeling that perhaps many users think that "1" is the best rating. I'm not sure how they get that impression, but I've seen it enough times that there must be something systematically wrong.
Neither of these sites has the "was this review useful?" feature that Amazon has. I would like to think that that would help, but it is of course open to abuse.
@Steven Jones - Yes, I agree. Forums related to products are the best place to find about about a particular product but not particularly useful when making direct comparisons between individual products. What I think particularly useful is that forum folks tend to use the product for real. Also, an active forum really adds value.
As for user generated reviews - it depends. Some of the sites I look at, I'll see some user names crop up again and again. You start to treat them like you would "real" reviewers. Some you trust more than others.
Nearly all reviews, whether expert or UGC, are based on a sample size of one. That is why UGC can be useful, because if 20 people say something is good and an expert says it isn't, this might be evidence that the latter just got a duff model for review. I am always amused by organisations such as Which?, testing one of each thing and thinking they can tell which is better in general.
This is demonstrated perfectly by audiophile sites, where the sonic differences between bits of high-end kit are probably so subtle (if they are there at all) that the difference between two amplifiers of exactly the same type may well be just as large as the 'clear difference' between the 1st and 2nd ranked products. (But then, would you take any notice of an audiophile site once you realised that they compare CD recorders to see which copy CDs most faithfully, or digital audio cables to see which are best?)
There is only one answer: critical thinking. Wikipedia is excellent for encouraging this --- you know you have to be sceptical when you read it, and it gets you into the habit of being sceptical about everything, even things written by 'authoritative experts'.
Gadget Blog sites (g*******, s******** e*******) seem to be getting more and more fanboyastic with drool about the finish etc rather than any real information about how good, reliable or usable.
The Win7 drivel is mindless.
Someone ought to do
My first real use of Yelp
Was on a visit to San Francisco. It (mis)led me to a mediocre restaurant in Chinatown that had gotten great reviews - even based my order off recommended dishes from the reviews. *However* - in the absence of Yelp, I would've had essentially nothing - maybe just a directory listing at best. So even if it leads you astray, I think bad information can be better than none at all. Also, as a side-effect, you can find useful information not kosher to any actual reviewing (in my case, getting tipped off about the unusual reservation/seating arrangement for a restaurant (with the dubious name 'Canteen' 8))
I mostly read reviews to see if there are any common problems with something before I buy it. Fewer than 10 reviews doesn't tell me much at all
My faith in the system was tested when I noticed my review rating had shot through the floor. Since I'd only posted about 5 reviews, I thought this was odd. I quickly tracked it down to a low-scoring review I gave of the film Advent Children. I gave a reasonably long review, pointing out the bad acting and nonsensical plot, but praising the direction and graphics. A hundred fanboys of the FF game claimed it was an unhelpful review, I can only think it was because they disagreed with it. The number of 5/5 scores for the film is incredible considering the major flaws in it and I have since readjusted my opinion on the 5/5 vs 1/5. I give far more creedence to a 4/5 review because it shows that the reviewer acknowledges that there are flaws and it isn't someone just trying to promote their favourite actor/band/game/console by giving even the bad stuff the maximum score.
Yelp is actually pretty interesting to use. It is pretty accurate, if a bit generous, to places I live near. I reviewed a half dozen places near where I lived and worked where I thought I could be helpful. I even had the satisfaction of savaging a really bad pub.
However, when I use it medium sized places (particularly in college towns) when visiting, it becomes somewhat comical to use. I roll into a town of 100,000 people and apparently they have the best dry cleaner in the world (who knew?), the best pizza "I have ever tasted", and the best martinis in the history of time. I just find it helpful thinking that Yelp in certain places in helpful in letting me know what a few young people think.
But the problem is that in smaller places or even in large places where you are not just a faceless consumer the review can be a problem. I gave my favorite lunch place where I had been a known regular 4 stars. I had not thought through the fact that they would be reading my review and even though it was positive I could not help but notice that the friendly staff was a little less attentive. This is why there is such a temptation to extremes in smaller places of politeness, pride of place, and ultimately your reviewing yourself.
User Generated Anything
Although the quote attributed to Jeremy Clarkson above is valid, I personally think it should really be something like:
"When presented with the opportunity to be a reviewer, only the people who gush or damn take up the offer. People who would have given Six, in the world of amateur reviewing, cannot be arsed".
Same as any poll, phone in or whatever, the vast majority of people, whilst they may almost certainly have an opinion one way or another, do not consider the question important enough (or the medium appropriate) to invest time, effort and possibly actual cost to make their point heard.
Why I Gave Amazon $400 Hard-Earned Dollars
I just got a Gretag/XRite Colormunki:
I looked at several professional reviewers (photographers and designers with various biases), and got the numbers and quality indicators, then I went and looked at the Amazon UGC reviews.
I immediately went to the most negative review:
It looked alarming. However, what sold me on the product, was this comment on that review:
My name is Stephen Rankin and I am the product manager responsible for ColorMunki at X-Rite. I saw your post and just wanted to help clarify a few items.
First, your correct that the original software installation CD did prompt the user to download the very latest version of software made available on our website. However, beginning in December, we've begun replacing this installation CD with one containing the most recent release of ColorMunki software. This means newer stock will contain the CD and an Internet connection is no longer required for initial software installation. (Obviously, an Internet connection would still be required to receive future software update notifications and downloads.)
Next, with respect to the 3 seat license agreement, while it's true that this is the current license that applies to most end users (camera clubs and educational institutions are actually granted an extended site license), it should be noted that our software installation & activation procedure is based on the "honor system". This means that an Internet connection is not required for activation and no user information is ever transferred or captured during software installation and activation. Instead, our software activation scheme simply requires that a user plug-in their ColorMunki instrument so we can verify that they actually possess the instrument. If you ever need to install or use the software on additional computers, X-Rite would only ask that you honor our license agreement and simply delete the ColorMunki software from one of your previous computers before you install and activate the software on additional computers. It's that simple.
As for your display profiling troubles, it's tough to say what's happening here from the information you've provided in this post. However, I invite you to please contact our technical support department and let us help you resolve this issue.
Again, thanks for your comments and let us know if we can help.
If there's one thing that user reviews are good for, it's finding bugs in tech gadgets. New tech gadgets too often get glowing reviews by professional testers who did little more than regurgitate the marketing booklet. It's the real users who find the bugs. How many pros found the Seagate firmware bug, or the Late 2008 MBP weak hinge, or noticed that the Kenmore He2 clothes washer sometimes stops balancing or aborts with a random temperature sensor error? None. I wish I'd seen customer reviews for the clothes washer sooner.
Pitching for positive reviews doesn't always work...
A few years back, I had the misfortune to need to synchronise Outlook contacts & folders on a couple of machines for somebody. At the time, the best solution looked to be a third-party product which promised the earth.
It was useless.
Could I get any support from the developer? I could not.
Could I get any sensible response from the developer? I could not.
Could I get a refund for the useless pile from the developer? I could not.
You can imagine the reaction, then, when a few months later I received an email asking me, as a "satisfied user", for a positive review on some site.
I'm not entirely sure he received what he was hoping for.
Review by any other name
Alas, I'm too poor nowadays to do much purchasing of cool products like cameras.
But I do tend to troll the recipe sites. A man has to eat. And what do I see attached to the ingredients I may or may not spend my hard-earned cash on tonight?
"My mouth is watering!"
"I'm going to go home and make this RIGHT NOW."
Translation: "I have not made this recipe, and know nothing about it."
Try as I might, I have YET to find a comment on a recipe site where the commenter has actually prepared the dish in question.
My point is that there is an underlying presumption--even in this article--that if you have ENOUGH user reviews saying "this is good" or "this is bad," then you have useful information.
This is not true.
You have NO information. None. Your data was not gathered in a controlled way under controlled circumstances. You have nothing. All 7000 comments saying "delicious!" are worth nothing.
Statistics must be meaningfully collected in order to be meaningfully applied. This is not just pedantic mumbo-jumbo. It's actually true and relevant in your day-to-day life.
I always love MP3 player reviews....
..they go on endlessly about the features, the politics, the size, weight, colour, hooking it up, codec support blah blah blah.
As for the really important bit - how it sounds. Well you are lucky if that gets summed up in half a sentence if at all.
Thanks for the that link, it kept me amused all afternoon.