I could not agree more - the devil is in the details. However, you need to consider the details you are quoting to refute my argument.
ExaGrid may have taken 3 hours to backup and deduplicate, but it still may have outperformed its competitors, depending on what benchmark you are looking at. As you are probably aware, inline deduplication solutions deduplicate little or not at all on the first backup so even though they completed in 30 minutes, how much deduplication really occurred? Second, on the second pass, if all of the data is the same or very close to the same, it may only be indexed and then discarded. So in this sense, yes, ExaGrid lost.
But before we throw ExaGrid under the dedupe bus, why did it take 3 hours? It is possible, because it does post-process deduplication, that it was actually deduplicating the data that was backed up?! What a concept! A deduplication product that deduplicates data!
So it raises the question, who "performed" the best? The one that completed the backup the fastest? (And both products may have actually completed the actual backup in the same amount of time - I would need to check on that) Or the one that deduplicated data after the first pass and used less storage capacity?
So maybe ExaGrid did outperform its competitors based upon this benchmark but maybe the tester was predisposed to favor in-line deduplication or he simply did not have time or think to look at it from this perspective. Who knows for sure?
This is why I wrote a blog entry over on my site this morning discussing the issues associated with product testing and why DCIG only evaluates features and does not test products in its Buyer's Guides.