1 post • joined 10 Mar 2009
Fundamental assessment rules ignored
The first rule of any technical assessment is 'never trust the vendor's metrics'.
The first rule of any performance assessment is 'performance means something different to everyone'.
The statements made in this write up are pretty pointless other than the valid (but well known) stuff about the different types of de-dupe available. If you have a dataset that is pretty unique to a single server, but contains daily repeated patterns (a DB dump for example) then an inline engine like DD is likely to win out. If you have a globally repeating dataset (eg OS backups or distributed code trees) then a global post process engine like FS is likely to win out… or maybe a client side dedupe engine like Avamar or PureDisk. There is no single winner here - it is TOTALLY dependent on the specific scenario that you are using it for.
Net: test in your environment and ignore pointless performance write-ups like this which ignore the first rules of technical and performance assessment.
- Review This is why we CAN have nice things: Samsung Galaxy Alpha
- MEN: For pity's sake SLEEP with LOTS of WOMEN - and avoid Prostate Cancer
- Ex-Soviet engines fingered after Antares ROCKET launch BLAST
- Hate the BlackBerry Z10 and Passport? How about this dusty old flashback instead?
- Apple spent just ONE DOLLAR beefing up the latest iPad Air 2