1 post • joined 10 Mar 2009
Fundamental assessment rules ignored
The first rule of any technical assessment is 'never trust the vendor's metrics'.
The first rule of any performance assessment is 'performance means something different to everyone'.
The statements made in this write up are pretty pointless other than the valid (but well known) stuff about the different types of de-dupe available. If you have a dataset that is pretty unique to a single server, but contains daily repeated patterns (a DB dump for example) then an inline engine like DD is likely to win out. If you have a globally repeating dataset (eg OS backups or distributed code trees) then a global post process engine like FS is likely to win out… or maybe a client side dedupe engine like Avamar or PureDisk. There is no single winner here - it is TOTALLY dependent on the specific scenario that you are using it for.
Net: test in your environment and ignore pointless performance write-ups like this which ignore the first rules of technical and performance assessment.
- Top Gear Tigers and Bingo Boilers: Farewell then, Phones4U
- Breaking Fad 4K-ing excellent TV is on its way ... in its own sweet time, natch
- First Irish boy band U2. Now Apple pushes ANOTHER thing into iPhones, iPods, iPads
- Updated iOS 8 Healthkit gets a bug SO Apple KILLS it. That's real healthcare!
- Stephen Pie iPhone 6: Most exquisite MOBILE? NO, it's the Most Exquisite THING. EVER