1 post • joined 10 Mar 2009
Fundamental assessment rules ignored
The first rule of any technical assessment is 'never trust the vendor's metrics'.
The first rule of any performance assessment is 'performance means something different to everyone'.
The statements made in this write up are pretty pointless other than the valid (but well known) stuff about the different types of de-dupe available. If you have a dataset that is pretty unique to a single server, but contains daily repeated patterns (a DB dump for example) then an inline engine like DD is likely to win out. If you have a globally repeating dataset (eg OS backups or distributed code trees) then a global post process engine like FS is likely to win out… or maybe a client side dedupe engine like Avamar or PureDisk. There is no single winner here - it is TOTALLY dependent on the specific scenario that you are using it for.
Net: test in your environment and ignore pointless performance write-ups like this which ignore the first rules of technical and performance assessment.
- Does Apple's iOS 7 make you physically SICK? Try swallowing version 7.1
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Pics Indestructible Death Stars blow up planets with glowing KILL RAY
- Video Snowden: You can't trust SPOOKS with your DATA
- Hands on Satisfy my scroll: El Reg gets claws on Windows 8.1 spring update