* Posts by JL

1 post • joined 10 Mar 2009

Falconstor/Sun wins speediest dedupe race


Fundamental assessment rules ignored

The first rule of any technical assessment is 'never trust the vendor's metrics'.

The first rule of any performance assessment is 'performance means something different to everyone'.

The statements made in this write up are pretty pointless other than the valid (but well known) stuff about the different types of de-dupe available. If you have a dataset that is pretty unique to a single server, but contains daily repeated patterns (a DB dump for example) then an inline engine like DD is likely to win out. If you have a globally repeating dataset (eg OS backups or distributed code trees) then a global post process engine like FS is likely to win out… or maybe a client side dedupe engine like Avamar or PureDisk. There is no single winner here - it is TOTALLY dependent on the specific scenario that you are using it for.

Net: test in your environment and ignore pointless performance write-ups like this which ignore the first rules of technical and performance assessment.



Biting the hand that feeds IT © 1998–2017