HP reckons it can claim the dedupe speed king crown, ingesting at 100TB/hour and spitting it our at 40TB/hour, faster by far than the dedupe dominator, Data Domain. The enhanced dedupe performance was announced at the HP Discover event in Las Vegas on Monday. HP gets to such giddy heights by combining its StoreOnce Catalyst …
So BOOST and Catalyst are effectively the same in theory. But at what dedupe levels did HP use to calculate 100TB/hr?
Symantec does the same thing with their client-side dedupe, quoting that throughput assuming a 10-15X level of compression.
DataDomain on the other hand is not client direct (well in small cases it is) how about vendors just quote pure RAW ingest numbers?
Dedupe the answer to the wrong problem?
Where are the billions spent on not pointlessly replicating data to start with?
Have we given up on data normalisation and the correct use of caching?
I know, not a real-world question, but I get antsy when someone says, "buy a server" and then "buy this to fix the problems from incorrect use of the server."
Re: Dedupe the answer to the wrong problem?
I don't think you understand what dedupe does - In a backup scenario lots of the data you backup is rightly duplicated - file headers for files of the same type, executables which appear on more than one server, emails which contain forwarded texts. etc. You aren't going to be able to prevent the duplication of these at source and neither should you be able to.
- JLaw, Kate Upton exposed in celeb nude pics hack
- Google flushes out users of old browsers by serving up CLUNKY, AGED version of search
- GCHQ protesters stick it to British spooks ... by drinking urine
- China: You, Microsoft. Office-Windows 'compatibility'. You have 20 days to explain
- Something for the Weekend, Sir? If you think 3D printing is just firing blanks, just you wait