Dell has jumped on the Quantum/EMC de-dupe bandwagon and is developing a single block-level de-dupe and replication architecture spanning its own, Quantum and EMC storage arrays. De-duplication involves detecting and removing repeated block-level patterns of data in a file and replacing them with pointers thus reducing the …
I'm a programmer and what goes on in the data centre is a mystery to me, so this question might be very very stupid.
If you have multiple backups of the same data, and you "de-dupe" it, doesn't that mean you now only have one backup of that piece of data, so only one point of failure?
How does that help? Surely if you're running full backups you want discrete copies rather than de-duped copies, which are more-or-less equivalent to an incremental backup?
And, conversely, if you want de-duped backup data you just run an incremental backup in the first place, without paying anyone a dime for de-dupe techonology.
What am I missing here? De-dupe sounds like a stupid idea for backup data.
Paris, because I wanted an icon with a question mark on it, and I can see the IT angle.
SMARTer Enabling QuantumWwwware ..... for the Virtual Space Time Traveller
"If you have multiple backups of the same data, and you "de-dupe" it, doesn't that mean you now only have one backup of that piece of data, so only one point of failure?" ... By Eddie Edwards Posted Monday 3rd November 2008 16:13 GMT
Eddie, You also have one point of Source thus is all Progress of Mutual Mind and Mutually MetaDataMined. Which is QuITe Clever.
Monkey Gland Extract
Can I has Fuse?
I used to think I was... not stupid... until I read this article. Is this like non lossy JPG? surely squashing the data down further and further until it reaches a quantum singularity will drag everyones' data into a binary black hole. I can see the Reg headline now:
Cloud computing disappears up its own backside.
That's it, I'm off to rebuild a PC (faster, stronger, better - we can rebuild you), and make myself feel better.
"And, conversely, if you want de-duped backup data you just run an incremental backup in the first place, without paying anyone a dime for de-dupe techonology."
where you're backing-up dozens or hundreds or thousands of machines/VMs including their OS system files, that still leaves a lot of duplicated data.
I would imagine you'd still want to keep multiple backups, maybe on different sets of disks/different locations..
- Product round-up Coming clean: Ten cordless vacuum cleaners
- Review We have a winner! Fresh Linux Mint 17.1 – hands down the best
- Product round-up Too 4K-ing expensive? Five full HD laptops for work and play
- 'Regin': The 'New Stuxnet' spook-grade SOFTWARE WEAPON described
- Worstall @ the Weekend BIG FAT Lies: Porky Pies about obesity