An IDC study sponsored by storage giant EMC tells us that, surprise, we need to buy more storage because the digital universe will reach 40 zettabytes by 2020. That's 40 trillion gigabytes and is estimated to be all the data created, replicated and consumed in that year. It's the result of the amount of digital information in …
No reason to interpolate, for 2020 you would have 28.8ZB*sqrt(2)=40.7ZB, which is exactly the figure they are giving...
Yes, that's a pretty tragic arithmetic fail on the Reg's part. f(t) = (9/5) 2^((t - 2011)/2)
Eight year projections are doubly worthless
If only there were a simple software solution that could reliably tell you how much redundant, out-of-date, massively over-duplicated shite is cluttering up your drives that you could either delete or at least dump into an archive. Wonder how much storage we'd need then?
1.44 MB .. you lucky thing!
180 K single sided floppies !!!!
( rapidly deteriorating into the 4 yorkshiremen sketch )
> massively over-duplicated shite
In theory if your storage de-duplicates all your VDIs , then that doesn't really matter how many of the same copies you have.
Only 33% needs protecting? So 66% is just dross?
Should be even lower than that if you can switch off youtube.
I would be amazed if even 33% was actually unique and valuable enough to protect.
Re: Re: 33%?
Most businesses would spend more time and money trying to figure out which was the valuable 33% then it would cost just to expand the storage network.
And 90% will be use to record all those phone calls, emails and tweets
that all the governments seem hell bent on recording and keeping for a gazillion years.
Re: And 90% will be use to record all those phone calls, emails and tweets
And 90% will be used for pr0n and spam. Fixed that for you.
Only 40 zettabytes by 2020?
Or is that the estimate for what Google will use?
Sent to me anonymously and posted here as the quickest way to get the comment known:-
In your story, "Are you ready for the 40-zettabyte year?", you write, "The amount for 2020 would be 43.2ZB by interpolation." The increase is doubling every 2 years so you cannot use linear interpolation.
Half way between 2019 and 2021 the data will be closer to the 2019 amount than the 2021 amount. So 40 zettabytes looks reasonable for a continued doubling every two years.
Is he right? Am I innumerate?
I gotta say it...as a 25 year IT vet my solution to big data - delete it. 75% of it is crap anyway.