Organisations could cut their cloudy costs by as much as 50 per cent by bringing AWS apps back into a private cloud, but repatriation is a tricky task requiring more than just tinkering with APIs, according to outspoken Cloudscaling founder Randy Bias. Bias made a few headlines back in July when he predicted that the open source …
savings of up to 50% over 3-5 years ?? how about savings of 50% within the first year and that's not even bothering to use shitbox equipment with the complexities of open stack. That's tier 1 enterprise hardware and software with 24x7x4 hour response times from the vendors. No need to re-design your application for a built to fail architecture.
At least that's the story I've seen time and time again. One guy I talked to just last week was at a company that was at one point paying $500,000/mo to joyent, and they later brought things in house for closer to (avg) of ~$50k/mo.
I've done the same with amazon, and done math for others (one that was paying $300k/mo to EC2 the ROI there was about 4 months) - the cost is rarely the issue, it's normally bullshit politics and braindead CxOs and managers who see shiny cloud and think it will save their ass instead it will sink their budget. I talked to another guy recently who's company wants to move their shit to amazon because they think it will make them look more attractive to investors (they are aware it will cost them 3-5x+ more, but they will be in "cloud" and that is shiny).
Hoping the pending implosion of the current tech bubble will shake some realities into how much waste is going on with public cloud spending. It's just absolutely staggering. It's one thing to pay a lot for a high quality service, but the service provided by the public clouds is about as basic as you can get. It's depressing.
I'm sure there are some exceptions....in all the companies I've worked at, and all of the friends I have at other companies(and the VARs I work with who have customers using cloud services) .. I have yet to come across any *personally*.
Potential financial savings from deploying any tech are always radically overestimated. If they were remotely accurate the tech manufacturers would be paying me for using their products, not the other way around. If all the promises actually added up my costs should have been reduced by ~9,000% over the last 10 years. That, unfortunately, isn't the case.
There are certainly efficiencies to be gained with tech, but it is rare that those translate directly into dollars. The increased staffing costs for skilled IT people to manage it all, licensing costs and general pain in the ass factor mean that you're lucky for a financial break even.
I'm not saying it's all worthless, far from it, but I am saying that financial savings is a terrible metric for judging a technology.
- Stick a 4K in them: Super high-res TVs are DONE
- Review You didn't get the MeMO? Asus Pad 7 Android tab is ... not bad
- BEST BATTERY EVER: All lithium, all the time, plus a dash of carbon nano-stuff
- DINOSAUR SLAYER asteroid strike was DEVILISHLY inconvenient timing
- Bring back error correction, say Danish 'net boffins