back to article Eric Schmidt reanimates el cheapo PC zombie

Last week, Eric Schmidt ran his mouth off again at the Morgan Stanley Technology Conference in San Francisco. Schmidt commented that one of the new business models in the pipe for internet businesses is giving out free or subsidized computers to users and stacking paper on the ad revenue. He didn't actually say that this is …

COMMENTS

This topic is closed for new posts.
  1. Frank Bough
    Thumb Down

    What a load of crap

    ...and 'netbooks' ARE being given away to anyone who's willing to sign up to a monthly 3G data plan right now. It's the mobile phone model, and it seems to work. Ted, give it up.

  2. Ian
    Unhappy

    @Frank

    I don't think that's Ted's point. The model for giving away something cheap to encourage a user to pay a monthly premium (which is pretty much higher than it should be in order to balance the cost of the free item) is pretty common. Most of Ted's argument was against the whole cloud computing thing and giving away a free netbook to encourage you to use cloud computing for your apps was a pretty pointless exercise since the users that would be interested in a free low-grade netbook probably wouldn't be interested in apps such as Google's.

  3. jon
    Joke

    Oh Well

    Schmidt happens!

  4. myxiplx
    Thumb Down

    I agree, absolute nonsense

    I mean, even after you ignore the spelling mistakes that litter the article, it's talking complete garbage. Web 2.0 died? Somebody had better tell Facebook quick! Web apps aren't compelling? Yes they are, and they're widely used for a whole bunch of stuff. They might not be ideal for corporate use, but that's a whole other ball game.

    And as for Google harping on about Moore's law to justify their business. Say what? I don't remember google having to justify anything. Ever. They made a damn good search engine, are happily making billions off it, and now have a whole host of other software available on their website too.

    And their 'arrogant' approach? Not one I've seen personally, and with the amount of money they're making, I think I'd just call it 'confidence'.

    The whole article smells of sour grapes to me.

  5. Anonymous Coward
    Thumb Down

    It is already happening

    Smartphones are the devices that are leading us to cloud computing. Many business travelers today travel with just their Blackberry, because it is very portable & it offers them secure access to their essential business applications.

    The netbooks that the mobile operators are beginning to subsidize are essentially smartphones with a bigger screen & keyboard.

    Perhaps cloud computing doesn't fit YOUR usage model, but it fits a lot of other people & companies, because of its convenience & cost savings.

  6. Dave

    Clouds?

    I wouldn't trust my sensitive data to a Google-hosted app. I don't even have a gmail account.

  7. jake Silver badge

    Ted, that's a gold.

    That was one of the most completely wacko and/or trollish articles I've ever read, outside Usenet.

    How many buttons were you trying to push, and why? I think I counted 16, several at cross-purposes ... did I miss any? ... Surly you have better things to do with your time?

    As a side note, my network didn't see any of the ads that were attached to yours ... which kinda makes your purpose in life pretty much useless, no?

  8. Anonymous Coward
    Thumb Up

    @frank

    Different business model, old boy. The article discussed machines given away in the arrogant hope that ad revenue would cover the cost and make a profit. Trouble is, ads are invasive, bug the hell out of users and they give up on free PC.

    The netbook model differs because it has a fixed monthly revenue stream from the 3g contract, mostly with 18 month contracts. As such, these will happily turn a profit, no matter what happens to the netbook.

    Both are a bit dubious as they don't emphasize the hidden cost (sanity in the former, £30 x 18 months for example of the latter).

  9. Antony Riley
    Thumb Down

    Yes Ted.....

    But what's the point?

    To quote the Finnish cartoon "Pasila":

    LÄÄKKEET!

  10. Michael H.F. Wilkinson Silver badge

    its not just wanting to stick with MS-Office

    (which I do not use (LaTeX is WAY easier for science writing)). I do not want MY data and docs stuck on some machine somewhere on the web where I do not have full control over it, NOR do I want my data on MY machine only accessible through a web app I may not be able to reach (try working for a week in Uganda, I did recently). No, I want MY data and docs on MY machine, accessible through apps on MY machine, AND good back ups.

    Where cloud or grid or insert new catch phrase for distributed computing here computing is good for is massively parallel computing for LARGE scientific and business computing. It does not always pay to have your own iron if you do not use it often, better to rent time on a grid than have your own supercluster stand idle. A bit like not having your own web server, but renting space at a provider. That is a business model that does seem to work.

  11. Anonymous Coward
    Pirate

    Re: What a load of crap

    Ah yes, sign up to one of those unlimited data contracts that lets you download as much as you want, except instead of being handed out by ISPs these are being handed out by the phone companies who are currently still encumbered by massive debt from buying all the 3g licenses and facing EU regulation that'll kill their roaming data charges as well as mid tier phones getting wifi and voip clients. Yes, being a reseller of unlimited bandwidth and subsidising laptops for people is a great business opportunity! What could possibly go wrong?

  12. Anonymous Coward
    Stop

    re : Moore's Law

    "Google loves to harp on Moore's law. Computers get twice as powerful every year"

    If you don't know what Moore's Law is, please refrain from quoting it.

  13. Eric Van Haesendonck

    Subsidies only work with real revenue.

    Subsidies will only work with real "guaranteed" revenue, such as a 3G or internet connection contract, not with advertising.

    The difference is that with a contract you know that you will get revenue from the subsidized computer, while with advertising you can only hope.

    The other difference with what was done previously is that the price of netbooks is much lower that the price of computers during the internet bubble. With Linux as the OS you can probably get a computer for less than $300, and toward the end of the year you'll probably be able to get an ARM based one for less than $200 (well, you already can, but since these are based on an older ARM generations processing power is lacking somewhat).

  14. Anonymous Coward
    Anonymous Coward

    @Frank

    But mobile phones, which cost more than netbooks have been given away with mobile voice telephony services for years, it's a totally different market. Google are talking about giving away netbooks without anything other than a vague 'please use our services and we'll claw the cash back through advertising' which has been shown not to work on many occasions. If it was a subscription model, it'd probably work. The problem is that Google don't seem to like the idea of subscription services.

    Also, if you get a netbook from Google, you'll be able to bet it won't have the OS that you want on it (unless you want (at a guess) a chopped down linux with chrome) and it won't be changeable. They aren't going to give something away if they can't control how you use.

  15. Anonymous Coward
    Dead Vulture

    Moore's Law. Twice as powerful every year?

    C'mon, this is a basic one. Try Googling next time.

    "Intel co-founder Gordon Moore is a visionary. In 1965, his prediction, popularly known as Moore's Law, states that the number of transistors on a chip will double about every two years. And Intel has kept that pace for nearly 40 years. Today, we continue to help move the industry forward by delivering:"

    http://www.intel.com/technology/mooreslaw/

  16. Andy
    Boffin

    I agree to a point

    You make a very good point, people will come to a point of deciding whether they need that 4G processor. Although, I disagree that they will not need it. The average person may use their computer for very low processor intensive activities but the applications they use is not the same software used 5 years ago. The more bells and whiles a programmer adds to an application the more resources that one software will need. Unless that application is using cloud computing. In other words consumers demand more from their applications but never realize that each function takes up more resources. If I wanted MS Word to type what I'm speaking and never make a mistake, I have now caused the footprint of that application to grow. That's why we have seen Windows OS grow in size over the years. Yes, we could point out that some applications are bloated and require some cleaning, but it's simple. The more function you have in an application, the more code you need. We will always need more hardware resources (processor, memory, hard drive...) as long as we continue to demand more from our applications.

  17. Anonymous Coward
    Anonymous Coward

    Good 'ole Ted

    read the article, think the opposite, profit :)

  18. Hironaka
    Coat

    Typo

    First page: The problem is that users just didn't care enough. Bu_t_ why not?

    Besides that, yet another brilliant Ted rant. But I'm probably the only one who likes them.

    Is that a pointer in my pocket, or am I just happy to see you?

  19. jubtastic1

    It's Inevitable all right...

    Just not any time soon, you made the point yourself, we've long passed the computing requirements for text processing and spreadsheets, it's only a matter of time before the chippery needed to power an office workstation is cheap enough to stick in a cereal box, Ditto for the intertubes, which get faster and more mobile each year.

    The clock is ticking on this concept, regardless of whether it sucked the last n times people tried.

  20. Anonymous Coward
    Paris Hilton

    This article is totally

    way off in the musical references department. The Mamas and the Papas? Michelle Phillips? WTF? My mum says my dad remembers them. That Paris Hilton, she's quite a dish, but it's Lindsay Lohan who can really sing....

  21. Anonymous Coward
    Thumb Up

    Best analogy ever

    The world economy has had "a baseball bat to the wind pipe".

    I know there are people who don't like him, but just based on that sentence I'd like Ted to read the evening news every night.

  22. Martin Lyne

    Typo?

    "The problem is that users just didn't care enough. Buy why not?"

    "But why not" mayhaps?

  23. Jason Bloomberg Silver badge
    Paris Hilton

    You said it, girlfriend !

    Way to go Ted, that was a mighty fine rant, grammatical errors and all, but you are right.

    Thankfully most users seem to be aware that TANSTAAFL, that if something is cheaper than it should be or subsidised, someone will simply be trying to take their money by some other means, and often seeking to take more. Users aren't as naive or stupid as those pushing the deals would seem to hope they are, most can see 'the con' coming; it won't be long before a free service starts charging and they'll be locked-in to a deal that costs them dear.

    The Cloud has its place - I won't judge its merits or its financial viability - but people just don't trust third-parties to look after their data, handle their needs, nor even look after their money these days. Fails like Gmail going down show just how fragile things can be and what the risks are. Users don't give up cars for public transport and it's going to be an equally hard sell to get away from the traditional desktop PC model.

    Paris - 'Cos she wouldn't take anyone for a ride.

  24. John Smith Gold badge
    Joke

    The major paradigm of our time @ Frank Bough

    "Selling s&*t to people"

    And as MS demonstrate it doesn't even have to be very good s s&*t either.

    @ Frank Bough

    "willing to sign up to a monthly 3G data plan "

    But what do they do with it that actually translates into a revenue stream?

  25. Robert Grant

    "I mean, this business model has already been proven - Proven to suck."

    I can't even begin to describe how bad this sentence is. If only it weren't representative of Ted's writing in general!

  26. Anonymous Coward
    Thumb Up

    Michelle Phillips

    I for one think that Michelle Phillips has been subject to her own form of Moore's Law, getting better looking every 18 months since the late '60s. MILFs FTW !

  27. David S

    Um...

    "There's going to come a point, likely very soon, where people will begin to ask why they need a 4GHz processor and 8 gigabytes of RAM to do word processing or spreadsheet manipulations – the same word processing and spreadsheet manipulations they have been doing for the last decade"

    Didn't that happen, like, about a year ago? I seem to remember an attractive lady on a beach going on and on about it.

    Ted, I love your work. It's entertaining as all hell to read. It'd be nice if it could be a little more relevant, but we can't have everything. Kudos for the metric ass-ton; that's a concept I'd like to borrow if that's okay. Just, for future reference, the metric ton is spelt "tonne".

  28. Anonymous Coward
    Thumb Up

    @What a load of crap

    Yeah because 3G takeup has been so good hasn't it!

    And the mobile phone model works because of a little thing called fashion. Nobody cares about the new features of a phone and even less know how to use them. They just want the latest handset because employee x in cubicle y has it.

    AND they gouge you for handset price, and then gouge you for contract prices.

    When I can work in MSWord for free why exactly am I going to sign up to a shitty cloud service, get a free crappy notebook and then pay service fees on top?

    Exactly, I'm not.

  29. Ray
    Flame

    4ghz cpu and 8 gigs of ram

    of course, wildly high specs are not necessary for word processing. but that's the whole point of having a proper computer, as opposed to a starwriter - i mean netbook - it's general purpose. it does loads *more* shit (sic) than just word processing. you want the specs for video editing, games, all that. oh, and coz dixons sold you a bundled copy of vista that chews your ram like it was addicted.

    the reason the computer subsidy thing hasn't taken off yet is not because it's a bad model, but because it's crap value. let's face it - you have to be pretty short on smarts to sign up to the pc world plan that gives you a £400 acer and capped 3g (data only, kidz) for £40 a month when you could get a £600 iphone and unlimited 3g for £30 - and that's at o2 monopoly prices!

    people like schmidt don't really care whether we go cloudy or not, what they really want are new markets. in which case they should offer a decent deal and some decent kit without the restrictive licensing agreements of which most tech biz knobs seem so terribly fond.

  30. Andus McCoatover

    Lääkkeet?

    Medicines??? Taking some??

    But, I have to agree with one part of the article. We just don't need more power to do the same as we've been used to for ages. If that's all we need to do.

    At home, I use a Compaq mini-deskpro (ES?) running Ubuntu. 700 meg. processor 20(?) gig HDD and 512 meg. of RAM - upgraded last week from 256 for a tenner.

    I browse, write CV's, use Skype, print, bit o'Gimp - OK, that's slooow - but not a fat lot else outside the standard 'accessories' suite. Bloody thing's about 8 years old, and it works fine. Fast enough. I don't do video games - they're for 14-year old pizza chomping, coke guzzling "visitors of 'lefthanded websites' "

    Faster than the 2GHz, 1Gb Vista (looks like Xp for some reason?) thing I'm writing this diatribe from in the pub.

    Christ, I prefer my eee to this!

    (For those 5.98 billion (potential) readers NOT in Finland, Pasila cartoon looks a bit like... http://epe.deviantart.com/art/PASILA-54109742). Pasila is a place, near Helsinki.

  31. nicholas22
    Flame

    NICE ONE

    Nice article, agree with most points as they are true, despite a large percentage of people still thinking otherwise, having not realised that the browser is not an operating system and generally being IT illiterate and lazy.

    I'm starting though to get fed up with Ted-haters; if you have personal issues with the author don't fucking post moronic comments, such as "he's misquoting Moore's law, etc", "facebook is great so the point about Web 2.0 is pointless", etc. Read between the lines.

  32. Anonymous Coward
    Anonymous Coward

    @"Frank Bough"

    Small difference: Mobile phone companies make a lot of money from their users, so can afford to bribe customers to take the service. They'd make more if they didn't give away laptops, but then their competitors would do it.

    In contrast, the Web2.0 folk make their money from their VC backers. Acquiring users/customers is an incidental step in the process: they need to get X,000 customers to get the next tranche of investors' money. A fine example of learning the rules and missing the point of the game.

  33. Watashi

    Web 2.0 alive and well despite El Reg

    El Reg really doesn’t like Web 2.0 does it? If it’s not Andrew Orlowski’s supercilious attitude towards anyone who actually uses Web 2.0 or who wants to make money from Web 2.0, then its general claims like this about Web 2.0 being dead.

    Of course, in the real world (as opposed to the jounalistic world or the business world) Web 2.0 is thriving. It's impossible to listen to the radio or watch the TV without someone going on about Twittering, and there are now several TV shows devoted to showing harvested Web 2.0 user-created video content. Not only this, but more than a few marriages have collapsed as a result of social networking and virtual reality sites. As for making money – the Apple iPhone app store seems to be doing quite well when it comes to selling user-created content over the web for people to use on the web.

    Like any new technology, there will be many investors who think that consumers will sign up to any new Web 2.0 product simply because it's new. Of course, this isn’t true, but the hype put out by companies looking for investment gives a false sense of what Web 2.0 is supposed to be like. Web 2.0 should be judged on the same grounds that other sectors are judged. If you want Web 2.0 to make money, you need to do what other industries do: sell something that people want to pay for and have no choice but to pay for. And jouralists should expect this and not be all superior when the hype turns out to be nothing but hype.

    The reason Web 2.0 is so popular with web users is that Web 2.0 is cheap to run, cheap to use and focuses around services that users don’t mind if they go down every now and again. Trying to make a lot of money in an open market on a product that has a tiny cost per user overhead (e.g. Friends Reunited or Facebook) is always going to be very difficult. The same is true when selling something that people can already do better and more reliably on their local systems (e.g. Google web applications).

    Web 2.0 products that do make money are not those that people merely like to use, but products that people have to pay for to use because they can only accessed through one channel, or where advertisers will stump up because of a single channel's dominance. Would Apple make money on iPhone apps if you didn’t have to buy them through Apple’s own service? Not a chance. And would advertising space on Google cost anywhere near as much if there was genuine competition in the search-engine marketplace? I doubt it. This is where the innovation will be focused – not on the products themselves, but on making money from those products.

    Web 2.0 has become immensely popular in a very short time, so it's far to early to judge how much money can be made from it.

  34. mark adrian bell
    Happy

    Thanks, Ted!

    "Perhaps that's why the only Web 2.0 companies still left alive are communication tools, and not apps useful outside of your lunch hour."

    Oh Ted, thank you, thank you, thank you!

  35. Sarah Bee (Written by Reg staff)

    Re: Web 2.0 alive and well despite El Reg

    The trouble with Web 2.0 is all the damn comments.

    :)

  36. Anonymous Coward
    Stop

    RE: I agree to a point

    Andy wrote "The more bells and whiles a programmer adds to an application the more resources that one software will need."

    True.

    "Unless that application is using cloud computing."

    No no no.

    If an application is using cloud computing, the "features" used will still require "resources". They might not be resources on the users PC, depending upon the application and the feature. An easy way to decrease the processing power required by the users PCs would be to take all the shite out of applications. (Microsoft, I'm thinking of you and things like your annoying paperclips). Word and Excel have reached a point where I think of the size of the program and I know I'm only using a few Mb of it - the core functionality of the application could be squeezed onto floppies...

    "In other words consumers demand more from their applications but never realize that each function takes up more resources. If I wanted MS Word to type what I'm speaking and never make a mistake, I have now caused the footprint of that application to grow"

    Hmmm. My experience is that users don't really want the new feature and never use it. The applications grows. Now it's obvious that hard-drives do as well. The available hard drive space may be larger but the "bells and whistles" code compared to "vital functionality" code increases and increases and ... and ... 99% of what the users thniks of as the application is functionality that they'll never use and don't need at all.

    For users who want to do nothing more than browse the web and write the occasional word document, the cloud is worth diddly-squat. They probably got a free word processor with their computer anyway!

    So yes, there are going to be people who are all excited by the cloud. It strikes me that these are all people who haven't actually got a grip on reality and don't seem to realise the lessons of the past. Has anyone stoped to consider why I am not currently typing this message on a dumb terminal...?

  37. alan

    re nicholas22

    "don't fucking post moronic comments, such as "he's misquoting Moore's law, etc""

    its not a moronic comment to point out that the author has got one of the most fundamental laws in computing wrong. This is a computing site, and I for one would expect that one to be correctly quoted.

    Alan

  38. Big Bear

    @jubtastic1

    “Ditto for the intertubes, which get faster and more mobile each year.”

    First of all, a pre-emptive apology if that is the case in your country, but as a UK resident I can say that the net is definitely not getting faster every year, and the more mobile bit works only so long as the uptake remains low… quite simply the backbone of this country cannot handle the extra traffic that the ISPs have sold! Besides, I think a big side-effect of the Web 2.0 is the immense f***ing bloat on every page, even El Reg has jumped from 50KB to 300, and this will only get worse with every script kiddy chucking flash and animation onto every damn page, and Devil-worshipping web advertising deciding that that 4MB viral advert is cool and every page needs it displayed because then they can max out that client’s advertising costs and thus earn maximum Web2.0profiteering…

    Once more, further proof that people love to jump on the form over function bandwagon by adding all the pointless bloat onto stuff when it was perfectly good as it was… but where’s the profit in that eh?

  39. Elmer Phud
    Pirate

    Coming Soon

    To me it's just a matter of time before the idea of 'cloud computing' becomes ideal in some eyes. Forget about your one-man (or whatever) business but some that are a little bit bigger. As money starts to be come available and start-ups are encouraged, with the emphasis on 'mobility' to reduce office costs, some form of common storage for a small company would be ideal. Folks are also getting used to the 'internet' now and not so afraid of storing photo's etc on someone elses server. Whether it's a good idea or not doesn't come in to it, it's whether the idea can be sold properly this time around.

    Just because the sea looks almost empty it doesn't mean there aren't sharks hatching out there.

  40. Anonymous Coward
    Anonymous Coward

    Moore's Law

    For those complaining about the misuse of Moore's Law, he's using it in the sense that the web2.0rrhea folks commonly use it. It's proper in context. It's been years since I've seen a non-hardware engineer properly refer to the number of transistors in a given piece of silicon doubling.

  41. alan

    re Moore's Law AC 15:06

    As my mum would say

    If they jumped off a bridge would you?

    No well then just because some iTards cant use it properly doesnt mean that we shouldnt

  42. Kev K
    Thumb Up

    Ahhh mondays

    I don't know if I enjoy the fail articles more or the comments that come accompany them.

    Carry on, perfect end to a $hitty day/

    Kev

  43. Giles Jones Gold badge

    The price of computing

    Computers aren't expensive, second hand, ex corp machines are cheap.

    Sure, a Mac or a high end PC aren't cheap. But if you can't afford one you do without.

    This business model exists for one or two reasons, firstly there's people who want stuff for free. Secondly there are marketing people who find ways to get people to consume more adverts.

  44. Anonymous Coward
    Anonymous Coward

    re alan

    You know iTard really isn't a word, right? And why do you use the word mum? The proper English would be ácennicge.

    Oh wait, you say communication is all about shared context and terms understandable to each other? Language moves on? But no, feel free to vent some righteous NERDRAGE about a term used that is perfectly understandable to others.

    For your future screeds, feel free to borrow these.

    ,,,,,,'''''''''.......

  45. Fred

    Good way to frighten off newcomers

    From time to time i still encounter people who see computers as something scarey, and the internet is something that 'other' people do.

    It does not take a large amount of foresight to see that when something is complicated any failure is a function of a measure of its complexity.

    The only level of 'cloud' computing i use is a skype bot from my skype phone (document forwarder) , thats all i need I and if you need more complexity then your obviously just getting paid a backhander from supplier X!

  46. Daniel B.
    Boffin

    Web2.0 is *financially* dead

    Geeze, it looks like Dzuiba's articles always attract flames. Anyway, it seems that some commenters forget the whole point about the "Web2.0 is dead" statement.

    The only "Web2.0" company actually making a profit is Google.

    There are plenty of other Web2.0 sites that are very popular like Twitter or Facebook, but these guys *aren't* making any profits, and are in fact on life support. I'd even go as far as saying that the only "web2.0" sites I've seen at least breaking even would be the "web MMO's" like ogame and such, which give "special powers" to those who convert to paid subscriptions.

    The funniest thing about Web2.0 is no one even agrees on what "Web2.0" is! It used to mean "social networking" but it now has another meaning for some people: crappy flash, AJAX and JavaScript-based sites using "clever" CSS to look like a MACOSX wannabe app.

    @Moore's Law - I've seen "Moore's Law" used as "doubling processing power" for at least 10 years; Dzuiba's probably referencing this kind of misuse common in the IT and Web2.0 world.

    @Jason - Users don't give up cars for public transport: Depends on where you live. Even the car-hugging US residents will give up their cars for daily commutes if they happen to live in big, congested cities like New York or Boston.

  47. Anonymous Coward
    Thumb Down

    web 2.0 and cloud computing must be a failure...

    ...because they have not put MicroSoft and MicroSoft Office out of business ? Give them time, it took the dinosaurs (and GM) a long time to die.

  48. Anonymous Coward
    Thumb Up

    Ted is totally right again

    People can moan on all they like about not liking his style - but he is correct.

    As PCs become more and more powerful, they actually make cloud computing less likely. Cloud computing genuinely working relies on a few concepts. Firstly, the network (and cloud) have to be more reliable than the desktop - and more reliable by a big margin, since when my PC fails I can walk over to another one in the office. Secondly (and more importantly) it relies on the PC being very expensive. In the early days of desktop computing the PC was expensive enough to make this a possibility - but the network wasn't up to it. Now the network is getting close (although it isn't there yet), but PCs are just too cheap. If I just want to run an office app I can get a PC for a few hundred bucks. My desktop PC (which is pretty powerful) costs less than my office chair (seriously - check the cost on Herman Miller Aerons). It costs less than the desk it sits on. It actually costs less than the monitor it is connected to (30" monitors aren't cheap). Whether I'm computing on the cloud or not, the cost of my desktop is still pretty much the same - ergo cloud computing makes no sense any more.

    At home, the only reason my PC gets upgraded is to play games and record music. Otherwise the only reason for an upgrade would be because the old one was broken. At work the only reason I upgrade my desktop is when the old one is likely to become unreliable. It's not to get the latest and greatest any more.

  49. Maty

    Cloud computing

    Because I was once doing a job in London and my laptop died. I got a netbook (Asus eee) from Tottenham court road - right next to B. mus where I was working - and was back at work in 90 min. No need to install apps or backups - I could check for details of emails that I'd sent two years ago and forgotten, and catch up with what my students were doing on their eLearning course. (Actually, where does moodle fit in this rather sneering article? Why not give someone a 250 quid computer to do a $1500 course online?)

    Cloud computing has also saved me hours of hassle having a google doc that I, editor, and a collaborator can work on at once without emailing multiple copies to each other. And when said editor is in London, I'm in Canada and collaborator is in Italy and we all move around, the only network that can hack it is the internet.

    Seriously. Cloud computing isn't for everyone, but if you need it, there's nothing better. And it's free. This might not be a great business model for the providers, but for users its great.

    And um, the reason why someone who's quoting Moore's law should do so correctly is so we know he knows what he's talking about. If he can't get the basics right, how can we believe the rest?

  50. Anonymous Coward
    Flame

    @Daniel B

    "I've seen "Moore's Law" used as "doubling processing power" for at least 10 years; Dzuiba's probably referencing this kind of misuse common in the IT and Web2.0 world."

    Oh well that's all right then. Except that it's not. To defend it under the mantra of "everybody's doing it" is not a tenable position, it's just an excuse for being sloppy. If everybody decided that the definition of the law of gravity had changed, would you be prepared to jump off a cliff on the strength of that?

    As a later commenter points out - if you can't get the basics right, how can we believe the rest?

  51. jake Silver badge

    @AC20:31 &@Maty

    "In the early days of desktop computing the PC was expensive enough to make this a possibility - but the network wasn't up to it."

    Eh? In the early 80s, my DEC Rainbow had Ethernet access to the VAX cluster ... Granted, you had to reboot the machine into terminal mode (VT102 if I recall correctly ...). Prior to that, in the late 70s I had an TokinRing connection to the WAITS machines at SAIL in my dorm ... of course the TokenRing terminated at what we would today call a "router", from which I had access to many other campus computers. And prior to THAT I had access to misc. computers at Berkeley via arcnet ... It took about two weeks of hacking to get the hardware to work with my Heath H11-A :-)

    "No need to install apps or backups - I could check for details of emails that I'd sent two years ago and forgotten, and catch up with what my students were doing on their eLearning course."

    I was doing that with a Panasonic Sr.Partner "luggable" in 1984ish. From there, I had telnet access to anywhere the Internet was ... The difference is that I was dialing into my own "home cloud", which in turn allowed me out on the Internet ... My "home cloud" still exists today. It's grown a trifle in the last 25 years, but I have never had to trust other people with my data while on the road ... Today, I set up corporations in a similar manor.

    It's true that in the early days it was all text, but the concepts are the same. And one could make a case for most of what is truly valuable about being connected is text-only.

  52. foo_bar_baz
    Thumb Down

    The real fail

    Is conflating SAAS with cloud computing.

  53. Anonymous Coward
    Thumb Down

    What's 'arrogant hubris'?

    Is that like 'female woman'?

  54. jake Silver badge

    @What's 'arrogant hubris'?

    "Is that like 'female woman'?"

    More like "male man", at least in my experience ...

  55. Sep
    Thumb Down

    No

    People have the wrong assumption that in the NEAR future everybody will only own 1 computer. In reality most people will have 2; a smartphone-netbook hybrid for the road and a desktop/server at home. The average person on the road rarely needs computing power beyond what's provided by a smartphone or a netbook, so as long as they have good connectivity they will settle for an inexpensive gadget. When you need to do some video editing, play graphic-intensive games, or archive important documents, you do it at your home machine.

    Google is going after the portable machine market, which is why it has something to do with Android. If they can structure the deal smartly (like cellular providers in the US giving away subsidized phones if you sign a contract), it has a potential to take off.

  56. kissingthecarpet
    Stop

    Writers like Dziuba

    are *not* the reason I read The Reg.

    Unless I'm missing the gag & he's a satire on Web commentators *exactly* like Shelley the Republican is a satire on Republicans.

  57. Pirate Dave Silver badge
    Pirate

    @Alan

    "its not a moronic comment to point out that the author has got one of the most fundamental laws in computing wrong. "

    Maybe I missed something, but I don't remember Moore's "Law" holding the same weight as, say, Newton's Law of Gravity or even Ohm's Law. No, Moore's "Law" was more an observation Moore made that Intel (eventually) pushed to the fore of their PR machine (since he was an Intel guy) and sturggled to keep valid. Had all engineers at Intel and elsewhere stopped designing new processor circuits, Moore's "Law" would have immediately become invalid. And thus proved itself not to be a Law.

This topic is closed for new posts.

Other stories you might like