When I first saw this article
I only read the first line on the Reg's front page, so I thought Intel were dipping into cryogenics. This isn't nearly so exciting. Bah.
265 publicly visible posts • joined 31 Jul 2009
"If your experiment is that unpredictable, then it's probably not very good science."
Make a 0.1% adjustment involving aerodynamics at 13,000km/h, and your results get unpredictable very fast. Good science doesn't mean that failing the first time tells you everything you need to know to succeed the second time.
For a while I've been thinking something very similar to the point Matt makes in the last couple of paragraphs, but with music rather than text - I find it quite disruptive when I arrive home and take out my earphones, that I then have to instruct my computer to play. It surprised me greatly during my iPhone-owning period that a company so focused on seamless user-experience as Apple should not already have integreated this into iTunes.
Stick this in your pipe and smoke it. Nuclear IS our best option right now. Tidal/wave generation has its own limitations that make it unfeasible to produce more than a small percentage of our power, and this report confirms my suspicions about wind - its economic 'viability' is completely artificial and in a free market it would be a dismal failure. The greenies would have us suffer brownouts galore to stave off an eventuality that nuclear plants don't contribute to in the slightest.
The reason Snow Leopard cost so little is because it was a largely behind-the-scenes update for performance. Apple was concerned its main user base wouldn't be willing to pay normal OS upgrade price for something that didn't add many visible features. Previous incarnations such as Tiger and Leopard were about £79 ($99) if I recall correctly. I expect Lion to be priced around this level.
Speaking from the student side of this debate, I think one of the best points being made here is how poorly the pre-university education system prepares people (damn that's a lot of ps) for actual computing work. My school was billed as a "technology college" and yet the number of computing-specific courses was zero. The closest A-level equivalents were in Electronics, which did involve the basis of digital theory but little more, and Business. Now there's an appealing title, especially to geeks [/sarcasm]. Not only this, but the careers advisors were useless, I doubt any of the students who were enthusiastic about the paths they chose derived any benefit from the careers staff.
I am curious however about the numbers here, and wonder whether the OP is referring only to courses specifically labelled "Computer Science" or whether he has investigated other computing courses. I'm currently studying Forensic Computing at the University of Central Lancashire; in our first year we were required to do basic C++, PHP and a little SQL and could choose either C# or further C++ for the second semester. In the second we could choose between OO-based C++ (until that point it had been entirely procedural) or SQL (Oracle based). Furthermore the networking module has covered well beyond the understanding of the difference between switches and routers already and this module is required for quite a few of the courses.
Granted, the majority of the students chose SQL because it was seen as the easy option, and not because they were more interested; but the general outline of skills learnt does not match up with the poor understanding displayed by the graduates he has spoken to.
On the other hand, the fact that I achieved the second highest mark in the year for both the C++ and PHP/SQL modules in the first year when I didn't even enjoy C++, and had never previously touched either at all does seem to suggest that whilst being taught more broadly than the examples that have been the despair of the OP, my coursemates may not be a shining example of all that is best and brightest in British universities.
Is it just me or does anyone else see a future where cloud computing is simply another public utility, like water, electricity and gas? Billed by the number of processing cycles used, or data throughput or some other more obscure measurement? Google is already moving its way into the utility market in a slightly different manner. Maybe we'll see some kind of merging of our broadband service where our connection package includes time on the cloud, like mobile phone minutes.
That's the reason the future might look dull, folks - because it's cloudy.
So, let me get this straight - where they've been ordered to do so by the relevant authority, they've deleted the data, and everywhere else, they haven't. Furthermore, although they "want to" delete the rest of it, the only apparent difference between places where they have and haven't is whether they've been ordered to do it.
Seems Google requires my services as a translator, so to summarise:
"We want people to think we're going to delete the data without having our arm twisted"
What are the odds that they'll find a way to keep some of the data they've promised to erase?
Apple consumer helpdesk used to BE my job, and this kind of problem cropped up with every other major OS revision. Probably no more than a few hundred individuals in the whole country for each update, given how many calls we took each time, but I began to love (read: hate) the phrase "archive and install", especially given the propensity of callers for making a mistake with its use and deleting something important.
I found the concept of the Zombie Bank quite entertaining. John Dee's comment about the flexibility of the English language is technically correct, although I think that most people's difficulty with the production of new words from our (sort of) friends across the Atlantic is less to do with their newness and more resentment that it wasn't us that made them. We Brits often consider the language our personal posession and are offended by changes imposed from the "outside". Xenophobia, inferiority complex or <insert psychosis here>? You choose.