870 posts • joined 8 May 2009
Re: What next?
Actually, even some of the most rabid Apple sites are saying that Office for iPad is pretty bloody good for a v1.0 release.
Yes, you need an Office 365 subscription to unlock the full feature-set. So what? Did you expect Microsoft to just give it away for free? And if the "Freemium" model is good enough for games, why shouldn't Microsoft be allowed to use it too?
I was raised bilingual, as are many others. Both of Linus' parents were Swedish and that language would most likely have predominated in the home, but Suomi would have been the language used in educational institutions, with a strong understanding of English required for most computer-related courses due to the Anglo-centric nature of most programming languages.
That said, Scandinavians typically learn multiple languages in school from an early age as a matter of course – typically from the UK equivalent of primary school and up, as the younger you are, the easier it is to learn new languages. Most will be taught their native language, plus two others, often including English, with German also popular.
For what it's worth, I've always considered programming as mere translation, nothing more. The trick is to understand how the target audience – i.e. the computer – 'thinks', and work within their frames of reference, but that's a given for any language. I used to get weird looks from colleagues when I told them I really could think in the programming languages I was using.
I was wondering what happened to Demis Hassabis – a child prodigy whose name seemed to be everywhere for a while. "Theme Park" (Bullfrog) and "Republic: The Revolution" (Elixir) were some of the games he was responsible for.
Interesting that he's gotten into much the same field as Jeff Hawkins. Shame his company was bought up by Google as that only goes to support Orlowski's (and my) view of the "Silicon Roundabout" hype.
You know, what the world of IT really, really needs is for every OS to be based on the same, 40-year-old, UNIX design.
Because that's clearly the answer to every technical problem from yesterday, through today, and forever into the future. UNIX is perfect. All hail f*cking UNIX.
One thing Microsoft has managed to achieve – beyond all reasonable expectations – is to maintain a viable alternative to that ancient UNIX design. Granted, they often mess up the GUI design – Windows 1-3.0, anyone? Windows ME? Windows Vista? – but then, Canonical and GNOME don't exactly have an unblemished record in this field either. Even Apple make mistakes: their old "Dashboard Widgets" technology hasn't exactly taken the world by storm, for example.
Furthermore, WIMP GUIs are designed for new users. There is no excuse for claiming to be a professional or expert in IT and not knowing the keyboard shortcuts. Those shortcuts have NOT changed in Windows 8; it literally took me five minutes to learn the changes, and even then, Windows 8 has added new shortcuts, not taken away the old ones. Hell, it even comes with a tutorial. If you still can't work out that the Desktop mode is basically that from Windows 7 with flatter icons and better performance, the problem is with you, not the OS.
If there's one thing nobody working in technology should ever be afraid of, it's change.
Re: What I don't understand is why bother?
Thunderbolt does for PCI Express and DisplayPort buses what eSATA does for SATA: It's basically PCIe and DisplayPort on a wire, so you no longer need to provide space inside a computer's case for traditional expansion slots. (Hence the recent Mac Pro redesign.)
For external storage, PCIe offers another advantage: it operates entirely independently of the CPU. USB keeps its price low by making the CPU to do much of the heavy lifting, taking valuable processing power away from your applications.
What the benchmarks in the article don't show – and they should – is the additional processing overheads imposed by using USB. Yes: the raw speed looks identical, but if you're having to give up a CPU core to achieve it, it's going to brutally hammer the performance of any high-end video editing suite you're using at the same time. Even Aperture and Lightroom will be noticeably more sluggish.
Trust me: if you're working in a field where processing large lumps of media is a core activity, you will care about this. It's why Apple redesigned their Mac Pros the way they did: that machine has the equivalent of 18 PCIe expansion slots. (Or 15 + 3 x 4K displays if you prefer.) All that's changed is that those slots are now on the outside of the machine, allowing the engineers to optimise the hell out of the arrangement of the core components inside the case.
That is what Thunderbolt can do that USB cannot. USB isn't even playing the same game, let alone in the same league.
Thunderbolt comes into its own in the high-end professional markets, where the cost of the actual computer itself is tiny compared to all the storage and other peripherals you need to connect to it. No, most readers here won't need that level of power, but it most definitely has a market.
"Call me a snob..."
Fine: You're a snob. Who also can't spell "bass".
The reviewer made it crystal clear that he was comparing the Sonos to an older TV. Given that's what most people will have, it's a perfectly valid comparison to make. And, yes, pretty much anything would beat that.
As for why anyone would pay £500+ for something like this: you do know that the UK has some of the smallest homes on Earth, right? Many of us can barely find space on the wall for the TV, let alone for a subwoofer, amplifier, and veritable multitude of speakers. (And let's not forget the wiring involved too.)
Yes, there are compromises made with these small form-factor designs, but they're still plenty good enough for the 99% of customers who don't still think they're so special that they can hear the difference between FLAC and high-quality MP4 audio files. (Despite all the research proving that practically nobody can actually do so.)
So, a Chinese manufacturer is going to make increasing use of automation on battery production line that produces iPhone batteries? Given that increasing automation is standard practice – and has been for years – among manufacturers, I'm not even sure why this is even considered news.
Also, last time I checked, Apple didn't own any factories in China, despite constant, ignorant media bollocks to the contrary. I expect such headlines from the BBC, Stephen Fry, or the Daily Mail, but not from a website that claims to be aimed at the Information Technology industry! If I wanted to read that kind of childishness, ignorance and FUD, I'd read your article comments.
Apple isn't "switching" a damned thing: Foxconn (and other Chinese suppliers) are.
Robots aren't new to the world; they're only new to China, which has, until very recently, been able to rely on a very cheap workforce instead. That China would have to adopt them eventually was always a given: it's the nature of a "developing" nation to aim to become a "developed" nation, but the price of doing so invariably includes losing your dirt-cheap workforce.
Please, for the love of Codd, stop heading down the plughole to click-baitism. The Register ought to be better than that.
Re: Berners-Lee and media luvvies?
The problem is that there are already plenty of Great Charters that are supposed to be protecting basic rights. The US even has a couple that are rather well known.
Like the endless flood of pointless new laws by knee-jerkist career politicians with no clue how the real world works, the hard part isn't telling people what they should and shouldn't do, but enforcing it.
And then there's the small matter of the Internet not being a public space. It's a network of mostly private networks, often connected by private infrastructure*. Freedom of Speech and all that jazz only apply to public spaces, not to private ones. There's a reason why an audience member can't just stand up in the middle of a stage play and start reciting bits of Shakespeare at the actors: their right to freedom of speech ended the moment they stepped into the theatre. Private property; their gaff, their rules. The Internet is no different.
Neither Facebook nor Google are doing anything wrong or evil. All their users have been notified, often repeatedly, that they are the product those companies are selling. If you don't like how they run their businesses, stop using them. It really is that simple.
* (No, you don't get to trot out the "British Telecom was once publicly owned!" cliché either: the GPO was privatised back in the early '80s. The GPO didn't have anything like the same infrastructure at the time of its privatisation.)
Re: As if this will make people happy!
No, he's pointing out that Windows 8 was *more* keyboard-friendly than Windows 7 even in its 8.0 incarnation. It's right there in the second paragraph; is English not your native language?
There is nothing "new" for you to jump through in Windows 8 if you've actually bothered to learn the keyboard shortcuts. Which is, incidentally, what you're supposed to do. (Mice are the biggest cause of RSI, not keyboards. You really aren't supposed to use them all the time.)
If you haven't been mentioning that rather crucial bit of information to your "friends/family/non-tech users", then the fault in their training is entirely yours and yours alone. You don't get to blame Microsoft for your own ignorance.
WIMP GUIs have always been designed to provide neophytes a way to discover functionality for themselves and learn the keyboard shortcuts as they do so.
There are textbooks explaining this core principle that date as far back as the 1970s. (The WIMP GUI concept was first mooted in the 1960s, but the first implementations had to wait until some core technologies became available in the 1970s.)
Re: Limit climate change?
"But climate change isn't happening. You've posted about a thousand articles saying so."
No. The climate is changing. That is not – and never has been – in dispute. It's been changing ever since the Earth formed.
The "debate" – and I use that term in its loosest possible sense – is about humanity's contribution towards it, and the resulting level of catastrophism – if any – that will result.
The opposing sides of the debate – and there really are more than two sides to it – are:
1. Who the f*ck cares what the human contribution is? Surely all that matters is what should we do about it?
2. Okay, if humans are contributing towards Climate Change, is it really as much as the media says it is? If not, see 1.
3. Are things really going to be as catastrophic as the Chicken Littles are claiming? So far, there is little evidence to suggest that the skies really are about to fall on our heads.
4. Where's the evidence that we have a real handle on how this complex system works in the first place? We hear endless talk about computer models, yet we also see articles like the one Mike Lewis just reported on in this very item that prove we don't have all the information necessary to create accurate climate models.
That last point is also the reason why Mike Lewis is reporting on this debate in a website called "The Register" aimed at IT professionals: the AGW camp's incessant blethering on about "computer models"...
Anyone who has ever programmed a computer knows that a computer model is merely an interactive illustration. Illustrations prove nothing. They are also only as good as the data that went into their construction. Ergo, a computer model cannot be used to support a hypothesis. It can only be used to illustrate it.
The fact that we are still seeing articles in major peer-reviewed journals like Nature that reveal previously unknown facts about how the Earth's complex climate actually works is sufficient proof that we do not have all the data needed to create wholly accurate predictive climate models. Which means the old IT "Garbage In, Garbage Out" cliché applies in spades.
The more complex your computer model, the more bloody accurate the data it's derived from needs to be. Given how easily even a computer model can spin off into the realms of utter bollocks given even a slightly incorrect data or algorithm, anyone claiming to have cracked this stratospherically difficult nut is, for the present at least, either lying, ignorant, or both (i.e. a politician).
"No wonder their security record is so crap if they don't have the decency to notify their userbase that products have gone the way of the dodo."
Where does it say, in any of the documentation, EULAs, etc., that any manufacturer is obliged to support your purchase indefinitely? Usually, support is optional (and certainly voluntary) after the legally mandated support period (which isn't exactly the same as the warranty itself) expires. That's three years in the EU. After that, you're on your own.
HP and Dell happily tell consumers to sod off after a few short years; their corporate customers pay them support money and get support for much longer. Nobody gets that level of support for free.
This is IT, which moves damned quickly. Nobody offers a 7-year guarantee like some automobile manufacturers do. Not even Apple, who, like every other consumer electronics manufacturer, is also tied to the support provided by their component suppliers.
A corporation is just a collection of people run by a dictator, or a cabal, called "shareholders", who elect their own dictator to represent their interests in the corporation.
A community is... just a collection of people run by either a charismatic leader (i.e. a dictator in all but name) – or a baying mob whose intelligence is on a par with its dumbest member.
In both cases, you can have a benevolent dictatorship, or your bog standard tyranny, but the nature of the entity itself makes little difference: either way, it's a bunch of people working together towards a common cause. So I've never cared about the distinction. A community-driven project is no less "proprietary" as a commercial one: either way, if support for a format is dropped, I'm screwed. (No, I'm not going to wade through hundreds of pages of ISO documentation and code up my own conversion software. Life's too bloody short.)
My interests may be met by one, other, or both of these two kinds of entities. I will pick whichever solution best fits my needs.
Whatever you may think about Microsoft, I don't consider them any more "evil" than Samsung, and they're a bloody sight less "evil" than Google. But I also have no time for the constant bickering and squabbling of GNU radicals and extremist FOSS nutters either.
"Throw in the fact that MacOS is not exactly the operating system enterprise software vendors make their first development priority..."
Really? You are aware that OS X is basically a UNIX-based OS with a decent GUI, right? If it'll run on GNU / Linux or FreeBSD, there's no reason why it can't be recompiled to run on OS X as well. You can even get X and Ports for it if you want.
Wasn't platform portability a key feature of UNIX?
Re: Unfair Tax?
"The BBC tax is unfair (technically: regressive). The reason being that everyone pays the same amount, irrespective of their ability to pay."
The BBC License Fee is not a tax. The clue's in the name. It's a part of the price of owning a TV; the TV itself merely provides the machinery with which to receive the broadcasts.
The BBC is a Corporation with a Royal Charter, not a subsidiary of a government ministry. It is a legal entity that has been granted very specific powers. This is an uncommon form of business these days, but it's the same mechanism that was used to create the University of Cambridge, the East India Company, and the Bank of England.
Also, how is everyone paying the same amount "unfair"? I don't get a discount on shoes, clothing or food based on my "ability to pay", so why should I expect a discount on a luxury item like a television set? Do Sky give you a discount on their subscriptions based on your level of income? Do Tescos give you an "I never watch TV adverts" discount on the stuff you've bought from them?
A television is a luxury, not a basic necessity. I haven't owned a TV since 1996, so it is most certainly possible to live without one.
Re: fund it from general taxation
The BBC iPlayer Global app – iOS only – lets you watch archived BBC content for a subscription. It's been available since 2011.
That's little consolation if you don't have an iOS device, but there are strong hints that the BBC are planning to roll out an international version of their web-based system through the bbc.com website, rather than building umpteen mobile / tablet / desktop apps instead. (Apparently, this is in direct response to Hulu and Netflix's success.)
Re: Please stop for a moment and look around
I've lived and worked in a number of countries and *nothing* comes close to the BBC in terms of quality. And, yes, most continental European countries not only have their own state-funded TV stations, but those TV stations _also_ have ads, and produce nowhere near as much quality content. (http://en.wikipedia.org/wiki/Television_licence – check out Norway, Sweden and Switzerland. Still think the BBC is terrible value for money?)
Considering the tiny budgets the BBC makes its programming for – even Doctor Who is made for a fraction of the cost of typical US telefantasy productions – it's a miracle they produce as much as they do, let alone produce content that many other countries, including the US, are willing to pay for. And at least you're not getting ads *and* having to pay a license fee regardless, as Italians, Germans and French TV viewers do.
While I agree that the license fee is not an ideal form of payment, it is by far the least worst option available at present.
And no, the Queen does not count as "state interference". Yes, she's the head of state, but she's apolitical – she has to be, given how long she's been in the job. The Windsors, for all their faults, do at least provide a level of long-term continuity, countering the short-termism endemic to elected representatives. This is one of the few advantages of a royal family, and one that shouldn't be dismissed out of hand. The UK could certainly do a better job of making use of this feature of a monarchy, but it absolutely should not remove it. Not unless they can come up with something better.
The article explains the reason: the music.
The BBC have a blanket license to use such music for their broadcasts, and I think they can still show the programme on iPlayer as that comes under the 'broadcast' umbrella too.
A DVD or Blu-ray release is not a broadcast, so the music licenses would need to be renegotiated at some expense. As this was a low-budget BBC 4 documentary, it's probably just not worth it. They could simply substitute library music instead, but it wouldn't be the same.
Re: Schol Reform
Oh, the irony of someone insinuating that teaching and education are fair game for the armchair ignoramus.
"Aside from a 'life skills' class, the only other reform I'd like to see is a change in school hours, change it from 9-3:15 (or whatever it is right now) to 8 - 5."
Good luck with that. Leaving aside the fact that teaching is actually a hell of a lot harder than you appear to think it is, the human brain can only process so much new data in a day. The brain suffers from fatigue, just like your muscles do.
There's a good reason why so many EU nations have shorter hours for their schools than the UK's. (The trick is to demand more homework, rather than more Victorian-style spoon-feeding in a classroom.)
"1: Daily PE rather than weekly, with no BS excuses to get out of it."
With all due respect, [WORD THAT RHYMES WITH 'LUCK'] you and the high horse you rode in on. I hated PE with a passion, the depth of which you couldn't even begin to conceive. I've always preferred working out on my own in a gym, listening to music. Team sports – professional and otherwise – have never remotely interested me.
"2: Bringing options selection forward to year 9 or even 8 in senior school to allow a better focus earlier for those who know what they want to do."
And what about those who don't? Have you spent even a microsecond considering the logistical and timetabling nightmare you're proposing here?
"PE would help with the obesity problem, as well as concentration at schools."
Firstly, know this: There is no such thing as an "obesity problem". There is a media obsession with scaremongering – because fear sells – but there has been no appreciable increase in actual childhood obesity measured using accurate metrics. Fact. (And, dear lord, if I hear the phrase "obesity epidemic" – as if obesity were somehow contagious – I swear I won't be held responsible for my actions.)
Any article or report you read that mentions using Body-Mass Index ("BMI") as a key measurement can be trivially dismissed as 100% fact-free scaremongering. BMI was discredited long ago as it utterly fails to account for muscle mass. Muscle is twice as dense as fat. BMI treats muscle as fat: Arnold Schwarzenegger would be considered "morbidly obese" under that system.
"Almost every kid I went to school with who had ADHD was almost perfectly behaved after PE, at least for a few hours anyway."
ADHD has surprisingly little to do with being "well behaved" or otherwise: plenty of kids with ADHD diagnoses manage not to disrupt their classes. All it does is affect concentration and focus, making it easier for you to get distracted.
I found doodling on a pad and creating little flick-book animations in the corners of my exercise books helped relieve the boredom of being taught about 1066 and all that in history lessons. (Or Computer Studies O Level: those of us who went into IT careers already knew more than our teacher did by then in any case.)
While those kids that do tend to disrupt lessons may or may not have ADHD, what they definitely have is a dire need for a bloody good clout round the ear. And better parenting.
Re: This doesn't alter the fact...
"It incenses me that she won't use the perfectly good laptop with LinuxMint on it because "she doesn't like it"."
Christ, talk about egotistical.
I suspect your girlfriend won't be losing any sleep over the prospect of never becoming your wife. Good on her.
Re: Old systems still working
"What about all those old systems that are still working but simply can't run Vista/7/8.1."
What about them?
Microsoft didn't make all those PCs! All they did back then was sell software, a games console and some peripherals like keyboards and mice. Why should they be expected to go out of their way to support someone else's products? It's not as if information about Vista and Windows 7 weren't leaked well before they were released, and you've certainly had plenty of time to put some money aside to save up for suitable replacements.
XP's expiry date has been known about since 2011 – 1000 days' notice – or a little under three whole years. Even setting aside just $150 a year would have left you with enough money now to buy a decent replacement, so there really is no excuse for all the whining.
As for the ecological argument: last time I checked, older PCs tend to use more energy. Quite a lot more if using kit with a Pentium IV (or related CPU) inside it. They're also quite a bit more recyclable than people realise. It's using the things to death and then chucking them into the nearest bin in their entirety that's ecologically unsound. Recycling them responsibly is actually the right thing to do. The WEEE regulations actually requires PC manufacturers to take back their old kit for recycling.
Mozilla CTO Eich: If your browser isn't open source (ahem, ahem, IE, Chrome, Safari), DON'T TRUST IT
What's the point of auditing the source code...
... when you have absolutely no way to audit the build process itself?
A handful of deliberate bugs that make it easy to compromise is all you'd need. Those could be added to a low-level library anywhere – i.e. it would affect any application linked against it. When someone spots the bug and fixes it, you simply insert another bug somewhere else. It becomes a never-ending game of "Whack-a-Mole".
This symptom is indistinguishable from ordinary bug-testing, so not an easy problem to identify.
Remember, the NSA, GCHQ, the CIA, etc. are all intelligence agencies. That basically boils down to spying, and intelligence operatives have been doing undercover work for decades. Find the right person with the right leverage and nobody would ever know your organisation had even been compromised – not even the managers.
Re: Who is Woz?
The man basically designed a couple of 1970s-era microcomputers – at a time when every Tom, Dick and Sally was doing the same thing, so big sodding deal – and, aside from a switching power supply, that's pretty much it. It was Jobs who suggested putting one of them in a case and offering a pre-built model for sale.
Mr. Wozniak had precisely f*ck all to do with Apple's transformation from a near-bankrupt basket case in the late '90s to its current status as one of the most successful businesses on the planet. Nothing. Nada. Zip. Naff all. Quite why anyone pays him any heed escapes me. Like Richard Stallman, his views are anachronistic, harking back to a bygone era before computers became a commodity consumer product.
Woz has become an embarrassing, parasitical media tart. A "rent-a-nerd" whose views represent the minority of the mercifully dwindling old guard.
Microsoft were #1 back when Jobs returned to Apple. However painful it may be to admit to many of you, it's Jobs you have to thank for finally knocking Microsoft down a peg or two and forcing them to get off their collective arses and actually innovate. (No, they didn't get ModernUI right first time, but it took them three goes to make Windows acceptable too. MS are good at playing the long game, so I'm not inclined to write them off yet.)
"How do Wikipedia get money when they don't advertise, if they are not going to ask for money?"
Wikipedia is what Yahoo! always dreamed of becoming: a curated catalogue of links that can act as a springboard to the rest of the Internet.
So... spin off a search engine built around their content. Make it a non-profit, but allow ads set around categories, etc. Allowed 'sponsored articles' paid for by the corporations who want them (but mark them clearly as such).
The Wikipedia site itself would continue as a separate entity, but it would now receive funding from that non-profit search engine business. (Who knows, maybe they could even start paying for contributions.)
I sometimes suspect that the above is precisely why Google decided to cosy up to them: Wikipedia is a potentially huge threat to their business model.
Christ, it's like a children's playground in here. "My bullshit and ignorance is better than your bullshit and ignorance! So there!"
It's forums like this that make me wonder how the fuck we had the gall to name our own species "The Wise Man".
Windows Phone and Windows 8 are perfectly fine operating systems. As is iOS. (My antipathy towards Android isn't about its technology, so much as its politics. I avoid anything tainted by Google like the plague.)
No platform is perfect. Each has its pros and cons. Buy the one that best fits your needs. No need to start another sodding religion over it.
Re: @Andrew Orlowski
Really? It took you long enough. I've been saying this for ages.
The Internet was designed to be inherently 'trusting'. It's never been fit for its current purposes, and there are no signs of this changing. I therefore don't put anything on the Internet that I want to keep secret.
The NSA, GCHQ and their peers spy on people? Who knew? Oh right: I did. So did anyone else with more than two brain cells to bang together. They're spy agencies! Spying is what they do! Spying on their own citizens was also a wholly predictable result of the US PATRIOT Act and its foreign equivalents: It's hard to spot home-grown terrorists within your own borders if you don't do it and the UK certainly has form: the IRA and UDF were rather into setting off bombs and murdering civilians until relatively recently. Both groups were operating inside the UK's own borders. Spain and France also have similar experiences, with the former having to face the Basque separatist group, ETA.
The only thing surprising about Snowden's "leaks" is that so many people were so shockingly ignorant about what these agencies actually did for a living. What did you people think they were doing all day in those vast buildings? Watching porn?
All those GPL variants so beloved of the GNU and many members of the FOSS communities? Without IP laws, they're not worth the rusty iron they're stored on: Copyleft cannot exist without Copyright. Without IP laws, without the pillars that support Copyright Law, no license agreement can be enforced: counterfeiting would be effectively legal for everyone. Even the Creative Commons movement relies on existing IP law to enforce its own licenses.
So, yes, IP law is needed – or a very near facsimile offering similar features. (I'm not convinced of the validity of software patents, for example. And the USPTO really needs a major overhaul.)
What we need are standardised Open Formats. If we can store our personal profiles in standard formats, they become much, much easier to trade. We could trivially leverage our personal data, and there's no need for micro-payments to do so either: "Do ut des." The personal data is the payment. Make this the price for "pro" services and we can choose whether to pay that price. Offer a cut-down, genuinely "free" service tier, then use it as a 'teaser' service to entice users to make that choice of their own volition. Give people the choice.
(Okay, I won't be interested myself, but I'm sure plenty of people will be more than happy to do so. As anyone who's ever looked at Facebook or Twitter can attest.)
Am I the only person who understands the meaning of a certain three-letter word?
The NSA, CIA and their ilk are SPY agencies!
What makes you think the NSA or CIA (etc.) don't have agents inside these companies? It's a lot easier to find flaws in software or firmware when you can actually read the commented source code!
It also means that it matters not one whit whether WD, HP, etc. are "aware" of any shenanigans as one of the golden rules of being a successful spy is that nobody knows you are one! All it would take is to install / bribe one or two employees in the right positions within each company and you're golden. Nobody else in the company would even know.
Microsoft had (at last count) over 100000 employees. Even HP and Dell have thousands of employees spread all over the world. And, of course, the rise in outsourcing will have helped immensely as a single, well-placed spy in the right outsourcing company could give you any number of businesses on a plate.
They're spy agencies! Spying is what these people do for a living. All day. All night. All the time. They're spying. Get it? What the blue blazes did you all think those thousands of spies actually do all day? Iron shirts? Mend wooden horses? What?
Jesus Horatio Fogharty Christ on a flying fuckstick. This is a bloody IT website. You're supposed to be intelligent* readers! Even allowing for the intelligence-battering effects of the Internet, I can't seriously be the only one who wasn't even remotely surprised by any of these so-called "revelations"?
* (Clearly for very small values of "intelligent".)
Emulation doesn't give you the full experience, as you admit. You lose the context if you rely solely on emulation and that's my point. It does not solve the preservation problem, because preserving the code on its own is pointless. Jet Set Will on a modern HDTV has nothing like the same feel.
As for failed 1970s / 1980s hardware: something tells me that, by the time hardware rot becomes a real issue, it'll be trivial to just print out a replacement unit. We can almost print the bloody things now using 3D printing technology. Take a look at the PCBs on an Atari VCS or a Sinclair ZX Spectrum: they're not exactly difficult to duplicate. Besides, we've seen people willing to voluntarily (re)build Babbage's Difference Engine and even Turing's Bombes. What makes you think nobody would be interested in servicing old tech?
The hard part will be finding CRT televisions to plug 'em into, but I suspect there'll be enough nostalgia sloshing about by then to support production in small volumes. They'll cost a mint, but collectors are often very willing to pay.
We have national, state-funded art galleries right now that still display paintings from over 1000 years ago. First-edition books are still kept in our libraries. The National Railway Museum in York still has working steam locomotives that date right back to the 1800s. Even London's Transport Museum has old trolleybuses and horse-drawn trams in its collection – and, yes, they're all maintained. Because they're museums, and preservation is what they're for!
The UK needs an equivalent for its games industry – an industry it still manages to kick serious arse in, thanks precisely to the likes of Sinclair Research, Acorn Microsystems (and the folks behind today's ARM). Not to mention Rockstar, DMA Design, Psygnosis, Gremlin Graphics, Ocean, Matthew Smith, Messrs. Braben & Bell, and all their contemporaries.
I'm aware that there are some attempts at archiving this stuff by existing organisations, but having a small department within another museum's sub-department doesn't really count. Resources are very finite for museums and preservation in general: what the UK needs is a dedicated institution. Where's the games industry's equivalent of Tate Modern? Why does a pile of bricks or an unmade bed get its own power station-sized space, while the UK's huge contribution to the games industries is tucked away in a basement and barely even mentioned in polite company?
Games and Play have been an integral part of growing up for Homo Sapiens since before the days of recorded history. They play such an important role in how our species learns – even other mammals play for educational purposes – it's shocking that it's had so little attention. Why is this important field not getting the recognition it deserves? Toys have their own museums already, but even they tend to focus on static toys like dolls and teddy bears.
As a one-time game designer and developer myself, I have strong feelings about this. Preservation isn't just about retaining just the code and graphics, any more than history should be limited to lists of kings and dates.
We need a National Games Museum. And not one limited solely to video games – hence my dislike of hanging such an institution off the side of a video or film archive – but one that takes a broad, holistic view, covering board games, tabletop / war-games (which, believe it or not, was actually a hobby of a certain Herbert G. Wells; he even wrote two articles on it), and going right back to pinball machines and Victorian end-of-pier games.
Did you not read the article? The man has both a wife and children.
Just as philatelists don't spend their every waking hour licking stamps and sticking them on envelopes, so games collectors don't just slob in front of a TV all day playing their games. The playing isn't the point: it's the having that matters.
As for emulators: oh, hell no. An emulator does not give the same look and feel of the original: running a ZX Spectrum or Commodore 64 emulator on a PC plugged into a modern flat-screen display just isn't the same.
You need the flicker, poor regulation and dot-crawl you can only get from a consumer-grade TV of the period, like an ex-rental Ferguson TX colour TV; the oddly hypnotic squeals while the game's loading screen streams in from the tape, or the short screech and burst of the US national anthem as a US Gold title's Novaload fast-loader kicks in. The sound through the mono, mediocre loudspeaker of the TV set, or the piezo-electric transducer of a ZX Spectrum 48K...
That 3-4 minutes of waiting for the game to load (with the possibility of it failing to load and having to be restarted – often after a few moments of twiddling with the tape head alignment) has the exact same effect as the queue before a theme park ride: the ride teases you with glimpses of what's going to happen; the game's loading screen similarly teases you and builds up the anticipation. By the time the actual title screen appears, you're already invested in the game, your imagination having had plenty of time to visualise what to expect.
And then there's the feel of the dead-flesh ZX Spectrum keys under your fingers, or the digital Kempston and Cruiser joysticks of the '80s. (Or the analogue controllers of the BBC Micros and some arcades.)
Any true preservation effort must preserve all of that context! Sterile emulators on modern hardware can provide some of that for you, but you need to preserve the hardware, for that, too, is a key part of this fading history. It not only gives you a much better idea of how the game felt to play, but it also shows the design ideals of the era, such as the materials used, the colours – the orange and black of a Binatone "Pong" clone console, or the wood veneer trim, as seen on the original Atari VCS – as well the manuals and marketing.
Modern LCDs are harsh critics of games designed to be played on old, often second-hand, and frequently imperfect bedroom CRT TVs. That dot-crawl, the convergence and geometry issues, etc,. also helped smooth those chunky, low-res graphics.
Preservation is not, and never has been, about merely keeping the code itself.
Google would have me drive to Viterbo through no less than three small medieval villages, along the old medieval route. Said route includes multiple hairpin bends and roads that sometimes become too narrow for two small cars to pass each other, so you can imagine what it's like when HGVs try and do likewise.
Apple's maps, on the other hand, correctly sends you along the bypass built over 20 years ago to relieve said three villages. It even gets the names and numbers right, as well as showing the area in rather more detail than Google's photos do.
Mapping the Earth accurately is hard. Neither company gets it spot on all the time, but this weird internet meme that only Apple's Maps ever get it wrong suggests that either TomTom are deliberately selling Apple a dodgy database (which seems extremely unlikely), or the media are being ridiculously selective about their memories. Google's mapping data wasn't exactly stellar during its first few years either.
"Both Google and Apple just seem to buy up tech these days. Do they make anything themselves?"
Why the hell would you piss money away on reinventing the wheel when there's a perfectly good wheel over there with a "For Sale" sign stuck to it?
When did "Not Invented Here Syndrome" become a desirable business policy?
Re: Bet Apple have already gotten around this in advance...
"The charge brick is that little doodad with the USB A slot you plug the cable into. I'm pretty sure you got one in your box, everybody else did..."
Actually, Apple have been selling mobile devices without PSUs for some time now. In fact, I'm not sure even iPads come with them any more, though my original iPad did as it required a higher output than standard USB ports could provide at the time. Their laptops do, obviously, but most people just charge their Apple mobile devices from the nearest handy USB socket. Same as everyone else.
This has been the case for nigh-on ten years now. Hence the "WTF?" reactions from Apple customers to The Register's click-bait headline. You can buy separate chargers if you want – and, yes, Apple will charge a hefty mark-up, as is their wont* – but the iPhone and iPods all come with a Lightning (or, for some older models, 30-pin) cable that has a perfectly standard USB A plug at the other end.
As for "overpriced" chargers: I have a twin-USB charger – a Belkin one, I think – I use for my old iPod classic and my iPhone 4. Works just fine. And cost a whopping, usurious, er... €7.99. Including two cables.
For the life of me, I have no idea what the article is actually trying to say here. The only part the EU can justify legislating on is the connector type and power output on the transformer itself. Most companies – including Apple – have already standardised on a USB socket for that, though the power output varies quite wildly due to the rather obvious fact that manufacturers tend to use batteries of differing capacity depending on context.
(E.g. the iPad 3 cannot charge from a standard laptop USB socket unless it's in standby. Switch it on and the power flowing into the device isn't enough to allow it to operate and charge at the same time. Hence the Lightning connector, which can also talk to USB sockets on Apple's own computers and request a higher wattage when the computer itself is connected to the mains.)
Legally mandating a specific socket on all mobile devices themselves effectively limits their design, and I can't see that going down well with any manufacturer, let alone Apple. This is also a road the EU really doesn't want to go down given the rapid pace of change in IT. The iPhone itself is barely six years old; the iPad is less than 4. Who knows what's coming next? If we move into wearable technology, or flexible screens, do you really think manufacturers will want to be tied to a (relatively) chunky connector design?
* (Last time I checked, an official Sony PSU wasn't cheap either. Neither were Samsung's. But, as with Apple kit, there's no shortage of respectable third-party alternatives that cost a lot less.)
Re: The world does not revolve around Apple
Re: About as stupid as ...
You do realise all those programming 'paradigms' that go in and out of fashion every ten minutes are just some random tosser's opinion, right? There's no law engraved in stone that requires all programming languages to support every damned fashion under the sun.
That's the mistake C++ made: it's trying to be all things to all programmers, and fails quite spectacularly at doing the one thing that is required of all programming languages: to be human-readable.
No CPU I've ever used gives a flying toss about templates, classes, lambdas, etc.; OOP, Functional Programming, and so on, are all just so much structural scaffolding that is frequently so poorly designed that it gets in the way of the code itself. Such scaffolding has no place in the programming language itself and should have been shifted into the IDE UIs, where it belongs, a long time ago.
Re: My how far NASA has fallen :(
NASA's funding has never, ever been anything more than a rounding error compared to the staggering sums of cash hurled at the military / defence industries over the same period.
Politicians do so love a good, lucrative, distracting war. It's practically a tradition.
As for the BBC: there's a myth that only the UK has a TV License. This is not only untrue, but many EU countries have both a TV license and adverts on their publicly funded TV stations. Oh, and those licenses are often more expensive too.
Seriously, try Italian TV sometime. It makes the BBC look like HBO and Netflix combined! RAI used to be pretty good at home-grown content, but Berlusconi's machinations put paid to that: 99% of RAI's output today consists of cheap talk shows, archive clip shows, and the like. Dramas are almost entirely imported (and dubbed), while the very few (admittedly quite decent) home-grown dramas tend to be the usual detective / cop show variety based on existing novels. So hardly a creative stretch.
The French and Germans are a little better, but the BBC is, as far as I'm aware, the only European TV broadcaster to have no advertising at all, regardless of funding.
I can only assume you've not noticed the new Mac Pro.
It's the perfect solution to the "Apple should ALSO make a mini-tower!" demands over the years: you can add as much expansion as you damned well please, and you don't need to pay for a computer the size of a hotel mini-bar fitted with a power-hungry PSU and cooling system that both have to be designed for the maximum potential load, regardless of whether you ever intend to expand it at all.
Yes, it's pricey, but quality kit usually is, regardless of the label. Take a look at the pricing for Intel's Xeon CPUs and those two AMD graphics cards, not to mention the PCI-e flash storage. Good luck building an exactly equivalent PC, with the same expansion ports, for anywhere near the same price. And no, a SATA-6 SSD doesn't even get to see the mustard, let alone cut it.
Traditional tower cases have barely changed in nigh on 30 years, so this really does count as proper "innovation" too. It'll be interesting to see if it does well, or ends up as another G4 Cube.
For those pointing out that the lyrics are different: song = lyrics + music. Without the music, all you have is poetry.
However, what Goldiblox have done isn't "parody", because they haven't actually changed the intent of the original song. All they've done is change a more subtle lyric into a 9lb. lumphammer that bludgeons the 'message' into the listener with all the grace and elegance of being repeatedly struck in the face by a Steinway grand piano.
Had they stuck with the original lyrics, the ad would have actually worked better, with the visuals clearly contrasting with the words. But the Beastie Boys made it clear years ago that they did not wish to have their music used for such purposes. At the very least, Goldiblox's failure to simply ask another artist – it's not as if the Beastie Boys' song is the only one ever written on the subject – instead of violating the last wishes of a dead man speak volumes about the company's management.
(And I speak as someone who was never into the Beastie Boys, so I'm not exactly biased here.)
Re: No, Liam, I won't be using a fondleslab as my primary computer.
"Windows tablets [...] gave up on having a proper multitasking OS for that bullshit 8.11 for Fondlegroups "two things at a time, tops" crapfest."
You've clearly never actually used a Surface Pro then. See that tile that looks like desktop wallpaper? There's your WIMP GUI right there, same as it always was. Everything Windows 7 can do, Windows 8 can do too. In some instances, it even does it a bit faster. You can even get two types of cover with integrated keyboards in them. I'm writing this on a three-year-old MacBook Pro, but even I'm seriously tempted by a Windows 8 (not "RT", which is definitely too half-baked) machine. It's a solid OS that does everything Windows 7 can do. And it even has a nice, shiny, app launcher. Granted, the latter doesn't appear to be to everyone's taste, but I found the old menu system a pain to use – mice are a major cause of RSI problems in a way trackpads aren't.
What we're seeing isn't the death of the WIMP metaphor, but its sidelining into niche markets as the vast majority of computers these days really aren't being used for much more than email, Facetwit and web browsing.
I'm a mild-mannered translator by day and have found my iPad invaluable for dictionaries and reference works. In my business, that's not "consuming", it's a real, actual, bona-fide work tool. When I upgraded my old Mk. 1 to a Mk. 3 iPad, it paid for itself in just under two weeks. I haven't bought a printed book or magazine since my first iPad, in 2010, and I can't say I miss them.
Liam's one and only mistake with his article was in not adding "Your mileage may vary" at the end.
Re: I'm always surprised at the naivity of people
"I mean seriously, what do you expect to happen if you download software the creator refuses to give you the source code? Why would anybody keep the source code from you other than wanting to defraud you?"
What earthly use is the bloody source code to someone who has no clue about programming?
People who can read source code and understand what it does – and who also happens to have lucked-out in becoming expert in the same programming language(s) as the original developer – are unlikely to be ignorant enough to install such malware in the first place!
However, NOBODY can be an expert in every field of human endeavour. IT is just one field among many. How would you like to be told that you got exactly what was coming to you every damned time your ignorance of a particular subject betrayed you? How would you like it if every time you failed to make a multinational corporation compliant with the likes of Sarbanes-Oxley and ISO 20001, you saw someone pointing and laughing at your ignorance and calling you a "n00b"?
So much for your accusation of naïveté: We are ALL ignorant. We're just ignorant about different things.
Most people don't want to build their cars from scratch, nor do they particularly care how they work. They'll happily buy a Ford Fiesta, or a BMW, or whatnot, and simply drive the thing. All cars share one common feature: their core user interface. Some details will change from car to car, but if you've learned to drive in a Vauxhall Astra, it's a fair bet you can work out how to drive a FIAT Punto or any other make and model of car built since the 1950s.
For every James May, who could cite chapter and verse from the relevant Haynes manual for each car, there are a hundred Jeremy Clarksons, who couldn't give a toss how the bloody machine actually works. Yet most developers still believe everyone who has any contact at all with a computer should be like James May.
The IT industry has moved on quite a bit since the 1960s and '70s.
Open Source has become an anachronism. It is very much part of the problem, not the solution. Forget GNU, Stallman and the FOSS movements: they're yesterday's causes. The problem today isn't source code, but interfaces.
Not just in the software, but across the entire chain – from box art to silicon chip, from API to documentation – it's all about interfaces, not code.
End users should not be required to read complicated EULAs to determine whether the code they've downloaded actually does what it says on the tin. Why shouldn't they be able to pay for virtual gatekeepers to screen such things on their behalf? This is exactly why companies like Apple and Amazon have opted to provide such "gated communities" for their users.
Developers – and the IT community in general – really have only themselves to blame for this: you'll have massive flamewars over trivialities like tabs vs. spaces, while criticising the poor bloody users who have to put up with the ill-designed, barely usable, and barely-supported tripe you expect them to learn how to use. And then you think nothing of bundling in someone else's crap with your "free" software, because your definition of "free" isn't the same as the one in the dictionary.
The IT industry's problems aren't Apple's, Google's, Microsoft's or anyone else's fault but yours. You've had half a century of power, but you've chosen to ignore all the responsibilities that come with it. It's time that changed.
 There is a veritable Babel of programming languages out there, and merely reading some books and tinkering about with each of them does not make you an expert.
 This will come as a shock, but some of you clearly haven't understood what the "I" in "API" actually stands for. Or the purpose of good documentation. Similarly, a published data format is also an interface. Interfaces are everywhere.
 Google Play is the only "walled garden" out there. It has gardeners who react to problems after they've happened, not gatekeepers who stop the problems getting in in the first place.
Re: I will get one because....
'But if I purchase a PS4 I can expect the wife to say "so what does it do different?"'
May I suggest ask your wife why she is fine with spending hundreds – if not thousands – of quid on shiny lumps of rock artfully nailed into equally shiny bits of metal, or ooh-ing and aah-ing over a pretty arrangement of vegetable genitalia... yet she believes YOU are the shallow one for wanting to spend a few hundred quid on a new home entertainment* centre that actually does something besides merely looking shiny.
* a concept that also includes games. There's no need to make a distinction.
Re: A fine line between Vision and Arrogance
"Never quite understood whether the new user interfaces being foisted on us are because *we* (the punters) lack vision to understand it, or *they* are just being arrogant and treating us like cash cow cattle."
It's the former. Windows 8.x replaced the rather weird Start Menu with a proper app launcher, which also runs big widgets. All the keyboard shortcuts – which everyone calling themselves a seasoned veteran or professional should know – are unchanged. ALT+F4 will close a ModernUI app just as it closes a conventional Windows GUI app, for example.
There are textbooks on this. Many of them written as far back as the late '60s and early '70s, when the R&D phase for the WIMP desktop metaphor we still see on desktop GUIs today was still in its infancy.
That the above is clearly a surprise to many so-called "professionals" is shocking to me; it was very basic stuff when I was studying Computer Science in the 1980s.
@Buck Futter, Tannin, et al:
Many people seem to have a problem with Windows 8.x, but I've actually found it's easier to get newbies into it than it was with previous versions. Rather than presenting you with a pretty picture and some cryptic icons, it actually starts with an application launcher that shows a bunch of very clear tiles, each of which tells you what it does and even gives you some basic information before you've even clicked on it.
As for myself: according to every WIMP GUI rulebook, the GUI is there for *newbies*. Nobody else. Intermediate and advanced users are supposed to learn the bloody keyboard shortcuts!
If, like me, you had done just that, Windows 8.x would pose no difficulties whatsoever. Want to close an application – or even bring up the shutdown dialog box? ALT+F4. Each new release has added new shortcuts, but many of the existing ones have been there since Windows for Workgroups!
The problem is that nobody's teaching this any more. When so-called "professionals" proclaim themselves grizzled veterans with umpteen years of expertise in a platform, yet admit to being bamboozled by changes to what is, when you get down to it, a glorified app launcher, you have to wonder what they're teaching kids at university these days.
Such people are, at best, amateurs, not professionals. Their blatant ignorance of basic GUI usage rules is proof enough of that. If you're still relying heavily on a mouse or trackpad to get your quotidian work done, and you're not an artist or architect, you're doing it wrong. By definition. There are actual textbooks explaining all this.
That tiled GUI really is piss-easy for neophytes to understand. It's easy to forget that we had to *learn* to navigate the (original hierarchical) menus and drill down to our application – never mind having to remember *which* application we needed to open! Now, my aunt need only look for the "Mail" tile, see that there's a message or three waiting for her, and click on it. It's all there right in front of her. And this is a Good Thing™ as it means she needs to rely rather less heavily on her failing memory.
iOS and Android – hardly surprising given the former's influence on the latter – led the way, but Windows' ModernUI picked up the widgets idea and ran with it, making it the central feature, but Trevor Potts' point about separating this from the old Windows GUI is a valid one: Windows 8.x is very much a transitional release, and it's likely Windows 9.x will be too, given the glacial pace of upgrading in the corporate field.
iOS was the first mainstream GUI to break with the old WIMP formula, so the keyboard shortcuts point doesn't apply to that. (Or to Android.) Microsoft also needs to make that transition, but whether beating Windows into submission with the multi-touch GUI stick over a number of transitional releases is the best way to achieve that is a question only the market can answer. In fairness, Windows 8.x is a pretty good choice for people, like myself, who have to do a lot of typing. Some of the hybrid Windows 8 "tabtop" devices out there are a perfect fit for my needs.
Wacom's Companion Pro (essentially a Wacom digitiser and stylus nailed onto a tablet very similar to the Surface Pro 2) is looking very attractive to me right now. It's flashing its ports seductively at me as I type this. Cease, you Jezebel! You tablet of the night!
Nurse! The screens!
I agree with part of the article: Steve Jobs was exactly what Apple needed in their hour of need, but that's largely due to his previous involvement with the company. The only equivalent for Microsoft would be the return of Bill Gates, who has already made it clear he's not interested in retreading old ground. (For all Jobs' later success, Gates didn't need to mess it up and spend years in the wilderness to learn the necessary skills. Gates nailed if first time around.)
However, I disagree with the tiresome repetition of a pointless meme: what Apple has is a *gated community*. It's not the walling-in that's the point here, but the *curation*. Android has barely any curation at all, hence its frequent security issues. iOS' App Store, on the other hand, *is* curated, which is less like a gardener wandering around a walled garden and occasionally reacting with an, "OI! Gerroff the lawn!", and more like the guards of a gated community who stop undesirables getting in in the first place. (No, they're not 100% successful, but they're close enough.)
What Microsoft needs is *focus*. It is making a mistake Apple was making in the mid-90s: it's doing too much. They sell no less than three flavours of desktop Windows, each with multiple variants. They sell server variants too. They make a games console (also with its own OS), they make games, they sell a major office suite, own a bunch of cloud services, and they sell industrial-strength software development tools too. They even make keyboards and mice.
Jobs was right to slash Apple's massive, and very confusing, product range when he returned: focus is a common factor in very successful businesses. You can't do that when you have a portfolio even wider-ranging than Apple's under Gil Amelio's tenure.
Unlike Jobs' scorched Apple policy, Microsoft could slash its portfolio by simply spinning off the profitable units into separate entities. Microsoft needs to make itself agile enough to react more quickly and effectively to the ever-changing world of IT – an industry that is, almost by definition, in a state of perpetual transition.
THAT is the hard part: changing Microsoft's management and corporate structure entirely. Given the present corporate structure, it may be that what Microsoft really needs today isn't so much a Steve Jobs, as a Genghis Khan.
Re: The Goons, really?
The original 23rd November 1963 broadcast was indeed followed immediately after by an episode of the Telegoons ("The Canal" episode).
The first episode of Doctor Who was also repeated the following week, on the 30th of November, due to the assassination of JFK on the 23rd overshadowing the first broadcast. The next episode of the Telegoons was not broadcast until the 7th of December – skipping the 30th. I can't find any scans of the Radio Times for the 30th though, so I'm only speculating that it was likely a casualty of the decision to show the repeat.
Re: 80's Doctors
Tom Baker had a few great episodes, but his theatrical acting style hasn't aged well. And there were a hell of a lot of duds too, not to mention an over-reliance on 'homages' to old gothic horror movies.
In fairness to Michael Grade, I think Colin Baker's portrayal suffered mainly from coming after Peter Davison's. Colin gave it his all – a little too much so – but the problem is that he and Tom Baker both have a strong theatrical acting background and it *really* shows when watching their stories on DVD. They both come perilously close to channeling Brian Blessed at times.
Peter Davison was very much a TV actor first and foremost and understood the medium's strengths and weaknesses. He gave a much more subtle, nuanced, portrayal. Casting Colin felt like a step backwards. Despite some decent episodes – I'm very partial to "Vengeance on Varos" and "The Two Doctors"* – Colin's tiresome schtick of repeating the same word three times, each louder than before – "Hammy? Hammy?! HAMMY?!!!—grated very quickly. Davison basically made the two Bakers' theatrical acting style obsolete.
McCoy was an inspired choice though, so it's a shame he was saddled with a bunch of scripts originally written for Colin Baker's portrayal for his first season. (And some very odd choices of scripts for the second.) His final season, on the other hand, holds up pretty well despite series' shoestring budget and the BBC's utter indifference to the series itself.
All that said, I'm still amazed the series survived the casting of Bonnie Langford. Poor Colin Baker. He never had a chance.
* (Patrick Troughton and John Stratton were clearly having a whale of a time, and we also got Jacqueline "Servalan" Pearce thrown in as well.)
There's a good reason why Apple kit tends not to have much upgradeability: eBay.
Apple hardware tends to hold its price very well – especially at the "pro" end. My 17" MBP (2010 model) has 8GB RAM and a 512 GB SSD. It screams. And it's still worth over £600+ on eBay, despite being closer to its 4th birthday than its 3rd.
I've yet to see the HP, Dell or Lenovo laptop that can boast the same.
Many owners of Apple kit are well aware of this and tend to effectively trade-in their old model for a new one every couple of years. Why the hell would they waste their time going through all the bother of upgrading?
(Before you reply: being a reader of The Register does not define you as a professional computer user, any more than being able to strip down and rebuild a V8 engine makes you a "professional" car driver.)
Re: As an ex- tech pubs* guy...
I used to write docs myself.
My niche was the game development tools industry: I wrote the user guide for Criterion's "Renderware 3" / "Renderware Graphics" (for which I can only apologise), and also the docs for the first few releases of Allegorithmic's "Substance" suite. I did some odds and ends for the Unity folks too some years ago. It's amazing how easy it is to find shockingly overpriced documentation tools that make even Emacs look like a simple, elegant editor by comparison.
"The engineers (as noted) hate documentation because they KNOW that their products are brilliantly intuitive and so NEED no documentation;"
This. Oh, so very this. (I lost count of the number of times I had to remind some of my colleagues what the "I" in "API" stood for. Developers can be end users too, so even an API should have basic UI design rules applied to it.)
Thing is, writing documentation really is a truly thankless task. Everybody hates what you do and considers your entire job pointless. End users have become so used to manuals being either missing, or abject shite, that they assume it's safe to expect this to be true of all applications. Not only do they never read your 500+ pages of bloody hard work, but they'll actually be surprised such a source of information even exists. And your own colleagues also see you as some kind of parasitic life form whose job appears to be ask them to explain what they consider blindingly obvious. There is nobody so blind as an expert.
Good technical authoring is bastard hard to do. Not only do you have to become an expert in the entire product you're documenting, you also need to be able to explain it to your readers without overwhelming them with information. Just enough background information – plus links to more in-depth info – and no more. And complex software can have features that rely heavily on other features too, so you need to organise the teaching to ensure you have full coverage.
Not everyone can do it effectively, despite many developers' belief in the contrary. Frankly, most developers I've met—include many with "Ph.D." after their names—seem to be either flat-out illiterate, or insist on making everything read like a particularly obtuse scientific paper. They sure as hell don't understand how to teach well.
Never mind that the greatest feature in the world is of no f*cking use to anybody if they can't work out how to use it. (Ironically, this is precisely how Apple have crawled out of near-bankruptcy to the top of the IT heap: they truly grok user education.)
I got the hell out when I realised that most of my potential clients had begun to see support as a chargeable feature, turning it into a revenue stream: a decent manual suddenly becomes a bad idea as it reduces support calls and, consequently, revenues from that particular stream. (Indeed, this appears to be how most GNU / FOSS applications are actually supported financially: write something that's genuinely useful, but give it a cryptic UI that effectively forces your customers to pay you for training and support and you can make a decent profit.)
I do translation now, which is a whole fresh hell of WTF in its own right, but at least people actually appreciate your work. They also don't tend to claim that "it's just writing! Anyone can write!"
Re: Really really basic computers
Procedural languages, Functional languages, OOP, etc., are mostly just different kinds of organisational scaffolding. No mainstream CPU actually gives a toss about any of that stuff; it all ends up as machine code in the end. Often on an Intel or ARM CPU, neither of which care one whit about whether you like to organise the original code in your source files as subroutines or as objects. Same meat, different gravy.
None of that stuff matters.
What matters is understanding how computers "think", because programming is just a synonym for "translation" and is actually pretty easy to learn at that level. I was far more productive coding Z80 or MC680x0 code in assembly language than I ever was writing in C++. I used to be proud of writing bug-free code, and it really *was* bug-free. But those days are long gone. The hardware has become orders of magnitude more powerful and capable, but the tools we use to program it all have barely changed since the flint axes of the 1970s.
It's 2013 and we're *still* writing code using artificial languages that require us to walk on eggshells due to their cruel and unusual punctuation. My *phone* can render a Mandelbrot set in real-time and run full-on 3D First-Person Shooters at HD resolutions, and yet we insist on forcing *humans* to do stupid grunt-work like adding a semicolon at the end of a line to save the compiler from a picosecond of calculation? How the hell is this even acceptable? How is this not front-page news in The Register and all its rivals? THIS is the scandal of our time.
No wonder today's software comes wrapped in legalese instead of warranties.
Re: Storage indeed
"No SD card in the nexus, but at least you can use a USB stick, or 20, or a USB microSD card reader if you like..."
So, er, just like an iPad with the Camera Connection Kit then?
No, it doesn't have a MicroSD slot built in, but most users don't need it. Why compromise a product's design and usability* for the sake of a relatively tiny number of edge cases?
If you want a tablet with stacks of storage, buy a Windows 8 model instead. Some of those come with 256 (and even 512 GB) of storage.
* (removable storage is a royal pain in the arse from a UI perspective. There's a bloody good reason why Apple went with software-eject systems for their Mac floppy disk drives back in the day.)
Re: Many tools for the job?
"and Microsoft isn't going to fund actual education."
Black & Decker aren't going to fund metalwork or woodworking classes either.
It's Microsoft's job to supply the tools. Anything else they do is icing on the cake. It's the school's job to provide the actual 'education' part. By offering industry-standard tools that students will actually encounter in the real world at a massive discount (and effectively for free for many students), Microsoft are reducing the total funds required for that education.
Sure, you could use Libre/OpenOffice, but it simply isn't that good. If it was, businesses would have standardised on them long ago. They are, after all, free. If you can't even gain market share by giving your product away for nothing, the problem isn't your competition. It's you.
Support counts for a lot. As does (relatively) easy customisation and extensibility, as well as a huge ecosystem of third-party applications that plug into the Microsoft Office suite. Companies like SDL, who create translation software, rely on MS Office being installed to handle the preview feature for Word-supported file formats, for example.
Libre/OpenOffice (as well as Apple's own "iWorks" suite) originally competed with Microsoft Works, not Microsoft Office. Sadly, Microsoft axed Works a long time ago, but it seems the *Office communities haven't really understood what it is that makes businesses so willing to pay licenses for Microsoft's products regardless. It's not merely inertia.
- Updated Zucker punched: Google gobbles Facebook-wooed Titan Aerospace
- Elon Musk's LEAKY THRUSTER gas stalls Space Station supply run
- Windows 8.1, which you probably haven't upgraded to yet, ALREADY OBSOLETE
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Android engineer: We DIDN'T copy Apple OR follow Samsung's orders