861 posts • joined 8 May 2009
Re: Limit climate change?
"But climate change isn't happening. You've posted about a thousand articles saying so."
No. The climate is changing. That is not – and never has been – in dispute. It's been changing ever since the Earth formed.
The "debate" – and I use that term in its loosest possible sense – is about humanity's contribution towards it, and the resulting level of catastrophism – if any – that will result.
The opposing sides of the debate – and there really are more than two sides to it – are:
1. Who the f*ck cares what the human contribution is? Surely all that matters is what should we do about it?
2. Okay, if humans are contributing towards Climate Change, is it really as much as the media says it is? If not, see 1.
3. Are things really going to be as catastrophic as the Chicken Littles are claiming? So far, there is little evidence to suggest that the skies really are about to fall on our heads.
4. Where's the evidence that we have a real handle on how this complex system works in the first place? We hear endless talk about computer models, yet we also see articles like the one Mike Lewis just reported on in this very item that prove we don't have all the information necessary to create accurate climate models.
That last point is also the reason why Mike Lewis is reporting on this debate in a website called "The Register" aimed at IT professionals: the AGW camp's incessant blethering on about "computer models"...
Anyone who has ever programmed a computer knows that a computer model is merely an interactive illustration. Illustrations prove nothing. They are also only as good as the data that went into their construction. Ergo, a computer model cannot be used to support a hypothesis. It can only be used to illustrate it.
The fact that we are still seeing articles in major peer-reviewed journals like Nature that reveal previously unknown facts about how the Earth's complex climate actually works is sufficient proof that we do not have all the data needed to create wholly accurate predictive climate models. Which means the old IT "Garbage In, Garbage Out" cliché applies in spades.
The more complex your computer model, the more bloody accurate the data it's derived from needs to be. Given how easily even a computer model can spin off into the realms of utter bollocks given even a slightly incorrect data or algorithm, anyone claiming to have cracked this stratospherically difficult nut is, for the present at least, either lying, ignorant, or both (i.e. a politician).
"No wonder their security record is so crap if they don't have the decency to notify their userbase that products have gone the way of the dodo."
Where does it say, in any of the documentation, EULAs, etc., that any manufacturer is obliged to support your purchase indefinitely? Usually, support is optional (and certainly voluntary) after the legally mandated support period (which isn't exactly the same as the warranty itself) expires. That's three years in the EU. After that, you're on your own.
HP and Dell happily tell consumers to sod off after a few short years; their corporate customers pay them support money and get support for much longer. Nobody gets that level of support for free.
This is IT, which moves damned quickly. Nobody offers a 7-year guarantee like some automobile manufacturers do. Not even Apple, who, like every other consumer electronics manufacturer, is also tied to the support provided by their component suppliers.
A corporation is just a collection of people run by a dictator, or a cabal, called "shareholders", who elect their own dictator to represent their interests in the corporation.
A community is... just a collection of people run by either a charismatic leader (i.e. a dictator in all but name) – or a baying mob whose intelligence is on a par with its dumbest member.
In both cases, you can have a benevolent dictatorship, or your bog standard tyranny, but the nature of the entity itself makes little difference: either way, it's a bunch of people working together towards a common cause. So I've never cared about the distinction. A community-driven project is no less "proprietary" as a commercial one: either way, if support for a format is dropped, I'm screwed. (No, I'm not going to wade through hundreds of pages of ISO documentation and code up my own conversion software. Life's too bloody short.)
My interests may be met by one, other, or both of these two kinds of entities. I will pick whichever solution best fits my needs.
Whatever you may think about Microsoft, I don't consider them any more "evil" than Samsung, and they're a bloody sight less "evil" than Google. But I also have no time for the constant bickering and squabbling of GNU radicals and extremist FOSS nutters either.
"Throw in the fact that MacOS is not exactly the operating system enterprise software vendors make their first development priority..."
Really? You are aware that OS X is basically a UNIX-based OS with a decent GUI, right? If it'll run on GNU / Linux or FreeBSD, there's no reason why it can't be recompiled to run on OS X as well. You can even get X and Ports for it if you want.
Wasn't platform portability a key feature of UNIX?
Re: Unfair Tax?
"The BBC tax is unfair (technically: regressive). The reason being that everyone pays the same amount, irrespective of their ability to pay."
The BBC License Fee is not a tax. The clue's in the name. It's a part of the price of owning a TV; the TV itself merely provides the machinery with which to receive the broadcasts.
The BBC is a Corporation with a Royal Charter, not a subsidiary of a government ministry. It is a legal entity that has been granted very specific powers. This is an uncommon form of business these days, but it's the same mechanism that was used to create the University of Cambridge, the East India Company, and the Bank of England.
Also, how is everyone paying the same amount "unfair"? I don't get a discount on shoes, clothing or food based on my "ability to pay", so why should I expect a discount on a luxury item like a television set? Do Sky give you a discount on their subscriptions based on your level of income? Do Tescos give you an "I never watch TV adverts" discount on the stuff you've bought from them?
A television is a luxury, not a basic necessity. I haven't owned a TV since 1996, so it is most certainly possible to live without one.
Re: fund it from general taxation
The BBC iPlayer Global app – iOS only – lets you watch archived BBC content for a subscription. It's been available since 2011.
That's little consolation if you don't have an iOS device, but there are strong hints that the BBC are planning to roll out an international version of their web-based system through the bbc.com website, rather than building umpteen mobile / tablet / desktop apps instead. (Apparently, this is in direct response to Hulu and Netflix's success.)
Re: Please stop for a moment and look around
I've lived and worked in a number of countries and *nothing* comes close to the BBC in terms of quality. And, yes, most continental European countries not only have their own state-funded TV stations, but those TV stations _also_ have ads, and produce nowhere near as much quality content. (http://en.wikipedia.org/wiki/Television_licence – check out Norway, Sweden and Switzerland. Still think the BBC is terrible value for money?)
Considering the tiny budgets the BBC makes its programming for – even Doctor Who is made for a fraction of the cost of typical US telefantasy productions – it's a miracle they produce as much as they do, let alone produce content that many other countries, including the US, are willing to pay for. And at least you're not getting ads *and* having to pay a license fee regardless, as Italians, Germans and French TV viewers do.
While I agree that the license fee is not an ideal form of payment, it is by far the least worst option available at present.
And no, the Queen does not count as "state interference". Yes, she's the head of state, but she's apolitical – she has to be, given how long she's been in the job. The Windsors, for all their faults, do at least provide a level of long-term continuity, countering the short-termism endemic to elected representatives. This is one of the few advantages of a royal family, and one that shouldn't be dismissed out of hand. The UK could certainly do a better job of making use of this feature of a monarchy, but it absolutely should not remove it. Not unless they can come up with something better.
The article explains the reason: the music.
The BBC have a blanket license to use such music for their broadcasts, and I think they can still show the programme on iPlayer as that comes under the 'broadcast' umbrella too.
A DVD or Blu-ray release is not a broadcast, so the music licenses would need to be renegotiated at some expense. As this was a low-budget BBC 4 documentary, it's probably just not worth it. They could simply substitute library music instead, but it wouldn't be the same.
Re: Schol Reform
Oh, the irony of someone insinuating that teaching and education are fair game for the armchair ignoramus.
"Aside from a 'life skills' class, the only other reform I'd like to see is a change in school hours, change it from 9-3:15 (or whatever it is right now) to 8 - 5."
Good luck with that. Leaving aside the fact that teaching is actually a hell of a lot harder than you appear to think it is, the human brain can only process so much new data in a day. The brain suffers from fatigue, just like your muscles do.
There's a good reason why so many EU nations have shorter hours for their schools than the UK's. (The trick is to demand more homework, rather than more Victorian-style spoon-feeding in a classroom.)
"1: Daily PE rather than weekly, with no BS excuses to get out of it."
With all due respect, [WORD THAT RHYMES WITH 'LUCK'] you and the high horse you rode in on. I hated PE with a passion, the depth of which you couldn't even begin to conceive. I've always preferred working out on my own in a gym, listening to music. Team sports – professional and otherwise – have never remotely interested me.
"2: Bringing options selection forward to year 9 or even 8 in senior school to allow a better focus earlier for those who know what they want to do."
And what about those who don't? Have you spent even a microsecond considering the logistical and timetabling nightmare you're proposing here?
"PE would help with the obesity problem, as well as concentration at schools."
Firstly, know this: There is no such thing as an "obesity problem". There is a media obsession with scaremongering – because fear sells – but there has been no appreciable increase in actual childhood obesity measured using accurate metrics. Fact. (And, dear lord, if I hear the phrase "obesity epidemic" – as if obesity were somehow contagious – I swear I won't be held responsible for my actions.)
Any article or report you read that mentions using Body-Mass Index ("BMI") as a key measurement can be trivially dismissed as 100% fact-free scaremongering. BMI was discredited long ago as it utterly fails to account for muscle mass. Muscle is twice as dense as fat. BMI treats muscle as fat: Arnold Schwarzenegger would be considered "morbidly obese" under that system.
"Almost every kid I went to school with who had ADHD was almost perfectly behaved after PE, at least for a few hours anyway."
ADHD has surprisingly little to do with being "well behaved" or otherwise: plenty of kids with ADHD diagnoses manage not to disrupt their classes. All it does is affect concentration and focus, making it easier for you to get distracted.
I found doodling on a pad and creating little flick-book animations in the corners of my exercise books helped relieve the boredom of being taught about 1066 and all that in history lessons. (Or Computer Studies O Level: those of us who went into IT careers already knew more than our teacher did by then in any case.)
While those kids that do tend to disrupt lessons may or may not have ADHD, what they definitely have is a dire need for a bloody good clout round the ear. And better parenting.
Re: This doesn't alter the fact...
"It incenses me that she won't use the perfectly good laptop with LinuxMint on it because "she doesn't like it"."
Christ, talk about egotistical.
I suspect your girlfriend won't be losing any sleep over the prospect of never becoming your wife. Good on her.
Re: Old systems still working
"What about all those old systems that are still working but simply can't run Vista/7/8.1."
What about them?
Microsoft didn't make all those PCs! All they did back then was sell software, a games console and some peripherals like keyboards and mice. Why should they be expected to go out of their way to support someone else's products? It's not as if information about Vista and Windows 7 weren't leaked well before they were released, and you've certainly had plenty of time to put some money aside to save up for suitable replacements.
XP's expiry date has been known about since 2011 – 1000 days' notice – or a little under three whole years. Even setting aside just $150 a year would have left you with enough money now to buy a decent replacement, so there really is no excuse for all the whining.
As for the ecological argument: last time I checked, older PCs tend to use more energy. Quite a lot more if using kit with a Pentium IV (or related CPU) inside it. They're also quite a bit more recyclable than people realise. It's using the things to death and then chucking them into the nearest bin in their entirety that's ecologically unsound. Recycling them responsibly is actually the right thing to do. The WEEE regulations actually requires PC manufacturers to take back their old kit for recycling.
Mozilla CTO Eich: If your browser isn't open source (ahem, ahem, IE, Chrome, Safari), DON'T TRUST IT
What's the point of auditing the source code...
... when you have absolutely no way to audit the build process itself?
A handful of deliberate bugs that make it easy to compromise is all you'd need. Those could be added to a low-level library anywhere – i.e. it would affect any application linked against it. When someone spots the bug and fixes it, you simply insert another bug somewhere else. It becomes a never-ending game of "Whack-a-Mole".
This symptom is indistinguishable from ordinary bug-testing, so not an easy problem to identify.
Remember, the NSA, GCHQ, the CIA, etc. are all intelligence agencies. That basically boils down to spying, and intelligence operatives have been doing undercover work for decades. Find the right person with the right leverage and nobody would ever know your organisation had even been compromised – not even the managers.
Re: Who is Woz?
The man basically designed a couple of 1970s-era microcomputers – at a time when every Tom, Dick and Sally was doing the same thing, so big sodding deal – and, aside from a switching power supply, that's pretty much it. It was Jobs who suggested putting one of them in a case and offering a pre-built model for sale.
Mr. Wozniak had precisely f*ck all to do with Apple's transformation from a near-bankrupt basket case in the late '90s to its current status as one of the most successful businesses on the planet. Nothing. Nada. Zip. Naff all. Quite why anyone pays him any heed escapes me. Like Richard Stallman, his views are anachronistic, harking back to a bygone era before computers became a commodity consumer product.
Woz has become an embarrassing, parasitical media tart. A "rent-a-nerd" whose views represent the minority of the mercifully dwindling old guard.
Microsoft were #1 back when Jobs returned to Apple. However painful it may be to admit to many of you, it's Jobs you have to thank for finally knocking Microsoft down a peg or two and forcing them to get off their collective arses and actually innovate. (No, they didn't get ModernUI right first time, but it took them three goes to make Windows acceptable too. MS are good at playing the long game, so I'm not inclined to write them off yet.)
"How do Wikipedia get money when they don't advertise, if they are not going to ask for money?"
Wikipedia is what Yahoo! always dreamed of becoming: a curated catalogue of links that can act as a springboard to the rest of the Internet.
So... spin off a search engine built around their content. Make it a non-profit, but allow ads set around categories, etc. Allowed 'sponsored articles' paid for by the corporations who want them (but mark them clearly as such).
The Wikipedia site itself would continue as a separate entity, but it would now receive funding from that non-profit search engine business. (Who knows, maybe they could even start paying for contributions.)
I sometimes suspect that the above is precisely why Google decided to cosy up to them: Wikipedia is a potentially huge threat to their business model.
Christ, it's like a children's playground in here. "My bullshit and ignorance is better than your bullshit and ignorance! So there!"
It's forums like this that make me wonder how the fuck we had the gall to name our own species "The Wise Man".
Windows Phone and Windows 8 are perfectly fine operating systems. As is iOS. (My antipathy towards Android isn't about its technology, so much as its politics. I avoid anything tainted by Google like the plague.)
No platform is perfect. Each has its pros and cons. Buy the one that best fits your needs. No need to start another sodding religion over it.
Re: @Andrew Orlowski
Really? It took you long enough. I've been saying this for ages.
The Internet was designed to be inherently 'trusting'. It's never been fit for its current purposes, and there are no signs of this changing. I therefore don't put anything on the Internet that I want to keep secret.
The NSA, GCHQ and their peers spy on people? Who knew? Oh right: I did. So did anyone else with more than two brain cells to bang together. They're spy agencies! Spying is what they do! Spying on their own citizens was also a wholly predictable result of the US PATRIOT Act and its foreign equivalents: It's hard to spot home-grown terrorists within your own borders if you don't do it and the UK certainly has form: the IRA and UDF were rather into setting off bombs and murdering civilians until relatively recently. Both groups were operating inside the UK's own borders. Spain and France also have similar experiences, with the former having to face the Basque separatist group, ETA.
The only thing surprising about Snowden's "leaks" is that so many people were so shockingly ignorant about what these agencies actually did for a living. What did you people think they were doing all day in those vast buildings? Watching porn?
All those GPL variants so beloved of the GNU and many members of the FOSS communities? Without IP laws, they're not worth the rusty iron they're stored on: Copyleft cannot exist without Copyright. Without IP laws, without the pillars that support Copyright Law, no license agreement can be enforced: counterfeiting would be effectively legal for everyone. Even the Creative Commons movement relies on existing IP law to enforce its own licenses.
So, yes, IP law is needed – or a very near facsimile offering similar features. (I'm not convinced of the validity of software patents, for example. And the USPTO really needs a major overhaul.)
What we need are standardised Open Formats. If we can store our personal profiles in standard formats, they become much, much easier to trade. We could trivially leverage our personal data, and there's no need for micro-payments to do so either: "Do ut des." The personal data is the payment. Make this the price for "pro" services and we can choose whether to pay that price. Offer a cut-down, genuinely "free" service tier, then use it as a 'teaser' service to entice users to make that choice of their own volition. Give people the choice.
(Okay, I won't be interested myself, but I'm sure plenty of people will be more than happy to do so. As anyone who's ever looked at Facebook or Twitter can attest.)
Am I the only person who understands the meaning of a certain three-letter word?
The NSA, CIA and their ilk are SPY agencies!
What makes you think the NSA or CIA (etc.) don't have agents inside these companies? It's a lot easier to find flaws in software or firmware when you can actually read the commented source code!
It also means that it matters not one whit whether WD, HP, etc. are "aware" of any shenanigans as one of the golden rules of being a successful spy is that nobody knows you are one! All it would take is to install / bribe one or two employees in the right positions within each company and you're golden. Nobody else in the company would even know.
Microsoft had (at last count) over 100000 employees. Even HP and Dell have thousands of employees spread all over the world. And, of course, the rise in outsourcing will have helped immensely as a single, well-placed spy in the right outsourcing company could give you any number of businesses on a plate.
They're spy agencies! Spying is what these people do for a living. All day. All night. All the time. They're spying. Get it? What the blue blazes did you all think those thousands of spies actually do all day? Iron shirts? Mend wooden horses? What?
Jesus Horatio Fogharty Christ on a flying fuckstick. This is a bloody IT website. You're supposed to be intelligent* readers! Even allowing for the intelligence-battering effects of the Internet, I can't seriously be the only one who wasn't even remotely surprised by any of these so-called "revelations"?
* (Clearly for very small values of "intelligent".)
Emulation doesn't give you the full experience, as you admit. You lose the context if you rely solely on emulation and that's my point. It does not solve the preservation problem, because preserving the code on its own is pointless. Jet Set Will on a modern HDTV has nothing like the same feel.
As for failed 1970s / 1980s hardware: something tells me that, by the time hardware rot becomes a real issue, it'll be trivial to just print out a replacement unit. We can almost print the bloody things now using 3D printing technology. Take a look at the PCBs on an Atari VCS or a Sinclair ZX Spectrum: they're not exactly difficult to duplicate. Besides, we've seen people willing to voluntarily (re)build Babbage's Difference Engine and even Turing's Bombes. What makes you think nobody would be interested in servicing old tech?
The hard part will be finding CRT televisions to plug 'em into, but I suspect there'll be enough nostalgia sloshing about by then to support production in small volumes. They'll cost a mint, but collectors are often very willing to pay.
We have national, state-funded art galleries right now that still display paintings from over 1000 years ago. First-edition books are still kept in our libraries. The National Railway Museum in York still has working steam locomotives that date right back to the 1800s. Even London's Transport Museum has old trolleybuses and horse-drawn trams in its collection – and, yes, they're all maintained. Because they're museums, and preservation is what they're for!
The UK needs an equivalent for its games industry – an industry it still manages to kick serious arse in, thanks precisely to the likes of Sinclair Research, Acorn Microsystems (and the folks behind today's ARM). Not to mention Rockstar, DMA Design, Psygnosis, Gremlin Graphics, Ocean, Matthew Smith, Messrs. Braben & Bell, and all their contemporaries.
I'm aware that there are some attempts at archiving this stuff by existing organisations, but having a small department within another museum's sub-department doesn't really count. Resources are very finite for museums and preservation in general: what the UK needs is a dedicated institution. Where's the games industry's equivalent of Tate Modern? Why does a pile of bricks or an unmade bed get its own power station-sized space, while the UK's huge contribution to the games industries is tucked away in a basement and barely even mentioned in polite company?
Games and Play have been an integral part of growing up for Homo Sapiens since before the days of recorded history. They play such an important role in how our species learns – even other mammals play for educational purposes – it's shocking that it's had so little attention. Why is this important field not getting the recognition it deserves? Toys have their own museums already, but even they tend to focus on static toys like dolls and teddy bears.
As a one-time game designer and developer myself, I have strong feelings about this. Preservation isn't just about retaining just the code and graphics, any more than history should be limited to lists of kings and dates.
We need a National Games Museum. And not one limited solely to video games – hence my dislike of hanging such an institution off the side of a video or film archive – but one that takes a broad, holistic view, covering board games, tabletop / war-games (which, believe it or not, was actually a hobby of a certain Herbert G. Wells; he even wrote two articles on it), and going right back to pinball machines and Victorian end-of-pier games.
Did you not read the article? The man has both a wife and children.
Just as philatelists don't spend their every waking hour licking stamps and sticking them on envelopes, so games collectors don't just slob in front of a TV all day playing their games. The playing isn't the point: it's the having that matters.
As for emulators: oh, hell no. An emulator does not give the same look and feel of the original: running a ZX Spectrum or Commodore 64 emulator on a PC plugged into a modern flat-screen display just isn't the same.
You need the flicker, poor regulation and dot-crawl you can only get from a consumer-grade TV of the period, like an ex-rental Ferguson TX colour TV; the oddly hypnotic squeals while the game's loading screen streams in from the tape, or the short screech and burst of the US national anthem as a US Gold title's Novaload fast-loader kicks in. The sound through the mono, mediocre loudspeaker of the TV set, or the piezo-electric transducer of a ZX Spectrum 48K...
That 3-4 minutes of waiting for the game to load (with the possibility of it failing to load and having to be restarted – often after a few moments of twiddling with the tape head alignment) has the exact same effect as the queue before a theme park ride: the ride teases you with glimpses of what's going to happen; the game's loading screen similarly teases you and builds up the anticipation. By the time the actual title screen appears, you're already invested in the game, your imagination having had plenty of time to visualise what to expect.
And then there's the feel of the dead-flesh ZX Spectrum keys under your fingers, or the digital Kempston and Cruiser joysticks of the '80s. (Or the analogue controllers of the BBC Micros and some arcades.)
Any true preservation effort must preserve all of that context! Sterile emulators on modern hardware can provide some of that for you, but you need to preserve the hardware, for that, too, is a key part of this fading history. It not only gives you a much better idea of how the game felt to play, but it also shows the design ideals of the era, such as the materials used, the colours – the orange and black of a Binatone "Pong" clone console, or the wood veneer trim, as seen on the original Atari VCS – as well the manuals and marketing.
Modern LCDs are harsh critics of games designed to be played on old, often second-hand, and frequently imperfect bedroom CRT TVs. That dot-crawl, the convergence and geometry issues, etc,. also helped smooth those chunky, low-res graphics.
Preservation is not, and never has been, about merely keeping the code itself.
Google would have me drive to Viterbo through no less than three small medieval villages, along the old medieval route. Said route includes multiple hairpin bends and roads that sometimes become too narrow for two small cars to pass each other, so you can imagine what it's like when HGVs try and do likewise.
Apple's maps, on the other hand, correctly sends you along the bypass built over 20 years ago to relieve said three villages. It even gets the names and numbers right, as well as showing the area in rather more detail than Google's photos do.
Mapping the Earth accurately is hard. Neither company gets it spot on all the time, but this weird internet meme that only Apple's Maps ever get it wrong suggests that either TomTom are deliberately selling Apple a dodgy database (which seems extremely unlikely), or the media are being ridiculously selective about their memories. Google's mapping data wasn't exactly stellar during its first few years either.
"Both Google and Apple just seem to buy up tech these days. Do they make anything themselves?"
Why the hell would you piss money away on reinventing the wheel when there's a perfectly good wheel over there with a "For Sale" sign stuck to it?
When did "Not Invented Here Syndrome" become a desirable business policy?
Re: Bet Apple have already gotten around this in advance...
"The charge brick is that little doodad with the USB A slot you plug the cable into. I'm pretty sure you got one in your box, everybody else did..."
Actually, Apple have been selling mobile devices without PSUs for some time now. In fact, I'm not sure even iPads come with them any more, though my original iPad did as it required a higher output than standard USB ports could provide at the time. Their laptops do, obviously, but most people just charge their Apple mobile devices from the nearest handy USB socket. Same as everyone else.
This has been the case for nigh-on ten years now. Hence the "WTF?" reactions from Apple customers to The Register's click-bait headline. You can buy separate chargers if you want – and, yes, Apple will charge a hefty mark-up, as is their wont* – but the iPhone and iPods all come with a Lightning (or, for some older models, 30-pin) cable that has a perfectly standard USB A plug at the other end.
As for "overpriced" chargers: I have a twin-USB charger – a Belkin one, I think – I use for my old iPod classic and my iPhone 4. Works just fine. And cost a whopping, usurious, er... €7.99. Including two cables.
For the life of me, I have no idea what the article is actually trying to say here. The only part the EU can justify legislating on is the connector type and power output on the transformer itself. Most companies – including Apple – have already standardised on a USB socket for that, though the power output varies quite wildly due to the rather obvious fact that manufacturers tend to use batteries of differing capacity depending on context.
(E.g. the iPad 3 cannot charge from a standard laptop USB socket unless it's in standby. Switch it on and the power flowing into the device isn't enough to allow it to operate and charge at the same time. Hence the Lightning connector, which can also talk to USB sockets on Apple's own computers and request a higher wattage when the computer itself is connected to the mains.)
Legally mandating a specific socket on all mobile devices themselves effectively limits their design, and I can't see that going down well with any manufacturer, let alone Apple. This is also a road the EU really doesn't want to go down given the rapid pace of change in IT. The iPhone itself is barely six years old; the iPad is less than 4. Who knows what's coming next? If we move into wearable technology, or flexible screens, do you really think manufacturers will want to be tied to a (relatively) chunky connector design?
* (Last time I checked, an official Sony PSU wasn't cheap either. Neither were Samsung's. But, as with Apple kit, there's no shortage of respectable third-party alternatives that cost a lot less.)
Re: The world does not revolve around Apple
Re: About as stupid as ...
You do realise all those programming 'paradigms' that go in and out of fashion every ten minutes are just some random tosser's opinion, right? There's no law engraved in stone that requires all programming languages to support every damned fashion under the sun.
That's the mistake C++ made: it's trying to be all things to all programmers, and fails quite spectacularly at doing the one thing that is required of all programming languages: to be human-readable.
No CPU I've ever used gives a flying toss about templates, classes, lambdas, etc.; OOP, Functional Programming, and so on, are all just so much structural scaffolding that is frequently so poorly designed that it gets in the way of the code itself. Such scaffolding has no place in the programming language itself and should have been shifted into the IDE UIs, where it belongs, a long time ago.
Re: My how far NASA has fallen :(
NASA's funding has never, ever been anything more than a rounding error compared to the staggering sums of cash hurled at the military / defence industries over the same period.
Politicians do so love a good, lucrative, distracting war. It's practically a tradition.
As for the BBC: there's a myth that only the UK has a TV License. This is not only untrue, but many EU countries have both a TV license and adverts on their publicly funded TV stations. Oh, and those licenses are often more expensive too.
Seriously, try Italian TV sometime. It makes the BBC look like HBO and Netflix combined! RAI used to be pretty good at home-grown content, but Berlusconi's machinations put paid to that: 99% of RAI's output today consists of cheap talk shows, archive clip shows, and the like. Dramas are almost entirely imported (and dubbed), while the very few (admittedly quite decent) home-grown dramas tend to be the usual detective / cop show variety based on existing novels. So hardly a creative stretch.
The French and Germans are a little better, but the BBC is, as far as I'm aware, the only European TV broadcaster to have no advertising at all, regardless of funding.
I can only assume you've not noticed the new Mac Pro.
It's the perfect solution to the "Apple should ALSO make a mini-tower!" demands over the years: you can add as much expansion as you damned well please, and you don't need to pay for a computer the size of a hotel mini-bar fitted with a power-hungry PSU and cooling system that both have to be designed for the maximum potential load, regardless of whether you ever intend to expand it at all.
Yes, it's pricey, but quality kit usually is, regardless of the label. Take a look at the pricing for Intel's Xeon CPUs and those two AMD graphics cards, not to mention the PCI-e flash storage. Good luck building an exactly equivalent PC, with the same expansion ports, for anywhere near the same price. And no, a SATA-6 SSD doesn't even get to see the mustard, let alone cut it.
Traditional tower cases have barely changed in nigh on 30 years, so this really does count as proper "innovation" too. It'll be interesting to see if it does well, or ends up as another G4 Cube.
For those pointing out that the lyrics are different: song = lyrics + music. Without the music, all you have is poetry.
However, what Goldiblox have done isn't "parody", because they haven't actually changed the intent of the original song. All they've done is change a more subtle lyric into a 9lb. lumphammer that bludgeons the 'message' into the listener with all the grace and elegance of being repeatedly struck in the face by a Steinway grand piano.
Had they stuck with the original lyrics, the ad would have actually worked better, with the visuals clearly contrasting with the words. But the Beastie Boys made it clear years ago that they did not wish to have their music used for such purposes. At the very least, Goldiblox's failure to simply ask another artist – it's not as if the Beastie Boys' song is the only one ever written on the subject – instead of violating the last wishes of a dead man speak volumes about the company's management.
(And I speak as someone who was never into the Beastie Boys, so I'm not exactly biased here.)
Re: No, Liam, I won't be using a fondleslab as my primary computer.
"Windows tablets [...] gave up on having a proper multitasking OS for that bullshit 8.11 for Fondlegroups "two things at a time, tops" crapfest."
You've clearly never actually used a Surface Pro then. See that tile that looks like desktop wallpaper? There's your WIMP GUI right there, same as it always was. Everything Windows 7 can do, Windows 8 can do too. In some instances, it even does it a bit faster. You can even get two types of cover with integrated keyboards in them. I'm writing this on a three-year-old MacBook Pro, but even I'm seriously tempted by a Windows 8 (not "RT", which is definitely too half-baked) machine. It's a solid OS that does everything Windows 7 can do. And it even has a nice, shiny, app launcher. Granted, the latter doesn't appear to be to everyone's taste, but I found the old menu system a pain to use – mice are a major cause of RSI problems in a way trackpads aren't.
What we're seeing isn't the death of the WIMP metaphor, but its sidelining into niche markets as the vast majority of computers these days really aren't being used for much more than email, Facetwit and web browsing.
I'm a mild-mannered translator by day and have found my iPad invaluable for dictionaries and reference works. In my business, that's not "consuming", it's a real, actual, bona-fide work tool. When I upgraded my old Mk. 1 to a Mk. 3 iPad, it paid for itself in just under two weeks. I haven't bought a printed book or magazine since my first iPad, in 2010, and I can't say I miss them.
Liam's one and only mistake with his article was in not adding "Your mileage may vary" at the end.
Re: I'm always surprised at the naivity of people
"I mean seriously, what do you expect to happen if you download software the creator refuses to give you the source code? Why would anybody keep the source code from you other than wanting to defraud you?"
What earthly use is the bloody source code to someone who has no clue about programming?
People who can read source code and understand what it does – and who also happens to have lucked-out in becoming expert in the same programming language(s) as the original developer – are unlikely to be ignorant enough to install such malware in the first place!
However, NOBODY can be an expert in every field of human endeavour. IT is just one field among many. How would you like to be told that you got exactly what was coming to you every damned time your ignorance of a particular subject betrayed you? How would you like it if every time you failed to make a multinational corporation compliant with the likes of Sarbanes-Oxley and ISO 20001, you saw someone pointing and laughing at your ignorance and calling you a "n00b"?
So much for your accusation of naïveté: We are ALL ignorant. We're just ignorant about different things.
Most people don't want to build their cars from scratch, nor do they particularly care how they work. They'll happily buy a Ford Fiesta, or a BMW, or whatnot, and simply drive the thing. All cars share one common feature: their core user interface. Some details will change from car to car, but if you've learned to drive in a Vauxhall Astra, it's a fair bet you can work out how to drive a FIAT Punto or any other make and model of car built since the 1950s.
For every James May, who could cite chapter and verse from the relevant Haynes manual for each car, there are a hundred Jeremy Clarksons, who couldn't give a toss how the bloody machine actually works. Yet most developers still believe everyone who has any contact at all with a computer should be like James May.
The IT industry has moved on quite a bit since the 1960s and '70s.
Open Source has become an anachronism. It is very much part of the problem, not the solution. Forget GNU, Stallman and the FOSS movements: they're yesterday's causes. The problem today isn't source code, but interfaces.
Not just in the software, but across the entire chain – from box art to silicon chip, from API to documentation – it's all about interfaces, not code.
End users should not be required to read complicated EULAs to determine whether the code they've downloaded actually does what it says on the tin. Why shouldn't they be able to pay for virtual gatekeepers to screen such things on their behalf? This is exactly why companies like Apple and Amazon have opted to provide such "gated communities" for their users.
Developers – and the IT community in general – really have only themselves to blame for this: you'll have massive flamewars over trivialities like tabs vs. spaces, while criticising the poor bloody users who have to put up with the ill-designed, barely usable, and barely-supported tripe you expect them to learn how to use. And then you think nothing of bundling in someone else's crap with your "free" software, because your definition of "free" isn't the same as the one in the dictionary.
The IT industry's problems aren't Apple's, Google's, Microsoft's or anyone else's fault but yours. You've had half a century of power, but you've chosen to ignore all the responsibilities that come with it. It's time that changed.
 There is a veritable Babel of programming languages out there, and merely reading some books and tinkering about with each of them does not make you an expert.
 This will come as a shock, but some of you clearly haven't understood what the "I" in "API" actually stands for. Or the purpose of good documentation. Similarly, a published data format is also an interface. Interfaces are everywhere.
 Google Play is the only "walled garden" out there. It has gardeners who react to problems after they've happened, not gatekeepers who stop the problems getting in in the first place.
Re: I will get one because....
'But if I purchase a PS4 I can expect the wife to say "so what does it do different?"'
May I suggest ask your wife why she is fine with spending hundreds – if not thousands – of quid on shiny lumps of rock artfully nailed into equally shiny bits of metal, or ooh-ing and aah-ing over a pretty arrangement of vegetable genitalia... yet she believes YOU are the shallow one for wanting to spend a few hundred quid on a new home entertainment* centre that actually does something besides merely looking shiny.
* a concept that also includes games. There's no need to make a distinction.
Re: A fine line between Vision and Arrogance
"Never quite understood whether the new user interfaces being foisted on us are because *we* (the punters) lack vision to understand it, or *they* are just being arrogant and treating us like cash cow cattle."
It's the former. Windows 8.x replaced the rather weird Start Menu with a proper app launcher, which also runs big widgets. All the keyboard shortcuts – which everyone calling themselves a seasoned veteran or professional should know – are unchanged. ALT+F4 will close a ModernUI app just as it closes a conventional Windows GUI app, for example.
There are textbooks on this. Many of them written as far back as the late '60s and early '70s, when the R&D phase for the WIMP desktop metaphor we still see on desktop GUIs today was still in its infancy.
That the above is clearly a surprise to many so-called "professionals" is shocking to me; it was very basic stuff when I was studying Computer Science in the 1980s.
@Buck Futter, Tannin, et al:
Many people seem to have a problem with Windows 8.x, but I've actually found it's easier to get newbies into it than it was with previous versions. Rather than presenting you with a pretty picture and some cryptic icons, it actually starts with an application launcher that shows a bunch of very clear tiles, each of which tells you what it does and even gives you some basic information before you've even clicked on it.
As for myself: according to every WIMP GUI rulebook, the GUI is there for *newbies*. Nobody else. Intermediate and advanced users are supposed to learn the bloody keyboard shortcuts!
If, like me, you had done just that, Windows 8.x would pose no difficulties whatsoever. Want to close an application – or even bring up the shutdown dialog box? ALT+F4. Each new release has added new shortcuts, but many of the existing ones have been there since Windows for Workgroups!
The problem is that nobody's teaching this any more. When so-called "professionals" proclaim themselves grizzled veterans with umpteen years of expertise in a platform, yet admit to being bamboozled by changes to what is, when you get down to it, a glorified app launcher, you have to wonder what they're teaching kids at university these days.
Such people are, at best, amateurs, not professionals. Their blatant ignorance of basic GUI usage rules is proof enough of that. If you're still relying heavily on a mouse or trackpad to get your quotidian work done, and you're not an artist or architect, you're doing it wrong. By definition. There are actual textbooks explaining all this.
That tiled GUI really is piss-easy for neophytes to understand. It's easy to forget that we had to *learn* to navigate the (original hierarchical) menus and drill down to our application – never mind having to remember *which* application we needed to open! Now, my aunt need only look for the "Mail" tile, see that there's a message or three waiting for her, and click on it. It's all there right in front of her. And this is a Good Thing™ as it means she needs to rely rather less heavily on her failing memory.
iOS and Android – hardly surprising given the former's influence on the latter – led the way, but Windows' ModernUI picked up the widgets idea and ran with it, making it the central feature, but Trevor Potts' point about separating this from the old Windows GUI is a valid one: Windows 8.x is very much a transitional release, and it's likely Windows 9.x will be too, given the glacial pace of upgrading in the corporate field.
iOS was the first mainstream GUI to break with the old WIMP formula, so the keyboard shortcuts point doesn't apply to that. (Or to Android.) Microsoft also needs to make that transition, but whether beating Windows into submission with the multi-touch GUI stick over a number of transitional releases is the best way to achieve that is a question only the market can answer. In fairness, Windows 8.x is a pretty good choice for people, like myself, who have to do a lot of typing. Some of the hybrid Windows 8 "tabtop" devices out there are a perfect fit for my needs.
Wacom's Companion Pro (essentially a Wacom digitiser and stylus nailed onto a tablet very similar to the Surface Pro 2) is looking very attractive to me right now. It's flashing its ports seductively at me as I type this. Cease, you Jezebel! You tablet of the night!
Nurse! The screens!
I agree with part of the article: Steve Jobs was exactly what Apple needed in their hour of need, but that's largely due to his previous involvement with the company. The only equivalent for Microsoft would be the return of Bill Gates, who has already made it clear he's not interested in retreading old ground. (For all Jobs' later success, Gates didn't need to mess it up and spend years in the wilderness to learn the necessary skills. Gates nailed if first time around.)
However, I disagree with the tiresome repetition of a pointless meme: what Apple has is a *gated community*. It's not the walling-in that's the point here, but the *curation*. Android has barely any curation at all, hence its frequent security issues. iOS' App Store, on the other hand, *is* curated, which is less like a gardener wandering around a walled garden and occasionally reacting with an, "OI! Gerroff the lawn!", and more like the guards of a gated community who stop undesirables getting in in the first place. (No, they're not 100% successful, but they're close enough.)
What Microsoft needs is *focus*. It is making a mistake Apple was making in the mid-90s: it's doing too much. They sell no less than three flavours of desktop Windows, each with multiple variants. They sell server variants too. They make a games console (also with its own OS), they make games, they sell a major office suite, own a bunch of cloud services, and they sell industrial-strength software development tools too. They even make keyboards and mice.
Jobs was right to slash Apple's massive, and very confusing, product range when he returned: focus is a common factor in very successful businesses. You can't do that when you have a portfolio even wider-ranging than Apple's under Gil Amelio's tenure.
Unlike Jobs' scorched Apple policy, Microsoft could slash its portfolio by simply spinning off the profitable units into separate entities. Microsoft needs to make itself agile enough to react more quickly and effectively to the ever-changing world of IT – an industry that is, almost by definition, in a state of perpetual transition.
THAT is the hard part: changing Microsoft's management and corporate structure entirely. Given the present corporate structure, it may be that what Microsoft really needs today isn't so much a Steve Jobs, as a Genghis Khan.
Re: The Goons, really?
The original 23rd November 1963 broadcast was indeed followed immediately after by an episode of the Telegoons ("The Canal" episode).
The first episode of Doctor Who was also repeated the following week, on the 30th of November, due to the assassination of JFK on the 23rd overshadowing the first broadcast. The next episode of the Telegoons was not broadcast until the 7th of December – skipping the 30th. I can't find any scans of the Radio Times for the 30th though, so I'm only speculating that it was likely a casualty of the decision to show the repeat.
Re: 80's Doctors
Tom Baker had a few great episodes, but his theatrical acting style hasn't aged well. And there were a hell of a lot of duds too, not to mention an over-reliance on 'homages' to old gothic horror movies.
In fairness to Michael Grade, I think Colin Baker's portrayal suffered mainly from coming after Peter Davison's. Colin gave it his all – a little too much so – but the problem is that he and Tom Baker both have a strong theatrical acting background and it *really* shows when watching their stories on DVD. They both come perilously close to channeling Brian Blessed at times.
Peter Davison was very much a TV actor first and foremost and understood the medium's strengths and weaknesses. He gave a much more subtle, nuanced, portrayal. Casting Colin felt like a step backwards. Despite some decent episodes – I'm very partial to "Vengeance on Varos" and "The Two Doctors"* – Colin's tiresome schtick of repeating the same word three times, each louder than before – "Hammy? Hammy?! HAMMY?!!!—grated very quickly. Davison basically made the two Bakers' theatrical acting style obsolete.
McCoy was an inspired choice though, so it's a shame he was saddled with a bunch of scripts originally written for Colin Baker's portrayal for his first season. (And some very odd choices of scripts for the second.) His final season, on the other hand, holds up pretty well despite series' shoestring budget and the BBC's utter indifference to the series itself.
All that said, I'm still amazed the series survived the casting of Bonnie Langford. Poor Colin Baker. He never had a chance.
* (Patrick Troughton and John Stratton were clearly having a whale of a time, and we also got Jacqueline "Servalan" Pearce thrown in as well.)
There's a good reason why Apple kit tends not to have much upgradeability: eBay.
Apple hardware tends to hold its price very well – especially at the "pro" end. My 17" MBP (2010 model) has 8GB RAM and a 512 GB SSD. It screams. And it's still worth over £600+ on eBay, despite being closer to its 4th birthday than its 3rd.
I've yet to see the HP, Dell or Lenovo laptop that can boast the same.
Many owners of Apple kit are well aware of this and tend to effectively trade-in their old model for a new one every couple of years. Why the hell would they waste their time going through all the bother of upgrading?
(Before you reply: being a reader of The Register does not define you as a professional computer user, any more than being able to strip down and rebuild a V8 engine makes you a "professional" car driver.)
Re: As an ex- tech pubs* guy...
I used to write docs myself.
My niche was the game development tools industry: I wrote the user guide for Criterion's "Renderware 3" / "Renderware Graphics" (for which I can only apologise), and also the docs for the first few releases of Allegorithmic's "Substance" suite. I did some odds and ends for the Unity folks too some years ago. It's amazing how easy it is to find shockingly overpriced documentation tools that make even Emacs look like a simple, elegant editor by comparison.
"The engineers (as noted) hate documentation because they KNOW that their products are brilliantly intuitive and so NEED no documentation;"
This. Oh, so very this. (I lost count of the number of times I had to remind some of my colleagues what the "I" in "API" stood for. Developers can be end users too, so even an API should have basic UI design rules applied to it.)
Thing is, writing documentation really is a truly thankless task. Everybody hates what you do and considers your entire job pointless. End users have become so used to manuals being either missing, or abject shite, that they assume it's safe to expect this to be true of all applications. Not only do they never read your 500+ pages of bloody hard work, but they'll actually be surprised such a source of information even exists. And your own colleagues also see you as some kind of parasitic life form whose job appears to be ask them to explain what they consider blindingly obvious. There is nobody so blind as an expert.
Good technical authoring is bastard hard to do. Not only do you have to become an expert in the entire product you're documenting, you also need to be able to explain it to your readers without overwhelming them with information. Just enough background information – plus links to more in-depth info – and no more. And complex software can have features that rely heavily on other features too, so you need to organise the teaching to ensure you have full coverage.
Not everyone can do it effectively, despite many developers' belief in the contrary. Frankly, most developers I've met—include many with "Ph.D." after their names—seem to be either flat-out illiterate, or insist on making everything read like a particularly obtuse scientific paper. They sure as hell don't understand how to teach well.
Never mind that the greatest feature in the world is of no f*cking use to anybody if they can't work out how to use it. (Ironically, this is precisely how Apple have crawled out of near-bankruptcy to the top of the IT heap: they truly grok user education.)
I got the hell out when I realised that most of my potential clients had begun to see support as a chargeable feature, turning it into a revenue stream: a decent manual suddenly becomes a bad idea as it reduces support calls and, consequently, revenues from that particular stream. (Indeed, this appears to be how most GNU / FOSS applications are actually supported financially: write something that's genuinely useful, but give it a cryptic UI that effectively forces your customers to pay you for training and support and you can make a decent profit.)
I do translation now, which is a whole fresh hell of WTF in its own right, but at least people actually appreciate your work. They also don't tend to claim that "it's just writing! Anyone can write!"
Re: Really really basic computers
Procedural languages, Functional languages, OOP, etc., are mostly just different kinds of organisational scaffolding. No mainstream CPU actually gives a toss about any of that stuff; it all ends up as machine code in the end. Often on an Intel or ARM CPU, neither of which care one whit about whether you like to organise the original code in your source files as subroutines or as objects. Same meat, different gravy.
None of that stuff matters.
What matters is understanding how computers "think", because programming is just a synonym for "translation" and is actually pretty easy to learn at that level. I was far more productive coding Z80 or MC680x0 code in assembly language than I ever was writing in C++. I used to be proud of writing bug-free code, and it really *was* bug-free. But those days are long gone. The hardware has become orders of magnitude more powerful and capable, but the tools we use to program it all have barely changed since the flint axes of the 1970s.
It's 2013 and we're *still* writing code using artificial languages that require us to walk on eggshells due to their cruel and unusual punctuation. My *phone* can render a Mandelbrot set in real-time and run full-on 3D First-Person Shooters at HD resolutions, and yet we insist on forcing *humans* to do stupid grunt-work like adding a semicolon at the end of a line to save the compiler from a picosecond of calculation? How the hell is this even acceptable? How is this not front-page news in The Register and all its rivals? THIS is the scandal of our time.
No wonder today's software comes wrapped in legalese instead of warranties.
Re: Storage indeed
"No SD card in the nexus, but at least you can use a USB stick, or 20, or a USB microSD card reader if you like..."
So, er, just like an iPad with the Camera Connection Kit then?
No, it doesn't have a MicroSD slot built in, but most users don't need it. Why compromise a product's design and usability* for the sake of a relatively tiny number of edge cases?
If you want a tablet with stacks of storage, buy a Windows 8 model instead. Some of those come with 256 (and even 512 GB) of storage.
* (removable storage is a royal pain in the arse from a UI perspective. There's a bloody good reason why Apple went with software-eject systems for their Mac floppy disk drives back in the day.)
Re: Many tools for the job?
"and Microsoft isn't going to fund actual education."
Black & Decker aren't going to fund metalwork or woodworking classes either.
It's Microsoft's job to supply the tools. Anything else they do is icing on the cake. It's the school's job to provide the actual 'education' part. By offering industry-standard tools that students will actually encounter in the real world at a massive discount (and effectively for free for many students), Microsoft are reducing the total funds required for that education.
Sure, you could use Libre/OpenOffice, but it simply isn't that good. If it was, businesses would have standardised on them long ago. They are, after all, free. If you can't even gain market share by giving your product away for nothing, the problem isn't your competition. It's you.
Support counts for a lot. As does (relatively) easy customisation and extensibility, as well as a huge ecosystem of third-party applications that plug into the Microsoft Office suite. Companies like SDL, who create translation software, rely on MS Office being installed to handle the preview feature for Word-supported file formats, for example.
Libre/OpenOffice (as well as Apple's own "iWorks" suite) originally competed with Microsoft Works, not Microsoft Office. Sadly, Microsoft axed Works a long time ago, but it seems the *Office communities haven't really understood what it is that makes businesses so willing to pay licenses for Microsoft's products regardless. It's not merely inertia.
Re: If you get them young and you will have them for life
"should openly promote FLOSS because it's the right thing to do."
Wrong answer: Discrimination is wrong no matter which side you discriminate against. Either way, it's still discrimination. The correct answer is to demand that schools use both. It's not as if installing LibreOffice would add much, if anything, to the costs, and both suites can use the same file formats now. It'd also be a lot cheaper for students, although Microsoft's offering goes a long way to help there.
Software is a damned tool. I don't need to know how a hammer works if all I intend to do with it is hang some pictures on a wall. Open Source only has value if there is sufficient interest and competence in your target market to understand it and work with it. It's worth pointing out that this stage that, when the GNU movement first began, there were a damned sight fewer programming languages and philosophies around too. Today, there's a veritable Babel of such languages, with new ones seemingly being invented every other day, so the chances of your audience actually being sufficiently competent in the language(s) you're developing in are shrinking, not growing.
Open data formats are far more useful and valuable than mere access to source code. What matters is that my data remains accessible if development on the tool that created it should end. Given that both MS Office and the Open/LibreOffice suites can both read ODF files now, this is no longer an issue; all schools should have to do is teach the value of open data formats.
Re: even my kids with learning disabilities can manage that
It helps if you check the anchor and text-wrap settings for tables and pictures. Once I understood what each of those settings actually did, I had no further issues.
That said, Mr. Stross needs to learn that he should use the right tool for the job. MS Word isn't designed for writing novels. It's a *corporate document* creation tool. (Which I use frequently and have never had any problems with. I translate for a living, so MS Word is unavoidable: unlike most professions, translation has no concept of a "standard" document format, so I have to be able to deal with DOC, PDF, XLS, CSV, AI, InDesign files, PO files and more.)
For book design and development, I use tools like Ulysses III or Scrivener. Both are highly recommended and a far better fit for writers and novelists than MS Word. The only reason for using MS Word as an author is either wilful* ignorance or masochism. There's no need for it. I know professional novelists who swear by Scrivener, for example.
I believe Scrivener has a Windows version now. I'm not sure about Ulysses III; I suspect not. Then again, even Screenwriter 6 has a "Novel" template now, though it's mostly aimed (not surprisingly) at screenwriters.
* (Google exists, Mr. Stross. A novelist whining about how hard it is write with MS Word is like an architect whining about how hard it is to design buildings in Logic Pro X. All you do is make yourself look like an ignorant twit.)
To be fair, such a digression could easily fill another 3-4 articles on its own.
Magnetic recording triggered the birth of the technique many (incorrectly) refer to as "sampling" today. (The term is "sample loops" or just "loops" – "sampling" refers to the process of recording the actual sounds digitally.) The looping of those sampled sounds underly almost all music today. It's most obvious in the work of Norman Cook, but even 'live' acts use it as a matter of routine now in their studio recordings.
This technique was first used in the 1940s and led directly to the "Musique Concrète" movement that was so influential on the likes of the BBC Radiophonic Workshop. Most of their output during the 1950s and '60s was Musique Concrète, albeit often with very early synthesised sounds added.
The term "loop" came from these pioneers: they had to do it the hard way, by recording sounds onto tape, re-recording them as many times as required onto another tape – at different speeds, in order to get the necessary notes – then chopping the resulting tape up into individual notes. (This was all worked out mathematically, hence the maths or engineering backgrounds of many of those involved.)
These individual snippets of sound – bass lines, rhythm / percussion cycles, even short sections of melody – were then joined together to form the necessary tape loops needed to produce the final piece. For example, the percussion sequence might be quite short and result in a fairly short loop of tape that could be repeated over the full track. (Most music is filled with repeated patterns and motifs. Listen to any dance track and it's pretty obvious, but you'll hear it even in Beethoven and Bach.)
A complex piece might require multiple, very long, loops of tape, all played in synchrony, with some loops so long that the team would have to jerry-rig a system of reels and pulleys to run the tape out of the machine, out of the door, down a corridor, and all the way back again.
Multiple tape machines were used for this. Even "bouncing" tracks is a term descended from this era.
Digital audio workstations ("DAWs") hide all the fiddly mathematics and tape editing today, but the underlying concepts and principles haven't changed.
Most of this can be gleaned from the BBC's own (belated) celebration of the defunct BBC Radiophonic Workshop: The Alchemists of Sound It's a shame the BBC never expanded it into a series covering the wider context of electronic music in general.
Re: @Sean Timmarco Baggaley - "Yet he, too, chose to stay and work with Jobs."
Seriously? Tim Cook – whose core skill set is in optimising supply chains – has "non-transferrable skills"? Jony Ive had his own UK-based design company ("Tangerine") before he moved to California to work for Apple. With his reputation, I doubt he'd struggle to find clients if he decided to strike out on his own again. So why didn't he do that?
Quite a few people did leave Apple. Some of them came back. How can that be if Jobs was such a git?
Jobs' own friends have described him as "mercurial", but that's hardly a unique trait in a CEO. You have to be pretty ruthless to turn a company from a near-bankrupt basket case into the most successful business on the planet in just ten years.
Most of the people who give the keynotes at Apple launches today could have left at any time – the likes of Ive and Federighi could easily write their own tickets given their track records – yet they haven't done so. They have chosen not to do so. Jony Ive wasn't even a Jobs hire: he was already at Apple long before Jobs returned and had never worked as an employee of either NeXT or Pixar. So you can't claim cronyism either.
So, again: if Jobs was such a nasty piece of work, why the hell did so many people who could have easily walked right into a new job (or set up on their own) choose to stay and work with him?
Answer: he wasn't as big an "asshole" as he's made out to be – usually by people who never actually met him, or spent any substantial time with him. Jobs was certainly a control freak and a perfectionist, so it's not difficult to see why he hated doing anything that he could not have any control over. He practiced some of those keynotes for weeks.
Reporters and journalists tend to be interested in people, but Jobs didn't talk about himself much. He was no Richard Branson, whose entire career has mostly involved blowing his own trumpet and pimping his "Virgin" brand. The latter gave good interviews, but Branson is unlikely to go down in history as anything other than a famous, self-publicising beard who got very lucky and milked it for all it was worth.
You might want to read the Techies with Asperger's article elsewhere on the site before making such comments. There's strong evidence that Jobs had mental health issues and I suspect that those who hated him were mostly people who couldn't make allowances for that. Even as a child, he tended to think digitally: everything was either "awesome" or "shit". There was never any middle ground. So those traits were there right from the start.
Asperger's (or some other part of the Autism spectrum) would certainly explain a lot, as would OCD.
Apple has been the biggest contributor to the "Product Red" charity for some time now, so the notion that Apple never donated any money at all under Steve's watch is patent nonsense. Jobs' estate has also pointed out that they preferred to donate anonymously.
Jonathan Ive, Tim Cook, and many, many others at Apple have had plenty of opportunities to leave the company and either go work for others, or set out on their own. (Some have, in fact, done precisely that, as a .look at Woz's full CV will attest.) Furthermore, some Apple people left, then willingly chose to return.
Let me repeat that, in case you missed it: these people chose to work at Apple with Steve Jobs. Jobs was in charge of Apple for 13 years or so. Jony Ive was hired in the early '90s – long after Jobs' initial departure and some years before his return. Yet he, too, chose to stay and work with Jobs.
Thirteen years is plenty of time for all those people to update their CVs and send them out. So what stopped them? Did Jobs hire heavies to point guns at their heads? Did he kidnap their families? If Jobs was truly the complete arsehole he's often made out to be by his detractors, why didn't all those people who worked for him leave?
The man clearly inspired a surprising amount of loyalty from his friends, so he must have had something going for him. He was even married to the same woman for 20 years, with whom he'd had three children, when he died. If he was that hard a man to work with, how did that happen? When they married in 1991, Pixar's first success was still four years away and NeXT was hardly a money-spinner either, so Laurene Powell certainly didn't marry him for his money.
Re: "Sorry" is the hardest word.
No, this is what happens when a publication has more than one writer working for it. Different writers = different perspectives.
Also, some forms of what was once known as Asperger's Syndrome do are linked to high levels of paranoia and destructive behaviour. The term "ASD" was created for a reason and this is one of them. A person closer to the High Functioning end of the ASD will be able cope with "normals" with a bit of help. As you get closer to the other end of the ASD, you get people who simply cannot function at all in ordinary society. Such people are often completely isolated from the people around them and, yes, paranoia and destructive behaviours are not uncommon.
And this is a "spectrum" – think of it as like a timeline, with one end of the line being "normal" and the other end labelled "completely hatstand". People with Asperger's Syndrome tend to be closer to the "normal" end, but the boundary between "Asperger's" and full-cream "Autism" was never satisfactorily and unanimously defined. Hence the replacement of the multiple variants with just one: Autism Spectrum Disorder.
Describing McKinnon as "paranoid, unreliable and destructive" would be quite valid if his diagnosis puts him closer to the full "hatstand" end of that X-axis. The worse your Autism, the harder it is to cope with other people in any way at all – even your own parents.
An appearance of paranoia is not uncommon as you move away from the 'high functioning' end of the spectrum, but you need to understand that we typically apply such descriptions based purely on outward behaviour; it can be difficult to tell if what we call "paranoia" is simply a manifestation of a more general, deep-rooted, terror of that constant torrential flood of data an autistic person is faced with during their every waking hour. Similarly, what an observer would consider "destructive behaviour" might have a perfectly rational explanation for the sufferer.
If any readers here have never read Oliver Saks' seminal "The Man Who Mistook His Wife for a Hat" and related books, I strongly recommend you do so.
The "emotionless = violent psychopathic killer" connection is made by people (and TV and movie writers) who appear incapable of understanding that even anger is itself an emotion.
Why would someone who doesn't react much to any emotion decide to suddenly make an exception and explode with rage and fury? It's not that emotions aren't there, it's just that they're felt nowhere near as intensely, be it joy or sadness, love or hate.
Such people tend to do well in jobs where what they have to do would turn any other person into a gibbering emotional wreck.
How you express your emotions is a far better indicator how violent you're likely to be. If you're someone who feels every emotion intensely, you will feel both joy and sadness, love and hate intensely. If you're also prone to bottling-up your emotions and venting only when that metaphorical bottle is full to bursting, you're likely to be far more dangerous than anyone on the Autism Spectrum.
I suspect that such emotional issues are more closely linked to current theories on depression and related disorders rather than the data-processing / management problems characteristic to autism. (Indeed, there's no reason to assume someone cannot have both an ASD and, say, clinical depression.)
The mammalian brain is a complex machine – far more so than any computer. Yet we've had the latter around for about 60 years now and, despite a CPU being essentially a collection of transistors, the behaviour of each of which should be entirely predictable, we still struggle to write bug-free software to this day. It's not the hardware that's difficult, but the code that runs on it.
The theories and hypotheses we have on the brain itself are pretty much at a similar level: we have a pretty good idea what individual synapses and neurones are, but have barely scratched the surface of what it is they actually do all day.
Re: Aspies are 13 to the dozen.
Actually, as another poster mentioned earlier, the term "Asperger's" is no longer used.
Spectra and continua are increasingly being used in diagnoses, instead of the older, 'digital' approach that required ticking a very specific list of boxes to obtain a diagnosis.
The human brain is such a complex organ* that the very idea that "normal" people even exist is just bizarre. It's far more likely that people with fully-functioning brains are actually quite rare, while the rest of us range from the extremely knackered to the only slightly buggered. And that buggeredness could be anything, from a poor sense of direction, through to something like colour-blindness or a potentially terminal inability to read user guides (or see the bloody great "Help" menu right up there at the top of the window. Look! See? That! What do you think it's for?)
Asperger's is not – and never has been – a license to be an asshole, not least because "asshole" is a very subjective description and depends on your point of view. Most people seem to believe the late Steve Jobs was a right tosser, but Jony Ive, Tim Cook and their colleagues don't appear to have thought so.
However, "high functioning" sufferers can also learn to be sociable. Many things most "normal" people seem to be able to do subconsciously, such as reading body language in real time, is something we "data-processing impaired" have to concentrate on consciously – often to the point where we end up with splitting headaches from trying to process all the data.
I'm one of those who also struggle with noisy environments. I've found lip-reading helps, and also goes a long way towards countering the eye-contact avoidance problem too, but it often makes moi brain 'urt!
Which is why I hate parties.
* You, sir, have a filthy, filthy mind.
Re: What hasn't been mentioned....
'That would be an absolute nightmare for app developers.'
Tough. It's not the user's job to make life easier for the developer. It's the developer's job to make life easier for the user.
Android's APIs clearly need a serious rethink if this is such a chore for developers to deal with. iOS app developers have to deal with this kind of thing too and most do so without kicking up a big fuss. (It helps that the relevant iOS APIs are pretty easy to use. Perhaps Google should be aware that the "I" in "API" stands for "Interface" – i.e. developers need good UIs too!)
'How do you deal with an angry user who's blocked a fundamentally required permission for your app and then starts reviewing it poorly because "it doesn't work"?'
Oh, I don't know... how about being better at app design and development, catching the errors caused by disabled permissions, and failing gracefully with suitably clear messages and notices to the user explaining why a feature isn't working?
- The land of Milk and Sammy: Free music app touted by Samsung
- 20 Freescale staff on vanished Malaysia Airlines flight MH370
- The long war on 'DRAM price fixing' is over: Claim YOUR spoils now (It's worth a few beers)
- Dell thuds down low-cost lap workstation for
cheapfrugal creatives or engineers
- NSFW vid LOHAN chap hooks up with busty stratominx in cosmic pleasure cruise