* Posts by Sean Timarco Baggaley

1038 publicly visible posts • joined 8 May 2009

Note-jotting and mapapp firms VANISH into Apple's MAW

Sean Timarco Baggaley

Google would have me drive to Viterbo through no less than three small medieval villages, along the old medieval route. Said route includes multiple hairpin bends and roads that sometimes become too narrow for two small cars to pass each other, so you can imagine what it's like when HGVs try and do likewise.

Apple's maps, on the other hand, correctly sends you along the bypass built over 20 years ago to relieve said three villages. It even gets the names and numbers right, as well as showing the area in rather more detail than Google's photos do.

Mapping the Earth accurately is hard. Neither company gets it spot on all the time, but this weird internet meme that only Apple's Maps ever get it wrong suggests that either TomTom are deliberately selling Apple a dodgy database (which seems extremely unlikely), or the media are being ridiculously selective about their memories. Google's mapping data wasn't exactly stellar during its first few years either.

Sean Timarco Baggaley

"Both Google and Apple just seem to buy up tech these days. Do they make anything themselves?"

Why the hell would you piss money away on reinventing the wheel when there's a perfectly good wheel over there with a "For Sale" sign stuck to it?

When did "Not Invented Here Syndrome" become a desirable business policy?

Fanbois, prepare to lose your sh*t as BRUSSELS KILLS IPHONE dock

Sean Timarco Baggaley

Re: Bet Apple have already gotten around this in advance...

"The charge brick is that little doodad with the USB A slot you plug the cable into. I'm pretty sure you got one in your box, everybody else did..."

Actually, Apple have been selling mobile devices without PSUs for some time now. In fact, I'm not sure even iPads come with them any more, though my original iPad did as it required a higher output than standard USB ports could provide at the time. Their laptops do, obviously, but most people just charge their Apple mobile devices from the nearest handy USB socket. Same as everyone else.

This has been the case for nigh-on ten years now. Hence the "WTF?" reactions from Apple customers to The Register's click-bait headline. You can buy separate chargers if you want – and, yes, Apple will charge a hefty mark-up, as is their wont* – but the iPhone and iPods all come with a Lightning (or, for some older models, 30-pin) cable that has a perfectly standard USB A plug at the other end.

As for "overpriced" chargers: I have a twin-USB charger – a Belkin one, I think – I use for my old iPod classic and my iPhone 4. Works just fine. And cost a whopping, usurious, er... €7.99. Including two cables.

For the life of me, I have no idea what the article is actually trying to say here. The only part the EU can justify legislating on is the connector type and power output on the transformer itself. Most companies – including Apple – have already standardised on a USB socket for that, though the power output varies quite wildly due to the rather obvious fact that manufacturers tend to use batteries of differing capacity depending on context.

(E.g. the iPad 3 cannot charge from a standard laptop USB socket unless it's in standby. Switch it on and the power flowing into the device isn't enough to allow it to operate and charge at the same time. Hence the Lightning connector, which can also talk to USB sockets on Apple's own computers and request a higher wattage when the computer itself is connected to the mains.)

Legally mandating a specific socket on all mobile devices themselves effectively limits their design, and I can't see that going down well with any manufacturer, let alone Apple. This is also a road the EU really doesn't want to go down given the rapid pace of change in IT. The iPhone itself is barely six years old; the iPad is less than 4. Who knows what's coming next? If we move into wearable technology, or flexible screens, do you really think manufacturers will want to be tied to a (relatively) chunky connector design?

* (Last time I checked, an official Sony PSU wasn't cheap either. Neither were Samsung's. But, as with Apple kit, there's no shortage of respectable third-party alternatives that cost a lot less.)

Sean Timarco Baggaley

Re: The world does not revolve around Apple

WebKit?

Darwin?

Microsoft now using next-gen Roslyn C#, Visual Basic compilers in house

Sean Timarco Baggaley

Re: About as stupid as ...

You do realise all those programming 'paradigms' that go in and out of fashion every ten minutes are just some random tosser's opinion, right? There's no law engraved in stone that requires all programming languages to support every damned fashion under the sun.

That's the mistake C++ made: it's trying to be all things to all programmers, and fails quite spectacularly at doing the one thing that is required of all programming languages: to be human-readable.

No CPU I've ever used gives a flying toss about templates, classes, lambdas, etc.; OOP, Functional Programming, and so on, are all just so much structural scaffolding that is frequently so poorly designed that it gets in the way of the code itself. Such scaffolding has no place in the programming language itself and should have been shifted into the IDE UIs, where it belongs, a long time ago.

SpaceX beats off Bezos' rocket for rights to historic NASA launch pad

Sean Timarco Baggaley

Re: My how far NASA has fallen :(

NASA's funding has never, ever been anything more than a rounding error compared to the staggering sums of cash hurled at the military / defence industries over the same period.

Politicians do so love a good, lucrative, distracting war. It's practically a tradition.

As for the BBC: there's a myth that only the UK has a TV License. This is not only untrue, but many EU countries have both a TV license and adverts on their publicly funded TV stations. Oh, and those licenses are often more expensive too.

Seriously, try Italian TV sometime. It makes the BBC look like HBO and Netflix combined! RAI used to be pretty good at home-grown content, but Berlusconi's machinations put paid to that: 99% of RAI's output today consists of cheap talk shows, archive clip shows, and the like. Dramas are almost entirely imported (and dubbed), while the very few (admittedly quite decent) home-grown dramas tend to be the usual detective / cop show variety based on existing novels. So hardly a creative stretch.

The French and Germans are a little better, but the BBC is, as far as I'm aware, the only European TV broadcaster to have no advertising at all, regardless of funding.

El Reg's contraptions confessional no.3: the Apple G4 Cube

Sean Timarco Baggaley

Re: Sigh

@Bad Beaver:

I can only assume you've not noticed the new Mac Pro.

It's the perfect solution to the "Apple should ALSO make a mini-tower!" demands over the years: you can add as much expansion as you damned well please, and you don't need to pay for a computer the size of a hotel mini-bar fitted with a power-hungry PSU and cooling system that both have to be designed for the maximum potential load, regardless of whether you ever intend to expand it at all.

Yes, it's pricey, but quality kit usually is, regardless of the label. Take a look at the pricing for Intel's Xeon CPUs and those two AMD graphics cards, not to mention the PCI-e flash storage. Good luck building an exactly equivalent PC, with the same expansion ports, for anywhere near the same price. And no, a SATA-6 SSD doesn't even get to see the mustard, let alone cut it.

Traditional tower cases have barely changed in nigh on 30 years, so this really does count as proper "innovation" too. It'll be interesting to see if it does well, or ends up as another G4 Cube.

You gotta fight for your copyright ... Beastie Boys sue toymaker over TV ad

Sean Timarco Baggaley

For those pointing out that the lyrics are different: song = lyrics + music. Without the music, all you have is poetry.

However, what Goldiblox have done isn't "parody", because they haven't actually changed the intent of the original song. All they've done is change a more subtle lyric into a 9lb. lumphammer that bludgeons the 'message' into the listener with all the grace and elegance of being repeatedly struck in the face by a Steinway grand piano.

Had they stuck with the original lyrics, the ad would have actually worked better, with the visuals clearly contrasting with the words. But the Beastie Boys made it clear years ago that they did not wish to have their music used for such purposes. At the very least, Goldiblox's failure to simply ask another artist – it's not as if the Beastie Boys' song is the only one ever written on the subject – instead of violating the last wishes of a dead man speak volumes about the company's management.

(And I speak as someone who was never into the Beastie Boys, so I'm not exactly biased here.)

Inside Steve Ballmer’s fondleslab rear-guard action

Sean Timarco Baggaley

Re: No, Liam, I won't be using a fondleslab as my primary computer.

"Windows tablets [...] gave up on having a proper multitasking OS for that bullshit 8.11 for Fondlegroups "two things at a time, tops" crapfest."

You've clearly never actually used a Surface Pro then. See that tile that looks like desktop wallpaper? There's your WIMP GUI right there, same as it always was. Everything Windows 7 can do, Windows 8 can do too. In some instances, it even does it a bit faster. You can even get two types of cover with integrated keyboards in them. I'm writing this on a three-year-old MacBook Pro, but even I'm seriously tempted by a Windows 8 (not "RT", which is definitely too half-baked) machine. It's a solid OS that does everything Windows 7 can do. And it even has a nice, shiny, app launcher. Granted, the latter doesn't appear to be to everyone's taste, but I found the old menu system a pain to use – mice are a major cause of RSI problems in a way trackpads aren't.

What we're seeing isn't the death of the WIMP metaphor, but its sidelining into niche markets as the vast majority of computers these days really aren't being used for much more than email, Facetwit and web browsing.

I'm a mild-mannered translator by day and have found my iPad invaluable for dictionaries and reference works. In my business, that's not "consuming", it's a real, actual, bona-fide work tool. When I upgraded my old Mk. 1 to a Mk. 3 iPad, it paid for itself in just under two weeks. I haven't bought a printed book or magazine since my first iPad, in 2010, and I can't say I miss them.

Liam's one and only mistake with his article was in not adding "Your mileage may vary" at the end.

That toolbar you downloaded is malware? Tough, read the EULA

Sean Timarco Baggaley

Re: I'm always surprised at the naivity of people

@Christian Berger:

"I mean seriously, what do you expect to happen if you download software the creator refuses to give you the source code? Why would anybody keep the source code from you other than wanting to defraud you?"

What earthly use is the bloody source code to someone who has no clue about programming?

People who can read source code and understand what it does – and who also happens to have lucked-out in becoming expert[1] in the same programming language(s) as the original developer – are unlikely to be ignorant enough to install such malware in the first place!

However, NOBODY can be an expert in every field of human endeavour. IT is just one field among many. How would you like to be told that you got exactly what was coming to you every damned time your ignorance of a particular subject betrayed you? How would you like it if every time you failed to make a multinational corporation compliant with the likes of Sarbanes-Oxley and ISO 20001, you saw someone pointing and laughing at your ignorance and calling you a "n00b"?

So much for your accusation of naïveté: We are ALL ignorant. We're just ignorant about different things.

Most people don't want to build their cars from scratch, nor do they particularly care how they work. They'll happily buy a Ford Fiesta, or a BMW, or whatnot, and simply drive the thing. All cars share one common feature: their core user interface. Some details will change from car to car, but if you've learned to drive in a Vauxhall Astra, it's a fair bet you can work out how to drive a FIAT Punto or any other make and model of car built since the 1950s.

For every James May, who could cite chapter and verse from the relevant Haynes manual for each car, there are a hundred Jeremy Clarksons, who couldn't give a toss how the bloody machine actually works. Yet most developers still believe everyone who has any contact at all with a computer should be like James May.

The IT industry has moved on quite a bit since the 1960s and '70s.

Open Source has become an anachronism. It is very much part of the problem, not the solution. Forget GNU, Stallman and the FOSS movements: they're yesterday's causes. The problem today isn't source code, but interfaces[2].

Not just in the software, but across the entire chain – from box art to silicon chip, from API to documentation – it's all about interfaces, not code.

End users should not be required to read complicated EULAs to determine whether the code they've downloaded actually does what it says on the tin. Why shouldn't they be able to pay for virtual gatekeepers to screen such things on their behalf? This is exactly why companies like Apple and Amazon have opted to provide such "gated communities[3]" for their users.

Developers – and the IT community in general – really have only themselves to blame for this: you'll have massive flamewars over trivialities like tabs vs. spaces, while criticising the poor bloody users who have to put up with the ill-designed, barely usable, and barely-supported tripe you expect them to learn how to use. And then you think nothing of bundling in someone else's crap with your "free" software, because your definition of "free" isn't the same as the one in the dictionary.

The IT industry's problems aren't Apple's, Google's, Microsoft's or anyone else's fault but yours. You've had half a century of power, but you've chosen to ignore all the responsibilities that come with it. It's time that changed.

[1] There is a veritable Babel of programming languages out there, and merely reading some books and tinkering about with each of them does not make you an expert.

[2] This will come as a shock, but some of you clearly haven't understood what the "I" in "API" actually stands for. Or the purpose of good documentation. Similarly, a published data format is also an interface. Interfaces are everywhere.

[3] Google Play is the only "walled garden" out there. It has gardeners who react to problems after they've happened, not gatekeepers who stop the problems getting in in the first place.

PS-PHWOARRR: We review Sony’s next-gen PlayStation 4

Sean Timarco Baggaley

Re: I will get one because....

'But if I purchase a PS4 I can expect the wife to say "so what does it do different?"'

May I suggest ask your wife why she is fine with spending hundreds – if not thousands – of quid on shiny lumps of rock artfully nailed into equally shiny bits of metal, or ooh-ing and aah-ing over a pretty arrangement of vegetable genitalia... yet she believes YOU are the shallow one for wanting to spend a few hundred quid on a new home entertainment* centre that actually does something besides merely looking shiny.

* a concept that also includes games. There's no need to make a distinction.

Why Microsoft absolutely DOESN'T need its own Steve Jobs

Sean Timarco Baggaley

Re: A fine line between Vision and Arrogance

"Never quite understood whether the new user interfaces being foisted on us are because *we* (the punters) lack vision to understand it, or *they* are just being arrogant and treating us like cash cow cattle."

It's the former. Windows 8.x replaced the rather weird Start Menu with a proper app launcher, which also runs big widgets. All the keyboard shortcuts – which everyone calling themselves a seasoned veteran or professional should know – are unchanged. ALT+F4 will close a ModernUI app just as it closes a conventional Windows GUI app, for example.

There are textbooks on this. Many of them written as far back as the late '60s and early '70s, when the R&D phase for the WIMP desktop metaphor we still see on desktop GUIs today was still in its infancy.

That the above is clearly a surprise to many so-called "professionals" is shocking to me; it was very basic stuff when I was studying Computer Science in the 1980s.

Sean Timarco Baggaley

@Buck Futter, Tannin, et al:

Many people seem to have a problem with Windows 8.x, but I've actually found it's easier to get newbies into it than it was with previous versions. Rather than presenting you with a pretty picture and some cryptic icons, it actually starts with an application launcher that shows a bunch of very clear tiles, each of which tells you what it does and even gives you some basic information before you've even clicked on it.

As for myself: according to every WIMP GUI rulebook, the GUI is there for *newbies*. Nobody else. Intermediate and advanced users are supposed to learn the bloody keyboard shortcuts!

If, like me, you had done just that, Windows 8.x would pose no difficulties whatsoever. Want to close an application – or even bring up the shutdown dialog box? ALT+F4. Each new release has added new shortcuts, but many of the existing ones have been there since Windows for Workgroups!

The problem is that nobody's teaching this any more. When so-called "professionals" proclaim themselves grizzled veterans with umpteen years of expertise in a platform, yet admit to being bamboozled by changes to what is, when you get down to it, a glorified app launcher, you have to wonder what they're teaching kids at university these days.

Such people are, at best, amateurs, not professionals. Their blatant ignorance of basic GUI usage rules is proof enough of that. If you're still relying heavily on a mouse or trackpad to get your quotidian work done, and you're not an artist or architect, you're doing it wrong. By definition. There are actual textbooks explaining all this.

That tiled GUI really is piss-easy for neophytes to understand. It's easy to forget that we had to *learn* to navigate the (original hierarchical) menus and drill down to our application – never mind having to remember *which* application we needed to open! Now, my aunt need only look for the "Mail" tile, see that there's a message or three waiting for her, and click on it. It's all there right in front of her. And this is a Good Thing™ as it means she needs to rely rather less heavily on her failing memory.

iOS and Android – hardly surprising given the former's influence on the latter – led the way, but Windows' ModernUI picked up the widgets idea and ran with it, making it the central feature, but Trevor Potts' point about separating this from the old Windows GUI is a valid one: Windows 8.x is very much a transitional release, and it's likely Windows 9.x will be too, given the glacial pace of upgrading in the corporate field.

iOS was the first mainstream GUI to break with the old WIMP formula, so the keyboard shortcuts point doesn't apply to that. (Or to Android.) Microsoft also needs to make that transition, but whether beating Windows into submission with the multi-touch GUI stick over a number of transitional releases is the best way to achieve that is a question only the market can answer. In fairness, Windows 8.x is a pretty good choice for people, like myself, who have to do a lot of typing. Some of the hybrid Windows 8 "tabtop" devices out there are a perfect fit for my needs.

Wacom's Companion Pro (essentially a Wacom digitiser and stylus nailed onto a tablet very similar to the Surface Pro 2) is looking very attractive to me right now. It's flashing its ports seductively at me as I type this. Cease, you Jezebel! You tablet of the night!

Nurse! The screens!

Sean Timarco Baggaley

I agree with part of the article: Steve Jobs was exactly what Apple needed in their hour of need, but that's largely due to his previous involvement with the company. The only equivalent for Microsoft would be the return of Bill Gates, who has already made it clear he's not interested in retreading old ground. (For all Jobs' later success, Gates didn't need to mess it up and spend years in the wilderness to learn the necessary skills. Gates nailed if first time around.)

However, I disagree with the tiresome repetition of a pointless meme: what Apple has is a *gated community*. It's not the walling-in that's the point here, but the *curation*. Android has barely any curation at all, hence its frequent security issues. iOS' App Store, on the other hand, *is* curated, which is less like a gardener wandering around a walled garden and occasionally reacting with an, "OI! Gerroff the lawn!", and more like the guards of a gated community who stop undesirables getting in in the first place. (No, they're not 100% successful, but they're close enough.)

*

What Microsoft needs is *focus*. It is making a mistake Apple was making in the mid-90s: it's doing too much. They sell no less than three flavours of desktop Windows, each with multiple variants. They sell server variants too. They make a games console (also with its own OS), they make games, they sell a major office suite, own a bunch of cloud services, and they sell industrial-strength software development tools too. They even make keyboards and mice.

Jobs was right to slash Apple's massive, and very confusing, product range when he returned: focus is a common factor in very successful businesses. You can't do that when you have a portfolio even wider-ranging than Apple's under Gil Amelio's tenure.

Unlike Jobs' scorched Apple policy, Microsoft could slash its portfolio by simply spinning off the profitable units into separate entities. Microsoft needs to make itself agile enough to react more quickly and effectively to the ever-changing world of IT – an industry that is, almost by definition, in a state of perpetual transition.

THAT is the hard part: changing Microsoft's management and corporate structure entirely. Given the present corporate structure, it may be that what Microsoft really needs today isn't so much a Steve Jobs, as a Genghis Khan.

Doctor Who: From Edwardian grump to Malcolm Tucker and back again

Sean Timarco Baggaley

Re: The Goons, really?

Depends...

The original 23rd November 1963 broadcast was indeed followed immediately after by an episode of the Telegoons ("The Canal" episode).

Evidence: http://www.bbc.co.uk/archive/doctorwho/6405.shtml

The first episode of Doctor Who was also repeated the following week, on the 30th of November, due to the assassination of JFK on the 23rd overshadowing the first broadcast. The next episode of the Telegoons was not broadcast until the 7th of December – skipping the 30th. I can't find any scans of the Radio Times for the 30th though, so I'm only speculating that it was likely a casualty of the decision to show the repeat.

Sean Timarco Baggaley

Re: 80's Doctors

This.

Tom Baker had a few great episodes, but his theatrical acting style hasn't aged well. And there were a hell of a lot of duds too, not to mention an over-reliance on 'homages' to old gothic horror movies.

In fairness to Michael Grade, I think Colin Baker's portrayal suffered mainly from coming after Peter Davison's. Colin gave it his all – a little too much so – but the problem is that he and Tom Baker both have a strong theatrical acting background and it *really* shows when watching their stories on DVD. They both come perilously close to channeling Brian Blessed at times.

Peter Davison was very much a TV actor first and foremost and understood the medium's strengths and weaknesses. He gave a much more subtle, nuanced, portrayal. Casting Colin felt like a step backwards. Despite some decent episodes – I'm very partial to "Vengeance on Varos" and "The Two Doctors"* – Colin's tiresome schtick of repeating the same word three times, each louder than before – "Hammy? Hammy?! HAMMY?!!!—grated very quickly. Davison basically made the two Bakers' theatrical acting style obsolete.

McCoy was an inspired choice though, so it's a shame he was saddled with a bunch of scripts originally written for Colin Baker's portrayal for his first season. (And some very odd choices of scripts for the second.) His final season, on the other hand, holds up pretty well despite series' shoestring budget and the BBC's utter indifference to the series itself.

All that said, I'm still amazed the series survived the casting of Bonnie Langford. Poor Colin Baker. He never had a chance.

* (Patrick Troughton and John Stratton were clearly having a whale of a time, and we also got Jacqueline "Servalan" Pearce thrown in as well.)

Apple MacBook 13in with Retina display

Sean Timarco Baggaley

There's a good reason why Apple kit tends not to have much upgradeability: eBay.

Apple hardware tends to hold its price very well – especially at the "pro" end. My 17" MBP (2010 model) has 8GB RAM and a 512 GB SSD. It screams. And it's still worth over £600+ on eBay, despite being closer to its 4th birthday than its 3rd.

I've yet to see the HP, Dell or Lenovo laptop that can boast the same.

Many owners of Apple kit are well aware of this and tend to effectively trade-in their old model for a new one every couple of years. Why the hell would they waste their time going through all the bother of upgrading?

(Before you reply: being a reader of The Register does not define you as a professional computer user, any more than being able to strip down and rebuild a V8 engine makes you a "professional" car driver.)

MANUAL STIMULATION: Whack me with some proper documentation

Sean Timarco Baggaley

Re: As an ex- tech pubs* guy...

I used to write docs myself.

My niche was the game development tools industry: I wrote the user guide for Criterion's "Renderware 3" / "Renderware Graphics" (for which I can only apologise), and also the docs for the first few releases of Allegorithmic's "Substance" suite. I did some odds and ends for the Unity folks too some years ago. It's amazing how easy it is to find shockingly overpriced documentation tools that make even Emacs look like a simple, elegant editor by comparison.

"The engineers (as noted) hate documentation because they KNOW that their products are brilliantly intuitive and so NEED no documentation;"

This. Oh, so very this. (I lost count of the number of times I had to remind some of my colleagues what the "I" in "API" stood for. Developers can be end users too, so even an API should have basic UI design rules applied to it.)

Thing is, writing documentation really is a truly thankless task. Everybody hates what you do and considers your entire job pointless. End users have become so used to manuals being either missing, or abject shite, that they assume it's safe to expect this to be true of all applications. Not only do they never read your 500+ pages of bloody hard work, but they'll actually be surprised such a source of information even exists. And your own colleagues also see you as some kind of parasitic life form whose job appears to be ask them to explain what they consider blindingly obvious. There is nobody so blind as an expert.

Good technical authoring is bastard hard to do. Not only do you have to become an expert in the entire product you're documenting, you also need to be able to explain it to your readers without overwhelming them with information. Just enough background information – plus links to more in-depth info – and no more. And complex software can have features that rely heavily on other features too, so you need to organise the teaching to ensure you have full coverage.

Not everyone can do it effectively, despite many developers' belief in the contrary. Frankly, most developers I've met—include many with "Ph.D." after their names—seem to be either flat-out illiterate, or insist on making everything read like a particularly obtuse scientific paper. They sure as hell don't understand how to teach well.

Never mind that the greatest feature in the world is of no f*cking use to anybody if they can't work out how to use it. (Ironically, this is precisely how Apple have crawled out of near-bankruptcy to the top of the IT heap: they truly grok user education.)

I got the hell out when I realised that most of my potential clients had begun to see support as a chargeable feature, turning it into a revenue stream: a decent manual suddenly becomes a bad idea as it reduces support calls and, consequently, revenues from that particular stream. (Indeed, this appears to be how most GNU / FOSS applications are actually supported financially: write something that's genuinely useful, but give it a cryptic UI that effectively forces your customers to pay you for training and support and you can make a decent profit.)

I do translation now, which is a whole fresh hell of WTF in its own right, but at least people actually appreciate your work. They also don't tend to claim that "it's just writing! Anyone can write!"

RETRO-GASM: The Fuze electronics kit for the Raspberry Pi

Sean Timarco Baggaley

Re: Really really basic computers

(sigh)

Procedural languages, Functional languages, OOP, etc., are mostly just different kinds of organisational scaffolding. No mainstream CPU actually gives a toss about any of that stuff; it all ends up as machine code in the end. Often on an Intel or ARM CPU, neither of which care one whit about whether you like to organise the original code in your source files as subroutines or as objects. Same meat, different gravy.

None of that stuff matters.

What matters is understanding how computers "think", because programming is just a synonym for "translation" and is actually pretty easy to learn at that level. I was far more productive coding Z80 or MC680x0 code in assembly language than I ever was writing in C++. I used to be proud of writing bug-free code, and it really *was* bug-free. But those days are long gone. The hardware has become orders of magnitude more powerful and capable, but the tools we use to program it all have barely changed since the flint axes of the 1970s.

It's 2013 and we're *still* writing code using artificial languages that require us to walk on eggshells due to their cruel and unusual punctuation. My *phone* can render a Mandelbrot set in real-time and run full-on 3D First-Person Shooters at HD resolutions, and yet we insist on forcing *humans* to do stupid grunt-work like adding a semicolon at the end of a line to save the compiler from a picosecond of calculation? How the hell is this even acceptable? How is this not front-page news in The Register and all its rivals? THIS is the scandal of our time.

No wonder today's software comes wrapped in legalese instead of warranties.

Wozniak: Please, whatever you do, DON'T buy me an iPad Air

Sean Timarco Baggaley

Re: Storage indeed

"No SD card in the nexus, but at least you can use a USB stick, or 20, or a USB microSD card reader if you like..."

So, er, just like an iPad with the Camera Connection Kit then?

No, it doesn't have a MicroSD slot built in, but most users don't need it. Why compromise a product's design and usability* for the sake of a relatively tiny number of edge cases?

If you want a tablet with stacks of storage, buy a Windows 8 model instead. Some of those come with 256 (and even 512 GB) of storage.

* (removable storage is a royal pain in the arse from a UI perspective. There's a bloody good reason why Apple went with software-eject systems for their Mac floppy disk drives back in the day.)

Kids hooked up with free Office subs at Microsoft-addicted schools

Sean Timarco Baggaley

Re: Many tools for the job?

"and Microsoft isn't going to fund actual education."

Black & Decker aren't going to fund metalwork or woodworking classes either.

It's Microsoft's job to supply the tools. Anything else they do is icing on the cake. It's the school's job to provide the actual 'education' part. By offering industry-standard tools that students will actually encounter in the real world at a massive discount (and effectively for free for many students), Microsoft are reducing the total funds required for that education.

Sure, you could use Libre/OpenOffice, but it simply isn't that good. If it was, businesses would have standardised on them long ago. They are, after all, free. If you can't even gain market share by giving your product away for nothing, the problem isn't your competition. It's you.

Support counts for a lot. As does (relatively) easy customisation and extensibility, as well as a huge ecosystem of third-party applications that plug into the Microsoft Office suite. Companies like SDL, who create translation software, rely on MS Office being installed to handle the preview feature for Word-supported file formats, for example.

Libre/OpenOffice (as well as Apple's own "iWorks" suite) originally competed with Microsoft Works, not Microsoft Office. Sadly, Microsoft axed Works a long time ago, but it seems the *Office communities haven't really understood what it is that makes businesses so willing to pay licenses for Microsoft's products regardless. It's not merely inertia.

Sean Timarco Baggaley

Re: If you get them young and you will have them for life

"should openly promote FLOSS because it's the right thing to do."

Wrong answer: Discrimination is wrong no matter which side you discriminate against. Either way, it's still discrimination. The correct answer is to demand that schools use both. It's not as if installing LibreOffice would add much, if anything, to the costs, and both suites can use the same file formats now. It'd also be a lot cheaper for students, although Microsoft's offering goes a long way to help there.

Software is a damned tool. I don't need to know how a hammer works if all I intend to do with it is hang some pictures on a wall. Open Source only has value if there is sufficient interest and competence in your target market to understand it and work with it. It's worth pointing out that this stage that, when the GNU movement first began, there were a damned sight fewer programming languages and philosophies around too. Today, there's a veritable Babel of such languages, with new ones seemingly being invented every other day, so the chances of your audience actually being sufficiently competent in the language(s) you're developing in are shrinking, not growing.

Open data formats are far more useful and valuable than mere access to source code. What matters is that my data remains accessible if development on the tool that created it should end. Given that both MS Office and the Open/LibreOffice suites can both read ODF files now, this is no longer an issue; all schools should have to do is teach the value of open data formats.

MS Word deserves DEATH says Brit SciFi author Charles Stross

Sean Timarco Baggaley

Re: even my kids with learning disabilities can manage that

It helps if you check the anchor and text-wrap settings for tables and pictures. Once I understood what each of those settings actually did, I had no further issues.

That said, Mr. Stross needs to learn that he should use the right tool for the job. MS Word isn't designed for writing novels. It's a *corporate document* creation tool. (Which I use frequently and have never had any problems with. I translate for a living, so MS Word is unavoidable: unlike most professions, translation has no concept of a "standard" document format, so I have to be able to deal with DOC, PDF, XLS, CSV, AI, InDesign files, PO files and more.)

For book design and development, I use tools like Ulysses III or Scrivener. Both are highly recommended and a far better fit for writers and novelists than MS Word. The only reason for using MS Word as an author is either wilful* ignorance or masochism. There's no need for it. I know professional novelists who swear by Scrivener, for example.

I believe Scrivener has a Windows version now. I'm not sure about Ulysses III; I suspect not. Then again, even Screenwriter 6 has a "Novel" template now, though it's mostly aimed (not surprisingly) at screenwriters.

* (Google exists, Mr. Stross. A novelist whining about how hard it is write with MS Word is like an architect whining about how hard it is to design buildings in Logic Pro X. All you do is make yourself look like an ignorant twit.)

Video thrilled the radio star: Tracking the history of magnetic tape

Sean Timarco Baggaley

Re: mellotron?

To be fair, such a digression could easily fill another 3-4 articles on its own.

Magnetic recording triggered the birth of the technique many (incorrectly) refer to as "sampling" today. (The term is "sample loops" or just "loops" – "sampling" refers to the process of recording the actual sounds digitally.) The looping of those sampled sounds underly almost all music today. It's most obvious in the work of Norman Cook, but even 'live' acts use it as a matter of routine now in their studio recordings.

This technique was first used in the 1940s and led directly to the "Musique Concrète" movement that was so influential on the likes of the BBC Radiophonic Workshop. Most of their output during the 1950s and '60s was Musique Concrète, albeit often with very early synthesised sounds added.

The term "loop" came from these pioneers: they had to do it the hard way, by recording sounds onto tape, re-recording them as many times as required onto another tape – at different speeds, in order to get the necessary notes – then chopping the resulting tape up into individual notes. (This was all worked out mathematically, hence the maths or engineering backgrounds of many of those involved.)

These individual snippets of sound – bass lines, rhythm / percussion cycles, even short sections of melody – were then joined together to form the necessary tape loops needed to produce the final piece. For example, the percussion sequence might be quite short and result in a fairly short loop of tape that could be repeated over the full track. (Most music is filled with repeated patterns and motifs. Listen to any dance track and it's pretty obvious, but you'll hear it even in Beethoven and Bach.)

A complex piece might require multiple, very long, loops of tape, all played in synchrony, with some loops so long that the team would have to jerry-rig a system of reels and pulleys to run the tape out of the machine, out of the door, down a corridor, and all the way back again.

Multiple tape machines were used for this. Even "bouncing" tracks is a term descended from this era.

Digital audio workstations ("DAWs") hide all the fiddly mathematics and tape editing today, but the underlying concepts and principles haven't changed.

Most of this can be gleaned from the BBC's own (belated) celebration of the defunct BBC Radiophonic Workshop: The Alchemists of Sound It's a shame the BBC never expanded it into a series covering the wider context of electronic music in general.

Cook invokes GHOST of STEVE JOBS in Apple-wide memo

Sean Timarco Baggaley

Re: @Sean Timmarco Baggaley - "Yet he, too, chose to stay and work with Jobs."

Seriously? Tim Cook – whose core skill set is in optimising supply chains – has "non-transferrable skills"? Jony Ive had his own UK-based design company ("Tangerine") before he moved to California to work for Apple. With his reputation, I doubt he'd struggle to find clients if he decided to strike out on his own again. So why didn't he do that?

Quite a few people did leave Apple. Some of them came back. How can that be if Jobs was such a git?

Jobs' own friends have described him as "mercurial", but that's hardly a unique trait in a CEO. You have to be pretty ruthless to turn a company from a near-bankrupt basket case into the most successful business on the planet in just ten years.

Most of the people who give the keynotes at Apple launches today could have left at any time – the likes of Ive and Federighi could easily write their own tickets given their track records – yet they haven't done so. They have chosen not to do so. Jony Ive wasn't even a Jobs hire: he was already at Apple long before Jobs returned and had never worked as an employee of either NeXT or Pixar. So you can't claim cronyism either.

So, again: if Jobs was such a nasty piece of work, why the hell did so many people who could have easily walked right into a new job (or set up on their own) choose to stay and work with him?

Answer: he wasn't as big an "asshole" as he's made out to be – usually by people who never actually met him, or spent any substantial time with him. Jobs was certainly a control freak and a perfectionist, so it's not difficult to see why he hated doing anything that he could not have any control over. He practiced some of those keynotes for weeks.

Reporters and journalists tend to be interested in people, but Jobs didn't talk about himself much. He was no Richard Branson, whose entire career has mostly involved blowing his own trumpet and pimping his "Virgin" brand. The latter gave good interviews, but Branson is unlikely to go down in history as anything other than a famous, self-publicising beard who got very lucky and milked it for all it was worth.

Sean Timarco Baggaley

Re: Spit!

You might want to read the Techies with Asperger's article elsewhere on the site before making such comments. There's strong evidence that Jobs had mental health issues and I suspect that those who hated him were mostly people who couldn't make allowances for that. Even as a child, he tended to think digitally: everything was either "awesome" or "shit". There was never any middle ground. So those traits were there right from the start.

Asperger's (or some other part of the Autism spectrum) would certainly explain a lot, as would OCD.

That said...

Apple has been the biggest contributor to the "Product Red" charity for some time now, so the notion that Apple never donated any money at all under Steve's watch is patent nonsense. Jobs' estate has also pointed out that they preferred to donate anonymously.

Jonathan Ive, Tim Cook, and many, many others at Apple have had plenty of opportunities to leave the company and either go work for others, or set out on their own. (Some have, in fact, done precisely that, as a .look at Woz's full CV will attest.) Furthermore, some Apple people left, then willingly chose to return.

Let me repeat that, in case you missed it: these people chose to work at Apple with Steve Jobs. Jobs was in charge of Apple for 13 years or so. Jony Ive was hired in the early '90s – long after Jobs' initial departure and some years before his return. Yet he, too, chose to stay and work with Jobs.

Thirteen years is plenty of time for all those people to update their CVs and send them out. So what stopped them? Did Jobs hire heavies to point guns at their heads? Did he kidnap their families? If Jobs was truly the complete arsehole he's often made out to be by his detractors, why didn't all those people who worked for him leave?

The man clearly inspired a surprising amount of loyalty from his friends, so he must have had something going for him. He was even married to the same woman for 20 years, with whom he'd had three children, when he died. If he was that hard a man to work with, how did that happen? When they married in 1991, Pixar's first success was still four years away and NeXT was hardly a money-spinner either, so Laurene Powell certainly didn't marry him for his money.

Sean Timarco Baggaley

Jobs was the one who suggested creating a business out of making and selling computers. If Woz had had his way, Apple would never have happened and it's doubtful if Woz would be as well-known today.

Jobs was the driving force behind Apple from its inception. Woz clearly wanted so little to do with it that he effectively left the company as a full-time employee in 1987. (Apple still pays him a $120K / year salary and Woz also has stocks in the company.)

The philanthropy issue is interesting: Jobs' widow has stated that Jobs did donate to charity, but preferred to do so anonymously.

And consider how many people have jobs because of Apple: not just Apple's own employees, but the thousands of developers, support businesses and all those involved in the production and distribution chains. In today's economy, having a steady income is very much a Good Thing. This is as much a benefit of Apple as any amount of pissing money into the black holes of the WWF, Oxfam and their ilk.

It's also worth mentioning that donating to charity is often considered an excellent way to reduce your tax burden. This is one of the most popular reasons for a business to donate money. Altruism often has bugger all to do with it.

Techies with Asperger's? Yes, we are a little different...

Sean Timarco Baggaley

Re: "Sorry" is the hardest word.

No, this is what happens when a publication has more than one writer working for it. Different writers = different perspectives.

Also, some forms of what was once known as Asperger's Syndrome do are linked to high levels of paranoia and destructive behaviour. The term "ASD" was created for a reason and this is one of them. A person closer to the High Functioning end of the ASD will be able cope with "normals" with a bit of help. As you get closer to the other end of the ASD, you get people who simply cannot function at all in ordinary society. Such people are often completely isolated from the people around them and, yes, paranoia and destructive behaviours are not uncommon.

And this is a "spectrum" – think of it as like a timeline, with one end of the line being "normal" and the other end labelled "completely hatstand". People with Asperger's Syndrome tend to be closer to the "normal" end, but the boundary between "Asperger's" and full-cream "Autism" was never satisfactorily and unanimously defined. Hence the replacement of the multiple variants with just one: Autism Spectrum Disorder.

Describing McKinnon as "paranoid, unreliable and destructive" would be quite valid if his diagnosis puts him closer to the full "hatstand" end of that X-axis. The worse your Autism, the harder it is to cope with other people in any way at all – even your own parents.

An appearance of paranoia is not uncommon as you move away from the 'high functioning' end of the spectrum, but you need to understand that we typically apply such descriptions based purely on outward behaviour; it can be difficult to tell if what we call "paranoia" is simply a manifestation of a more general, deep-rooted, terror of that constant torrential flood of data an autistic person is faced with during their every waking hour. Similarly, what an observer would consider "destructive behaviour" might have a perfectly rational explanation for the sufferer.

If any readers here have never read Oliver Saks' seminal "The Man Who Mistook His Wife for a Hat" and related books, I strongly recommend you do so.

Sean Timarco Baggaley

@Peter2:

The "emotionless = violent psychopathic killer" connection is made by people (and TV and movie writers) who appear incapable of understanding that even anger is itself an emotion.

Why would someone who doesn't react much to any emotion decide to suddenly make an exception and explode with rage and fury? It's not that emotions aren't there, it's just that they're felt nowhere near as intensely, be it joy or sadness, love or hate.

Such people tend to do well in jobs where what they have to do would turn any other person into a gibbering emotional wreck.

How you express your emotions is a far better indicator how violent you're likely to be. If you're someone who feels every emotion intensely, you will feel both joy and sadness, love and hate intensely. If you're also prone to bottling-up your emotions and venting only when that metaphorical bottle is full to bursting, you're likely to be far more dangerous than anyone on the Autism Spectrum.

I suspect that such emotional issues are more closely linked to current theories on depression and related disorders rather than the data-processing / management problems characteristic to autism. (Indeed, there's no reason to assume someone cannot have both an ASD and, say, clinical depression.)

The mammalian brain is a complex machine – far more so than any computer. Yet we've had the latter around for about 60 years now and, despite a CPU being essentially a collection of transistors, the behaviour of each of which should be entirely predictable, we still struggle to write bug-free software to this day. It's not the hardware that's difficult, but the code that runs on it.

The theories and hypotheses we have on the brain itself are pretty much at a similar level: we have a pretty good idea what individual synapses and neurones are, but have barely scratched the surface of what it is they actually do all day.

Sean Timarco Baggaley

Re: Aspies are 13 to the dozen.

Actually, as another poster mentioned earlier, the term "Asperger's" is no longer used.

Spectra and continua are increasingly being used in diagnoses, instead of the older, 'digital' approach that required ticking a very specific list of boxes to obtain a diagnosis.

The human brain is such a complex organ* that the very idea that "normal" people even exist is just bizarre. It's far more likely that people with fully-functioning brains are actually quite rare, while the rest of us range from the extremely knackered to the only slightly buggered. And that buggeredness could be anything, from a poor sense of direction, through to something like colour-blindness or a potentially terminal inability to read user guides (or see the bloody great "Help" menu right up there at the top of the window. Look! See? That! What do you think it's for?)

Asperger's is not – and never has been – a license to be an asshole, not least because "asshole" is a very subjective description and depends on your point of view. Most people seem to believe the late Steve Jobs was a right tosser, but Jony Ive, Tim Cook and their colleagues don't appear to have thought so.

However, "high functioning" sufferers can also learn to be sociable. Many things most "normal" people seem to be able to do subconsciously, such as reading body language in real time, is something we "data-processing impaired" have to concentrate on consciously – often to the point where we end up with splitting headaches from trying to process all the data.

I'm one of those who also struggle with noisy environments. I've found lip-reading helps, and also goes a long way towards countering the eye-contact avoidance problem too, but it often makes moi brain 'urt!

Which is why I hate parties.

* You, sir, have a filthy, filthy mind.

Android adware that MUST NOT BE NAMED threatens MILLIONS

Sean Timarco Baggaley

Re: What hasn't been mentioned....

'That would be an absolute nightmare for app developers.'

Tough. It's not the user's job to make life easier for the developer. It's the developer's job to make life easier for the user.

Android's APIs clearly need a serious rethink if this is such a chore for developers to deal with. iOS app developers have to deal with this kind of thing too and most do so without kicking up a big fuss. (It helps that the relevant iOS APIs are pretty easy to use. Perhaps Google should be aware that the "I" in "API" stands for "Interface" – i.e. developers need good UIs too!)

'How do you deal with an angry user who's blocked a fundamentally required permission for your app and then starts reviewing it poorly because "it doesn't work"?'

Oh, I don't know... how about being better at app design and development, catching the errors caused by disabled permissions, and failing gracefully with suitably clear messages and notices to the user explaining why a feature isn't working?

Apple's new iPhones dope-slap Samsung in US

Sean Timarco Baggaley

Re: 5S and 5C impressive for different reasons

@MrXavia:

Seriously?

It's a different kind of sensor to the crap sensors used in the past! It's been explained in any number of articles that this is a capacitative sensor with the equivalent of a 500 dpi resolution that looks beneath the surface of the skin. It's not an old optical surface scanner as used on those old, less reliable, sensors.

Your assertion is like claiming that iPhone 4's "retina" display was just a gimmick because "every phone has a display".

The iPhone 5s isn't the first phone to include a fingerprint sensor, no, but it is the first to include one that is (a) not shit, and (b) actually integrated right into the OS at a very deep level, making it far more useful.

Apple are all about creating a seamless user experience. They only include a technology if it will actually enable them to improve that. Any new technology they include in their products therefore has to be integrated at a very deep level in order to provide that appearance of seamlessness.

(And, before the usual bunch of ignorant haters jump in: no, they're not perfect. Last time I checked, neither were any of Apple's competitors.)

Nothing I've written above should be even remotely surprising to regular readers of El Reg as all the evidence has been right there, out in the open: Neither Jobs nor Apple have ever hidden this information from the public. They've even produced TV spots explaining all this. And not just that one either. Many of their ads are there to educate people about Apple's philosophy of making design and the user experience – not merely shovelling raw, unprocessed technology into a box for its own sake – their top priority.

They've been doing this for at least fifteen years now.

What more do they have to do? How long will it take for some of you to grok that Apple are not Microsoft or LG and are successful precisely because of how they differ in their approaches to product design and production?

Ubuntu 13.10: Meet the Linux distro with a bizarre Britney Spears fixation

Sean Timarco Baggaley

Re: Please please PLEASE ...

He's well aware that PCs and tablets are two different things.

Which form-factor is selling like hot cakes?

Which form-factor is seeing sales drop off a cliff?

75% of all PCs sold three years ago were laptops. That percentage is rather higher now. Laptops typically come with trackpads, not mice. Even Windows laptops have had trackpads that support multi-touch gestures for some time now.

The traditional "separates" PC design is already a niche market. That form-factor is limited primarily to high-end workstations.

Dell sold about 9 million PCs last year. Asus sold about 8.6 million. Acer sold a little over 6 million. In a year.

Apple sold over 9 million iPhones 5s and 5c models over a single weekend.

Canonical are well aware of the future of IT and consumer electronics. At least they're doing something. They might not get it right first time – Microsoft have a tendency to iterate too; the first reasonably popular version of Windows was v3.1 – but it's the attempt that matters.

Both the GNU / FOSS communities and Microsoft have proved that you can cock up spectacularly regardless of your corporate / community culture and politics. How long have we been waiting for the "Year of Linux on the desktop" again? And yet, there's Apple with a full-fat *BSD UNIX OS that's been made so user-friendly, most of their customers aren't even aware of its UNIX heritage. And they pulled that feat off not once, but twice, with OS X and, later, with a completely redesigned GUI for iOS.

Without vision, effective guidance and focus, you're screwed. Whatever your personal views on Mr. Shuttleworth and his vision, at least he has one. The GNU / FOSS movements do not. All they have is a tired, anachronistic political dogma of no relevance to anyone who doesn't actually program computers for a living. (And even then, it's only a tiny subset.)

US House Republicans: 'End net neutrality or no debt ceiling deal' – report

Sean Timarco Baggaley

Re: End the Drug War or No More Debt

"All one can say for sure is that hundreds of millions of dollars have been spent on legalizing marijuana - who has that kind of money?"

Tens of millions of people?

There's nothing writ in stone that says funding has to be provided in one gigantic lump sum from a single donor.

USB 3.1 demo shows new spec well on its way towards 1.2GB/sec goal

Sean Timarco Baggaley

Thunderbolt isn't competing with USB.

The two are apples and oranges: Thunderbolt 2 (which is the version in the announced, but not yet launched, new Mac Pro) is already at 20 Gbits/sec and is part video port, part data port.

However, the crucial difference is in the way they connect with devices at each end: Thunderbolt is, to all intents and purposes, a PCI bus on a rope. That's why each port requires a PCI lane. It interfaces with the computer (or peripheral) at a lower level than USB.

Thunderbolt, like Firewire, also does all the heavy lifting itself; your computer's CPU is therefore not bothered with running the protocol at all. USB, on the other hand, requires the CPU does some of the work. For consumers, the difference is academic, but for many professionals, anything that reduces valuable CPU capacity for their own projects is a big no-no. Why spend eye-wateing sums of cash on a high-end Xeon CPU and waste even a fraction of that power on USB processing overheads?

With USB, the CPU has to not only handle video data compression for my video editor application, but it also has to set aside some of its time to handle some of USB's duties during file transfers. This lengthens the time required for that video work, which is the work that actually *pays my bills*.

With Firewire or Thunderbolt, the CPU is *entirely* free to work on *my* stuff, not the connector's. My computer is therefore helping to make me more money per second. If I have to pay a one-off €29.99 for a cable to ensure that, I won't even blink at the cost as I know I'll make it back within a day or two.

So, no, Thunderbolt won't be a big consumer connection standard. Neither was Firewire. But the latter had its place in the high end prosumer and professional markets, and so will the former.

Last time I checked, Apple were rather fond of targeting both those markets. That's where the profits are.

Office 365 goes to work on an Android

Sean Timarco Baggaley

Re: Keyboards for the win, till speech really is recognised

Matias do a lovely version of their Pro series which explicitly support connecting to mobile devices via Bluetooth. Might be worth a look.

(I'm quite fond of my Unicomp IBM-based keyboard, but I have an older Matias Tactile Pro sitting next to it for when I feel like a change of feel.)

Ex-CEO Elop's plunder to total $25m in voyage from Nokia to Microsoft

Sean Timarco Baggaley

Did anyone *force* Nokia to bring in Elop?

No?

Then Nokia's woes are entirely their own damned fault.

They had Symbian... and let it wither and die through under-investment.

Their in-house System 40 OS has seen precious little love over the years too.

They mucked-about with various Linux derivatives and... f*cked those up too.

By the time Android had reached the point where it could no longer be accused of being a flagrant iOS rip-off, it was too late for Nokia.

The writing was on the wall when Apple launched their first iPhone and subsequently proceeded to eat Nokia's high-end lunch. Nokia *should* have seen multi-touch devices coming. They didn't. Neither did most of the other incumbents of the day: Whither Sony-Ericsson? Where is LG? Even Motorola is now just a department of Google, Inc. Nokia's management became complacent and, ultimately, incompetent. Today, all they have left is their mobile networks business; their mobile arm was effectively junk long before Elop knocked on their door, so offloading it now is more a case of "What the hell took you so long?" than "Ooh! Elop was a Microsoft mole!"

This is what competition *means* in a (mostly) Capitalist society: Company A seizes the castle and gets to play king for a while, making it the target for all its rivals. Unlike those rivals, Company A only has to make one serious blunder and one of those rivals will come along, spank their corporate arses, and take over that castle. It then becomes that new company's turn to rule for a bit.

Rinse and repeat. This corporate success / failure churn is cyclic and fairly predictable.

It's also why the most successful CEOs don't tend to be nice, friendly little doormats. You don't get to demand such telephone-number deals and salaries unless you have a ruthless streak wider than an airport runway.

Rotten Apple iOS 7 fury: Glitchy audio or is today's music really that bad?

Sean Timarco Baggaley

Dear El Reg...

... still no "Edit" button? In 2013? Seriously?

I see you also still haven't fixed the bug that deletes empty lines between paragraphs when copying to the clipboard in Safari on OS X. This only seems to affect comment posts; articles copy just fine.

Considering the industry you're supposed to be reporting on, this is a very poor show.

Sean Timarco Baggaley

Must be a slow news day.

Apple don't have anywhere near the same number of data centres Google does, nor do they own their own undersea / cross-border cables – not that I'm aware of, anyway. Google own some of those too.

Sadly, Apple and Google had a bit of a falling-out a few years ago and they haven't been on speaking terms of late. I understand El Reg is similarly in Apple's bad books. The upshot of which is that Apple are almost entirely at the mercy of the middlemen who connect you with their servers: your ISP, their peering, interconnections like Telehouse, and so on, all along the line. It only takes one bottleneck to slow things down for everybody else.

And yet, every single bloody time a major online roll-out like this happens, we *always* get the same tiresome filler pieces bemoaning the fact that—shock!—the internet is not, in fact, an exception to the laws of physics.

(Besides: what happened to waiting a day or so to see what issues all those early adopters have run into first?)

Sean Timarco Baggaley

Re: Well at least the phone update went OK

Wait for your drum machine app to be updated.

Many developers prefer to wait until the GM (i.e. final) developer seed of any new OS before working on major updates to take advantage of changes. iOS 7 introduces a number of new features to the core libraries, so it's always better to wait for these to bed in.

This has always been the case with previous iOS releases, so it shouldn't really come as a great shock that many apps haven't been updated for iOS 7 just yet. Expect a rapid series of point updates – it's got a *lot* of new code under the hood; the cosmetic changes are just the tip of the iceberg – as well as updated apps over the next couple of months. My money's on iOS 7.1 appearing around the same time as the new iPads and OS X 10.9 are released.

(Note that many apps are still following the old iOS 6-and-earlier UI guidelines, so it'll take a while before the app launcher starts to look a bit less like an explosion in a Technicolor® Yawn factory.)

iOS developers have had *months* to check out the beta versions, new APIs and developer docs, so it shouldn't take long for your favourite app to get an update. If they seem to be dragging their feet, switch to an app written by a team that actually gives a damn about their customers.

Peak Apple: Has ANYONE at all ordered a new iPhone 5c?

Sean Timarco Baggaley

Re: About as un-Jobsian as you can get

"I can say with absolute certainty that Steve Jobs would never in a million years have sanctioned the release of the aesthetic disaster that is the iPhone 5C."

The 5c is a coloured iPhone. You know: like the coloured *iPods* Apple have been selling for years now. Did you not notice those?

Have you also forgotten the multicoloured *plastic* iMacs that helped bring Apple back from the brink, nearly 15 years ago?

All of those were released with Steve Jobs' explicit approval while he was still alive.

Sean Timarco Baggaley

Re: Tim Cook needs to go. He is destroying Apple.

Because, of course, the iPhone 5s' sensor is *exactly* the bloody same as those useless little finger-swipe models that never actually worked reliably.

Oh, wait, it isn't, and you're talking bollocks. Again.

Incidentally, the *only* people expecting Apple to suddenly change the habit of a lifetime and suddenly decide to cater to the low- and mid-range markets with a "cheap" iPhone were the same pundits who continually insist that they know _exactly_ what the late Steve Jobs would have done despite having never actually met him. Apparently, all the people who actually *worked* with Jobs for many years don't have a clue what he wanted.

Come to think of it, *any* pundit who was actually any bloody good would be more than wealthy enough not to have to write link-bait bollocks for third-rate websites for a living. So those who do can be safely ignored.

ZTE Open: This dirt-cheap smartphone is a swing and a miss

Sean Timarco Baggaley

Re: The comparison will be made

Sorry, but I disagree.

I recently helped a relative replace her ageing Nokia. She was on an extremely tight budget, so we originally planned to replace it with, possibly, a feature-phone at best. Instead, we ended up with the snappily-named Samsung Galaxy Star s5280. Cost? €80 in-store, SIM-free.

(More info here: http://www.samsung.com/it/consumer/mobile-devices/smartphones/smartphones/GT-S5280RWAITV – [NOTE: Italian site, but the specs should be understandable] ).

That's an Android "Jellybean" 4.1.2 smartphone for about the same price as the ZTE Open.

(Eagle-eyed readers will have noticed the lack of 4G and even 3G support, but these are useless outside of Italian urban areas. Out here in the Italian countryside, you're lucky to get even basic 2G signals unless you live right inside a town or village. On the other hand, home broadband with WiFi is easy to find. The next Android phone up was over the €100 spending limit.)

The Galaxy Star is very much a low-end Android smartphone, but despite its low specs, it still points and laughs at the ZTE Open, while kicking sand in its face. This despite being, by Android phone standards, a weedy little thing with pipe-cleaner arms, bottle-lensed glasses and an allergy to sports.

Relying on what are, fundamentally, just grids of website bookmarks for your apps is a bloody stupid idea, not just in the West, but *especially* in developing nations that this phone is apparently supposed to be aimed at. Many potential customers barely have *clean running water*, let alone access to the mobile internet infrastructure needed to use such a phone. How are they supposed to run those apps?

(Also, if the web is so full of open standards, why aren't there more apps for existing mobile platforms already? Last time I checked, even an iPhone 1 or that Samsung Galaxy Star could run such apps just as easily as this Firefox-based device. Yet nobody seems to be jumping onto that bandwagon with any alacrity.)

Apple to uncloak new iPads, iMacs at October 15 event?

Sean Timarco Baggaley

Re: @Steven Raith 2013-09-16 22:03 Can't Cook.. Stupid Cook..

This.

Programmers typically write to an *API*, not to the bare metal of each machine. Switching to another architecture could easily be as simple as a recompile for most apps, while games and other hardware-pushing applications might need some additional tweaks to take account of differences in OpenGL features. The PowerPC >> Intel switch was pretty damned painless and was achieved in less than a year. For most developers, it really was as simple as selecting "Intel" from a drop down "Build target" menu.

The iPhone 5s uses a very different ARM core to the 5c and earlier iPhones: as one of the developers they wheeled onto the stage said, it took a mere two hours to support the 64-bit ARM processor, and it's probably fair to assume it was mostly a recompile, with some minor tweaks to low-level code to help it take advantage of the wider registers and data bus.

Even a version of OS X with iOS app support isn't that difficult to imagine: iOS apps could simply open up into a 'Space' in landscape mode. I suspect it would also make sense for Apple to bump up their "TouchBook Air" displays to 'retina' resolutions too, as the 11" MacBook Air's display might prove problematic. iOS developers will simply have one more aspect ratio to contend with (16:10, as opposed to the iPad's native 4:3), but this is hardly a showstopper.

iPhone 5S: Fanbois, your prints are safe from the NSA, claim infosec bods

Sean Timarco Baggaley

Re: Believe Apple? Erm no.

Okay, answer this:

If the NSA already have a backdoor into my phone, why the hell would they even *need* the fingerprint hash data? A fingerprint scanner is a means to an end. As far as the NSA are concerned, they already have the master keys to every US-made / designed phone, so they don't need *our* keys at all!

Either way, there's no reason for Apple to lie about that fingerprint scanner and how it works. The fingerprint hash is of no interest to the NSA: They're _already_ in. Their interest is primarily in your communications, not your biometrics. If they genuinely think you're a threat to national security, they'll send the boys round to get them, whether you want them to or not.

The NSA's activities are an entirely predictable symptom of declaring war on an *emotion*. "Terrorist"-type attacks in most countries tend to be carried out by citizens of said country – the Oklahoma bombing; the 11-SEP-2001 attacks, the London bombings in 2005, etc. were all carried out by people already within the target nation's borders. Given this, it's hardly a big shock that the NSA (and their peers in other countries) were spying on their own citizens as part of their assigned duties.

For PITY'S SAKE, DON'T BUY an iPHONE 5S, begs FSF

Sean Timarco Baggaley

Re: Refreshing truth.

Strange. I can drop any ePub or PDF file I want into iBooks. Anything that's not directly compatible, I can convert with Calibre. I can also drop many standard music file formats into iTunes – even MP3. It'll offer to convert some formats, while others can be converted by other (free) tools. Same goes for video, which I tend to stream off a QNAP NAS: AVI, MKV, FLV, MPEG-2, MP4 – you name it. (You are aware you can even get VLC for iOS, right?)

The *only* restriction Apple's iDevices have is that there is only the one App Store, and it's the one Apple created. Yes, it's curated, but so is every shop in the high street: nobody can walk into a John Lewis department store and demand that they sell their products without the permission of the Head Buyer. Curation is *normal*. It is not some form of control-freakery.

A "store" that lets anyone come in and set up their own stall is called a "bazaar". Perhaps you're unaware of this, but in the countries I've lived in, bazaars are surprisingly rare. It turns out most people like to know they can bring an item to a store and not find the item's seller has done a runner!

*

I'm not sure how a focus on good design and usability makes Jobs a "malign influence". Most successful CEOs tend to be abrasive and even a little OCD and Jobs was no exception. Neither, it seems, is Richard Stallman, so he certainly doesn't get to criticise Jobs on that front.

However, it is clear that Stallman is unaware that "freedom" is a two-way street:

Jony Ive and Apple have just as much right and *freedom* to follow their own design philosophy – which they've never made any attempt to hide: http://www.youtube.com/watch?v=VpZmIiIXuZ0 – as Stallman does. Apple aren't forcing you to buy their stuff. You have the freedom to choose one of the many competing products instead. Apple won't mind: they have a very specific target market and they're sticking with it.

Stallman, however, is a hypocrite: he seems hell-bent on *forcing* the entire planet to kowtow to his own, rather peculiar, views and philosophy. This is the exact *opposite* of freedom.