833 posts • joined Friday 8th May 2009 16:41 GMT
Re: I'm always surprised at the naivity of people
"I mean seriously, what do you expect to happen if you download software the creator refuses to give you the source code? Why would anybody keep the source code from you other than wanting to defraud you?"
What earthly use is the bloody source code to someone who has no clue about programming?
People who can read source code and understand what it does – and who also happens to have lucked-out in becoming expert in the same programming language(s) as the original developer – are unlikely to be ignorant enough to install such malware in the first place!
However, NOBODY can be an expert in every field of human endeavour. IT is just one field among many. How would you like to be told that you got exactly what was coming to you every damned time your ignorance of a particular subject betrayed you? How would you like it if every time you failed to make a multinational corporation compliant with the likes of Sarbanes-Oxley and ISO 20001, you saw someone pointing and laughing at your ignorance and calling you a "n00b"?
So much for your accusation of naïveté: We are ALL ignorant. We're just ignorant about different things.
Most people don't want to build their cars from scratch, nor do they particularly care how they work. They'll happily buy a Ford Fiesta, or a BMW, or whatnot, and simply drive the thing. All cars share one common feature: their core user interface. Some details will change from car to car, but if you've learned to drive in a Vauxhall Astra, it's a fair bet you can work out how to drive a FIAT Punto or any other make and model of car built since the 1950s.
For every James May, who could cite chapter and verse from the relevant Haynes manual for each car, there are a hundred Jeremy Clarksons, who couldn't give a toss how the bloody machine actually works. Yet most developers still believe everyone who has any contact at all with a computer should be like James May.
The IT industry has moved on quite a bit since the 1960s and '70s.
Open Source has become an anachronism. It is very much part of the problem, not the solution. Forget GNU, Stallman and the FOSS movements: they're yesterday's causes. The problem today isn't source code, but interfaces.
Not just in the software, but across the entire chain – from box art to silicon chip, from API to documentation – it's all about interfaces, not code.
End users should not be required to read complicated EULAs to determine whether the code they've downloaded actually does what it says on the tin. Why shouldn't they be able to pay for virtual gatekeepers to screen such things on their behalf? This is exactly why companies like Apple and Amazon have opted to provide such "gated communities" for their users.
Developers – and the IT community in general – really have only themselves to blame for this: you'll have massive flamewars over trivialities like tabs vs. spaces, while criticising the poor bloody users who have to put up with the ill-designed, barely usable, and barely-supported tripe you expect them to learn how to use. And then you think nothing of bundling in someone else's crap with your "free" software, because your definition of "free" isn't the same as the one in the dictionary.
The IT industry's problems aren't Apple's, Google's, Microsoft's or anyone else's fault but yours. You've had half a century of power, but you've chosen to ignore all the responsibilities that come with it. It's time that changed.
 There is a veritable Babel of programming languages out there, and merely reading some books and tinkering about with each of them does not make you an expert.
 This will come as a shock, but some of you clearly haven't understood what the "I" in "API" actually stands for. Or the purpose of good documentation. Similarly, a published data format is also an interface. Interfaces are everywhere.
 Google Play is the only "walled garden" out there. It has gardeners who react to problems after they've happened, not gatekeepers who stop the problems getting in in the first place.
Re: I will get one because....
'But if I purchase a PS4 I can expect the wife to say "so what does it do different?"'
May I suggest ask your wife why she is fine with spending hundreds – if not thousands – of quid on shiny lumps of rock artfully nailed into equally shiny bits of metal, or ooh-ing and aah-ing over a pretty arrangement of vegetable genitalia... yet she believes YOU are the shallow one for wanting to spend a few hundred quid on a new home entertainment* centre that actually does something besides merely looking shiny.
* a concept that also includes games. There's no need to make a distinction.
Re: A fine line between Vision and Arrogance
"Never quite understood whether the new user interfaces being foisted on us are because *we* (the punters) lack vision to understand it, or *they* are just being arrogant and treating us like cash cow cattle."
It's the former. Windows 8.x replaced the rather weird Start Menu with a proper app launcher, which also runs big widgets. All the keyboard shortcuts – which everyone calling themselves a seasoned veteran or professional should know – are unchanged. ALT+F4 will close a ModernUI app just as it closes a conventional Windows GUI app, for example.
There are textbooks on this. Many of them written as far back as the late '60s and early '70s, when the R&D phase for the WIMP desktop metaphor we still see on desktop GUIs today was still in its infancy.
That the above is clearly a surprise to many so-called "professionals" is shocking to me; it was very basic stuff when I was studying Computer Science in the 1980s.
@Buck Futter, Tannin, et al:
Many people seem to have a problem with Windows 8.x, but I've actually found it's easier to get newbies into it than it was with previous versions. Rather than presenting you with a pretty picture and some cryptic icons, it actually starts with an application launcher that shows a bunch of very clear tiles, each of which tells you what it does and even gives you some basic information before you've even clicked on it.
As for myself: according to every WIMP GUI rulebook, the GUI is there for *newbies*. Nobody else. Intermediate and advanced users are supposed to learn the bloody keyboard shortcuts!
If, like me, you had done just that, Windows 8.x would pose no difficulties whatsoever. Want to close an application – or even bring up the shutdown dialog box? ALT+F4. Each new release has added new shortcuts, but many of the existing ones have been there since Windows for Workgroups!
The problem is that nobody's teaching this any more. When so-called "professionals" proclaim themselves grizzled veterans with umpteen years of expertise in a platform, yet admit to being bamboozled by changes to what is, when you get down to it, a glorified app launcher, you have to wonder what they're teaching kids at university these days.
Such people are, at best, amateurs, not professionals. Their blatant ignorance of basic GUI usage rules is proof enough of that. If you're still relying heavily on a mouse or trackpad to get your quotidian work done, and you're not an artist or architect, you're doing it wrong. By definition. There are actual textbooks explaining all this.
That tiled GUI really is piss-easy for neophytes to understand. It's easy to forget that we had to *learn* to navigate the (original hierarchical) menus and drill down to our application – never mind having to remember *which* application we needed to open! Now, my aunt need only look for the "Mail" tile, see that there's a message or three waiting for her, and click on it. It's all there right in front of her. And this is a Good Thing™ as it means she needs to rely rather less heavily on her failing memory.
iOS and Android – hardly surprising given the former's influence on the latter – led the way, but Windows' ModernUI picked up the widgets idea and ran with it, making it the central feature, but Trevor Potts' point about separating this from the old Windows GUI is a valid one: Windows 8.x is very much a transitional release, and it's likely Windows 9.x will be too, given the glacial pace of upgrading in the corporate field.
iOS was the first mainstream GUI to break with the old WIMP formula, so the keyboard shortcuts point doesn't apply to that. (Or to Android.) Microsoft also needs to make that transition, but whether beating Windows into submission with the multi-touch GUI stick over a number of transitional releases is the best way to achieve that is a question only the market can answer. In fairness, Windows 8.x is a pretty good choice for people, like myself, who have to do a lot of typing. Some of the hybrid Windows 8 "tabtop" devices out there are a perfect fit for my needs.
Wacom's Companion Pro (essentially a Wacom digitiser and stylus nailed onto a tablet very similar to the Surface Pro 2) is looking very attractive to me right now. It's flashing its ports seductively at me as I type this. Cease, you Jezebel! You tablet of the night!
Nurse! The screens!
I agree with part of the article: Steve Jobs was exactly what Apple needed in their hour of need, but that's largely due to his previous involvement with the company. The only equivalent for Microsoft would be the return of Bill Gates, who has already made it clear he's not interested in retreading old ground. (For all Jobs' later success, Gates didn't need to mess it up and spend years in the wilderness to learn the necessary skills. Gates nailed if first time around.)
However, I disagree with the tiresome repetition of a pointless meme: what Apple has is a *gated community*. It's not the walling-in that's the point here, but the *curation*. Android has barely any curation at all, hence its frequent security issues. iOS' App Store, on the other hand, *is* curated, which is less like a gardener wandering around a walled garden and occasionally reacting with an, "OI! Gerroff the lawn!", and more like the guards of a gated community who stop undesirables getting in in the first place. (No, they're not 100% successful, but they're close enough.)
What Microsoft needs is *focus*. It is making a mistake Apple was making in the mid-90s: it's doing too much. They sell no less than three flavours of desktop Windows, each with multiple variants. They sell server variants too. They make a games console (also with its own OS), they make games, they sell a major office suite, own a bunch of cloud services, and they sell industrial-strength software development tools too. They even make keyboards and mice.
Jobs was right to slash Apple's massive, and very confusing, product range when he returned: focus is a common factor in very successful businesses. You can't do that when you have a portfolio even wider-ranging than Apple's under Gil Amelio's tenure.
Unlike Jobs' scorched Apple policy, Microsoft could slash its portfolio by simply spinning off the profitable units into separate entities. Microsoft needs to make itself agile enough to react more quickly and effectively to the ever-changing world of IT – an industry that is, almost by definition, in a state of perpetual transition.
THAT is the hard part: changing Microsoft's management and corporate structure entirely. Given the present corporate structure, it may be that what Microsoft really needs today isn't so much a Steve Jobs, as a Genghis Khan.
Re: The Goons, really?
The original 23rd November 1963 broadcast was indeed followed immediately after by an episode of the Telegoons ("The Canal" episode).
The first episode of Doctor Who was also repeated the following week, on the 30th of November, due to the assassination of JFK on the 23rd overshadowing the first broadcast. The next episode of the Telegoons was not broadcast until the 7th of December – skipping the 30th. I can't find any scans of the Radio Times for the 30th though, so I'm only speculating that it was likely a casualty of the decision to show the repeat.
Re: 80's Doctors
Tom Baker had a few great episodes, but his theatrical acting style hasn't aged well. And there were a hell of a lot of duds too, not to mention an over-reliance on 'homages' to old gothic horror movies.
In fairness to Michael Grade, I think Colin Baker's portrayal suffered mainly from coming after Peter Davison's. Colin gave it his all – a little too much so – but the problem is that he and Tom Baker both have a strong theatrical acting background and it *really* shows when watching their stories on DVD. They both come perilously close to channeling Brian Blessed at times.
Peter Davison was very much a TV actor first and foremost and understood the medium's strengths and weaknesses. He gave a much more subtle, nuanced, portrayal. Casting Colin felt like a step backwards. Despite some decent episodes – I'm very partial to "Vengeance on Varos" and "The Two Doctors"* – Colin's tiresome schtick of repeating the same word three times, each louder than before – "Hammy? Hammy?! HAMMY?!!!—grated very quickly. Davison basically made the two Bakers' theatrical acting style obsolete.
McCoy was an inspired choice though, so it's a shame he was saddled with a bunch of scripts originally written for Colin Baker's portrayal for his first season. (And some very odd choices of scripts for the second.) His final season, on the other hand, holds up pretty well despite series' shoestring budget and the BBC's utter indifference to the series itself.
All that said, I'm still amazed the series survived the casting of Bonnie Langford. Poor Colin Baker. He never had a chance.
* (Patrick Troughton and John Stratton were clearly having a whale of a time, and we also got Jacqueline "Servalan" Pearce thrown in as well.)
There's a good reason why Apple kit tends not to have much upgradeability: eBay.
Apple hardware tends to hold its price very well – especially at the "pro" end. My 17" MBP (2010 model) has 8GB RAM and a 512 GB SSD. It screams. And it's still worth over £600+ on eBay, despite being closer to its 4th birthday than its 3rd.
I've yet to see the HP, Dell or Lenovo laptop that can boast the same.
Many owners of Apple kit are well aware of this and tend to effectively trade-in their old model for a new one every couple of years. Why the hell would they waste their time going through all the bother of upgrading?
(Before you reply: being a reader of The Register does not define you as a professional computer user, any more than being able to strip down and rebuild a V8 engine makes you a "professional" car driver.)
Re: As an ex- tech pubs* guy...
I used to write docs myself.
My niche was the game development tools industry: I wrote the user guide for Criterion's "Renderware 3" / "Renderware Graphics" (for which I can only apologise), and also the docs for the first few releases of Allegorithmic's "Substance" suite. I did some odds and ends for the Unity folks too some years ago. It's amazing how easy it is to find shockingly overpriced documentation tools that make even Emacs look like a simple, elegant editor by comparison.
"The engineers (as noted) hate documentation because they KNOW that their products are brilliantly intuitive and so NEED no documentation;"
This. Oh, so very this. (I lost count of the number of times I had to remind some of my colleagues what the "I" in "API" stood for. Developers can be end users too, so even an API should have basic UI design rules applied to it.)
Thing is, writing documentation really is a truly thankless task. Everybody hates what you do and considers your entire job pointless. End users have become so used to manuals being either missing, or abject shite, that they assume it's safe to expect this to be true of all applications. Not only do they never read your 500+ pages of bloody hard work, but they'll actually be surprised such a source of information even exists. And your own colleagues also see you as some kind of parasitic life form whose job appears to be ask them to explain what they consider blindingly obvious. There is nobody so blind as an expert.
Good technical authoring is bastard hard to do. Not only do you have to become an expert in the entire product you're documenting, you also need to be able to explain it to your readers without overwhelming them with information. Just enough background information – plus links to more in-depth info – and no more. And complex software can have features that rely heavily on other features too, so you need to organise the teaching to ensure you have full coverage.
Not everyone can do it effectively, despite many developers' belief in the contrary. Frankly, most developers I've met—include many with "Ph.D." after their names—seem to be either flat-out illiterate, or insist on making everything read like a particularly obtuse scientific paper. They sure as hell don't understand how to teach well.
Never mind that the greatest feature in the world is of no f*cking use to anybody if they can't work out how to use it. (Ironically, this is precisely how Apple have crawled out of near-bankruptcy to the top of the IT heap: they truly grok user education.)
I got the hell out when I realised that most of my potential clients had begun to see support as a chargeable feature, turning it into a revenue stream: a decent manual suddenly becomes a bad idea as it reduces support calls and, consequently, revenues from that particular stream. (Indeed, this appears to be how most GNU / FOSS applications are actually supported financially: write something that's genuinely useful, but give it a cryptic UI that effectively forces your customers to pay you for training and support and you can make a decent profit.)
I do translation now, which is a whole fresh hell of WTF in its own right, but at least people actually appreciate your work. They also don't tend to claim that "it's just writing! Anyone can write!"
Re: Really really basic computers
Procedural languages, Functional languages, OOP, etc., are mostly just different kinds of organisational scaffolding. No mainstream CPU actually gives a toss about any of that stuff; it all ends up as machine code in the end. Often on an Intel or ARM CPU, neither of which care one whit about whether you like to organise the original code in your source files as subroutines or as objects. Same meat, different gravy.
None of that stuff matters.
What matters is understanding how computers "think", because programming is just a synonym for "translation" and is actually pretty easy to learn at that level. I was far more productive coding Z80 or MC680x0 code in assembly language than I ever was writing in C++. I used to be proud of writing bug-free code, and it really *was* bug-free. But those days are long gone. The hardware has become orders of magnitude more powerful and capable, but the tools we use to program it all have barely changed since the flint axes of the 1970s.
It's 2013 and we're *still* writing code using artificial languages that require us to walk on eggshells due to their cruel and unusual punctuation. My *phone* can render a Mandelbrot set in real-time and run full-on 3D First-Person Shooters at HD resolutions, and yet we insist on forcing *humans* to do stupid grunt-work like adding a semicolon at the end of a line to save the compiler from a picosecond of calculation? How the hell is this even acceptable? How is this not front-page news in The Register and all its rivals? THIS is the scandal of our time.
No wonder today's software comes wrapped in legalese instead of warranties.
Re: Storage indeed
"No SD card in the nexus, but at least you can use a USB stick, or 20, or a USB microSD card reader if you like..."
So, er, just like an iPad with the Camera Connection Kit then?
No, it doesn't have a MicroSD slot built in, but most users don't need it. Why compromise a product's design and usability* for the sake of a relatively tiny number of edge cases?
If you want a tablet with stacks of storage, buy a Windows 8 model instead. Some of those come with 256 (and even 512 GB) of storage.
* (removable storage is a royal pain in the arse from a UI perspective. There's a bloody good reason why Apple went with software-eject systems for their Mac floppy disk drives back in the day.)
Re: Many tools for the job?
"and Microsoft isn't going to fund actual education."
Black & Decker aren't going to fund metalwork or woodworking classes either.
It's Microsoft's job to supply the tools. Anything else they do is icing on the cake. It's the school's job to provide the actual 'education' part. By offering industry-standard tools that students will actually encounter in the real world at a massive discount (and effectively for free for many students), Microsoft are reducing the total funds required for that education.
Sure, you could use Libre/OpenOffice, but it simply isn't that good. If it was, businesses would have standardised on them long ago. They are, after all, free. If you can't even gain market share by giving your product away for nothing, the problem isn't your competition. It's you.
Support counts for a lot. As does (relatively) easy customisation and extensibility, as well as a huge ecosystem of third-party applications that plug into the Microsoft Office suite. Companies like SDL, who create translation software, rely on MS Office being installed to handle the preview feature for Word-supported file formats, for example.
Libre/OpenOffice (as well as Apple's own "iWorks" suite) originally competed with Microsoft Works, not Microsoft Office. Sadly, Microsoft axed Works a long time ago, but it seems the *Office communities haven't really understood what it is that makes businesses so willing to pay licenses for Microsoft's products regardless. It's not merely inertia.
Re: If you get them young and you will have them for life
"should openly promote FLOSS because it's the right thing to do."
Wrong answer: Discrimination is wrong no matter which side you discriminate against. Either way, it's still discrimination. The correct answer is to demand that schools use both. It's not as if installing LibreOffice would add much, if anything, to the costs, and both suites can use the same file formats now. It'd also be a lot cheaper for students, although Microsoft's offering goes a long way to help there.
Software is a damned tool. I don't need to know how a hammer works if all I intend to do with it is hang some pictures on a wall. Open Source only has value if there is sufficient interest and competence in your target market to understand it and work with it. It's worth pointing out that this stage that, when the GNU movement first began, there were a damned sight fewer programming languages and philosophies around too. Today, there's a veritable Babel of such languages, with new ones seemingly being invented every other day, so the chances of your audience actually being sufficiently competent in the language(s) you're developing in are shrinking, not growing.
Open data formats are far more useful and valuable than mere access to source code. What matters is that my data remains accessible if development on the tool that created it should end. Given that both MS Office and the Open/LibreOffice suites can both read ODF files now, this is no longer an issue; all schools should have to do is teach the value of open data formats.
Re: even my kids with learning disabilities can manage that
It helps if you check the anchor and text-wrap settings for tables and pictures. Once I understood what each of those settings actually did, I had no further issues.
That said, Mr. Stross needs to learn that he should use the right tool for the job. MS Word isn't designed for writing novels. It's a *corporate document* creation tool. (Which I use frequently and have never had any problems with. I translate for a living, so MS Word is unavoidable: unlike most professions, translation has no concept of a "standard" document format, so I have to be able to deal with DOC, PDF, XLS, CSV, AI, InDesign files, PO files and more.)
For book design and development, I use tools like Ulysses III or Scrivener. Both are highly recommended and a far better fit for writers and novelists than MS Word. The only reason for using MS Word as an author is either wilful* ignorance or masochism. There's no need for it. I know professional novelists who swear by Scrivener, for example.
I believe Scrivener has a Windows version now. I'm not sure about Ulysses III; I suspect not. Then again, even Screenwriter 6 has a "Novel" template now, though it's mostly aimed (not surprisingly) at screenwriters.
* (Google exists, Mr. Stross. A novelist whining about how hard it is write with MS Word is like an architect whining about how hard it is to design buildings in Logic Pro X. All you do is make yourself look like an ignorant twit.)
To be fair, such a digression could easily fill another 3-4 articles on its own.
Magnetic recording triggered the birth of the technique many (incorrectly) refer to as "sampling" today. (The term is "sample loops" or just "loops" – "sampling" refers to the process of recording the actual sounds digitally.) The looping of those sampled sounds underly almost all music today. It's most obvious in the work of Norman Cook, but even 'live' acts use it as a matter of routine now in their studio recordings.
This technique was first used in the 1940s and led directly to the "Musique Concrète" movement that was so influential on the likes of the BBC Radiophonic Workshop. Most of their output during the 1950s and '60s was Musique Concrète, albeit often with very early synthesised sounds added.
The term "loop" came from these pioneers: they had to do it the hard way, by recording sounds onto tape, re-recording them as many times as required onto another tape – at different speeds, in order to get the necessary notes – then chopping the resulting tape up into individual notes. (This was all worked out mathematically, hence the maths or engineering backgrounds of many of those involved.)
These individual snippets of sound – bass lines, rhythm / percussion cycles, even short sections of melody – were then joined together to form the necessary tape loops needed to produce the final piece. For example, the percussion sequence might be quite short and result in a fairly short loop of tape that could be repeated over the full track. (Most music is filled with repeated patterns and motifs. Listen to any dance track and it's pretty obvious, but you'll hear it even in Beethoven and Bach.)
A complex piece might require multiple, very long, loops of tape, all played in synchrony, with some loops so long that the team would have to jerry-rig a system of reels and pulleys to run the tape out of the machine, out of the door, down a corridor, and all the way back again.
Multiple tape machines were used for this. Even "bouncing" tracks is a term descended from this era.
Digital audio workstations ("DAWs") hide all the fiddly mathematics and tape editing today, but the underlying concepts and principles haven't changed.
Most of this can be gleaned from the BBC's own (belated) celebration of the defunct BBC Radiophonic Workshop: The Alchemists of Sound It's a shame the BBC never expanded it into a series covering the wider context of electronic music in general.
Re: @Sean Timmarco Baggaley - "Yet he, too, chose to stay and work with Jobs."
Seriously? Tim Cook – whose core skill set is in optimising supply chains – has "non-transferrable skills"? Jony Ive had his own UK-based design company ("Tangerine") before he moved to California to work for Apple. With his reputation, I doubt he'd struggle to find clients if he decided to strike out on his own again. So why didn't he do that?
Quite a few people did leave Apple. Some of them came back. How can that be if Jobs was such a git?
Jobs' own friends have described him as "mercurial", but that's hardly a unique trait in a CEO. You have to be pretty ruthless to turn a company from a near-bankrupt basket case into the most successful business on the planet in just ten years.
Most of the people who give the keynotes at Apple launches today could have left at any time – the likes of Ive and Federighi could easily write their own tickets given their track records – yet they haven't done so. They have chosen not to do so. Jony Ive wasn't even a Jobs hire: he was already at Apple long before Jobs returned and had never worked as an employee of either NeXT or Pixar. So you can't claim cronyism either.
So, again: if Jobs was such a nasty piece of work, why the hell did so many people who could have easily walked right into a new job (or set up on their own) choose to stay and work with him?
Answer: he wasn't as big an "asshole" as he's made out to be – usually by people who never actually met him, or spent any substantial time with him. Jobs was certainly a control freak and a perfectionist, so it's not difficult to see why he hated doing anything that he could not have any control over. He practiced some of those keynotes for weeks.
Reporters and journalists tend to be interested in people, but Jobs didn't talk about himself much. He was no Richard Branson, whose entire career has mostly involved blowing his own trumpet and pimping his "Virgin" brand. The latter gave good interviews, but Branson is unlikely to go down in history as anything other than a famous, self-publicising beard who got very lucky and milked it for all it was worth.
Re: "Sorry" is the hardest word.
No, this is what happens when a publication has more than one writer working for it. Different writers = different perspectives.
Also, some forms of what was once known as Asperger's Syndrome do are linked to high levels of paranoia and destructive behaviour. The term "ASD" was created for a reason and this is one of them. A person closer to the High Functioning end of the ASD will be able cope with "normals" with a bit of help. As you get closer to the other end of the ASD, you get people who simply cannot function at all in ordinary society. Such people are often completely isolated from the people around them and, yes, paranoia and destructive behaviours are not uncommon.
And this is a "spectrum" – think of it as like a timeline, with one end of the line being "normal" and the other end labelled "completely hatstand". People with Asperger's Syndrome tend to be closer to the "normal" end, but the boundary between "Asperger's" and full-cream "Autism" was never satisfactorily and unanimously defined. Hence the replacement of the multiple variants with just one: Autism Spectrum Disorder.
Describing McKinnon as "paranoid, unreliable and destructive" would be quite valid if his diagnosis puts him closer to the full "hatstand" end of that X-axis. The worse your Autism, the harder it is to cope with other people in any way at all – even your own parents.
An appearance of paranoia is not uncommon as you move away from the 'high functioning' end of the spectrum, but you need to understand that we typically apply such descriptions based purely on outward behaviour; it can be difficult to tell if what we call "paranoia" is simply a manifestation of a more general, deep-rooted, terror of that constant torrential flood of data an autistic person is faced with during their every waking hour. Similarly, what an observer would consider "destructive behaviour" might have a perfectly rational explanation for the sufferer.
If any readers here have never read Oliver Saks' seminal "The Man Who Mistook His Wife for a Hat" and related books, I strongly recommend you do so.
The "emotionless = violent psychopathic killer" connection is made by people (and TV and movie writers) who appear incapable of understanding that even anger is itself an emotion.
Why would someone who doesn't react much to any emotion decide to suddenly make an exception and explode with rage and fury? It's not that emotions aren't there, it's just that they're felt nowhere near as intensely, be it joy or sadness, love or hate.
Such people tend to do well in jobs where what they have to do would turn any other person into a gibbering emotional wreck.
How you express your emotions is a far better indicator how violent you're likely to be. If you're someone who feels every emotion intensely, you will feel both joy and sadness, love and hate intensely. If you're also prone to bottling-up your emotions and venting only when that metaphorical bottle is full to bursting, you're likely to be far more dangerous than anyone on the Autism Spectrum.
I suspect that such emotional issues are more closely linked to current theories on depression and related disorders rather than the data-processing / management problems characteristic to autism. (Indeed, there's no reason to assume someone cannot have both an ASD and, say, clinical depression.)
The mammalian brain is a complex machine – far more so than any computer. Yet we've had the latter around for about 60 years now and, despite a CPU being essentially a collection of transistors, the behaviour of each of which should be entirely predictable, we still struggle to write bug-free software to this day. It's not the hardware that's difficult, but the code that runs on it.
The theories and hypotheses we have on the brain itself are pretty much at a similar level: we have a pretty good idea what individual synapses and neurones are, but have barely scratched the surface of what it is they actually do all day.
Re: What hasn't been mentioned....
'That would be an absolute nightmare for app developers.'
Tough. It's not the user's job to make life easier for the developer. It's the developer's job to make life easier for the user.
Android's APIs clearly need a serious rethink if this is such a chore for developers to deal with. iOS app developers have to deal with this kind of thing too and most do so without kicking up a big fuss. (It helps that the relevant iOS APIs are pretty easy to use. Perhaps Google should be aware that the "I" in "API" stands for "Interface" – i.e. developers need good UIs too!)
'How do you deal with an angry user who's blocked a fundamentally required permission for your app and then starts reviewing it poorly because "it doesn't work"?'
Oh, I don't know... how about being better at app design and development, catching the errors caused by disabled permissions, and failing gracefully with suitably clear messages and notices to the user explaining why a feature isn't working?
Re: Aspies are 13 to the dozen.
Actually, as another poster mentioned earlier, the term "Asperger's" is no longer used.
Spectra and continua are increasingly being used in diagnoses, instead of the older, 'digital' approach that required ticking a very specific list of boxes to obtain a diagnosis.
The human brain is such a complex organ* that the very idea that "normal" people even exist is just bizarre. It's far more likely that people with fully-functioning brains are actually quite rare, while the rest of us range from the extremely knackered to the only slightly buggered. And that buggeredness could be anything, from a poor sense of direction, through to something like colour-blindness or a potentially terminal inability to read user guides (or see the bloody great "Help" menu right up there at the top of the window. Look! See? That! What do you think it's for?)
Asperger's is not – and never has been – a license to be an asshole, not least because "asshole" is a very subjective description and depends on your point of view. Most people seem to believe the late Steve Jobs was a right tosser, but Jony Ive, Tim Cook and their colleagues don't appear to have thought so.
However, "high functioning" sufferers can also learn to be sociable. Many things most "normal" people seem to be able to do subconsciously, such as reading body language in real time, is something we "data-processing impaired" have to concentrate on consciously – often to the point where we end up with splitting headaches from trying to process all the data.
I'm one of those who also struggle with noisy environments. I've found lip-reading helps, and also goes a long way towards countering the eye-contact avoidance problem too, but it often makes moi brain 'urt!
Which is why I hate parties.
* You, sir, have a filthy, filthy mind.
You might want to read the Techies with Asperger's article elsewhere on the site before making such comments. There's strong evidence that Jobs had mental health issues and I suspect that those who hated him were mostly people who couldn't make allowances for that. Even as a child, he tended to think digitally: everything was either "awesome" or "shit". There was never any middle ground. So those traits were there right from the start.
Asperger's (or some other part of the Autism spectrum) would certainly explain a lot, as would OCD.
Apple has been the biggest contributor to the "Product Red" charity for some time now, so the notion that Apple never donated any money at all under Steve's watch is patent nonsense. Jobs' estate has also pointed out that they preferred to donate anonymously.
Jonathan Ive, Tim Cook, and many, many others at Apple have had plenty of opportunities to leave the company and either go work for others, or set out on their own. (Some have, in fact, done precisely that, as a .look at Woz's full CV will attest.) Furthermore, some Apple people left, then willingly chose to return.
Let me repeat that, in case you missed it: these people chose to work at Apple with Steve Jobs. Jobs was in charge of Apple for 13 years or so. Jony Ive was hired in the early '90s – long after Jobs' initial departure and some years before his return. Yet he, too, chose to stay and work with Jobs.
Thirteen years is plenty of time for all those people to update their CVs and send them out. So what stopped them? Did Jobs hire heavies to point guns at their heads? Did he kidnap their families? If Jobs was truly the complete arsehole he's often made out to be by his detractors, why didn't all those people who worked for him leave?
The man clearly inspired a surprising amount of loyalty from his friends, so he must have had something going for him. He was even married to the same woman for 20 years, with whom he'd had three children, when he died. If he was that hard a man to work with, how did that happen? When they married in 1991, Pixar's first success was still four years away and NeXT was hardly a money-spinner either, so Laurene Powell certainly didn't marry him for his money.
Jobs was the one who suggested creating a business out of making and selling computers. If Woz had had his way, Apple would never have happened and it's doubtful if Woz would be as well-known today.
Jobs was the driving force behind Apple from its inception. Woz clearly wanted so little to do with it that he effectively left the company as a full-time employee in 1987. (Apple still pays him a $120K / year salary and Woz also has stocks in the company.)
The philanthropy issue is interesting: Jobs' widow has stated that Jobs did donate to charity, but preferred to do so anonymously.
And consider how many people have jobs because of Apple: not just Apple's own employees, but the thousands of developers, support businesses and all those involved in the production and distribution chains. In today's economy, having a steady income is very much a Good Thing. This is as much a benefit of Apple as any amount of pissing money into the black holes of the WWF, Oxfam and their ilk.
It's also worth mentioning that donating to charity is often considered an excellent way to reduce your tax burden. This is one of the most popular reasons for a business to donate money. Altruism often has bugger all to do with it.
Re: 5S and 5C impressive for different reasons
It's a different kind of sensor to the crap sensors used in the past! It's been explained in any number of articles that this is a capacitative sensor with the equivalent of a 500 dpi resolution that looks beneath the surface of the skin. It's not an old optical surface scanner as used on those old, less reliable, sensors.
Your assertion is like claiming that iPhone 4's "retina" display was just a gimmick because "every phone has a display".
The iPhone 5s isn't the first phone to include a fingerprint sensor, no, but it is the first to include one that is (a) not shit, and (b) actually integrated right into the OS at a very deep level, making it far more useful.
Apple are all about creating a seamless user experience. They only include a technology if it will actually enable them to improve that. Any new technology they include in their products therefore has to be integrated at a very deep level in order to provide that appearance of seamlessness.
(And, before the usual bunch of ignorant haters jump in: no, they're not perfect. Last time I checked, neither were any of Apple's competitors.)
Nothing I've written above should be even remotely surprising to regular readers of El Reg as all the evidence has been right there, out in the open: Neither Jobs nor Apple have ever hidden this information from the public. They've even produced TV spots explaining all this. And not just that one either. Many of their ads are there to educate people about Apple's philosophy of making design and the user experience – not merely shovelling raw, unprocessed technology into a box for its own sake – their top priority.
They've been doing this for at least fifteen years now.
What more do they have to do? How long will it take for some of you to grok that Apple are not Microsoft or LG and are successful precisely because of how they differ in their approaches to product design and production?
Re: Please please PLEASE ...
He's well aware that PCs and tablets are two different things.
Which form-factor is selling like hot cakes?
Which form-factor is seeing sales drop off a cliff?
75% of all PCs sold three years ago were laptops. That percentage is rather higher now. Laptops typically come with trackpads, not mice. Even Windows laptops have had trackpads that support multi-touch gestures for some time now.
The traditional "separates" PC design is already a niche market. That form-factor is limited primarily to high-end workstations.
Dell sold about 9 million PCs last year. Asus sold about 8.6 million. Acer sold a little over 6 million. In a year.
Apple sold over 9 million iPhones 5s and 5c models over a single weekend.
Canonical are well aware of the future of IT and consumer electronics. At least they're doing something. They might not get it right first time – Microsoft have a tendency to iterate too; the first reasonably popular version of Windows was v3.1 – but it's the attempt that matters.
Both the GNU / FOSS communities and Microsoft have proved that you can cock up spectacularly regardless of your corporate / community culture and politics. How long have we been waiting for the "Year of Linux on the desktop" again? And yet, there's Apple with a full-fat *BSD UNIX OS that's been made so user-friendly, most of their customers aren't even aware of its UNIX heritage. And they pulled that feat off not once, but twice, with OS X and, later, with a completely redesigned GUI for iOS.
Without vision, effective guidance and focus, you're screwed. Whatever your personal views on Mr. Shuttleworth and his vision, at least he has one. The GNU / FOSS movements do not. All they have is a tired, anachronistic political dogma of no relevance to anyone who doesn't actually program computers for a living. (And even then, it's only a tiny subset.)
Re: End the Drug War or No More Debt
"All one can say for sure is that hundreds of millions of dollars have been spent on legalizing marijuana - who has that kind of money?"
Tens of millions of people?
There's nothing writ in stone that says funding has to be provided in one gigantic lump sum from a single donor.
Thunderbolt isn't competing with USB.
The two are apples and oranges: Thunderbolt 2 (which is the version in the announced, but not yet launched, new Mac Pro) is already at 20 Gbits/sec and is part video port, part data port.
However, the crucial difference is in the way they connect with devices at each end: Thunderbolt is, to all intents and purposes, a PCI bus on a rope. That's why each port requires a PCI lane. It interfaces with the computer (or peripheral) at a lower level than USB.
Thunderbolt, like Firewire, also does all the heavy lifting itself; your computer's CPU is therefore not bothered with running the protocol at all. USB, on the other hand, requires the CPU does some of the work. For consumers, the difference is academic, but for many professionals, anything that reduces valuable CPU capacity for their own projects is a big no-no. Why spend eye-wateing sums of cash on a high-end Xeon CPU and waste even a fraction of that power on USB processing overheads?
With USB, the CPU has to not only handle video data compression for my video editor application, but it also has to set aside some of its time to handle some of USB's duties during file transfers. This lengthens the time required for that video work, which is the work that actually *pays my bills*.
With Firewire or Thunderbolt, the CPU is *entirely* free to work on *my* stuff, not the connector's. My computer is therefore helping to make me more money per second. If I have to pay a one-off €29.99 for a cable to ensure that, I won't even blink at the cost as I know I'll make it back within a day or two.
So, no, Thunderbolt won't be a big consumer connection standard. Neither was Firewire. But the latter had its place in the high end prosumer and professional markets, and so will the former.
Last time I checked, Apple were rather fond of targeting both those markets. That's where the profits are.
Re: Keyboards for the win, till speech really is recognised
Matias do a lovely version of their Pro series which explicitly support connecting to mobile devices via Bluetooth. Might be worth a look.
(I'm quite fond of my Unicomp IBM-based keyboard, but I have an older Matias Tactile Pro sitting next to it for when I feel like a change of feel.)
Did anyone *force* Nokia to bring in Elop?
Then Nokia's woes are entirely their own damned fault.
They had Symbian... and let it wither and die through under-investment.
Their in-house System 40 OS has seen precious little love over the years too.
They mucked-about with various Linux derivatives and... f*cked those up too.
By the time Android had reached the point where it could no longer be accused of being a flagrant iOS rip-off, it was too late for Nokia.
The writing was on the wall when Apple launched their first iPhone and subsequently proceeded to eat Nokia's high-end lunch. Nokia *should* have seen multi-touch devices coming. They didn't. Neither did most of the other incumbents of the day: Whither Sony-Ericsson? Where is LG? Even Motorola is now just a department of Google, Inc. Nokia's management became complacent and, ultimately, incompetent. Today, all they have left is their mobile networks business; their mobile arm was effectively junk long before Elop knocked on their door, so offloading it now is more a case of "What the hell took you so long?" than "Ooh! Elop was a Microsoft mole!"
This is what competition *means* in a (mostly) Capitalist society: Company A seizes the castle and gets to play king for a while, making it the target for all its rivals. Unlike those rivals, Company A only has to make one serious blunder and one of those rivals will come along, spank their corporate arses, and take over that castle. It then becomes that new company's turn to rule for a bit.
Rinse and repeat. This corporate success / failure churn is cyclic and fairly predictable.
It's also why the most successful CEOs don't tend to be nice, friendly little doormats. You don't get to demand such telephone-number deals and salaries unless you have a ruthless streak wider than an airport runway.
Dear El Reg...
... still no "Edit" button? In 2013? Seriously?
I see you also still haven't fixed the bug that deletes empty lines between paragraphs when copying to the clipboard in Safari on OS X. This only seems to affect comment posts; articles copy just fine.
Considering the industry you're supposed to be reporting on, this is a very poor show.
Must be a slow news day.
Apple don't have anywhere near the same number of data centres Google does, nor do they own their own undersea / cross-border cables – not that I'm aware of, anyway. Google own some of those too.
Sadly, Apple and Google had a bit of a falling-out a few years ago and they haven't been on speaking terms of late. I understand El Reg is similarly in Apple's bad books. The upshot of which is that Apple are almost entirely at the mercy of the middlemen who connect you with their servers: your ISP, their peering, interconnections like Telehouse, and so on, all along the line. It only takes one bottleneck to slow things down for everybody else.
And yet, every single bloody time a major online roll-out like this happens, we *always* get the same tiresome filler pieces bemoaning the fact that—shock!—the internet is not, in fact, an exception to the laws of physics.
(Besides: what happened to waiting a day or so to see what issues all those early adopters have run into first?)
Re: Well at least the phone update went OK
Wait for your drum machine app to be updated.
Many developers prefer to wait until the GM (i.e. final) developer seed of any new OS before working on major updates to take advantage of changes. iOS 7 introduces a number of new features to the core libraries, so it's always better to wait for these to bed in.
This has always been the case with previous iOS releases, so it shouldn't really come as a great shock that many apps haven't been updated for iOS 7 just yet. Expect a rapid series of point updates – it's got a *lot* of new code under the hood; the cosmetic changes are just the tip of the iceberg – as well as updated apps over the next couple of months. My money's on iOS 7.1 appearing around the same time as the new iPads and OS X 10.9 are released.
(Note that many apps are still following the old iOS 6-and-earlier UI guidelines, so it'll take a while before the app launcher starts to look a bit less like an explosion in a Technicolor® Yawn factory.)
iOS developers have had *months* to check out the beta versions, new APIs and developer docs, so it shouldn't take long for your favourite app to get an update. If they seem to be dragging their feet, switch to an app written by a team that actually gives a damn about their customers.
Re: About as un-Jobsian as you can get
"I can say with absolute certainty that Steve Jobs would never in a million years have sanctioned the release of the aesthetic disaster that is the iPhone 5C."
The 5c is a coloured iPhone. You know: like the coloured *iPods* Apple have been selling for years now. Did you not notice those?
Have you also forgotten the multicoloured *plastic* iMacs that helped bring Apple back from the brink, nearly 15 years ago?
All of those were released with Steve Jobs' explicit approval while he was still alive.
Re: Tim Cook needs to go. He is destroying Apple.
Because, of course, the iPhone 5s' sensor is *exactly* the bloody same as those useless little finger-swipe models that never actually worked reliably.
Oh, wait, it isn't, and you're talking bollocks. Again.
Incidentally, the *only* people expecting Apple to suddenly change the habit of a lifetime and suddenly decide to cater to the low- and mid-range markets with a "cheap" iPhone were the same pundits who continually insist that they know _exactly_ what the late Steve Jobs would have done despite having never actually met him. Apparently, all the people who actually *worked* with Jobs for many years don't have a clue what he wanted.
Come to think of it, *any* pundit who was actually any bloody good would be more than wealthy enough not to have to write link-bait bollocks for third-rate websites for a living. So those who do can be safely ignored.
Re: The comparison will be made
Sorry, but I disagree.
I recently helped a relative replace her ageing Nokia. She was on an extremely tight budget, so we originally planned to replace it with, possibly, a feature-phone at best. Instead, we ended up with the snappily-named Samsung Galaxy Star s5280. Cost? €80 in-store, SIM-free.
(More info here: http://www.samsung.com/it/consumer/mobile-devices/smartphones/smartphones/GT-S5280RWAITV – [NOTE: Italian site, but the specs should be understandable] ).
That's an Android "Jellybean" 4.1.2 smartphone for about the same price as the ZTE Open.
(Eagle-eyed readers will have noticed the lack of 4G and even 3G support, but these are useless outside of Italian urban areas. Out here in the Italian countryside, you're lucky to get even basic 2G signals unless you live right inside a town or village. On the other hand, home broadband with WiFi is easy to find. The next Android phone up was over the €100 spending limit.)
The Galaxy Star is very much a low-end Android smartphone, but despite its low specs, it still points and laughs at the ZTE Open, while kicking sand in its face. This despite being, by Android phone standards, a weedy little thing with pipe-cleaner arms, bottle-lensed glasses and an allergy to sports.
Relying on what are, fundamentally, just grids of website bookmarks for your apps is a bloody stupid idea, not just in the West, but *especially* in developing nations that this phone is apparently supposed to be aimed at. Many potential customers barely have *clean running water*, let alone access to the mobile internet infrastructure needed to use such a phone. How are they supposed to run those apps?
(Also, if the web is so full of open standards, why aren't there more apps for existing mobile platforms already? Last time I checked, even an iPhone 1 or that Samsung Galaxy Star could run such apps just as easily as this Firefox-based device. Yet nobody seems to be jumping onto that bandwagon with any alacrity.)
Re: @Steven Raith 2013-09-16 22:03 Can't Cook.. Stupid Cook..
Programmers typically write to an *API*, not to the bare metal of each machine. Switching to another architecture could easily be as simple as a recompile for most apps, while games and other hardware-pushing applications might need some additional tweaks to take account of differences in OpenGL features. The PowerPC >> Intel switch was pretty damned painless and was achieved in less than a year. For most developers, it really was as simple as selecting "Intel" from a drop down "Build target" menu.
The iPhone 5s uses a very different ARM core to the 5c and earlier iPhones: as one of the developers they wheeled onto the stage said, it took a mere two hours to support the 64-bit ARM processor, and it's probably fair to assume it was mostly a recompile, with some minor tweaks to low-level code to help it take advantage of the wider registers and data bus.
Even a version of OS X with iOS app support isn't that difficult to imagine: iOS apps could simply open up into a 'Space' in landscape mode. I suspect it would also make sense for Apple to bump up their "TouchBook Air" displays to 'retina' resolutions too, as the 11" MacBook Air's display might prove problematic. iOS developers will simply have one more aspect ratio to contend with (16:10, as opposed to the iPad's native 4:3), but this is hardly a showstopper.
Re: Believe Apple? Erm no.
Okay, answer this:
If the NSA already have a backdoor into my phone, why the hell would they even *need* the fingerprint hash data? A fingerprint scanner is a means to an end. As far as the NSA are concerned, they already have the master keys to every US-made / designed phone, so they don't need *our* keys at all!
Either way, there's no reason for Apple to lie about that fingerprint scanner and how it works. The fingerprint hash is of no interest to the NSA: They're _already_ in. Their interest is primarily in your communications, not your biometrics. If they genuinely think you're a threat to national security, they'll send the boys round to get them, whether you want them to or not.
The NSA's activities are an entirely predictable symptom of declaring war on an *emotion*. "Terrorist"-type attacks in most countries tend to be carried out by citizens of said country – the Oklahoma bombing; the 11-SEP-2001 attacks, the London bombings in 2005, etc. were all carried out by people already within the target nation's borders. Given this, it's hardly a big shock that the NSA (and their peers in other countries) were spying on their own citizens as part of their assigned duties.
Re: Refreshing truth.
Strange. I can drop any ePub or PDF file I want into iBooks. Anything that's not directly compatible, I can convert with Calibre. I can also drop many standard music file formats into iTunes – even MP3. It'll offer to convert some formats, while others can be converted by other (free) tools. Same goes for video, which I tend to stream off a QNAP NAS: AVI, MKV, FLV, MPEG-2, MP4 – you name it. (You are aware you can even get VLC for iOS, right?)
The *only* restriction Apple's iDevices have is that there is only the one App Store, and it's the one Apple created. Yes, it's curated, but so is every shop in the high street: nobody can walk into a John Lewis department store and demand that they sell their products without the permission of the Head Buyer. Curation is *normal*. It is not some form of control-freakery.
A "store" that lets anyone come in and set up their own stall is called a "bazaar". Perhaps you're unaware of this, but in the countries I've lived in, bazaars are surprisingly rare. It turns out most people like to know they can bring an item to a store and not find the item's seller has done a runner!
I'm not sure how a focus on good design and usability makes Jobs a "malign influence". Most successful CEOs tend to be abrasive and even a little OCD and Jobs was no exception. Neither, it seems, is Richard Stallman, so he certainly doesn't get to criticise Jobs on that front.
However, it is clear that Stallman is unaware that "freedom" is a two-way street:
Jony Ive and Apple have just as much right and *freedom* to follow their own design philosophy – which they've never made any attempt to hide: http://www.youtube.com/watch?v=VpZmIiIXuZ0 – as Stallman does. Apple aren't forcing you to buy their stuff. You have the freedom to choose one of the many competing products instead. Apple won't mind: they have a very specific target market and they're sticking with it.
Stallman, however, is a hypocrite: he seems hell-bent on *forcing* the entire planet to kowtow to his own, rather peculiar, views and philosophy. This is the exact *opposite* of freedom.
One possible use...
... is to lift entire prefabricated sections of a building to speed up construction.
The Skycrane helicopters don't have anything like the lifting capacity of this new airship, but they can still do this: http://upload.wikimedia.org/wikipedia/commons/f/fd/Sikorsky_Skycrane_carrying_house_bw.jpg – and that was a hell of a long time ago. Now, imagine what an airship capable of hoisting *250 metric tons* could do...
These airships could make a *huge* difference in construction alone. If you watch any recent "How we built [REALLY TALL BUILDING X]"-type documentary, you'll notice that, especially in tight urban areas (e.g. 'Ground Zero' in NYC, during the construction of the new WTC towers), an awful lot of effort is spent simply ensuring all the materials are brought in on time and in the right order. It's very much like a modern factory production line. If the site is in a tight urban location, you're talking about *hundreds* of trucks and other vehicles that need to be marshalled into and out of the site without causing massive disruption.
Now, imagine what you could do with some larger versions of this airship, rated at, say, 750 or so metric tons each: you could literally hoist entire *prefabricated floors* into place, including cladding and the concrete floor for the floor above. As most of the workers would be at ground level, rather than on some windswept steel skeleton hundreds of feet above the ground, they'd be much safer and could work more quickly without the need to keep unhitching and hitching their safety lines. You could build each floor at a more convenient site, along with the cladding and stairs, and keep only a handful of people on the roof of the previous floor to help guide each new floor into place and bolt it into position.
You could easily complete 5 or more floors in a single day, vastly speeding up the construction process. At present, the fastest average is typically just one floor per day. And that's usually just the steelwork and concrete slabs.
Even with the current model, you could still save a lot of time by lifting entire sections of pre-bolted steelwork into position, instead of just one or two beams at a time, as is currently the norm.
(The down-side of these beasts is that they're going to make the "Monster Moves" documentaries rather dull and predictable: just hoist whatever you're planning to move up into the air and fly it to its destination!)
I get the impression they've been testing it on their marketing department as they must have been seriously tired and emotional when they came up with that name: "Birra Spalmabile" = "Spreadable Beer" in Italian.
Re: Shark. Jumped.
I've been reading The Register since around 2000-ish. This is indeed a "typical" today, but ten years ago, this site had a lot less blatant childish trolling and cheap link-bait. It used to have some bloody standards and even writers who could write without using childish insults.
Contrary to popular belief, some of us Apple customers are not IT-illterate newbies. I've been programming computers since the early '80s. I can code in Z80 and 680x0, developed published games in the late '80s and early '90s, (including graphics and animations), and know a number of other, higher-level, programming languages, including plenty I've mercifully managed to forget – like Forth and COBOL. I've built and maintained entire Windows-based networks, with dozens of PCs that I bought as components and assembled single-handed. I'm not some ignorant sandal-wearing cult follower. Steve Jobs may have had serious personality issues, but so did Spike Milligan and I don't think any less of his work either.
Tim Cook's stock options situation was known about at the time he took over from his predecessor, so this article is not even remotely "news". It's also perfectly normal in any other company, so why single out Apple? Do you think anyone in Google's top management tier is being paid any less? Do you think Apple is the only company that offers stock options to its company leaders?
"Fanboi", "cult of Apple", "iFans", etc. are just as childish and tiresome as the unfashionable "M$" and "Microsucks". None of these – not even "fandroid" – have any place in the bloody articles.
In the comments, fine, but not in the articles themselves. I don't want to read articles written by people who clearly belong on YouTube, writing the comments.
So, I'll ask again: is there a decent tech news site on the internet that is aimed at people with an IQ above that of a cabbage?
I've known this moment was coming for a some time now, but...
"Apple boss Tim Cook took a 99 per cent pay cut in 2012 - the year his firm's maps crapp confused iPhone fanbois and rival Android dominated the mobile market."
Seriously? How old is this writer? Six?
The Register used to have standards. Low standards, granted, but standards nevertheless. You used to be better than this, but the site has degenerated increasingly into tiresome link-bait trolling bullshit of no worth. My time is valuable to me and I really don't appreciate having it wasted.
Does anyone know of a decent technology news site that actually hires grown-ups who can write without insulting half their readership, instead of childish YouTube comments posters?
When a business workforce is spending more time fighting the systems than using them...
... it has a serious problem.
Infrastructure is supposed to be invisible. You should never even have to think about it until something goes wrong. If your IT systems are constantly falling over, your IT department is broken. If your users are constantly screwing things up through ignorance, you need to train them. Note that the latter is NOT an option: every successful business invests in its workforce to make them more productive. A user who knows exactly when to use Excel, when to use a database, and when to use a proper DTP application instead of trying to do every f*cking thing in MS Word, is a user who is helping your business run more smoothly.
Managers are supposed to handle the higher-level strategy or tactical aspects, not running around like a ragged-arsed chicken constantly fighting fires. They're supposed to be making the business more productive – i.e. more efficient – by finding ways to improve systems and processes.
If the IT infrastructure is constantly getting in everyone else's face, saying "Can't do! Computer says 'No!'" and so on, it is BROKEN. No "ifs", no "buts". The purpose of an IT department is to support the business, not vice-versa. IT is but one of many components. It is the grease that keeps all that corporate machinery running. If there is friction, you're doing it wrong.
I've seen this from both sides – I've been an IT Admin and a manager. (For a while, I was doing both at the same time; it was a very small business.)
Yes, your colleagues (not "users") may be ignorant of how computers work. So what? I doubt many of you understand the finer points of logistics, or tax accountancy either. Everyone is ignorant of something. Your role is not to throw up obstacles and jeer at their ignorance, but to find out how you can HELP them. You can offer to train colleagues in the finer points of using a PC – I think of training and education as "preventative customer support"; it drastically reduces the number of support calls you get for basic issues and leaves you much more time to get on with other tasks.
If your managers see you only as a "cost centre", point out to them that learning to drive costs a shitload of money these days too, but few drivers then go on to whine about how it's made them less productive. Education and training is much cheaper than pissing away valuable time firefighting trivial problems that could have been avoided by eliminating ignorance.
And this training works both ways: IT staff cannot help the Accounts department effectively if none of the IT people have a clue what the accountants there actually do. This is basic Systems Analysis: you need to find out what your colleagues need – which, as others have pointed out, is not necessarily what they want – and find ways to make that happen.
THAT is your job. THAT is what IT administration and support staff are supposed to be doing.
All that said, there is a generational problem going on here too: IT is in a constant state of flux and transition and not everyone can cope with the frequent changes.
If, after all your attempts to train and educate a particularly IT-illiterate colleague, they continue to screw things up due to their incompetence and inability (or lack of desire) to learn, then, and only then do you get to tell HR about it and suggest said colleague is either let go, or moved somewhere where they can do less harm.
I say this because IT is infrastructure, like plumbing and electricity. If an employee is regularly buggering up the plumbing, or plunging an entire department into darkness, they'd be let go immediately. It's 2012, not 1982. There is no excuse for being so totally clueless with a basic tool of the trade. You wouldn't hire a carpenter who has no clue how to use a hammer, so there's no acceptable reason for an HR department to keep people on who have no idea how to use a computer in this day and age.
Both sides are right. The answer is, as is often the case, in the middle ground.
"You might have bought in to the idea that we are morally and ethically obligated to refresh our hardware and software every three years or you may believe that "new" is a reason to change what works."
Oddly enough, no, I haven't bought into that idea myself.
I was merely pointing out that corporations ARE "morally and ethically obligated" to refresh their products. They are legally obliged to provide the best value and returns for their investors and / or shareholders. They don't get to choose not to do so. This typically means giving potential customers a justification or excuse for buying new stuff, rather than sticking with the old stuff.
But given how many people really do seem to be distressingly prone to fads and fashions, I can't say I blame them. But no, I'm not big on consumerism myself*. I even stopped owning a TV way back in 1996, long before it became fashionable to do so.
Consider how often you've heard the phrase, "Now washes better than ever!" I've always wondered what the hell those companies were putting in their boxes of washing powder 30-40 years ago. Mud? Dried sewage? How much "better" can such a powder possibly be after so many years?
This endless exhortation to buy more stuff, newer stuff, shinier stuff, vaguely 'better' stuff, is not new. It's been going on for generations. It's not about to end just because some Mayans' laser printer ran out of paper.
* (With the ever-increasing horde of nephews and nieces to "voluntarily" buy presents for each year, it's not as if I have the option anyway.)
Re: How long
"[Apple] buys the name.
I suspect it'd be easier to just buy the company.
Re: @Trevor_Pott, Gil Grissum, Oh4FS, et al.
"It is still Microsoft shitcanning older format support to drive adoption of their newer stuff for no good reason whatsoever. Pay the tithe sir. Use our new interface sir. Buy our training for our new interface sir..."
Of course, nobody else does this at all. Ever. Only Microsoft. Not Apple. Or Samsung. Or HTC. Or Nokia. It's only Microsoft who ever tout new shiny to replace last year's shiny.
I can only assume from your rant that you have never used any Adobe software in anger. They update every bloody year and their upgrade prices are typically higher than the full price for Microsoft's complete Office Professional suite. They've been frequently accused of price gouging and monopolistic practices for some time now – particularly after they swallowed Macromedia whole with a nice Chianti.
I'm sorry, but when people start complaining that a business that makes a product is being so evil by touting new, shinier, versions of said product on a regular basis in order to drum up business, I have to wonder what the hell they teach kids these days. Do fashion houses not do precisely this every damned season? Do TV broadcasters not advertise new episodes of their hit shows on top of shows that are actually being broadcast immediately after every damned ad break, and even at each end of said ad breaks?
I'm not fan of Microsoft myself – I use a Mac – but I'm seriously bored of all these immature "Evil capitalist PIG!" screeds that somehow seem to ignore the fact that every goddamned corporation does exactly the same things.
If you don't like the rules of the game, change the game. But don't blame the players for playing by the rules. They can lobby for changes to said rules, but they don't actually get to make them. That's your job.
As for your "headache": may I suggest you advise upgrading to a more recent version of Microsoft Office? It can still read its old formats, while also supporting the new ones. That would make the transition easier.
Once your clients have been weaned off those old proprietary formats and have archived their old documents properly, you can then start to move then towards the likes of Libre/OpenOffice, but only if your clients don't rely on MS Office's extensibility. (VBA is popular, but Office also has a very powerful API. Specialist software like SDL Trados – the translation world's industry standard – relies heavily on MS Office's components to ingest and export, as well as to display, document previews for MS Word-based projects.)
@AC 20-DEC-2012 19:50 GMT
"I am a professional writer. I have several novels, articles and short story collections in DOC format on my PC. I also have at least five hundred DOC / XLS files in my email concerning contracts, corporate papers, royalty payments, etc."
And you've been completely unaware of Microsoft's move away from their old document formats because...?
I'm a professional technical author and translator myself. I use Scrivener for writing, not Microsoft Word. I use MS Word for translations, and even then, only because that's the format most of my work arrives in. (I actually use SDL Trados, but that relies on MS Office components for some of its functionality.)
This is the IT industry. Proprietary file formats can, and do, become obsolete and unsupported over time. I've lost count of the many TLAs that have gone to that great Winchester drive in the sky.
The Microsoft DOC and XLS file formats are not open standards and therefore cannot be relied upon to remain supported in perpetuity. Hell, there are even differences in how well they're supported in Microsoft's own software; the formats have never been particularly well documented. (If they had been, Libre/OpenOffice might do a better than half-arsed job of working with them.)
There are archival-quality open ISO Standard formats available (e.g. PDF/A) that you should have been migrating to years ago. You could have the process five years ago, converting a few files a month, and been done with it all ages ago. The only person to blame for leaving all your washing up in the sink for so long is yourself.
- On the matter of shooting down Amazon delivery drones with shotguns
- Review Bring Your Own Disks: The Synology DS214 network storage box
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene
- IT MELTDOWN ruins Cyber Monday for RBS, Natwest customers
- Google's new cloud CRUSHES Amazon in RAM battle