790 posts • joined Friday 8th May 2009 16:41 GMT
Re: Shark. Jumped.
I've been reading The Register since around 2000-ish. This is indeed a "typical" today, but ten years ago, this site had a lot less blatant childish trolling and cheap link-bait. It used to have some bloody standards and even writers who could write without using childish insults.
Contrary to popular belief, some of us Apple customers are not IT-illterate newbies. I've been programming computers since the early '80s. I can code in Z80 and 680x0, developed published games in the late '80s and early '90s, (including graphics and animations), and know a number of other, higher-level, programming languages, including plenty I've mercifully managed to forget – like Forth and COBOL. I've built and maintained entire Windows-based networks, with dozens of PCs that I bought as components and assembled single-handed. I'm not some ignorant sandal-wearing cult follower. Steve Jobs may have had serious personality issues, but so did Spike Milligan and I don't think any less of his work either.
Tim Cook's stock options situation was known about at the time he took over from his predecessor, so this article is not even remotely "news". It's also perfectly normal in any other company, so why single out Apple? Do you think anyone in Google's top management tier is being paid any less? Do you think Apple is the only company that offers stock options to its company leaders?
"Fanboi", "cult of Apple", "iFans", etc. are just as childish and tiresome as the unfashionable "M$" and "Microsucks". None of these – not even "fandroid" – have any place in the bloody articles.
In the comments, fine, but not in the articles themselves. I don't want to read articles written by people who clearly belong on YouTube, writing the comments.
So, I'll ask again: is there a decent tech news site on the internet that is aimed at people with an IQ above that of a cabbage?
I've known this moment was coming for a some time now, but...
"Apple boss Tim Cook took a 99 per cent pay cut in 2012 - the year his firm's maps crapp confused iPhone fanbois and rival Android dominated the mobile market."
Seriously? How old is this writer? Six?
The Register used to have standards. Low standards, granted, but standards nevertheless. You used to be better than this, but the site has degenerated increasingly into tiresome link-bait trolling bullshit of no worth. My time is valuable to me and I really don't appreciate having it wasted.
Does anyone know of a decent technology news site that actually hires grown-ups who can write without insulting half their readership, instead of childish YouTube comments posters?
When a business workforce is spending more time fighting the systems than using them...
... it has a serious problem.
Infrastructure is supposed to be invisible. You should never even have to think about it until something goes wrong. If your IT systems are constantly falling over, your IT department is broken. If your users are constantly screwing things up through ignorance, you need to train them. Note that the latter is NOT an option: every successful business invests in its workforce to make them more productive. A user who knows exactly when to use Excel, when to use a database, and when to use a proper DTP application instead of trying to do every f*cking thing in MS Word, is a user who is helping your business run more smoothly.
Managers are supposed to handle the higher-level strategy or tactical aspects, not running around like a ragged-arsed chicken constantly fighting fires. They're supposed to be making the business more productive – i.e. more efficient – by finding ways to improve systems and processes.
If the IT infrastructure is constantly getting in everyone else's face, saying "Can't do! Computer says 'No!'" and so on, it is BROKEN. No "ifs", no "buts". The purpose of an IT department is to support the business, not vice-versa. IT is but one of many components. It is the grease that keeps all that corporate machinery running. If there is friction, you're doing it wrong.
I've seen this from both sides – I've been an IT Admin and a manager. (For a while, I was doing both at the same time; it was a very small business.)
Yes, your colleagues (not "users") may be ignorant of how computers work. So what? I doubt many of you understand the finer points of logistics, or tax accountancy either. Everyone is ignorant of something. Your role is not to throw up obstacles and jeer at their ignorance, but to find out how you can HELP them. You can offer to train colleagues in the finer points of using a PC – I think of training and education as "preventative customer support"; it drastically reduces the number of support calls you get for basic issues and leaves you much more time to get on with other tasks.
If your managers see you only as a "cost centre", point out to them that learning to drive costs a shitload of money these days too, but few drivers then go on to whine about how it's made them less productive. Education and training is much cheaper than pissing away valuable time firefighting trivial problems that could have been avoided by eliminating ignorance.
And this training works both ways: IT staff cannot help the Accounts department effectively if none of the IT people have a clue what the accountants there actually do. This is basic Systems Analysis: you need to find out what your colleagues need – which, as others have pointed out, is not necessarily what they want – and find ways to make that happen.
THAT is your job. THAT is what IT administration and support staff are supposed to be doing.
All that said, there is a generational problem going on here too: IT is in a constant state of flux and transition and not everyone can cope with the frequent changes.
If, after all your attempts to train and educate a particularly IT-illiterate colleague, they continue to screw things up due to their incompetence and inability (or lack of desire) to learn, then, and only then do you get to tell HR about it and suggest said colleague is either let go, or moved somewhere where they can do less harm.
I say this because IT is infrastructure, like plumbing and electricity. If an employee is regularly buggering up the plumbing, or plunging an entire department into darkness, they'd be let go immediately. It's 2012, not 1982. There is no excuse for being so totally clueless with a basic tool of the trade. You wouldn't hire a carpenter who has no clue how to use a hammer, so there's no acceptable reason for an HR department to keep people on who have no idea how to use a computer in this day and age.
Both sides are right. The answer is, as is often the case, in the middle ground.
"You might have bought in to the idea that we are morally and ethically obligated to refresh our hardware and software every three years or you may believe that "new" is a reason to change what works."
Oddly enough, no, I haven't bought into that idea myself.
I was merely pointing out that corporations ARE "morally and ethically obligated" to refresh their products. They are legally obliged to provide the best value and returns for their investors and / or shareholders. They don't get to choose not to do so. This typically means giving potential customers a justification or excuse for buying new stuff, rather than sticking with the old stuff.
But given how many people really do seem to be distressingly prone to fads and fashions, I can't say I blame them. But no, I'm not big on consumerism myself*. I even stopped owning a TV way back in 1996, long before it became fashionable to do so.
Consider how often you've heard the phrase, "Now washes better than ever!" I've always wondered what the hell those companies were putting in their boxes of washing powder 30-40 years ago. Mud? Dried sewage? How much "better" can such a powder possibly be after so many years?
This endless exhortation to buy more stuff, newer stuff, shinier stuff, vaguely 'better' stuff, is not new. It's been going on for generations. It's not about to end just because some Mayans' laser printer ran out of paper.
* (With the ever-increasing horde of nephews and nieces to "voluntarily" buy presents for each year, it's not as if I have the option anyway.)
Re: How long
"[Apple] buys the name.
I suspect it'd be easier to just buy the company.
Re: @Trevor_Pott, Gil Grissum, Oh4FS, et al.
"It is still Microsoft shitcanning older format support to drive adoption of their newer stuff for no good reason whatsoever. Pay the tithe sir. Use our new interface sir. Buy our training for our new interface sir..."
Of course, nobody else does this at all. Ever. Only Microsoft. Not Apple. Or Samsung. Or HTC. Or Nokia. It's only Microsoft who ever tout new shiny to replace last year's shiny.
I can only assume from your rant that you have never used any Adobe software in anger. They update every bloody year and their upgrade prices are typically higher than the full price for Microsoft's complete Office Professional suite. They've been frequently accused of price gouging and monopolistic practices for some time now – particularly after they swallowed Macromedia whole with a nice Chianti.
I'm sorry, but when people start complaining that a business that makes a product is being so evil by touting new, shinier, versions of said product on a regular basis in order to drum up business, I have to wonder what the hell they teach kids these days. Do fashion houses not do precisely this every damned season? Do TV broadcasters not advertise new episodes of their hit shows on top of shows that are actually being broadcast immediately after every damned ad break, and even at each end of said ad breaks?
I'm not fan of Microsoft myself – I use a Mac – but I'm seriously bored of all these immature "Evil capitalist PIG!" screeds that somehow seem to ignore the fact that every goddamned corporation does exactly the same things.
If you don't like the rules of the game, change the game. But don't blame the players for playing by the rules. They can lobby for changes to said rules, but they don't actually get to make them. That's your job.
As for your "headache": may I suggest you advise upgrading to a more recent version of Microsoft Office? It can still read its old formats, while also supporting the new ones. That would make the transition easier.
Once your clients have been weaned off those old proprietary formats and have archived their old documents properly, you can then start to move then towards the likes of Libre/OpenOffice, but only if your clients don't rely on MS Office's extensibility. (VBA is popular, but Office also has a very powerful API. Specialist software like SDL Trados – the translation world's industry standard – relies heavily on MS Office's components to ingest and export, as well as to display, document previews for MS Word-based projects.)
@AC 20-DEC-2012 19:50 GMT
"I am a professional writer. I have several novels, articles and short story collections in DOC format on my PC. I also have at least five hundred DOC / XLS files in my email concerning contracts, corporate papers, royalty payments, etc."
And you've been completely unaware of Microsoft's move away from their old document formats because...?
I'm a professional technical author and translator myself. I use Scrivener for writing, not Microsoft Word. I use MS Word for translations, and even then, only because that's the format most of my work arrives in. (I actually use SDL Trados, but that relies on MS Office components for some of its functionality.)
This is the IT industry. Proprietary file formats can, and do, become obsolete and unsupported over time. I've lost count of the many TLAs that have gone to that great Winchester drive in the sky.
The Microsoft DOC and XLS file formats are not open standards and therefore cannot be relied upon to remain supported in perpetuity. Hell, there are even differences in how well they're supported in Microsoft's own software; the formats have never been particularly well documented. (If they had been, Libre/OpenOffice might do a better than half-arsed job of working with them.)
There are archival-quality open ISO Standard formats available (e.g. PDF/A) that you should have been migrating to years ago. You could have the process five years ago, converting a few files a month, and been done with it all ages ago. The only person to blame for leaving all your washing up in the sink for so long is yourself.
"You do not BUY it, you pay for a license to USE IT."
Contrary to popular belief, the GPL offers exactly the same deal. The "L" stands for "License".
"It's not YOUR software, you can't reverse engineer is, decompile it.. blah, blah, blah etc..."
Like anyone who isn't a (bored) programmer gives a toss. If I wanted to write my own word processor, I'd have bought Visual Studio, not Microsoft Office. Open Source is an irrelevance; open standards are what matter. And if all your data is still in a proprietary file format after lo these many years, you only have yourself to blame. Seriously, stop showing off your ignorance.
This is the IT industry. File formats are expected to become obsolete – especially proprietary file formats – so not planning for that is just stupid. Microsoft have never made any secret of their desire to move away from their old (and rather poorly documented) formats. They've been pushing DOCX and its siblings for damned hear an entire decade.
And I've no idea why you insist on whining about your dislike for Microsoft's GUIs. I happen to quite like Windows 8, but then, I use the keyboard shortcuts, so changes to the pretty pictures makes no never-mind. The same shortcuts work in all the versions I've used.
Perhaps you're just doing it wrong?
Re: Dear Microsoft
"Office 2003 formats are standard."
No they're not. They are proprietary de-facto standards. There is a difference, and any computer "expert" worthy of that name would be fully aware of their fleeting lifespan.
I find PDFs work well for archiving purposes. PDF is an ISO standard now and unlikely to disappear any time soon. Macs can print to PDFs as a matter of routine; Windows users can get similar functionality from various PDF applications.
As for those claiming that they'd have to spend "weeks" doing this: what the fuck were you doing when you were working on your Business Continuity plans? Or did it never occur to you that an old, obsolete file format might no longer be supported in the future? Again: whither the plangent cries demanding support for WordStar, WordPerfect, or AmiPro files? Where's the whining about lack of support for Visicalc?
It's IT, for crying out loud, obsolescence is guaranteed.
"They absolutely rely on being able to "preview" XLS spreadsheets in their outlook."
And they'll still be able to do so just as before. That has not changed. It's only importing and exporting of data from Outlook that has dropped support for DOC and XLS formats. This is no big deal at all.
@Trevor_Pott, Gil Grissum, Oh4FS, et al.
Did you all just stop reading after the first sentence of the article and decide to leap to a very wrong conclusion in a single bound?
I appreciate that Gavin's piece appears to be classic troll bait – and boy did it ever work; there are a lot of very stupid-looking posters in this thread – but even so, I had to check the URL to make sure I wasn't accidentally reading a Daily Mail letters page by mistake. Jesus, but the level of stupid here is astonishing.
The ONLY feature being "dropped" here is support for EXPORTING or IMPORTING certain parts of the Outlook database to these old legacy file formats.
That's it. That's all that's changed. Nothing else. DOC and XLS files will preview just fine.
Seriously, first RTFA, then form your opinion. This is basic "Reading 101" stuff.
Re: I have a different take...
" This is evidenced by the simple fact that they are paying at least 30% more for their computing devices than an equivalent from another manufacturer"
Please, do feel free to show me the 27" iMac equivalent from HP or Dell that isn't laughably crap. Or, indeed, the "equivalent" of the Retina MacBook Pro 15" model.
No? Thought not. Apple kit is expensive because Apple only targets the high-end markets. Because that's where the profits are to be made. The margins for the kind of low-end tat you find from Asus and Eurocom are so razor-thin, it's not worth Apple's time.
"A Windows user on the other hand is a little more technically competent,"
No. Just... no.
GNU / Linux distros, certainly. You need a certain kind of IT-obsessed masochist to get the most out of those.
But Windows users? "Technically competent"? I want some of whatever it is you're smoking please. I'll pay, because it's clearly top-class, hardcore hallucinogenic stuff and well worth the money.
Windows users are simply more interested in buying 'value' goods. They're the people who shop in Tescos instead of Waitrose. People who don't give a gnat's chuff about aftersales customer service until they actually need it.
Which doesn't mean you can't have a decent Windows (or GNU / Linux, or whatever) box that runs like a dream, but the notion that Windows users are inherently more "technically competent" is, like repeatedly ramming your finger up a dog's arse while driving up and down a multi-storey car park: seriously wrong on oh, so many levels.
Re: Mixed Feelings
Jesus, seriously? How old are you? Six?
"How are you supposed to get into the Control Panel"
WIN+R - type "control panel". Press Enter.
WIN+R and type 'explorer', then press Enter. Or you can just enable the usual Desktop icons through the Personalize command (in the context menu).
Exactly the same as in every previous bloody version.
HANDY TIP FOR THE IGNORANT:
Do you still rely heavily on the mouse / trackpad for accessing everything? If so, I have some bad news for your "IT Credibility": By definition, you are not an advanced user! WIMP-based GUIs were designed for beginner and early intermediate users, who are supposed to learn the keyboard shortcuts. Keyboard shortcuts are the "advanced UI" for all GUIs based on the 1960s-era WIMP metaphor.
That's the whole POINT of WIMP-based GUIs. It's right there in "GUI Design 101" textbooks, so you really have no excuses for all this incessant whining and moaning about how MS have mucked about with the Start Menu yet again. (The same Start Menu that they've been mucking about with in almost EVERY Windows release since Windows 95.)
NONE of the keyboard shortcuts have changed. ALT+TAB still works as before. The Windows key will usually switch between the launcher and the last-used app. WIN+D will bring up the Desktop.
The Windows key on its own still brings up the Start Menu (or its tiled "ModernUI" incarnation on Windows 8). Again, this behaviour has not changed. Why on earth anyone who considers themselves any kind of expert or advanced user still drags their mouse pointer all the way down to the bottom-right corner in this day and age escapes me: that key's been on every PC keyboard for over 15 years now. After all these years, do you still need a coloured picture to aim at?
Even ALT+F4 will close apps and bring up the usual "shutdown / sleep" dialog in the same contexts as it did before. WIN+R will still pop up a dialog to type app names into. (Many advanced Mac users use Spotlight in a similar way. Others run a launcher called "Alfred", or use some other method.)
Christ, if you think Windows 8 is "hard", you clearly never tried to play around with Intuition on an old Commodore Amiga 1000. (And people said the PC's old CGA graphics palette was hard on the eyes.)
Pipe, slippers, rocking chair, pretending to be deaf, "kids these days", etc.
"Have been an Android user for 3 years, have yet to see Malware - it's really very simple."
Unfortunately, so are most smartphone users.
1) Define "dodgy app" and "dodgy websites". In terms an IT-illiterate will understand.
2) "a developer other than you would expect"? Seriously? How many users do you think remember the names of all these developers? Again: think IT-illiterate.
3) How the hell is an IT-illiterate going to know what is "too much"? The vast majority of IT device users haven't a clue about even basic privacy precautions: how else do you explain Facebook? Even so, these devices are supposed to be simple appliances, not virtual LEGO sets.
Judging by some of the posts here, most developers and IT gadget fans wouldn't know "logic" if it poked them very hard in the eye with a Steinway Grand. Which explains why hardly anyone will even stand by their code and warrant it as fit for purpose.
For the vast majority of people out there – myself included – iPhones and Macs are merely the least worst platform of choice, not the "best". The state of this industry is a sick, sick joke to anyone who doesn't work in it.
You FOSSers are the worst of the lot: A bunch of feckless, bickering schoolchildren, wanking forth over tiresome, irrelevant faux-philosophical issues of "freedom", while refusing to write anything of sufficient quality as to be worth guaranteeing that it'll do what it says on the damned tin.
Get over yourselves. This is by far the most unethical, hypocritical industry I've ever worked in. And I've worked in PR and marketing, as well as the games industry.
It's weird how Apple get all this flak...
... despite their datasets being provided for them by other companies.
But even weirder is how the press have raved about Nokia's maps, despite their "Here" service being utterly f*cking useless here in rural Italy.
Google's data still insists that the main road to Viterbo goes through three small medieval towns (including the one I happen to live in) and requires no less than five – count 'em – dangerous hairpin bends, not to mention some godawful junctions. I've seen HGVs trying to squeeze up through the town's very narrow streets (some barely wide enough for a small car) despite the existence of a 12-year-old bypass right there, and clearly signposted.
In fact, the only map service that actually gets my neck of the woods right is, er, Apple. They've even put the correct road labels on the bypass as well. Google's still doesn't.
I guess digital mapping of an entire planet's landmasses is too hard for any of these three to nail it 100%. Gosh! Who knew?
"The man was an unrepentant bigot."
So was my granddad.
Sir Patrick Moore was 89 years old. Like my granddad, he was born and raised at a time when wives really did stay at home. (They didn't have much choice: we still had outdoor toilets and mangles back then; many homes either still had gas lighting, or had to plug their very few appliances into electric light fittings as there was no standard for wall sockets at the time.)
It wasn't until the 1950s that society started to move towards greater equality in the workplace. Even today, there are still issues of pay equality.
Furthermore, Sir Patrick Moore served in the RAF during WW2. He lived through six years of relentless wartime anti-German propaganda, found the love of his life killed by enemy bombing raids, saw his brothers in arms killed in action and was also shot down himself. You do not get to demand he switches instantly from anti-German to pro-German overnight on VE day.
Individual mental injuries take an entire lifetime, and may never heal entirely. Socio-cultural traumas on that scale take generations to heal. (Why do you think insulting the French is such a British tradition?)
VE Day marked the end of the fighting, but not the damage it caused. Wars do not end when the bullets stop flying. For many, the nightmares continue for the rest of their lives. We have a convenient label for this now: PTSD. And every war we've been involved in since WW2 has produced its own PTSD-suffering walking wounded. Their damage is inside, invisible to the naked eye. But it is there nonetheless.
You'd think we'd have learned this as a society by now, but no. We still get ignorant posts like yours.
A great communicator.
He never claimed to be a professional astronomer, but he was exactly what every great teacher and mentor should be: infectiously enthusiastic, and a great communicator.
He gave us memories and learning. As with all the great teachers and communicators, the memories he gave us to keep are unforgettable. That's what I call immortality.
Re: Funny how almost every malware issue ...
Market share does not automatically equate to profits.
The most profitable mobile platform is iOS, and it has been for some time. It's been creaming off about 50% of the entire mobile platforms market's profits, despite its smaller market share. The reason? Apple are only interested in targeting the market of people with money to spend and who are willing to spend it.
Apple are more than happy to leave the rest of the market to others. Why waste resources competing for the custom of people who have little or no money and who are thus forced to place affordability over everything else, including usability?
"Open code can be checked by anyone. "
Indeed, but there's no guarantee that anyone will actually do so. Nor is there any guarantee that the person(s) checking the code are even remotely competent to do so, let alone whether they'll tell everyone about what they find, rather than just keeping their knowledge to themselves and using it to their own nefarious ends.
Any shyster with a copy of "Programming Bullshit for Shysters" can pretend to know what they're talking about. And any sociopath can decide to keep a juicy little bug to themselves.
Open source is no better or worse than closed source when it comes to security. It never was. (And it's certainly nowhere near as good as commercial closed-source development when it comes to design and innovation. When your only means of making money is by charging for support, good UI design suffers.)
Re: As American as...
Indeed, the apple isn't even native to the Americas: it was brought over by European colonists. There are documented recipes for apple pie in England that date back to around 1390, years before the Americas were originally colonised. So neither the apple, nor the pie, are even remotely "American".
Ironically, the Chicago-style "deep dish pizza" is an American invention. You will not find such pizzas in Italy. It's the Chicken Tikka Masala of pizzas.
"As American as pizza pie" would therefore be better than the utterly bizarre "...apple pie" version, but I can't help thinking that "As American as corn flakes" is better. (The more generic "breakfast cereal" doesn't work, thanks to the existence of both Scotland and porridge.)
Dear Mrs. May
You ask: "The people who say they’re against this bill need to look victims of serious crime, terrorism and child sex offences in the eye and tell them why they’re not prepared to give the police the powers they need to protect the public."
Allow me to rephrase that in terms that might, possibly, help you understand why spending £1.8bn. on such a moronic abbreviation of traditional freedoms hard-won in not one, but two major wars is a terrible idea:
"The people who say they are for this bill need to look the many thousands of victims of road traffic accidents in the eye and explain why their lives are worth so much less than those this bill claims to be protecting."
There are literally thousands of men, women and children who are being killed or maimed on the UK's roads every single year. Why are their lives not worth such an investment? Why are such crimes as running children over considered less 'important' than saving the far smaller number of lives affected by sex offenders, paedophiles and terrorists*. To paraphrase a WW2 veteran's comment at the time of the 7/7 bombings in London: "F*ck you! We've been bombed by professionals!" This is the same country that stood up to the Germans during the Blitz of WW2. We're better than this.
There were 1901 deaths and 23122 serious injuries during 2011 on the UK's roads. [Source.] Why are their deaths and injuries tolerated more easily? Are all lives not equally valuable? Why should the severe maiming of a child by a drunk driver matter less than the injury of the same child at the hands of a paedophile, despite the former being far more likely?
£1.8bn. would save an awful lot more lives if invested in improving pedestrian and road safety—perhaps by setting up a dedicated "Road Patrol" arm of the police force? – than any amount pissed up the wall on dubious ICT-related projects that no British government in living memory has ever managed to implement successfully, on time, or even on budget. Or, frankly, with any understanding of its ramifications.
Madam, you are not qualified to even begin to specify an ICT project of this magnitude as you clearly have no clue how computers and the Internet actually work. All you will achieve by ramming through this Bill is pushing paedophile networks onto VPNs, which are impossible for any ISP to track and trace in any way: all the data is encrypted, including the addresses of websites, the contents (and headers) of emails, etc.
A far better Bill would be one that improved the numbers of police on the ground, increased the numbers working in intelligence, and also helped train as many police as possible in advanced IT skills that go beyond merely understanding how to switch on a PC and use Microsoft Word to write their umpteen reports. (Oh yes: streamline the procedures too if you could. Paperwork really shouldn't be taking up 30% of the policeman's time; it's woefully inefficient.)
The many incompetent civil servants you are charged with managing have become a laughingstock with regard to IT security and privacy thanks to their singular inability to stop leaving laptops and important data lying around on trains and in other public areas due to forgetfulness. (Never mind that such data should NEVER have been downloaded to such devices in the first bloody place.)
So, no, we in the IT community wouldn't trust any current MP or Minister to successfully write a Bill like this. You're doing it wrong. Seriously. Stop. Please. And tell your peers and colleagues to please stop embarrassing themselves – and our nation – by vomiting up so many dumb Bills like this. It'd also help if you stopped listening to clearly biased "consultants" who have no interest in giving the taxpayer value for money, but every interest in giving themselves lucrative slices of any IT pies.
* (What the hell are "terrorists" even doing on this list? The UK has been fighting terrorists since first member of the IRA chucked a bomb into a pub. What makes Al Qaeda so bloody special that, suddenly, nearly a century of experience is worthless and needs to be 'helped' by yet more pointless and dangerous intrusions into our privacy and freedoms? Your Bill will do absolutely nothing to improve matters.)
Driver support for Linux is still hit or miss. In order to make a particular distro work on a particular piece of hardware, you have to do some tuning. That doesn't necessarily involve writing actual driver code, but it is quite likely to involve messing about with config files and the like.
Also, where Dell are using some custom components, they do need to work on integrating drivers properly. And I wouldn't be surprised if Dell's people were finding bugs too.
The source code is open. ("Open Source" – the clue's in the name.) Anyone can play with it. Why would Dell's skunkworks team be an exception? If it were as easy as just installing an Ubuntu ISO onto a bare laptop, it wouldn't have taken this long, would it?
Re: NEST Thermostat webpage - OMG!
The Nest thermostats can also connect to climate control and air conditioning systems, not just the bog-standard gas boiler Brits are used to. If you look closely, you'll see the word "cooling" on some of the images, with a blue background.
It has nothing to do with insulation, which is pretty common in US construction these days as nobody likes pissing money away.
Re: Interesting tactic
Not quite. The mistake so many – including Dell – make is to assume that there is a single "Linux market", when there's nothing of the sort.
The collective noun for Linux fanatics is "argument".
Re: @AC 16:25GMT - Quit your whining
RTFA yourselves. You clearly missed the part where it points out that Dell spent some of their own money getting the drivers working properly for Ubuntu.
THAT is why there's a (rather small) premium. This means Dell had to do work themselves that they normally would not have to bother with. With Windows 7, the component manufacturers themselves do 99.9% of the heavy lifting when it comes to drivers. All Dell's monkeys have to do is bolt it all together properly, test it actually works, then shovel it all into a plastic case with a shit screen.
For the Ubuntu Edition of their laptop, Dell's own people have had to do some of that work, because there's nobody else do to it for them. They've had to muck about with source code and driver support in Ubuntu to make it talk civilly with their choice of components. Only when all the features listed on the side of the box are actually usable can they then shovel it all into that plastic case with the shit screen.
Las time I checked, programmers capable of working on drivers weren't cheap, regardless of the operating system. Neither are project managers familiar with such GNU / Linux projects. So Dell do have to pay for the skunkworks people's time and effort somehow.
Furthermore: providing customer support is a legal obligation in most territories. It's also expensive. Even more so when you consider that most call-centre operators have been dealing with Windows users rather more often than Ubuntu users. That means training will be needed to ensure suitable operators are available for these Ubuntu-burdened laptops.
What happens if a customer decides to upgrade Ubuntu "Anthropomorphic Axolotl" to its next version, which might even have a radically different GUI (again)? Dell need to consider that support aspect too.
Hence the extra $50.
Sounds pretty reasonable to me.
Re: There is indeed delusion at work here but it doesn't live in Apple stores.
Apple have consistently beaten their competitors in customer satisfaction ratings. This is incontrovertible fact. There are any number of independent sources for this. Feel free to check the internet if you don't believe me.
There was a time when Sony used to do great design too, but they seem to have thrown in the towel of late and have lost their self-confidence. (Sony even have their own branded stores. Apple were not the first.)
Sony used to be a luxury brand, but they've lost a lot of their cachet in recent years and seem to have suffered from a lot of poor management over that time.
Apple look good today only because their competitors have set the bar so low. Greatness is relative.
Gosh, that's a lot of cynicism you lot have.
Apple sell consumer electronics. Hardware. Computer-based appliances. That's all they do.
Everything else they do is marketing. It really is that simple, people. Those Apple Stores are literally just interactive advertisements you can walk right into. That they also have shop-like properties is a side-effect, not the primary intent. Their primary purpose is to make you go "Oooh! Shiny!" and step inside.
The store's employees don't pressure you into buying something: they're much more subtle than that. They'll show you what the magic boxes can do. This is a much more compelling argument for making a purchasing decision than any number of bullet-point lists on the side of a box, or printed on a label next to a demo laptop that hasn't even been plugged in. (Yes, PC World / Comet / etc., I'm looking at you.)
This isn't a 'religious cult', though some of the underlying psychology and cognitive science applications are related to those used (often unintentionally) by many major religions: People like to 'belong'; we're an inherently tribal species. That's why we have Linux fanatics, Android fanatics, Windows fanatics (yes, they do exist), Apple fans, and so on. It's also why there are people who will literally smash chairs over anyone who claims their favourite football team is, in any way, not up to scratch.
There are textbooks explaining all of this stuff. It's not art. It's science. Supermarkets have been applying psychology and cognitive science to their businesses for decades and it's just as subtle as Apple's approach.
But what truly amazes me is that so few people can see this. Apple haven't become such a massive success solely because of what they've done. Their success is also due, in large part, to what their competitors haven't done.
Re: I hear the collective groan of a million fanbois
Did you even read the article? The Nexus 4 lacks the circuitry to do 4G legally. This isn't a feature: it's bug. It's a serious design cock-up and nothing to be proud of. The phone could be withdrawn from sale as a result of this.
Besides, it runs a version of Android. Having had to support relatives who've bought Android devices because they were too cheap to buy something better suited to their IT skill level, I am not a fan of it. It's Windows all over again. Christ, talk about history repeating itself.
People may be expert in some subjects, but they're guaranteed to be ignorant of others and do dumb things as a result of that. Any public-facing system that doesn't allow for the possibility that its users might, in fact, not have a clue what they're doing is a Bloody Stupid Johnson of a system that should be put out of everyone else's misery.
"liek the old saying, guns don't kill people, people kill people."
But guns do make it a bloody sight easier to kill in the heat of passion, and also make it less dangerous for the killer as they can shoot their victim from afar, instead of having to get up close and personal.
Guns make it much easier to kill. They reduce the obstacles that would otherwise get in the way of an angry scrote and his intended victim. Those obstacles get in your way, slowing you down and giving you more opportunities to reflect and withdraw. That's the problem. Gun control isn't about banning guns entirely (although there really is no excuse for carrying one on your person at all times. Seriously: you do not live in the Wild West any more.)
Technology – and guns are also technology – is all about making things easier to do. Guns are a labour-saving device, like a washing machine. To pretend that guns are utterly harmless and should be made freely available is the height of idiocy. (And, of course, it's unlikely the authors of the US Constitution's Second Amendment had modern weaponry in mind when drafting it. The word "arms" also covered lances, knives and pitchforks. The word used is "arms", not "firearms".
A good revolver would set you back $17 – around $300 in today's money – which is outside the range of most of the population at the time, for whom $1-3 / day was considered a decent wage. A Winchester rifle was $40 – $700 in today's money; the equivalent of buying an iPad. Even ammunition cost 50 cents, which is a substantial chunk of a day's pay for the poorer workers.
It's clear that, in the context of the time when that Second Amendment was signed into law, firearms were unlikely to have been foremost in the legislators' minds. It was an age of cavalries, cannons and lancers. The hoi polloi were not expected to have guns in their handbags.
Re: Ummm, ..
No, you're not the only one who had to go back and re-read that line.
There are an awful lot of people out there with Van Gogh's ear for music.
Re: That alternative battery decision, just out of spite...
Right. Because Apple invented the USPTO and singlehandedly created the entire running joke that is US patent legislation.
Oh wait: no they didn't.
If you want to blame somebody, blame the people responsible for the USPTO, which is SUPPOSED to stop "obvious" patents being filed in the first bloody place. (Of course, that would require spending some money on hiring suitably trained and educated people to scrutinise the patent applications.)
Contrary to popular belief, lawyers are rarely trained in engineering and have no idea if the patent(s) they're defending actually make sense from an engineering standpoint. However, patent legislation usually requires that each patent holder actively defends their patent portfolio, regardless of how many valid patents that portfolio might actually contain.
Apple vs. Samsung is a symptom, not a cause.
The concept of legal entity status for a business is actually implicit in the terminology used: "corporation". The root word is (ultimately) from "corpus", the Latin word for "body".
The US even goes so far as to refer to "incorporated" businesses – "to form into one body".
All businesses are just collections of people operating as one unit.
It's a shame etymology is so rarely discussed, let alone taught, in our schools. It would certainly stop all these ignorant assertions that "businesses are not legal entities". They really are. And the reason for it is, in programming terminology, "code reuse": if you can define a business as a legal entity, all the laws that apply to actual people can be applied to businesses too, thus we don't need separate sets of laws to stop businesses going around murdering people with impunity.
Granted, that's not how it ended up: the UK has such a Byzantine mess of tax codes, it's clear that there are serious flaws in the nation's tax system. When a system is that complicated, with such a terrible UI, it's a sure sign that it's very, very broken and needs to be fixed.
Corporate Tax makes no sense in light of the above. Business never pay taxes, because they're just avatars operating in "business-space" and have no physical existence in our own universe. They are artificial constructs for accruing profits and making their owners money. Their owners pay the taxes. Said owners react to those taxes by reducing operational costs – e.g. lower wages; fewer stores; less investment in R&D, etc. – to make up for that overhead.
The LibDems used to tout an alternative that was effectively automated: "Everyone pays 10% on their income." Banks would handle it automatically: any income that came into your account would be taxed at 10%. That's it. No tax relief. No complicated bollocks. I think they suggested a minimum income level threshold, but that's just detail. VAT would probably continue too, but, aside from making a lot of accountants redundant, it was effectively a flat percentage system. Very easy to manage. Very easy to understand. The simplest UI possible.
Naturally, given how many of the current generation of politicians have an accountancy background, it was binned at the earliest opportunity.
IP creators need to extricate themselves from agencies that often fleece them.
As opposed to the freeloaders who don't pay them a penny? At least record labels actually pay something. That's a lot better than sod all, which appears to be the only alternative proffered by many.
It's also depressing how often people conflate the implementation of a concept with the concept itself.
Yes, the media conglomerates can be very heavy-handed: they're old, crusty, conservative and uncomfortable with 'change'. This is normal. It's not nice, granted, but it's perfectly predictable behaviour. Most CEOs tend to be on the old side and, therefore, less interested in disrupting their nice, clean, venerable – and increasingly outdated – business models. These are people who still call in their secretaries to take dictation rather than messing about with those newfangled 'computer' things.
What we're seeing is poor implementation of IP protection. Nobody is seeing this from an engineering or IP creator's perspective: the legislators are seeing it only through the lenses of those who have built empires leveraging IP to make money off it – the middle-men. I have nothing against middle-men per se, but they do need to offer genuine value-add services to justify themselves. Many are starting to come round to this, but a lot more of the old guard are still trying to milk the old business model for all its worth.
Legislators tend to favour the wealthy with laws that help increase that wealth–they know what side their bread is buttered. Again: this is perfectly normal and utterly predictable. The original article barely counts as "news".
If you don't like this, kick the buggers out. That's what elections are for.
And don't whine that "they're all the same" either. You have just as much right to stand for election as any of them. That "somebody" you insist should "do something" to protect your high-minded principles? Look in a mirror: that somebody is YOU. It's your fight. Nut up and do it yourself. IT-literate politicians and lawmakers aren't going to magically appear out of thin air.
But remember this: if the concept of Intellectual Property is such a bad idea, why do people keep gibbering endlessly on about FOSS on these forums? The various GPLs would be utterly worthless without IP legislation to back them up. And without those licenses, the entire "Software Libre" movement would collapse into irrelevance overnight.
Re: Missing "This post sponsored by Nokia" somewhere?
Same here in Italy (where even Google's mapping data is woeful). The main road here is not only mislabelled in Nokia's maps data, and the photographic imagery is so poor, I can't even see my town, let alone my street!
Say what you like about Apple's mapping data, but it's a hell of a lot better than Nokia's. If Apple's mapping data is "disastrous", what the hell does that make Nokia's?
Re: A modest proposal to solve the increased levels of greenhouse gasses
Then kill everyone else. No witnesses!
Re: That thing looks very thin
"I favor function over form"
Why should we have to choose between the two? I rather like having both at the same time. Apple are pretty good at providing products that meet both criteria.
Function over form is how we got those nasty 1960s-1970s Brutalist buildings. Thanks, but no thanks. If I'm going to be staring at a machine for eight hours a day, I'd like it to be easy on the eye as well.
We've been trying to build "portable" applications for decades, but it never truly works. UNIX, MSX, Java, .NET/Mono – you name it. We've tried it.
Christ, we've had that portable assembly language called "C" since the 1970s. We've invented a veritable Tower of Babel of "portable" programming languages, tools and technologies, but… nada.
If anything, it's only made things worse: the world is now dominated by the almost 40-year-old UNIX and its myriad clones, while the only remaining non-UNIX-derived OS kernel comes from Microsoft. If that's progress, I'm a banana. This is ironic when you consider all the FOSS community's talk of "choice" and "freedom". I had more choice and freedom in operating systems in the 1980s and 1990s.
For applications to be 100% portable over multiple devices, from 27" iMacs and Windows boxes with their (primarily WIMP-derived) mouse-centred GUIs, all the way down to 3.5" smartphones with their multi-touch screens, we need to invent some form of AI-driven GUI rendering technology that can take advantage of platform-specific features automatically. The alternative is the lowest common denominator design approach, and, frankly, thats sucks from a consumer perspective.
Standardised platforms are undesirable: it makes it next to impossible to make your own products stand out. What you end up with is a single, monolithic, point of failure. If a flaw is found in the underlying platform, everything that runs on it is vulnerable. Furthermore, good UI design is done from the platform up, not from the top down. iOS apps feel different to their Android counterparts because Android's gestural 'language' has diverged from iOS'. Similarly, Windows (Phone) 8 applications also need a very different look and feel.
What you are advocating, in fact, is the very antithesis of "choice" and "freedom". I can already choose between multiple ecosystems. The ecosystem is the "platform" of today, not the hardware that merely provides a component of it. I can choose Android and Google and GNU / Linux. I can choose Windows 8 and Windows Phone 8. I can choose iOS and OS X. Or I can choose any mixture of the above.
What you propose would be no choice at all.
GUIs are supposed to be for newbies.
The mouse pointer and icons were not – and never have been – intended as the primary interface for all users at every level of proficiency: the intention was that the GUI provided the "first level" interface and was designed for new users:
Beginner users would use the GUI to explore the application, find out what it does, how it works, and so on. The GUI elements are part of the learning / training process.
Once users start getting the hang of the application, they're supposed to start learning the keyboard shortcuts. Windows offers both CUA (ALT+underlined letters) and direct shortcuts (CTRL+letter, shown next to the relevant command in the menus). Apple's OS X only offers the latter style. This is the intermediate level of user.
Advanced users are those who know every keyboard shortcut like the back of their hand. Unless the application is for graphic design or similar, the mouse should be used rarely – if at all – by this point.
This is very basic GUI design stuff. There are textbooks on it and everything, so you don't get to complain about GUIs being designed for beginners rather than favouring advanced users. The latter are NOT the target for GUI design. That's why all the stuff for new users is switched on by default: advanced users should already know how to switch off the bits they don't want.
You can customise damned near everything in Windows, including Windows 8. Apple's OS X is a lot less malleable out of the box. You can minimise the Ribbon, change the "Quick Access" toolbar, move it around, modify the Folder and View Options, and so on. It's all still there. SHFT+CTRL+N still creates a new folder. F2 still Renames a file. Those shortcuts have been there since at least Windows 95, so there really is no excuse for not knowing them.
If you're still faffing about with a mouse in a traditional WIMP GUI, you're still at the "Beginner" level and don't get to call yourself a pro or advanced user, no matter what you may believe. You sure as hell don't get to vote on how such a GUI is designed.
In the emails, Sinofsky comes across as positively diplomatic. "D" came across as a condescending, arrogant, pig-ignorant twit. As even Sinofsky himself pointed out: if "D" needs to do a lot of file handling, there are industrial-strength file management tools available. It is not Windows' job to provide for every possible use case, it is only required to handle the most common use cases.
You've never actually developed for an iPad then, have you?
Apple actively encourages iPad-specific GUIs for that platform. You can build a single app that will use one GUI for iPhones / iPod Touch, and another for the larger iPad displays. They'll often look very different.
I've used both platforms and very few Android applications actually bother to do anything similar: most literally just show a cheesy stretched-out phone UI with terrible use of space and potential. Only a few of the big name developers are actually putting any effort into this aspect, but the ratio of stretched vs. native tablet apps is still terrible when compared to the Apple ecosystem.
There's really no excuse for this. It's just laziness on the part of developers. If they can't be arsed to put a bit of effort in the user interface design, why bother writing the thing at all? You can add all the functionality you like, but if it's a bastard to use, nobody's going to buy your mousetrap.
Also, while Android has less curation of its app stores (unless you buy an Amazon device, in which case, it's a hell of a lot more 'closed' than Apple's), that just means any old newbie with a compiler can spam it with ill-designed shite and malware. Any good developer must fight through all that noise just to get noticed.
Apple's store curators may occasionally let the odd bit of malware slip through, but it's still rare. Furthermore, it's nothing like on the same scale as Android. And, from a support perspective – i.e. mine – that's a massive advantage: I have enough crap to deal with when relatives ask me to "look at" their PCs, without having to do the same thing for their phones as well.
You can keep your tat bazaar; I'll stick with the department store. The former may offer a greater choice, but the latter offers better quality. Choice for its own sake is pointless.
Yes, we get it, ZFS does some neat stuff. Guess what? Most people (myself included) find it easier to just run a regular (in my case, weekly) backup of important data to an external drive connected to the NAS box via USB. (I also make a weekly clone of my computer's drive on the same day. Job done.
As for why I bought a ready-build NAS appliance: I did so for the same reason I prefer to live in ready-built homes. My time is worth money. I'm worth £300 day as a technical author. (And that's cheap. Some charge as much as £700 / day.) I'm not a fan of UNIX in any of its flavours, so setting up even a FreeNAS box isn't something I enjoy. I'd spend hours perusing the Web to find out the best practices, the arcane spells that need to be typed into the shell, and so on. On top of which, I'd also have to order all the parts and wait for them to be delivered.
Why the hell would I waste £600 or more of my time (and days of my life) working on a device I can just buy off the shelf for less than half that, and which would be up and running within minutes of my taking it out of the packaging?
Just because YOU enjoy a bit of DIY in your preferred field of expertise, it does not follow that everyone else does too. My background is in software, not hardware. I know how the latter works, and I've built dozens of PCs over the years – mostly for relatives and friends – but it is not something I find particularly rewarding.
I have no more interest in building my own NAS boxes and laptops than I do in building my own home or car. The time required for the DIY approach is not 'free' unless you actually enjoy doing that sort of thing as a hobby. I don't, so, as far as I'm concerned, it's time wasted on doing something boring and irritating instead of time I could be earning doing something fun and rewarding.
Re: @Mark C Casey - Upgrade? I don't need no stinkin upgrade!
LibreOffice? Bollocks to that. Give me Ulysses or Scrivener any day of the week. I write for a living, not as a hobby. If you want me to use your writing tool, it had better be much better than what I'm already using. A mediocre MS Word clone isn't going to interest me; I already have MS Word, and it's demonstrably the better tool.
That is how you disrupt markets: by producing something so much better than the competition – rewriting the rulebook if necessary – that choosing your product becomes a no-brainer. This is how Apple went from near-bankrupt basket-case in the late '90s, to one of the most successful business on the planet, in little over a decade.
Nor is Office going to be killed by some amateurish knock-off that barely beats the late, unlamented Microsoft Works for usability and misses the key selling point of Microsoft's Office suite: easy extensibility. You can bend Office to almost any corporate workflow's whims, automating many processes along the way. Neither of the bickering children that purport to be its "rival" come anywhere near close to offering such features.
Sure, OO and LO are "open source", but you'd have to provide your own support – or buy it in from a third party – to take advantage of that and have programmers messing about with the engine itself, rather than sticking bits onto the car. MS Office already has a massive ecosystem of VBA experts ready and willing to do the customising you need for a fraction of the price of changing the tools themselves.
Re: Apple makes even Microsoft look good.
Er, no. Sorry.
Flash has always been a massive resource hog – especially on Apple's computers. That latter point is the main reason for Jobs' rant: Adobe had years to sort out the major problems with their applications, but they never did. Adobe took years just to port their Creative Suite applications ported to the Intel versions of OS X – years after every other f*cking developer, large and small, and including Microsoft managed it!
Why the hell would Jobs trust them to organise this particular piss-up in a brewery when they've never shown any ability to pull it off? Adobe have become lazy, coasting on the fat of their ever more bloated (and staggeringly overpriced) applications and resting on their laurels. Ever since they ate Macromedia, they've had so little competition to speak of that they're not even trying any more. They've lost their hunger and that's not going to end well for them.
Flash has only ever been viable on full-fat desktops. It requires so many resources to run properly, even Adobe themselves have finally admitted defeat and thrown in the towel. There will be no mobile Flash releases for Android either. Flash is dead. It had a good innings, but it is an ex-platform and has no place in today's web developer's toolbox.
Any website that still requires Flash is doing it wrong.
As for Mr. Allaire's argument that HTML5 is the future: Unfortunately, as a certain Dutch brewer's marketing department might say, "Sschtop! HTML5 isn't ready yet!"
In the meantime, the underlying medium itself, the fabric we call the Internet, is trundling along just fine and works perfectly well as a means of getting data from a server to a client. So that's what developers are doing: using what works, applying tried, tested techniques, and getting the bloody job done.
And client applications should be tailored for the platforms and devices they run on! That's just basic GUI design. They must also be responsive and that's something HTML5 still isn't ready to deliver, due to high latencies and inconsistent availability of suitable bandwidth around the world. Not everyone on the planet has ready access to broadband – be it fixed-line or even GSM.
This is a transitional stage. The IT industry is full of those – I'd argue it's now a never-ending series of them – so either deal with it, or take up forest husbandry.
Re: "with a touchscreen laptop"
Over three quarters of ALL computers sold over the last five years or so are laptops. With trackpads. They are only occasionally supplied with that god-awful RSI-inducing 1960s-era design that is the plastic rat.
The desktop PC so beloved of the more vocal elements of El Reg's commentariat has been dying since before the release of Microsoft Vista for f*ck's sake. Stop banging on about it as if it's some sacred beast: it's dead. It has ceased to be. It is an ex-form-factor. Certainly nobody's selling them in any great numbers today.
Note that OS X's last two incarnations have also favoured the (multi-)touch trackpad, to the extent that Apple are even offering a separate multi-touch trackpad for their iMac, Mac mini and Mac Pro users.
And, yes, there are plenty of whiners from amateurs who only think they're good with computers, but who really aren't anywhere near as expert as they think they are.
Incidentally, professionals who are actually worthy of that name learn the bloody keyboard shortcuts, which – surprise, surprise – tend to stay much the same even across major releases. (Graphic artists are one of the few exceptions.) That mouse pointer, the icons, and so on are only there for the newbies. This is very basic GUI design stuff: You're only supposed use the mouse, pull-down menus, and icons to explore and learn the system. All those keyboard shortcuts in the menus and tool-tips? They're not there for decoration.
Windows 8 is a transitional product. It is most emphatically NOT change for its own sake: Microsoft can't just ditch the old WIMP GUI overnight; they need to allow developers and enterprises time to adapt, and we're likely to see that old desktop stick around for a few release cycles yet. But it will be killed off eventually. Any applications that rely on that environment are on notice.
Re: Easy to earn a lot if you dont pay tax
"After all what do we call A person who habitually relies on or exploits others and gives nothing in return?"
Re: Help my house could be on fire...............maybe?
"SCIENCE needs to step up to the plate..."
"SCIENCE" is not a corporation. It is not an individual. It has no id, no self, no ego. It is a process, not an entity.
Asking for "SCIENCE" to make a stand is like demanding the sky takes you out for lunch.
Cock-up, not conspiracy.
Somebody failed to do their Due Diligence properly, and it wasn't the CEO of Fairstar.
Somebody's going to get either a severe bollocking, or, (depending on their contract and pay grade), the sack over this. That 'somebody' will be a Dockwise employee somewhere in one of the management tiers. And they'll deserve it too. Failing to check Fairstar's accounts are solid before buying them up is what Due Diligence is for.
Baby / Bathwater Ejection Situation
"The erosion of right of first sale, the mentality of "it's not yours even though you paid for it", the avaricious principles of pay-per-view and pay-per-listen, the inherent idiocy of DRM, the destruction of the public domain"
There's a big difference between the concept of Intellectual Property, and the implementation of Intellectual Property legislation. Conflating the two is part of the problem, not the solution.
The problem is not that IP exists at all, but that the existing legal frameworks for IP are obsolete and urgently require a serious refactoring and UI clean-up.