Re: Parkinson et al are right
* (Technically a partial re-opening. The original route was mostly rebuilt as part of this project, burying more of the track underground and removing an unpopular railway bridge that blocked a view of St. Paul's.)
952 posts • joined 8 May 2009
* (Technically a partial re-opening. The original route was mostly rebuilt as part of this project, burying more of the track underground and removing an unpopular railway bridge that blocked a view of St. Paul's.)
As you appear to be taking requests, I'd like you to explain why it costs so bloody much to build anything in the UK.
I ask because the Swiss have built the Gotthard Base Tunnel for about £6bn. (including the fiddly bits at each end to connect it to the existing networks), while HS2's first phase (to Birmingham) – roughly 100 miles through relatively easy terrain – is priced at almost *three times* that amount, despite not having a major mountain range to contend with.
I've noticed that road-building is also very poor value for money. The recently opened Hindhead Tunnel (on the A3) cost £371 million to build and was extensively covered in the national news, yet the Swiss and Italians point and laugh at such a piffling little project. Both countries build such tunnels as a matter of routine, even on relatively minor roads, and they cost a bloody sight less than £155K per metre!
One small problem with your argument is that it doesn't fit the available evidence. See those 125 mph. diesel trains roaring out of Paddington and St. Pancras stations? They're roughly 40 years old and were British Rail's "Plan B" project, in case "Plan A" (the infamous tilting 'Advanced Passenger Train') didn't pan out. The reasons why Plan A failed can, and have, filled a book, so I won't repeat them here.
Like the BBC – another 'nationalised' industry – British Rail handled a lot of R&D in-house back then, and made quite a bit of money from patents and other revenue streams. There were some damned good managers back then. They *had* to be good, or BR would have collapsed into a basket-case and ground to an embarrassing halt long before privatisation finally came. They were operating on a miniscule budget that was a fraction of that granted to their continental peers. And yet, this was the same British Rail that opened* the cheap and cheerful Thameslink route, modifying it to push the tracks underground and sell the land occupied by their problematic Ludgate Hill and Holborn Viaduct stations. And for afters, they then started work on Crossrail! This was all long before the Major administration came to power.
Nationalised industries have no monopoly on bad managers, as Nokia and Sinclair Research will attest.
The trick is to have people at the top who have a fucking clue what they're doing. Such people are genuinely rare, and it doesn't help that there's no formal requirement for any manager to have any relevant qualifications. (A related problem is managers who got very lucky once, and who therefore assume they're Codd's gift to their industry and end up repeatedly proving themselves wrong. Usually at other peoples' expense.)
We also have very skewed definitions of both success and failure these days. Businesses appear to be becoming increasingly disposable, particularly in the IT sector, and that's not a good thing in the long term. The end goal is often to have the company bought out by an existing player. Eventually, all the nimble little fish end up inside one of the bigger fish, effectively *reducing* competition.
Commuter services invariably cost more money than they make, while inter-city rail services often turn a profit.
This is one of those pioneering mistakes I alluded to in my earlier post: conventional rail is optimised for moving lots of stuff in bulk, over long distances, at speed. As long as there was no better alternative, the short-haul commuter services did fine, but they were never anywhere near as profitable as long-haul express services.
Once electric trams and, later, motor cars and buses appeared, the railway lost its monopoly advantage and their commuter services haemorrhaged money. The two world wars provided temporary boosts at best, but left the network utterly knackered and worn out, which is why the mainline railways and London's own Underground network were all nationalised shortly after WW2. There was simply no money to repair the damage. The various Underground companies were also in financial trouble too. London couldn't afford to lose either the mainline railways or the Underground as they provided essential services, so taxpayers effectively bailed out the shareholders through nationalisation.
TL;DR version: the only part of a rail network that makes any profit is long-haul express passenger services. This is why the Japanese and French didn't build high-speed metros, but high-speed *express* trains. That's where most of the money in rail transport is.
Residual railway property is peanuts by comparison, so if George Osborne is looking to flog off more bits of railway land, (and both Railtrack and Network Rail have already thought of this, so all the low-hanging fruit has already gone), he's going to be disappointed by the return. I've read that he expects to make £1bn out of it, but that's barely six months of construction work on Crossrail. There's no scope for a windfall here.
If Osborne wants to cut the burden of commuter railways on the taxpayer, I'd advise him to look into why it costs so f*cking much to build *anything* in the UK these days. HS2 isn't a bad idea in principle, but its price tag is ludicrously high. Phase 1 is just 100 miles of railway through piss-easy rolling countryside. There's absolutely no excuse for it to cost that much. None. Even allowing for HM Treasury's moronic "Optimism Bias" rules, it's taking the piss.
It was John Major's bunch of idiots who decided to privatise a natural monopoly that was also hamstrung by a number of issues that didn't apply in other countries. In fairness, when the plans were drawn up, nobody expected railways to last. At the time, passenger numbers had been falling for years, so the expectation was that the network would not need massive investment in infrastructure.
I hate to play the "special snowflake" argument, but it really does apply to the UK's rail infrastructure. The price of pioneering rail networks is that a number of mistakes were made. Other countries got to learn from and avoid those mistakes, but the UK is stuck with many of them and it makes the network rather more expensive to run compared to its cousins overseas.
The privatisation of British Rail by John Major's administration also managed to completely ignore all the most basic tenets of good (interface) design, but that's another rant. That no other country has even considered copying the UK's model speaks volumes for Major's incompetence.
I can understand why a graphics driver would be included in a monolithic kernel if performance were a major consideration, but a *gamepad* driver? Seriously?
Linux-based OSes are still primarily used in embedded, mobile, and server environments, where I'd argue security is far more important than a slight increase in frames per second while playing Tux Racer.
Apple's OS X uses a hybrid Mach/monolithic architecture that means only those drivers that *need* to run in kernel space do so, while other drivers are kept the hell out of there. I'd argue that this is a better model than an all-eggs-in-one-basket approach.
...is all that crud doing in a bloody kernel?
I seem to recall that microkernels were the in thing during the 1990s, but this predates the recent rise of cheap multi-core CPUs. (There were some high-end specialist CPUs that had multiple cores, but they weren't found in computers most of us could actually afford.) Today, even the Intel Atom x7 range comes with up to four cores as standard.
Could the microkernel architecture be more viable on today's multicore CPUs? Running *multiple* microkernels in parallel might also be an alternative, with each dedicated to a particular set of tasks. Just because UNIX and Windows have followed the traditional path of loading everything onto the shoulders of a single kernel, it doesn't follow that this is the way it *must* be done forevermore.
The above is why I get irritated by the focus on irrelevancies like Open Source and Software Libre. Not that they don't have their uses, but they've distracted the industry and led us down the path of a de-facto architectural duopoly: it's either Windows, or some flavour of UNIX. Windows' current kernel is a relatively recent design, but UNIX is getting on for 40 years old and still has neck-bearded lunatics gibbering on about 'human-readable' config files*.
The monolithic kernel design is inherently compromised as it puts all our eggs in one basket. Break just one of those eggs and the entire basket gets messy fast. Yet there is no mainstream OS architecture that doesn't follow this model. Windows, GNU/Linux, most BSD flavours, Android, iOS, and so on are all tied to this design.
This cannot be healthy for the long-term future of the industry.
Time for my dried frog pills, I think.
* I don't know what makes a text editor so special that it doesn't count as a dedicated file reader given that text is stored as numbers, just like everything else. Nor do I understand the fixation on storing config data in a format that usually *requires* the reader to be fluent in English. What's the bloody *point* of computers when you're going out of your way to make life *harder* for their users? But I digress.
A "formal or informal contribution of either money or equipment" = payment. No matter how you try and frame it, it comes down to offering something in return for a service.
The bugs in question have been sitting in multiple versions of Windows and Adobe's software for well over a *decade*, yet it took Mateusz Jurczyk—a professional hacker—to find them. Are all programmers the world over supposed to be able to match Mr. Jurczyk's abilities as a matter of routine?
Even if I had the entire source code to Windows, OS X, and the latest spin of Gentoo sitting on my hard disk, I wouldn't have the faintest idea where to even *start* looking for backdoors and the like, let alone how to fix them if I came across one. My current expertise lies in writing tutorials about flinging sprites around a screen using C# and Unity, not in untangling the source code to OpenSSL. And I once spent 15 solid months doing nothing *but* debugging other people's code, so thanks, but no thanks; I'll leave that to hardcore masochists.
Yes, it means trusting businesses, but how is that any different to trusting random folk on the Internet? I'm not wealthy enough to have money to throw at random strangers whose CVs could be complete and utter fabrications for all I know. When it comes to programmers with security skills, performing due diligence isn't optional.
So, I can just as easily have Apple, Microsoft, etc. "fix their source code" instead, with no need for me to have access to it. Better still, I actually get free, and (usually) timely patches and updates from both companies without having to lift so much as a finger!
I believe the current fashion among young whippersnappers is to add a "W3wt!" at this point, or something equally vacuous.
Good try, but expecting amateurs to fix industrial strength cryptography code is a bit much. I understand the principles involved, but none of the maths.
Still, ignorance doesn't appear to be a barrier to some coders, or the Heartbleed issue wouldn't have happened.
As opposed to, say, game controller drivers? Or did you not read the article about the latest Linux kernel release?
Odd, I don't recall Spotify and their ilk being particularly generous with the ol' moolah either, and their free ('ad-supported') offer was no three-month lure either. They're also famous for not paying much, (if anything), to the artists; it's the publishers who get the dosh.
Apple still offers their conventional iTunes music "buy to own" music download service. Apple Music merely offers an additional method of getting access to music. And – oh, look! – Ms. Swift's oeuvre is still available in the iTunes Store! Including "1989 (Deluxe)" at a mere €11.99! So, she seems quite happy to take the Apple shilling when it suits her.
As others have pointed out, Apple aren't forcing artists or publishers to take part in the free three-month opening. Ms. Swift's complaint would be more valid if Apple were going to offer her catalogue of music for free *without* her permission.
The iSuppli Bill Of Materials (BOM) estimates – and they really are JUST estimates – completely and utterly ignore the product design and development efforts for both the hardware and software components. iOS is not licensed to any third parties; it's strictly an Apple-only OS and that means Apple need to meet the costs of developing that component as well when selling their iPhones and iPads. iSuppli and their ilk never include this in their estimates.
Nor do they include the costs of other loss-making aspects of producing these products, such as providing developers with up-to-date documentation and support, maintaining iCloud and the synchronisation back-end, and so on. Only iTunes and the App Stores are intended to at least break even, or make a profit, though none of these provides more than a fraction of Apple's revenues. The rest of it is loss-making, but is required to provide features like Continuity. (As Samsung, HTC, and their peers don't make desktop PCs, they're limited in how much they can support such things. Android has no desktop counterpart as such.)
All of these things add up and form part of the real BOM.
Android handles *all* of that shit for Samsung, HTC, etc., so iSuppli's BOM for those companies' devices are much closer to the mark. They invest next to f*ck all in Android's development. Yet somehow, Samsung still manages to find a way to charge iPhone prices for their "flagship" Android phones.
You'll be waiting a long time then. Nokia won't be designing a thing, "high quality" or otherwise. The report implies they're just going to pimp the brand name and let other companies stick it on their own stuff.
Seems fair enough to me: the "Nokia" brand name still has some value; why not milk it for all it's worth?
Elop leaving Microsoft is no big surprise: Microsoft did not want to buy Nokia's bloated and (obviously) underperforming mobile phone division! Elop pushed it through regardless, giving Nokia's shareholders a decent chunk of money. (Anyone should have seen the axe of job cuts coming a mile away. Seriously, how is that a surprise?)
A business is a machine that makes money for its owners. How that money is made is a mere detail. Nokia have been active in logging, rubber products, and more. Mobile telephony is still a key sector; they just prefer making the infrastructure now, rather than the client devices.
Elop didn't leak that memo himself.
It is worth reading the memo, if only to clarify exactly what it was he wrote. This wasn't an "Osborne" moment; it's clear from the content that Nokia were *already* in serious trouble. (The full text can be found in this article. I assume El Reg also has it somewhere, but I can't find it with their search engine.)
Although it's fair to say the writing was on the wall long before Elop turned up, it was the idiot who leaked that memo who doomed his colleagues and destroyed any future the mobile division might have had up until that point. The memo was posted to an internal intranet, not to the media at large.
Apple pay Foxconn to build their stuff. Last time I checked, Apple did not own Foxconn. These are Foxconn's employees, not Apple's. Apple aren't even Foxconn's biggest client.
Also, the value of wages needs to be considered in context: 8 bucks might barely cover two coffees in the US, but it goes much, much further in China. That's why Foxconn and Quanta have no shortage of willing workers queuing up to work for them.
This is a pattern that has repeated itself a number of times in recent history. Even the UK went through this phase.
You may want to check your facts. Apple dropped their skeuomorphism (along with Scott Forstall, who championed it) with iOS 7. This was released in 2013, not 2014, and was the first version of iOS to go 'flat', though iOS 8 did include a lot of visual refinements. (In any case, Apple aren't about doing something first, but in doing it *right*.)
The main problem with Android is Google.
Why would I pay $600+ for a "premium" phone that runs an OS made by a company that treats *me* as the product, and my own personal information as their corporate property? If I'm going to spend that kind of money, I'd rather spend it on a product that ensures I have full control over my own data. That means Apple or Microsoft.
As long as Samsung and HTC insist on relying on Google for their core software components, they're going to struggle to beat Apple at their own game.
Either there are people running businesses in the US, who haven't heard of ATMs, or they're so rich, they literally assume all us little people exist to pay for things in cash on their behalf, so they don't have to dirty their hands with actual banknotes.
Or they're just incompetent and incapable of proper planning.
Why would anyone want to work for such people?
Nobody is "required" to give up their privacy. If you don't accept the software's terms and conditions of use, you have the *choice* not to use it. I don't know which part of this Stallman doesn't understand, but it's not neuroscience.
That's why I prefer to use OS X or Windows over anything that comes tied to the GPL. At least with Apple I know they're making the vast bulk of their money from selling actual products, rather than nickel-and-diming me while selling on my personal data.
Despite all Stallman's rhetoric about Free Software being "free, as in speech", it's patently obvious to anyone who isn't a Demis Roussos lookalike that the vast majority of GNU/Linux's success can be attributed primarily to the "free, as in beer" aspect, and not a lot else. Even the companies making those cheap and nasty routers are using it because it saves them having to actually pay for a team of competent programmers. And this is why Heartbleed happened.
Apple owners tend not to be penny-pinching misers and are therefore quite happy to pay for stuff they want. (If anything iOS App Store prices have risen a bit of late.) In fact, Apple are a very conventional company: they make their money selling physical products; the software they make is just one of the many components used to make said products, and is often also used as a kind of loss-leader.
Similarly, Microsoft would have to be utterly batshit insane to mine data from their customers given that most of said customers are businesses—many of them major multinationals who give Microsoft a Very Hard Stare if they caught MS doing something that stupid.
I'll leave the situation with Android as an exercise for the reader. Suffice to say that there's a reason why the "Freemium" model has proved so successful on that platform. (Hint: it has the potential to provide much more granular information about the user. I wonder who'd be interested in that?)
...one of Stallman's claims is that FOSS is 'better' because it is less prone to security issues thanks to the (trivially disproved) myth of the 'many eyes' approach. (Stallman's egotism in his Guardian rant really don't help his anachronistic* 'cause'. With friends like him, the GNU and FOSS communities really don't need enemies.)
Problem is, you need those 'many eyes' to be bothered to look into all that Open Source code first, and wading through yet another router's firmware is pretty bloody tedious and ranks right up there with writing user guides and translation work in its thanklessness. Nobody in their right mind would do it willingly, unless they were being paid to do so. If Heartbleed taught us anything, it's that it's really, really hard to get volunteers interested in doing the boring stuff.
Note that the article clearly states that the research was done—and paid for—by ESET. This wasn't a FOSS community effort. ESET would happily research and highlight flaws in proprietary software too given their line of work.
* (Open Source and all that "Free as in Speech" stuff made sense back in the 1970s and early '80s, when computers were still mostly the preserve of lab-coated folk who could actually program in rooms filled with Winchester disks and photogenic tape drives covered in blinking lights. It makes no sense today. What's important now is the _data_, not the code. As long as we can access our data, it matters not one whit whether the code that originally created it ceased being run a decade earlier. The code doesn't matter any more. It's become a disposable component of a much greater whole.)
...but, unlike Moses' ocean-splitting escapades, there was no *live footage of the fucking planes hitting the buildings within MINUTES of the events*.
This is 2001 we're talking about. Nobody could have rendered the footage that quickly to match the real-world visuals. Remember, people had cam-corders too back then, and they did use them. How easy do you think all that footage would have been to fake as well?
As for "The WTC shouldn't have fallen down": the Twin Towers had an unusual design whereby *all* the structural strength of the buildings was in the outer skin. This allowed the rentable office space to be much, much bigger and less cluttered with structural elements.
Boeings jet airliners fly at close to the speed of sound. That's a lot of kinetic energy. No building would be able to keep the plane out.
Once a plane had passed through that crispy outer shell, there was only the soft centre in the middle to plough through and destroy. (The poor fireproofing didn't help either. This is why the buildings actually collapsed: the explosions and burning fuel softened the floor structures until they gave way.)
That there was so little *obvious* aircraft wreckage is trivially explained by that outer skin, which effectively shredded the planes as they passed through, rather like a cheese grater.
The Twin Towers collapsed because their design had fundamental flaws. Contrary to popular belief, architects and engineers are not infallible. Even Isambard Kingdom Brunel made plenty of mistakes.
Freedom of Speech / Expression does not imply Freedom from Offence: in fact, the two are broadly incompatible. However, it also does not imply Freedom from Consequences.
However, the Internet is 90% privately-owned infrastructure, so those freedoms and rights are not as relevant. You agree to abide by the private owners' rules and regulations when posting on forums and the like, many of which explicitly ban offensive statements and remarks. Because—contrary to popular belief—they're perfectly entitled to do so.
Private entities (including individuals) are entitled to do this for the same reason people are allowed to call the police if some nut-job decides to stand in my front garden and shout obscenities at their family day and night.
The whole "Freedom of Speech" (and "Right to bear arms") stuff is intended to protect us from *bad government*. They're primarily political rights and responsibilities.
"Odd...nobody mention the average quality sound that the iProducts deliver"
Apple's devices tend to be rated quite highly for sound quality, with only the occasional exception over the years.
Besides, nobody's plugging these into a high-end NAD amp and multi-thousand-quid speakers anyway, so who gives a toss? The paying public—the only subset of "the public" that matters to any business interested in making a profit—certainly doesn't.
Elop, for all his flaws (and apparent ignorance of the Osborne Effect), was "parachuted" into a basket-case of a company whose woes date back to the adoption of a platform, Symbian, that was designed for the mobile technologies of the 1990s, not the 2000s, let alone the 2010s.
Symbian shared many design flaws with Microsoft's much-loathed MFC. And for similar reasons: both were built on C++, but that language was still in flux during the 1990s, with the result that many OOP concepts were not particularly well implemented in the language itself and required all sorts of hacky workarounds. The most visible example of this was Symbian's decision to roll its own memory management system. This made your code much more difficult to port, for example.
Symbian was also clearly designed by people who genuinely believed that 'correctness' was the most important thing, so the API demanded pages and pages of sub-classing, just to perform the most basic of operations. And they didn't give a damn about making the developer's life easier. No decent IDEs, no design tools, no consistency across the UI-layer APIs. Nada. You had to do it all the hard way. Because, masochism. Think GNU / Linux development circa 1998.
So, Nokia's main platform was a technological dead-end. It was never going to work in an era of mobile devices powerful enough to run proper, hairy-chested, grown-up operating systems like *BSD, Linux, or Windows NT. Its lifespan was therefore inherently limited. But Nokia's management had no consistent Plan B. They bickered endlessly, duplicated efforts, and let petty empires and management politics get in the way of running a damned company.
Nokia's original management team royally f*cked up. This was NOT Elop's fault as he wasn't there during any of this.
By the time Elop was brought on board, the mobile division was already a massive basket-case and clearly in its death throes. Elop looked about desperately to find an alternative to Symbian. In 2010, Android was still very half-baked and no better than any other option. (This was the Android 2.x period.) At this point in time, Windows Phone looked just as viable as any other option, and Elop had worked for Microsoft, so he presumably felt having contacts there would be useful. He didn't know anyone at Google.
Elop did make some mistakes, but the leaking of the "burning platforms" memo is not something he can be directly blamed for. (My money's on it being leaked by a disgruntled member of the outgoing management team.) This resulted in the predictable Osborne Effect, effectively destroying what little chance the mobile division may have had of surviving intact as part of the larger Nokia group. With nobody buying their old tat, Nokia had to find a buyer for the division.
Elop's connections at Microsoft made the latter an obvious choice, but Microsoft weren't actually that interested. That Elop pulled off the purchase regardless is therefore not a "failure" as many seem to believe: had he not achieved this, there would be no Nokia Lumia phones today: the division would most probably have been closed down entirely. Believe it or not, this made Nokia's shareholders happy, not sad. They don't care how their goose makes those golden eggs, as long as it keeps laying them and making them money.
"The problem was that Apple failed to sue Google and the Android Open Source Project for copying the Apple designs..."
Apple started suing Samsung (and others) back when Android was still a babe among mobile operating systems, and looked so much like iOS that nobody was seriously attempting to explain the similarities as anything other than a very sincere form of flattery. Some manufacturers today are still chancing their arm by blatantly ripping-off iOS. If you don't believe me, check this out. Samsung's early Android phones were pretty similar rip-offs, though they've moved a long way away from that approach since then, helped by improvements and changes in later Android releases.
The reason why Apple didn't sue Google directly is simple: Google don't charge for Android licensing, so it's difficult to prove that Google benefit materially from ripping-off iOS given that Android is deliberately designed to be skinnable, and has been pretty much since the beginning. (Otherwise, that rip-off Chinese device linked to above wouldn't be possible.)
Google didn't get where they are today without learning how to cover their corporate arse. Suing the licensees rather than the licensor therefore makes a lot more sense.
Strange how so few reporters (and El Reg readers) understand the basics of good design. I recommend Donald A. Norman's seminal "The Design of Everyday Things". It's surprisingly full of clues, which is clearly something Mr. J. Hamill lacks.
Here's a hint: "innovation" is most emphatically not limited to hardware alone.
I thought it was just me who thought this "bendgate" bollocks was being blown out of proportion.
My eyesight being a bit rubbish means smaller phones are no longer of any earthly use to me as I can barely read them, so big screens it is. So, I own a 6" Nokia Lumia 1320, which means nobody gets to call me a "fanboi". (Yes, I keep it in a nice, chunky flip-cover case. I'm not an idiot: it has to share a trouser pocket with my house keys.)
I did check out a Note 3 and a similarly sized Wiko model, but Nokia's build quality is best described with words like "outhouse" and "brick" and it's been holding up pretty well so far.
No consumer electronics product is designed to take the kind of deliberate abuse shown in the video. If you look closely, you can see the screen starting to separate when he tried to bend the Android product. And let's not even mention Samsung's own woes. (Oh, right: nobody has. Odd, that.)
"I have absolutely no problem with Microsoft collecting info on my beta copy."
WHAT "beta copy"?
Microsoft haven't released a beta version of Windows 10: there is only a Technology Preview, which means it's not even at the Alpha test stage yet. The final release is almost a year away!
Microsoft explicitly warned you to expect "a UI design that might change significantly over time." That's copied verbatim from here. You have no excuse for not reading that section. None.
So the GUI is clearly still a work in progress. Nothing is nailed down. Features are being added. Others might be taken away, tweaked, or replaced wholesale.
In short: this is NOT a "beta" release.
As you appear to be unaware of what a beta release is supposed to be, allow me to explain: a beta release—despite Google's attempts to suggest otherwise—is feature-complete, with development of new features frozen: no new features are allowed to be added to the codebase beyond this point. (Any new features that aren't in the beta are automatically kicked into the next release cycle.)
Beta releases are the last leg of the journey towards release. Right now, Windows 10 has only just left the starting gate.
"Most of the so called innovation in the Iphone 6 (such as bonk to pay) has been in Android phones for years."
Apple's innovation wasn't the inclusion of NFC support in their phone (and watch). Apple's innovation was in the software that works with it, integrating it into the OS. Apple also went the extra mile in sorting out the back-end too. They got MasterCard, Visa and Amex on board.
Did Google bother doing that for Android phones? Did they feck.
While Android manufacturers scream "FRIST!" at every opportunity, they rarely bother with the much harder job of making sure each shiny new feature is actually useful. There's a vast chasm between fitting a component and a basic driver, and the far more difficult task of integrating it properly into the user experience.
El Reg readers really do need to stop viewing software and hardware as if these were two entirely separate domains that should always be treated individually. It's a bloody stupid way of approaching product design and always has been. The two components are of equal importance in producing something that is genuinely usable and meets the needs of the customer.
The reason Apple have been running away with all the cash isn't because they're really good. It's because their competitors have been so spectacularly bad.
And this is despite Apple literally spelling out exactly what their business model and philosophy in damned near every cheesy corporate identity advert they make. (Including, please note, the "not caring about being first" bit, which they explicitly state in that linked example.)
"What gets me is how tiny the engines are on these huge ships, and how little power they use to move stuff around."
If memory serves, this is because the power you need to move a ship through water increases linearly with volume, rather than exponentially. Brunel's Great Eastern was sold to investors on this basis, and it's why there's still a continuing trend for ever larger ships. Bigger ships are more efficient.
Except it's not nasty, toxic "waste": It's granite. (Mr. Worstall explained that the hill he owns is full of tin ore, and that said ore tends to be found in granite.)
Crushed granite, granted, but washing that down into a nearby stream isn't going to harm it any more than washing the topsoil into said stream does. It's a perfectly natural rock. (And as Mr. Worstall patiently explained, the sea does plenty of rock-crushing all by itself. How do you think sandy beaches happen?)
Besides, there are other uses for crushed rocks. Construction aggregate, for example, is mostly crushed rock, for example. And you need a lot of aggregate in construction. It's what you mix with cement to make concrete.
No, my only quibble with Mr. Worstall's article is use of "Mr. Henry". And he's not the only one to make this mistake...
The manufacturer is Numatic—both British-owned and British made, which is more than can be said for Dyson—while their vacuum cleaners are simply "Henry", "Hetty", "Charles", etc. Their wet and dry vacuum models are "Charles" and "George". "Henry" is their low-end model and doesn't do wet vacuuming, so I'd not recommend using one for hoovering an Asian beach, no matter how long the extension cord is, as Henry would not have such a happy face for long.
"Android owners are just the ones all around that dont feel the need to buy status symbols, they don't care what others think."
Then would you mind awfully explaining why you and your peers insist on telling us, over and over and over and over again, at tedious fucking length, how much you "don't care what others think" every bloody time there's a news story about Apple?
"Average users install software everyday, they restore Windows everyday, they are perfectly able to install an OS with a gui interface."
Eh? "Average users" don't have a f*cking clue what an operating system even is, let alone why they should have to piss away 20-40 minutes of their lives waiting for it to install.
When I buy a car, or a washing machine, or any other fucking appliance, I expect it to work out of the box. That's what most people here seem to be failing to understand. 99% of the people out there couldn't give a flying toss what their hammer is powered by, as long as it bangs in the damned nails.
The OS is a component. It is not—and never has been—the alpha and the omega of IT. It's just a program that lets you run other programs. Stop treating it as if it were special.
Besides, if "competition" were truly what most people were after, why are you trying to do down Microsoft when their OS is practically the only non-UNIX-based choice available? Or do you genuinely believe that an OS designed back in the days of flared trousers and The Clangers is the only OS worth keeping in the 21st century? Because, from where I'm sitting, that's a bloody sight more dangerous to our industry's long-term future than anything Microsoft has ever done—and I include all their business shenanigans in that.
As for the court case: this is Italy. It'll have taken years to grind through the system—Italy's legal system makes continental drift look fast—so is probably referring to a purchase made back when Charles Babbage was still alive. I doubt it'll have any lasting repercussions. Especially as Windows hasn't had a "90% share" of the desktop / laptop market for some time now.
So the OS and ecosystem don't matter at all, then? They're completely equal across all platforms? Really?
Strange how the Snowden Files™ have caused such an uproar in the media, yet people are still happy to hand over their every personal detail to a company that has been doing much the same thing since it sold its first online ad.
I wouldn't touch anything tainted by Google if they *paid me* to take one of their malware-infested pieces of overrated crap off their hands. Not only does Google not value my privacy, they've built an entire business empire on invading it. So, yes, I'm willing to pay a little bit more to avoid that. Apple might not be perfect, but they know good UX design, and they're not in the business of selling *me*.
Er, no. Copyright infringement is just a fancy name for counterfeiting.
It's still wrong. It just has nothing to do with stealing paintings by long-dead artists from a publicly-funded gallery. (If nothing else, the quote is a perfect illustration of why politicians should never be allowed to have any real power.)
Counterfeiting is wrong for the exact same reason printing duplicate banknotes is wrong: it devalues the item. The more money you print, the less that money is worth. The more copies you make of a copyright-protected item, the less value each individual copy has. In effect, you're reducing the value of the work the creator made, which in turn reduces their ability to sell it.
The fact that computers make this kind of counterfeiting easy is irrelevant. Most crimes aren't that difficult to commit: It's not that hard to point a gun and shoot people, or grab a knife from a kitchen drawer and stab someone with it. Dropping a few grains of poison into a cup of tea isn't technically all that difficult either.
However, most people tend not to do so. Because most people aren't arseholes.
So, you've not heard of this new-fangled "Bluetooth" technology, then?
Ironic that the only two companies that have actually done more to push the industry forwards of late are the ones constantly accused of not "innovating" any more. As if innovation is a tap you can just turn on and off on demand.
Say what you will about Windows 8.1, but at least Microsoft are *trying* to do something new. They're taking risks and trying new ideas, which is more than any bugger else seems to be doing these days. That ModernUI, with its typography-led approach is apparently so awful, nobody has been inspired by it in any way at all. Except, that is, for pretty much everyone and their dog. Even Apple's iOS 7 took some cues from it.
It says a lot about the state of this industry that Apple's only real competition in UX design today is Microsoft. Nobody else is even *trying*.
[SPOILER ALERT: If you haven't seen the episode yet, look away now.]
I'm glad I wasn't the only one who wondered about the Godzilla-sized dinosaur. T. Rex – on which the dinosaur is clearly based – only grew to about 40-odd feet tall, which is smaller than the Sovereign's Entrance at the foot of the Queen's Tower. The _really_ big dinosaurs were the herbivores, but even those would have been hard to spot if looking across London over the Houses of Parliament.
The poor standard of the foley work didn't help: at no point does the dinosaur ever sound like it's standing in the middle of a major river.
I think the fundamental problem here is that Moffatt just isn't a particularly good show-runner. RTD might not have been able to write a genre story worth a damn, (which is why he kept falling back on soap-style melodrama + comedy), but at least he knew how to produce a show. Even if it did include comedy farting aliens.
Others seem to be bothered by Clara's reaction to New Doctor™, but I'm not. Although she'd met the Doctor's previous incarnations, she was clearly only aware of regeneration as an abstract concept. Actually seeing someone a good friend, whose life you literally just saved (and got "rebooted" into the bargain), change from Mr. Over-actor into Mr. McFurious right in front of your face—when the space-time machine you're both standing in is also about to crash—is bound to shake you up a bit. She's still a human being.
Yes. Windows Phone 8.1 actually a pretty decent OS and, crucially, available for cheap: Italians tend to buy SIM-free then get a PAYG SIM option, rather than signing up for long contracts. You can still get the latter, but Italians are nowhere near as interested in endless credit as other countries I could name.
Price is a big factor too. If you think the UK was a "rip-off", you should see the prices charged over here. I've seen differences of over €70 for the same product in France and Italy, despite both countries allegedly having the same currency. (Yes, Apple, I'm looking at you. As for you, Samsung, you're just as guilty.)
Dual-SIM phones are also popular due to the shite cellphone coverage once you step outside the cities. Apple don't make any of those, so you're left with either a Nokia, or some little-known brand of Android device. (Ever heard of a WIKO BARRY? No, me neither.)
Samsung seem to be trying to chase the same high-end market as Apple here, so they're not interested in the cheap and cheerful sector.
I suspect Italians aren't that big on buying apps either. Relatively few apps tend to be localised into Italian to begin with, so it's not as if there's a vast selection on any platform to choose from.
"In the Apple App Store, apps with in-app purchases are still marked as free. "
And also marked with the phrase Offers In-App Purchases directly under the "Download" button when opened in iTunes, or the iOS App Store. (Note that you cannot install the app except from inside iTunes or the iOS App Store. The link takes you to the 'shop window' page, but you can only view in iTunes or the iOS App Store. You can't download from there.)
You'll also note the list of "Top In-App Purchases" in both locations. It's rather hard to miss. Note, too, that the full description is required by Apple to state very clearly that in-app purchases are offered. This is non-negotiable.
Seriously, what more do people want? iOS 7 already makes it bastard hard to 'accidentally' make an in-app purchase. Even in earlier versions, iOS would ask you for your password after 15 minutes or so.
"The problem is we are all living in a post-scarcity era"
Last time I checked, the number of actual authors able to make a bloody living from writing was still relatively low. All that's happened is that the 'scarcity' has shifted to the content creators, rather than the content itself.
Authors aren't "gouging" anyone. Hell, Amazon actually offer a very good deal for authors who publish directly through them, or one of their in-house imprints.
Not only that, but they make it astonishingly easy to do. Compared to some of their rivals, publishing through Amazon directly is an absolute dream. They really do *grok* good design in the same way Apple do. And they don't limit the process solely to their customer-facing components either. Authors get the same attention to detail, and get published *within hours*, whereas a Big Publisher will sit on a manuscript for anywhere up to 18 *months* before publishing it.
I can do my own PR and marketing—Big Publishing only do a big splash if you're already a big name; I can hire a freelance editor myself; I don't need a massive advance that may, or may not, actually be paid...
... and I don't need to decipher some of the world's most creative accounting to work out why my "#1 New York Times Bestseller" has apparently not turned a penny in profit for me, while netting my publisher hundreds of thousands. Amazon, Apple and their peers pay *much* better than any of the traditional publishers. And they won't try and fleece you through onerous contracts that effectively assign your copyright to them for life.
It's Big Publishing who are screwing up on a massive scale. Simon & Shyster, Hatchet, and their ilk, can get stuffed. All of them. They dropped the ball many years ago and they're just flailing about in desperation now.
As for Amazon's dominance: it would *really* help if people stopped treating design like the unwanted ginger stepchild of software development. Anyone who's ever fought their way through Kobo's system will welcome Amazon's far easier system with open arms. Kobo's site is truly, truly awful. I'm sure they mean well, but it's easy to see why they've struggled to make any kind of impact.
It's a shame Steve Jobs' 'smoking gun' email caused Apple so much damage as they're Amazon's only real competition here. And for the same reason: they go out of their way to make extracting cash from customers as simple, easy and painless as possible. Their rivals seem to be hell-bent on making it as hard as possible for anyone to buy their products, or sell products through their marketplaces.
Amazon and Apple got where they are today by offering people what they wanted. Their rivals have had their clear, obvious lessons right in front of their faces for well over a decade now, so they have only themselves to blame for their own failure to learn from them.
"In jail charged with pedophilia, 'cos you're only 13 years old!"
Even William "How should I spell my surname this time?" Shakespeare was at it. Operation Yewtree clearly doesn't go far enough back in time.
You can prove anything with facts!
I find your lack of faith in the competence of national governments to maintain alien conspiracies for generations... well, not really all that disturbing, now that I think of it.
Apple couldn't give a toss about the general smartphone market. They're only interested in the high-margin markets. I.e. the high end. The people with the moolah. The people with dosh to slosh.
That's the only market worth chasing if you want to make a big, fat, profit. It's very much old-school business practice. And there's a good reason for that: Apple weren't born in the 21st century, or even in the 1990s. They were founded in the 1970s, when flared trousers and "Shaft" were both cool.
Apple are as old-school as you can get without being called either Microsoft or IBM. They're interested in making money, not viral videos, social networks, or bad photography.
Business is all about making a profit for the owner(s). That's the whole point of a business.
And yet, the Swiss manage to do precisely what you say is impossible. How? By realising that democracy – in any meaningful sense – doesn't scale well. Why are voter turnouts so low? Because, in a country of well over 60 million people, having one sixty-millionth of a say is tantamount to having no say at all.
In Switzerland, the bulk of the power is in the Cantons, not the central government. The latter is kept deliberately small and has very little power compared to its counterparts in neighbouring countries. Yet they still manage to get things done. It works precisely because people are fundamentally tribal. By focusing the democratic processes on local and regional politics, which is, frankly what most people actually care about, the national tier can concentrate primarily on harmonising legislation that has to work across Cantonal borders, as well as the occasional bit of foreign diplomacy.
Another point to note is that the Swiss system relies much more heavily on referendums. The St. Gotthard Base Tunnel (and its siblings) wouldn't have happened without one, for example.
(Incidentally, US readers might like to note that the Swiss Constitution was deliberately based very closely the US one. What you see in Switzerland is arguably much, much closer to what the US Constitution's signatories had in mind.)
At the very least, such an approach makes warmongering rather more difficult to do. It also means that most politics would be local and regional rather than national, so of more interest to those it affects.
Switzerland got it right, and did so back in the 1870s (around the same time as Italy's reunification), without the benefit of electronic communications, so it is certainly possible.
The USA does, however, provide us with a crystal clear illustration of how such a system can go very, very wrong.
Can you tell us how a company that is famous for only refreshing each of its (very small) product lines roughly once a year, can be described thus:
"With a product refresh rate that sees it chomp through natural resources like a chubby child demolishing a Mars bar, it's difficult to see Apple as a particularly environmentally friendly organisation."
If Apple are "a chubby child demolishing a Mars bar", what the hell does that make Samsung, Dell, or LG? A massive swarm of locusts, devouring acres of crops in a few hours?
I suggest sir gets a sense of perspective. Or a life. Either would suit me, as long as neither involves writing ever more ridiculous click-bait pieces like this worthless shite. If I wanted to read this kind of ill-informed bollocks, The Motley Fool has no shortage of it.
It's not "censorship". Last time I checked, Apple were not a country with a seat at the UN. Censorship is perfectly legal if you're a private entity. The media does it all the bloody time and nobody bangs on about them 'censoring' whenever they bleep out bits of Frankie Boyle's jokes.
Apple have an image they're trying to present to the public—an almost Disney-like one that tries to portray a 'squeaky clean' company and ethic. Apple is also a business. Businesses are explicitly allowed to refuse to do work for a customer if they don't want to, if they feel it would make their employees uncomfortable, or even if they just feel it would violate their core precepts, such as their brand image.
(Note: You're also free to sue said businesses into oblivion if you genuinely believe they denied service to you on grounds of racism, sexism, or some other legally-covered "-ism". But that's entirely your own decision.)
Chances are, their filtering policy is based on how a nation's media treats such words: If they're routinely bleeped out on TV and radio, it'll probably be added to the filter. "Dick" and "penis" are rarely bleeped in the UK or US, and arguably aren't even considered swear words. "Clit" and "cunt", on the other hand, certainly are.
There's a good reason why Roger's Profanisaurus (PBUV*) has been so popular, and is so filled with inventive euphemisms: rather than fight the media's self-censorship, it's easier—and a hell of a lot more fun—to just work around it.
* "Praise Be Unto Viz". Let's see if El Reg are happy with this.
Do you seriously believe Tim Cook is personally responsible for the website's filter code?
"To be fair I'd suggest this was as much a fail on the MacOS interface design team "
This was done for a very good reason: to stop users ejecting a disk while it's being written to!
As for the "drag to the Trashcan icon": the Trashcan icon changes to an Eject icon when you start dragging. So I call bullshit there. Also, there's a perfectly obvious "Eject" command in the File menu, and an equally obvious keyboard shortcut too: CMD+E. (Or just press the "Eject" button right there on the bloody keyboard!)
It seems there are so-called "experts" commenting right here on these very forums who feel a 40-minute 'group shout' at a hapless user is a perfectly valid way to convince them to eject a floppy disk. And yet it's the user who is considered the clueless idiot in that story? Hypocrites, much?