Re: Moving "users" folder
Windows 10 does a better job of this than v8.x, though I can't get it to move applications as well. (There's a switch in Settings, but it's always greyed out.)
976 posts • joined 8 May 2009
Windows 10 does a better job of this than v8.x, though I can't get it to move applications as well. (There's a switch in Settings, but it's always greyed out.)
"Java is actually a very useful platform, especially for cross-platform server apps."
Java is a massively bloated kludge. It's "cross-platform" in the way that a train is cross-platform when derailed at speed just before entering a station: the term technically applies, but it's definitely neither pretty nor elegant. Java reinvents every wheel common to every operating system, embeds these new wheels in a gargantuan virtual machine, then dumps that VM on top of the underlying operating system, effectively duplicating most of it. Now, instead of one vastly complicated source of potential security issues, we have two!
Java is not cross-platform: it is its own platform.
How the hell this mess ever managed to get anywhere at all boggles my mind, but then again, this is the same industry that thought nailing object-orientation onto a low-level portable assembly language was a good idea.
Oh dear, I appear to have ranted all over this nice clean thread. Sorry.
"I see the same names cropping up again and again in articles condemning it as a bug ridden pile of hurt."
El Reg doesn't have a staff of hundreds of writers, so you're naturally going to see the "same names cropping up again and again". I'm curious as to why you think this is a surprise.
And Flash is a bug-ridden piece of legacy code that should have been taken outside and put out of our misery a long time ago. I'd also be quite happy to see the same sentence passed on Java; I'm an equal opportunities hater of shit technology.
And individuals have also been known to do the same thing. This isn't a concept limited to fictional persons.
Just ask Jimmy Carr.
Yes. You can buy it an entire container ship full of fruit juice cartons if you like. That's the point: contracts work just fine, and offering to buy the corporation a pint is a perfectly valid contract.
What the corporation would do with said pint is another matter, but I'm sure the CEO will think of something.
From Latin corporatus, past participle of corporare (“to make into a body”). The concept dates back to Roman times.
The whole point of the concept of incorporation is creating a legal 'person' that acts on behalf of its owners. By defining a business as a fictional person, the same laws that govern contracts between two normal people can also apply to corporations, for example, which avoids the need to waste time and effort designing and implementing a parallel legal system just for businesses.
(In IT terms, you can consider a nation's laws as its operating system. These tend to be designed for interactions between people, so defining a corporation as a fictional person means you can use the same API for both. Of course, like many an old, crusty API, there are any number of edge cases, quirks, and outright design errors that cause problems and unintended behaviour, but that's us fallible humans for you.)
That's a fine rant, but the spat (from Apple's point of view) is actually about design: the combination of both hardware and software. Apple don't treat the two as separate, which is part of the reason the case has been dragging on for so long.
"Trade Dress" is a key issue: the purpose and applications of Trade Dress can be complicated to explain to laypeople. It is absolutely not just about owning the concept of "rounded corners", but about the combination of design factors that associate a product with its manufacturer in the minds of its customers.
For example: Apple used silhouetted dancers wearing white earbuds in adverts for their iPods back in the day. While Apple don't own a patent on "white earbuds" or "silhouetted dancers", they could definitely sue a rival if they used such recognisably 'Apple' imagery in their own ads for their own music reproduction products. This is the sort of thing "Trade Dress" covers. The most common example is the iconic Coke bottle and the cursive, flowing Coca-Cola logo itself. Using a name other than "Coca-Cola" for your soft drink, but in the same flowing typeface and colour scheme, is a violation of Trade Dress.
It's also worth noting that this mess has been dragging on for years and refers to hardware produced during the Android v1.x and v2.x eras, with a focus on Samsung's home-grown "Touchwiz" UI layer, not to more recent devices. Back then, Samsung's phones really were blatant rip-offs of Apple's, down to the grid of icons and the 'dock' at the bottom of the screen. Even the number of icons in each row was the same, though different screen sizes meant the number of rows varied.
Android and Touchwiz have both changed a lot since then, so it's not surprising that those unfamiliar with the case don't understand what the big deal is.
"Top Gear" did a piece on the many and varied user interfaces that were tried in early automobiles. (I think it was mostly James May, though Hammond may have been involved too.)
To avoid confusion with the milliJub (or mJub), shouldn't we be referring to µJubs?
(That's ALT+230 in Windows. OPT+m on OS X. What GNU/Linux or the BSDs use is left as an exercise for the reader.)
Rome? Rome's a piece of cake. I've driven in Rome, but not even my native Italian relatives would willingly drive in Naples.
Perhaps it's for the same reason a certain Mr. L. Torvalds has included support for f*cking *game pads* in his own, little-used kernel? (I forget its name. Minux? Torix? It's got an 'x' in it somewhere.)
But please, do continue with your rant, O Great Sage! I'm sure no other OS has ever included code designed in an era when dial-up Internet access from home was still a novelty even for most IT experts.
You're thinking of the other John Smith.
No, not John "No Middle Initial" Smith from Accounts, or John "John Smith" Smith, the amateur John Smith impersonator from Smitham. You know: Johnny, and his mate Smithy's, brewer friend, John Smith!
Or maybe it was Dave. I often get those two confused.
Some of us *like* poisoning the seas! (Bloody things, think they're so big! I'll show *them* and their wobbly wetness! Serves 'em right!)
The reason FIAT offer such small engines is because the ancillary costs of owning a car in Italy force them to do so.
The 500 isn't being sold to older, experienced, drivers, but to younger drivers who are still working their way up the no-claims ladder. (It has no less than 14 rungs in Italy, not the five you get in the UK, although the way insurance works over here makes it difficult to compare like for like. As with most of continental Europe, in Italy, it's the car that's insured, not the driver.)
Furthermore, even selling a car on costs money: changing the owner of a car requires a fee that can easily be north of €450, depending on the rated power of the engine (in kW). Other taxes also add to the high cost of car ownership, so those tiny engines are a perfect market solution: few people are driving their shiny new 500s on long trips through the mountains, but they do drive them around their cities. Think of the 500 as the 4-wheeled equivalent of the 1960s classic Vespa motor scooter, and the marketing makes a lot more sense.
Finally, there's the small matter of fuel: Petrol is pricey here, so the less of it you need, the better.
"A ridiculous statement?"
Why, yes. Yes it is.
I've lost count of the (original, pre-2003) FIAT Pandas I've seen around here with well over half a million kilometres on the clock, and number plates dating them at well over 20 years old. Truly they are Italy's unsung workhorses. (My own FIAT Punto ESX lasted 17 years before it went to the big motorway in the sky.)
That said, I'm not sure why you think FIAT going for "coolness" is such a big surprise. FIAT Chrysler Automobiles also own Alfa Romeo, Dodge, Ferrari, and Maserati. And they're *Italian*, for f*ck's sake!
As for the iPhone: it pretty much redefined the meaning of the term "smartphone" thanks to its innovations in usability. It's not the physical components alone that make a product great, but the way they are connected to the user through their interfaces. Apparently, only Microsoft have twigged this 'secret', though the more open nature of the Windows platform makes it much harder to execute on this.
I'd better bring my copy of "War and Peace" then.
...but you didn't have to go so far udder your way to prove it.
And the less said about some of the comments, the butter.
If there's one thing a phone-based NFC system can't do, it's get swallowed up by a Windows XP-running ATM that's just decided to have a bad hair day and crash in quite spectacular fashion.
It was fascinating to watch it die, painfully slowly... until I realised I wasn't going to get that peach iced tea and pizza I'd been looking forward to for most of the afternoon. I got the card back the following day, but it has carried the scars ever since. It looks like it got run over by a very small bicycle with particularly filthy tyres.
(Oh, and I've had my card skimmed too, with the consequent sorting-out involving giving a police statement and a 60 km. round trip -- in a borrowed car as I don't own one -- to sort out the paperwork. Still, that's Italian bureaucracy for you. At least it got sorted, though I was without any money at all for about three weeks.)
Or you could just plug in an external 'booster' battery.
I really don't get what the big deal is over 'replaceable' batteries. How is popping a fiddly case off, easing out the drained battery, digging around your bag for its replacement, fitting it, then popping the back of the phone back on "better" than just plugging in an external battery?
Either way, you'd be carrying two things around with you instead of one.
The problem the article highlights is that there is software that is widely used in many GNU/Linux distributions that is effectively *unsupported*. There's nobody behind it, accepting bug reports, or actively maintaining the code. This is the exact *opposite* of the Linux kernel developers!
I'm no fan of the GNU/FOSS communities in general, and certainly not its tedious politics, but, for once, Linus Torvalds is not the problem here.
The problem is one we used to call "bit rot". Old code, in common usage, that is a time-bomb waiting to happen. Not just because of potential security risks either, but because an important API that code relies upon might change tomorrow, causing all sorts of major headaches.
Such code thus also acts like a drag on improving such APIs, holding back development and forcing programmers to jump through additional hoops to keep that old code happy. And it's those hoops that cause hackery and kludges to appear in code, opening up wonderful new possibilities for bugs and security issues.
It took them long enough, but Microsoft have finally learned to say, "No!" to continued legacy support in Windows precisely because of problems like these. (Apple never said "Yes!" in the first place; they explicitly state that each major new release of OS X may break old software.)
The UK's statute law system is the exact opposite of the Roman (or Napoleonic) systems used across most of continental Europe.
In the UK, the fundamental principle is that laws define what is illegal.
In most of continental Europe, laws define what is legal.
Harmonising laws across the EU is therefore not a trivial process. British MPs cannot simply copy and paste a translation of the French or German version.
1. Obtain a sufficient quantity of snakes.
2. Kill snakes. Kill them! KILL THEM ALL! MUAHAHAHAAA!
3. Bury inside crust of planet-sized pressure cooker.
4. Simmer for approximately 65 million years.
5. Extract snake oil and serve with fresh, or tinned, war.
6. Sell weapons to all sides.
"École polytechnique fédérale de Lausanne's Center for Space Engineering and Signal Processing 5 Laboratory (LTS 5)"?
What happened to the first four? Did the previous one get sucked back in time? Did they all sink into a swamp? What are the Swiss really up to? Will Penfold ever finish his egg and cress sandwich? Did I leave the gas on? Tune in next week!
"Has the world learned nothing from Ernst Stavro Blofeld?"
Well... YouTube is mostly cats. Does that count?
* (Technically a partial re-opening. The original route was mostly rebuilt as part of this project, burying more of the track underground and removing an unpopular railway bridge that blocked a view of St. Paul's.)
As you appear to be taking requests, I'd like you to explain why it costs so bloody much to build anything in the UK.
I ask because the Swiss have built the Gotthard Base Tunnel for about £6bn. (including the fiddly bits at each end to connect it to the existing networks), while HS2's first phase (to Birmingham) – roughly 100 miles through relatively easy terrain – is priced at almost *three times* that amount, despite not having a major mountain range to contend with.
I've noticed that road-building is also very poor value for money. The recently opened Hindhead Tunnel (on the A3) cost £371 million to build and was extensively covered in the national news, yet the Swiss and Italians point and laugh at such a piffling little project. Both countries build such tunnels as a matter of routine, even on relatively minor roads, and they cost a bloody sight less than £155K per metre!
One small problem with your argument is that it doesn't fit the available evidence. See those 125 mph. diesel trains roaring out of Paddington and St. Pancras stations? They're roughly 40 years old and were British Rail's "Plan B" project, in case "Plan A" (the infamous tilting 'Advanced Passenger Train') didn't pan out. The reasons why Plan A failed can, and have, filled a book, so I won't repeat them here.
Like the BBC – another 'nationalised' industry – British Rail handled a lot of R&D in-house back then, and made quite a bit of money from patents and other revenue streams. There were some damned good managers back then. They *had* to be good, or BR would have collapsed into a basket-case and ground to an embarrassing halt long before privatisation finally came. They were operating on a miniscule budget that was a fraction of that granted to their continental peers. And yet, this was the same British Rail that opened* the cheap and cheerful Thameslink route, modifying it to push the tracks underground and sell the land occupied by their problematic Ludgate Hill and Holborn Viaduct stations. And for afters, they then started work on Crossrail! This was all long before the Major administration came to power.
Nationalised industries have no monopoly on bad managers, as Nokia and Sinclair Research will attest.
The trick is to have people at the top who have a fucking clue what they're doing. Such people are genuinely rare, and it doesn't help that there's no formal requirement for any manager to have any relevant qualifications. (A related problem is managers who got very lucky once, and who therefore assume they're Codd's gift to their industry and end up repeatedly proving themselves wrong. Usually at other peoples' expense.)
We also have very skewed definitions of both success and failure these days. Businesses appear to be becoming increasingly disposable, particularly in the IT sector, and that's not a good thing in the long term. The end goal is often to have the company bought out by an existing player. Eventually, all the nimble little fish end up inside one of the bigger fish, effectively *reducing* competition.
Commuter services invariably cost more money than they make, while inter-city rail services often turn a profit.
This is one of those pioneering mistakes I alluded to in my earlier post: conventional rail is optimised for moving lots of stuff in bulk, over long distances, at speed. As long as there was no better alternative, the short-haul commuter services did fine, but they were never anywhere near as profitable as long-haul express services.
Once electric trams and, later, motor cars and buses appeared, the railway lost its monopoly advantage and their commuter services haemorrhaged money. The two world wars provided temporary boosts at best, but left the network utterly knackered and worn out, which is why the mainline railways and London's own Underground network were all nationalised shortly after WW2. There was simply no money to repair the damage. The various Underground companies were also in financial trouble too. London couldn't afford to lose either the mainline railways or the Underground as they provided essential services, so taxpayers effectively bailed out the shareholders through nationalisation.
TL;DR version: the only part of a rail network that makes any profit is long-haul express passenger services. This is why the Japanese and French didn't build high-speed metros, but high-speed *express* trains. That's where most of the money in rail transport is.
Residual railway property is peanuts by comparison, so if George Osborne is looking to flog off more bits of railway land, (and both Railtrack and Network Rail have already thought of this, so all the low-hanging fruit has already gone), he's going to be disappointed by the return. I've read that he expects to make £1bn out of it, but that's barely six months of construction work on Crossrail. There's no scope for a windfall here.
If Osborne wants to cut the burden of commuter railways on the taxpayer, I'd advise him to look into why it costs so f*cking much to build *anything* in the UK these days. HS2 isn't a bad idea in principle, but its price tag is ludicrously high. Phase 1 is just 100 miles of railway through piss-easy rolling countryside. There's absolutely no excuse for it to cost that much. None. Even allowing for HM Treasury's moronic "Optimism Bias" rules, it's taking the piss.
It was John Major's bunch of idiots who decided to privatise a natural monopoly that was also hamstrung by a number of issues that didn't apply in other countries. In fairness, when the plans were drawn up, nobody expected railways to last. At the time, passenger numbers had been falling for years, so the expectation was that the network would not need massive investment in infrastructure.
I hate to play the "special snowflake" argument, but it really does apply to the UK's rail infrastructure. The price of pioneering rail networks is that a number of mistakes were made. Other countries got to learn from and avoid those mistakes, but the UK is stuck with many of them and it makes the network rather more expensive to run compared to its cousins overseas.
The privatisation of British Rail by John Major's administration also managed to completely ignore all the most basic tenets of good (interface) design, but that's another rant. That no other country has even considered copying the UK's model speaks volumes for Major's incompetence.
I can understand why a graphics driver would be included in a monolithic kernel if performance were a major consideration, but a *gamepad* driver? Seriously?
Linux-based OSes are still primarily used in embedded, mobile, and server environments, where I'd argue security is far more important than a slight increase in frames per second while playing Tux Racer.
Apple's OS X uses a hybrid Mach/monolithic architecture that means only those drivers that *need* to run in kernel space do so, while other drivers are kept the hell out of there. I'd argue that this is a better model than an all-eggs-in-one-basket approach.
...is all that crud doing in a bloody kernel?
I seem to recall that microkernels were the in thing during the 1990s, but this predates the recent rise of cheap multi-core CPUs. (There were some high-end specialist CPUs that had multiple cores, but they weren't found in computers most of us could actually afford.) Today, even the Intel Atom x7 range comes with up to four cores as standard.
Could the microkernel architecture be more viable on today's multicore CPUs? Running *multiple* microkernels in parallel might also be an alternative, with each dedicated to a particular set of tasks. Just because UNIX and Windows have followed the traditional path of loading everything onto the shoulders of a single kernel, it doesn't follow that this is the way it *must* be done forevermore.
The above is why I get irritated by the focus on irrelevancies like Open Source and Software Libre. Not that they don't have their uses, but they've distracted the industry and led us down the path of a de-facto architectural duopoly: it's either Windows, or some flavour of UNIX. Windows' current kernel is a relatively recent design, but UNIX is getting on for 40 years old and still has neck-bearded lunatics gibbering on about 'human-readable' config files*.
The monolithic kernel design is inherently compromised as it puts all our eggs in one basket. Break just one of those eggs and the entire basket gets messy fast. Yet there is no mainstream OS architecture that doesn't follow this model. Windows, GNU/Linux, most BSD flavours, Android, iOS, and so on are all tied to this design.
This cannot be healthy for the long-term future of the industry.
Time for my dried frog pills, I think.
* I don't know what makes a text editor so special that it doesn't count as a dedicated file reader given that text is stored as numbers, just like everything else. Nor do I understand the fixation on storing config data in a format that usually *requires* the reader to be fluent in English. What's the bloody *point* of computers when you're going out of your way to make life *harder* for their users? But I digress.
A "formal or informal contribution of either money or equipment" = payment. No matter how you try and frame it, it comes down to offering something in return for a service.
The bugs in question have been sitting in multiple versions of Windows and Adobe's software for well over a *decade*, yet it took Mateusz Jurczyk—a professional hacker—to find them. Are all programmers the world over supposed to be able to match Mr. Jurczyk's abilities as a matter of routine?
Even if I had the entire source code to Windows, OS X, and the latest spin of Gentoo sitting on my hard disk, I wouldn't have the faintest idea where to even *start* looking for backdoors and the like, let alone how to fix them if I came across one. My current expertise lies in writing tutorials about flinging sprites around a screen using C# and Unity, not in untangling the source code to OpenSSL. And I once spent 15 solid months doing nothing *but* debugging other people's code, so thanks, but no thanks; I'll leave that to hardcore masochists.
Yes, it means trusting businesses, but how is that any different to trusting random folk on the Internet? I'm not wealthy enough to have money to throw at random strangers whose CVs could be complete and utter fabrications for all I know. When it comes to programmers with security skills, performing due diligence isn't optional.
So, I can just as easily have Apple, Microsoft, etc. "fix their source code" instead, with no need for me to have access to it. Better still, I actually get free, and (usually) timely patches and updates from both companies without having to lift so much as a finger!
I believe the current fashion among young whippersnappers is to add a "W3wt!" at this point, or something equally vacuous.
Good try, but expecting amateurs to fix industrial strength cryptography code is a bit much. I understand the principles involved, but none of the maths.
Still, ignorance doesn't appear to be a barrier to some coders, or the Heartbleed issue wouldn't have happened.
As opposed to, say, game controller drivers? Or did you not read the article about the latest Linux kernel release?
Odd, I don't recall Spotify and their ilk being particularly generous with the ol' moolah either, and their free ('ad-supported') offer was no three-month lure either. They're also famous for not paying much, (if anything), to the artists; it's the publishers who get the dosh.
Apple still offers their conventional iTunes music "buy to own" music download service. Apple Music merely offers an additional method of getting access to music. And – oh, look! – Ms. Swift's oeuvre is still available in the iTunes Store! Including "1989 (Deluxe)" at a mere €11.99! So, she seems quite happy to take the Apple shilling when it suits her.
As others have pointed out, Apple aren't forcing artists or publishers to take part in the free three-month opening. Ms. Swift's complaint would be more valid if Apple were going to offer her catalogue of music for free *without* her permission.
The iSuppli Bill Of Materials (BOM) estimates – and they really are JUST estimates – completely and utterly ignore the product design and development efforts for both the hardware and software components. iOS is not licensed to any third parties; it's strictly an Apple-only OS and that means Apple need to meet the costs of developing that component as well when selling their iPhones and iPads. iSuppli and their ilk never include this in their estimates.
Nor do they include the costs of other loss-making aspects of producing these products, such as providing developers with up-to-date documentation and support, maintaining iCloud and the synchronisation back-end, and so on. Only iTunes and the App Stores are intended to at least break even, or make a profit, though none of these provides more than a fraction of Apple's revenues. The rest of it is loss-making, but is required to provide features like Continuity. (As Samsung, HTC, and their peers don't make desktop PCs, they're limited in how much they can support such things. Android has no desktop counterpart as such.)
All of these things add up and form part of the real BOM.
Android handles *all* of that shit for Samsung, HTC, etc., so iSuppli's BOM for those companies' devices are much closer to the mark. They invest next to f*ck all in Android's development. Yet somehow, Samsung still manages to find a way to charge iPhone prices for their "flagship" Android phones.
You'll be waiting a long time then. Nokia won't be designing a thing, "high quality" or otherwise. The report implies they're just going to pimp the brand name and let other companies stick it on their own stuff.
Seems fair enough to me: the "Nokia" brand name still has some value; why not milk it for all it's worth?
Elop leaving Microsoft is no big surprise: Microsoft did not want to buy Nokia's bloated and (obviously) underperforming mobile phone division! Elop pushed it through regardless, giving Nokia's shareholders a decent chunk of money. (Anyone should have seen the axe of job cuts coming a mile away. Seriously, how is that a surprise?)
A business is a machine that makes money for its owners. How that money is made is a mere detail. Nokia have been active in logging, rubber products, and more. Mobile telephony is still a key sector; they just prefer making the infrastructure now, rather than the client devices.
Elop didn't leak that memo himself.
It is worth reading the memo, if only to clarify exactly what it was he wrote. This wasn't an "Osborne" moment; it's clear from the content that Nokia were *already* in serious trouble. (The full text can be found in this article. I assume El Reg also has it somewhere, but I can't find it with their search engine.)
Although it's fair to say the writing was on the wall long before Elop turned up, it was the idiot who leaked that memo who doomed his colleagues and destroyed any future the mobile division might have had up until that point. The memo was posted to an internal intranet, not to the media at large.
Apple pay Foxconn to build their stuff. Last time I checked, Apple did not own Foxconn. These are Foxconn's employees, not Apple's. Apple aren't even Foxconn's biggest client.
Also, the value of wages needs to be considered in context: 8 bucks might barely cover two coffees in the US, but it goes much, much further in China. That's why Foxconn and Quanta have no shortage of willing workers queuing up to work for them.
This is a pattern that has repeated itself a number of times in recent history. Even the UK went through this phase.
You may want to check your facts. Apple dropped their skeuomorphism (along with Scott Forstall, who championed it) with iOS 7. This was released in 2013, not 2014, and was the first version of iOS to go 'flat', though iOS 8 did include a lot of visual refinements. (In any case, Apple aren't about doing something first, but in doing it *right*.)
The main problem with Android is Google.
Why would I pay $600+ for a "premium" phone that runs an OS made by a company that treats *me* as the product, and my own personal information as their corporate property? If I'm going to spend that kind of money, I'd rather spend it on a product that ensures I have full control over my own data. That means Apple or Microsoft.
As long as Samsung and HTC insist on relying on Google for their core software components, they're going to struggle to beat Apple at their own game.
Either there are people running businesses in the US, who haven't heard of ATMs, or they're so rich, they literally assume all us little people exist to pay for things in cash on their behalf, so they don't have to dirty their hands with actual banknotes.
Or they're just incompetent and incapable of proper planning.
Why would anyone want to work for such people?
Nobody is "required" to give up their privacy. If you don't accept the software's terms and conditions of use, you have the *choice* not to use it. I don't know which part of this Stallman doesn't understand, but it's not neuroscience.
That's why I prefer to use OS X or Windows over anything that comes tied to the GPL. At least with Apple I know they're making the vast bulk of their money from selling actual products, rather than nickel-and-diming me while selling on my personal data.
Despite all Stallman's rhetoric about Free Software being "free, as in speech", it's patently obvious to anyone who isn't a Demis Roussos lookalike that the vast majority of GNU/Linux's success can be attributed primarily to the "free, as in beer" aspect, and not a lot else. Even the companies making those cheap and nasty routers are using it because it saves them having to actually pay for a team of competent programmers. And this is why Heartbleed happened.
Apple owners tend not to be penny-pinching misers and are therefore quite happy to pay for stuff they want. (If anything iOS App Store prices have risen a bit of late.) In fact, Apple are a very conventional company: they make their money selling physical products; the software they make is just one of the many components used to make said products, and is often also used as a kind of loss-leader.
Similarly, Microsoft would have to be utterly batshit insane to mine data from their customers given that most of said customers are businesses—many of them major multinationals who give Microsoft a Very Hard Stare if they caught MS doing something that stupid.
I'll leave the situation with Android as an exercise for the reader. Suffice to say that there's a reason why the "Freemium" model has proved so successful on that platform. (Hint: it has the potential to provide much more granular information about the user. I wonder who'd be interested in that?)
...one of Stallman's claims is that FOSS is 'better' because it is less prone to security issues thanks to the (trivially disproved) myth of the 'many eyes' approach. (Stallman's egotism in his Guardian rant really don't help his anachronistic* 'cause'. With friends like him, the GNU and FOSS communities really don't need enemies.)
Problem is, you need those 'many eyes' to be bothered to look into all that Open Source code first, and wading through yet another router's firmware is pretty bloody tedious and ranks right up there with writing user guides and translation work in its thanklessness. Nobody in their right mind would do it willingly, unless they were being paid to do so. If Heartbleed taught us anything, it's that it's really, really hard to get volunteers interested in doing the boring stuff.
Note that the article clearly states that the research was done—and paid for—by ESET. This wasn't a FOSS community effort. ESET would happily research and highlight flaws in proprietary software too given their line of work.
* (Open Source and all that "Free as in Speech" stuff made sense back in the 1970s and early '80s, when computers were still mostly the preserve of lab-coated folk who could actually program in rooms filled with Winchester disks and photogenic tape drives covered in blinking lights. It makes no sense today. What's important now is the _data_, not the code. As long as we can access our data, it matters not one whit whether the code that originally created it ceased being run a decade earlier. The code doesn't matter any more. It's become a disposable component of a much greater whole.)
...but, unlike Moses' ocean-splitting escapades, there was no *live footage of the fucking planes hitting the buildings within MINUTES of the events*.
This is 2001 we're talking about. Nobody could have rendered the footage that quickly to match the real-world visuals. Remember, people had cam-corders too back then, and they did use them. How easy do you think all that footage would have been to fake as well?
As for "The WTC shouldn't have fallen down": the Twin Towers had an unusual design whereby *all* the structural strength of the buildings was in the outer skin. This allowed the rentable office space to be much, much bigger and less cluttered with structural elements.
Boeings jet airliners fly at close to the speed of sound. That's a lot of kinetic energy. No building would be able to keep the plane out.
Once a plane had passed through that crispy outer shell, there was only the soft centre in the middle to plough through and destroy. (The poor fireproofing didn't help either. This is why the buildings actually collapsed: the explosions and burning fuel softened the floor structures until they gave way.)
That there was so little *obvious* aircraft wreckage is trivially explained by that outer skin, which effectively shredded the planes as they passed through, rather like a cheese grater.
The Twin Towers collapsed because their design had fundamental flaws. Contrary to popular belief, architects and engineers are not infallible. Even Isambard Kingdom Brunel made plenty of mistakes.
Freedom of Speech / Expression does not imply Freedom from Offence: in fact, the two are broadly incompatible. However, it also does not imply Freedom from Consequences.
However, the Internet is 90% privately-owned infrastructure, so those freedoms and rights are not as relevant. You agree to abide by the private owners' rules and regulations when posting on forums and the like, many of which explicitly ban offensive statements and remarks. Because—contrary to popular belief—they're perfectly entitled to do so.
Private entities (including individuals) are entitled to do this for the same reason people are allowed to call the police if some nut-job decides to stand in my front garden and shout obscenities at their family day and night.
The whole "Freedom of Speech" (and "Right to bear arms") stuff is intended to protect us from *bad government*. They're primarily political rights and responsibilities.
"Odd...nobody mention the average quality sound that the iProducts deliver"
Apple's devices tend to be rated quite highly for sound quality, with only the occasional exception over the years.
Besides, nobody's plugging these into a high-end NAD amp and multi-thousand-quid speakers anyway, so who gives a toss? The paying public—the only subset of "the public" that matters to any business interested in making a profit—certainly doesn't.
Elop, for all his flaws (and apparent ignorance of the Osborne Effect), was "parachuted" into a basket-case of a company whose woes date back to the adoption of a platform, Symbian, that was designed for the mobile technologies of the 1990s, not the 2000s, let alone the 2010s.
Symbian shared many design flaws with Microsoft's much-loathed MFC. And for similar reasons: both were built on C++, but that language was still in flux during the 1990s, with the result that many OOP concepts were not particularly well implemented in the language itself and required all sorts of hacky workarounds. The most visible example of this was Symbian's decision to roll its own memory management system. This made your code much more difficult to port, for example.
Symbian was also clearly designed by people who genuinely believed that 'correctness' was the most important thing, so the API demanded pages and pages of sub-classing, just to perform the most basic of operations. And they didn't give a damn about making the developer's life easier. No decent IDEs, no design tools, no consistency across the UI-layer APIs. Nada. You had to do it all the hard way. Because, masochism. Think GNU / Linux development circa 1998.
So, Nokia's main platform was a technological dead-end. It was never going to work in an era of mobile devices powerful enough to run proper, hairy-chested, grown-up operating systems like *BSD, Linux, or Windows NT. Its lifespan was therefore inherently limited. But Nokia's management had no consistent Plan B. They bickered endlessly, duplicated efforts, and let petty empires and management politics get in the way of running a damned company.
Nokia's original management team royally f*cked up. This was NOT Elop's fault as he wasn't there during any of this.
By the time Elop was brought on board, the mobile division was already a massive basket-case and clearly in its death throes. Elop looked about desperately to find an alternative to Symbian. In 2010, Android was still very half-baked and no better than any other option. (This was the Android 2.x period.) At this point in time, Windows Phone looked just as viable as any other option, and Elop had worked for Microsoft, so he presumably felt having contacts there would be useful. He didn't know anyone at Google.
Elop did make some mistakes, but the leaking of the "burning platforms" memo is not something he can be directly blamed for. (My money's on it being leaked by a disgruntled member of the outgoing management team.) This resulted in the predictable Osborne Effect, effectively destroying what little chance the mobile division may have had of surviving intact as part of the larger Nokia group. With nobody buying their old tat, Nokia had to find a buyer for the division.
Elop's connections at Microsoft made the latter an obvious choice, but Microsoft weren't actually that interested. That Elop pulled off the purchase regardless is therefore not a "failure" as many seem to believe: had he not achieved this, there would be no Nokia Lumia phones today: the division would most probably have been closed down entirely. Believe it or not, this made Nokia's shareholders happy, not sad. They don't care how their goose makes those golden eggs, as long as it keeps laying them and making them money.
"The problem was that Apple failed to sue Google and the Android Open Source Project for copying the Apple designs..."
Apple started suing Samsung (and others) back when Android was still a babe among mobile operating systems, and looked so much like iOS that nobody was seriously attempting to explain the similarities as anything other than a very sincere form of flattery. Some manufacturers today are still chancing their arm by blatantly ripping-off iOS. If you don't believe me, check this out. Samsung's early Android phones were pretty similar rip-offs, though they've moved a long way away from that approach since then, helped by improvements and changes in later Android releases.
The reason why Apple didn't sue Google directly is simple: Google don't charge for Android licensing, so it's difficult to prove that Google benefit materially from ripping-off iOS given that Android is deliberately designed to be skinnable, and has been pretty much since the beginning. (Otherwise, that rip-off Chinese device linked to above wouldn't be possible.)
Google didn't get where they are today without learning how to cover their corporate arse. Suing the licensees rather than the licensor therefore makes a lot more sense.