60 posts • joined 16 May 2007
It's a myth that the Brits only preserve their older buildings. We have *loads* of those, so we tend to preserve only those with unique elements or important historical ties. (For example, there's no shortage of Georgian and Victorian-era housing.)
1920s is still "old" for a Brit; like most cultures, we use living memory and lifespan as references for this concept. Plenty of buildings from the inter-war years have gained Listed preservation status. These are mostly one-off buildings of genuine architectural merit.
The difference between the Old and New Worlds is that we don't consider "old" to be remarkable or special in and of itself, any more than Americans gaze upon skyscrapers or air-conditioned homes with awe and wonder.
The rise of the preservation and heritage movements in the UK grew out of the destruction of the original Euston railway station in London and its replacement by the grotty little box which still stands there today (and is now being considered for demolition). This triggered the creation of preservation movements which were directly responsible for the survival of St. Pancras railway station in London, now refurbished and adapted as the terminus for the Eurostar services to mainland Europe.
My personal view is that anything proposed for formal preservation should be bought from its owner(s) for the price they paid for it. If nobody is willing to stump up the cash to do this, it's clear that the building isn't *that* well-loved, so let it go. We have the technology to record any building or relic for posterity should we need to demolish it, so we wouldn't be losing all trace of the structure forever.
With regard to Steve Jobs' problem: I agree with Jobs. It's an old, impractical building with little architectural value. It would be uneconomic to restore as few people would be willing to live in such a building today. (No decent insulation; no modern electrical system; far bigger than it needs to be for the smaller families we have today, and so on.) It's old, but old doesn't automatically imply *worth preserving*. Let it go.
"So you don't get in one of those nasty motor cars which can drive towards each other at a closing speed of 120mph (if neither is speeding) with only a painted white line to seperate them ?"
Not if there's a viable alternative, no. I used to live near an accident black spot in south London; any attraction cars may have had as a kid evaporated after I saw my first corpse. I was about 12 at the time.
I only learned to drive four years ago. (I'm 38.) I drove to Rome and back a few times without any trouble. My Skoda Octavia went back to the finance company last year and I don't miss it.
Mind you, south London has appalling transport infrastructure no matter what mode you use. Just getting up to 20mph is a feat.
You really, really have no clue, do you?
Seriously. Go do some research into the history of transportation and the evolution of signalling systems. Here's a clue: cutting the power to the electrified track section behind a vehicle is something the London Underground was doing in the *1800s*. (You can still see the mechanical trip-cocks on the sub-surface lines at stations: they're the little pedals which rise up and down near the signals. These won't be around much longer though.)
The fact that this incident involved a tram is utterly irrelevant: the SPAD was the first, easily preventable, error here -- the crash was a consequence, not the cause. The technology to stop trains automatically after a SPAD event is decades-old, tried and tested. There is no excuse for still running passenger services without any such technology and I would refuse to travel on any system which didn't have it fitted.
As an aside: Infrastructure construction, renewal and maintenance costs are usually *higher* when you use human drivers: Computers don't need colour lights on steel poles, and floodlit speed limit signs all along the route. Nor do you need to worry about the logistical nightmare of ensuring all your computer drivers get their scheduled breaks.
In this case, however, there's a valid reason for not using something like the DLR's "SELTRAC" automation: there's currently no equivalent system for street-running trams. (Yet.)
Re. UK TV
"I lost a great deal of respect for the BBC when I discovered that they actually created a clone of American Gladiators!"
"American Gladiators" was cloned by Sky, a News Corp. company owned by a certain Mr. Rupert Murdoch and his family of drones. The same guy also happens to own The Sun (a British tabloid newspaper) and The Times (a well-known broadsheet). Oh yes: and his tribe owns the FOX network in the US too.
In fairness, shows like "American Gladiators" aren't new to the UK: we had "It's A Knockout" way, way back -- part of the once-popular, EU-wide "Jeux Sans Frontières" TV franchise (1965-1999).
Sky also happens to have an (effective) monopoly on what passes for the UK's satellite TV "market". It's about as "British" as kangaroos and Kylie Minogue.
Not patent trolling.
Apple are doing this for the same reason all major corporations do: to protect their work. Whether they are *granted* the patents they apply for is another matter as the AC poster (12:07) above has pointed out.
Corporations are increasingly spamming the patent office with every potential innovation -- regardless of whether it ever makes it into a product -- precisely BECAUSE of patent trolls! If they don't do this, someone else (with no intention of actually producing any tangible product) very probably will. This has been proven repeatedly.
Apple, unlike a patent troll, demonstrably produces products which use their own patents.
(The amount of work that has gone into developing a patentable innovation really ought to be taken into account before that patent is granted. An entity which has spent good money and put a lot of effort into research and development should also be allowed to hold patents on the fruits of their labours. A shell corporation with no investment of its own into R&D should not. This would kill off patent trolls overnight.)
You first, Viviane!
The EU merrily pisses away millions of Euros, and Gigawatts of energy, shuffling between Bruxelles and Strasbourg every six months. How about the EU stops doing that first, before demanding efficiency savings from everybody else?
Encarta's World English Dictionary...
...is produced in the UK by Bloomsbury Plc. It's one of the few printed dictionaries I own and it really is rather good. (Excellent layout too.)
Encarta suffered from never having a reputation as good as, say, Britannica's.
Wikipedia has two problems, not one.
The first is its lack of authorial transparency and responsibility. (There are alternatives which do a better job of this. Google's "Knol" is but the most recent example.)
The second is that Wikipedia's very popularity is clearly at the expense of second-sourcing. Nobody, but nobody, seems to bother checking anything *other* than Wikipedia for information. This is wrong and students should be taught from an early age to ALWAYS double-check important information. Find a second, independent, source. Use it to corroborate (or disprove) the first. If your two sources contradict each other, find a third... and so on. This is basic research, not rocket science.
That Wikipedia accounts for so much of the online encyclopaedia audience is a testament to just how few people know how to second-source.
So it's official...
... the first human-alien conversation will consist mainly of human and alien nerds showing off how many significant digits of Pi they can recite, while complaining that "BOFH" isn't what it used to be.
"It should only be considered illegal if people are profiting from free content. As long as one doesn't use this to earn money but for leisure of themselves it should be fine." - AC (03.20)
You know, even back in the days of the travelling harpers of Ireland, composers and entertainers were treated with more respect than you've suggested.
Michelangelo, Leonardo Da Vinci and Turlough O'Carolan were all *recompensed* for their work. And copyright didn't even exist back then.
Explain, please, why you demand people provide you with free entertainment. Go on. I'm listening.
Yes, "Small stuff".
BSG had one major flaw: handheld camerawork. There was absolutely no justification for this. It's not like the characters ever demanded to know why there was a documentary film crew following them around. (A film crew which clearly couldn't afford a Steadicam rig.)
The second flaw -- one common to most 'remake' attempts -- is the notion that it's okay to take a cheesy, kids TV series and dress it up in adult clothing in an attempt to make it feel more grown-up. (Presumably so the fans of the original show, who are now adults, will watch it.)
Contrast with Blakes 7 -- which *was* a (low-budget) kid's TV SF show, but one set in a dark, dystopian future where the lead characters were regularly killed-off -- did not. Sure, the latter had to resort to painted footballs for planets, but the writing never dropped below "good". It was often excellent.
Blakes 7 was dark, scary, believable (for the day) and, most importantly, genuinely *entertaining*. It was never a chore to watch. Even at its worst, it was still fun, even if only to watch the actors trying their hardest to be scared of blatantly cardboard aliens. Yet this was a series about a dysfunctional group of terrorists / freedom fighters trying to overthrow a corrupt government. (And they actually *lose*! No cheesy "winning against all the odds" bullshit here.)
(Blakes 7 -- the lack of a possessive apostrophe still irritates me -- was the inspiration for Babylon 5, by JM Straczyinski's own admission. B5, unfortunately, suffered from the same problem the current Doctor Who had: "Writer / Producer Syndrome". It's clear that the production workload often resulted in sub-par writing when the same person was responsible for both.)
The Science Fiction and Fantasy genres are the *oldest* in modern literature. ("Gulliver's Travels" is considered the first proper novel.) SF which "asks hard questions" is a tiresome cliché, but BSG didn't even go that far. It was stuffed full of Deus Ex Machina elements. The only thing it says to the viewer is "Meh! Shit happens! Deal with it."
Programming Languages Considered Harmful
Programming languages are the CLIs of development tools. They're inherently linear, because *languages* are inherently linear -- we don't know how to read any other way. This was fine as long as computers relied on single-thread, in-step CPUs, but it's increasingly untenable today.
I started programming in the days of the ZX81, when BASIC still had line numbers and assembly languages were real, furry assembly languages designed by people who had bothered to look up "mnemonic" in a dictionary first.
I've seen people hype procedural programming, modular programming, OOP, functional programming and more. Yet those programming 'paradigms' are just attempts at nailing structural and organisational UI features onto a written language without any thought for whether this is the right place for it. It's 2009 and we're still using software development tools designed in the days when hard drives were called "Winchester Disks", punched cards and paper tape were still in use, graphics were monochrome and blocky, and the Internet was still wearing nappies and crying for its DARPA.
Sure, those tools have gained WIMP GUIs to help us place buttons and list-boxes, but look under the hood and you see the same old dumb, flat, text files and archaic, linear programming languages.
Programming languages have had their day. They're not the solution. They are the *problem*.
What's wrong with perfectly good Anglo-Saxon words?
If you have a problem with the swearing, perhaps you shouldn't be reading a website produced in a country where "Oi! WANKER!" is considered a friendly greeting among close friends?
See that ".co.uk" bit in the URL? Guess what it means. (Hint: it's named "English" for a reason. Perhaps Americans should consider renaming their bastard version "English Lite"?)
... should never have been made London's primary airport.
Boris Johnson might be a bit of an upper-class twit, but the notion of relocating Heathrow to somewhere a bit more open -- and far less constrained by urbanisation -- is a sound one. I live in North Kent, where there's an awful lot of flat, open, brown-field land from the area's industrial past. (Grain, for example, is a steaming huge lump of bugger all, perfectly situated for an airport. Granted, there are bound to be some protests against such a move because it's practically a British tradition that nothing ever be allowed to actually *happen*, but a four-runway airport there would fit easily. It's close to two rail routes too -- via Gravesend in Kent, and with the LTS line just across the river in Essex. You could then close Heathrow entirely, and possibly Stansted or Gatwick too.)
(Incidentally, I'd rather see TGV replaced by Maglev technology. Yes, it's new and expensive, but they said that about railways once too. TGV is only a little bit faster than our ECML and WCML routes, at 200mph. Maglev can do 450 mph! Sod Paris! Imagine travelling from London to *Rome* in well under three hours, by train!)
"I've worked in countless companies where vast teams of analysts, programmers, helpdesk operatives etc are retained and paid to fix, maintain and otherwise run proprietary systems and applications. This is ADDITIONAL to the enormous sums paid to the vendors in 'maintenance' and 'support' fees. Thats also not counting the never ending effort to train users to work the software. Yes, you still have to train users to use M$ Word, and no they don't 'just get it' any better than if you stuck OpenOffice under their nose instead."
I note you spell "Microsoft" as "M$". Quite what an old BASIC variable has to do with Microsoft escapes me, but I tend to find most people who use that particular variant seem to believe it is (a) funny, and (b) implies Microsoft are only about making money.
Point (a) is demonstrably incorrect. Please stop it; it just makes you look like a tool. (I'd post the traditional link to XKCD's take on it, but I'll assume you can use Google,)
Point (b) is a fair observation, but one which is equally applicable to pretty much every large OSS company like RedHat and Canonical, neither of which offer support for free. ALL businesses are machines for making money. It's not rocket science. Whether you agree with Microsoft's particular business tactics isn't particularly relevant or germane; last time I checked, Microsoft didn't make databases named "Oracle" or "DB1", nor are they in the same price-gouging league as Adobe.
I'll say it again: ALL businesses are there to make money. RedHat is a business.
Finally, your attitude towards Microsoft-specialist IT engineers is simply bizarre and bordering on insulting. Do you seriously believe that because something is *easier*, it is therefore less valuable or worthwhile than something which is *unnecessarily hard*? Is administering a UNIX server really all that more manly and macho than administering a Microsoft one which does much the same things, only more easily and with far less tedious mucking about with abstruse text files?
Or have you considered that maybe, just maybe, the reason Microsoft-specialist engineers are *cheaper* than their UNIX-skilled counterparts is because there are *more* Microsoft-specialist engineers around? The law of Supply & Demand is a fairly basic one in economics. Maybe you should try doing some actual research on a subject before commenting on it.
It's people like you who make businessmen -- you know: those people who don't sit in front of computers all day and have a *life* -- treat the GNU and FOSS movements like the whackjob pseudo-religious cults they truly are.
(Can we have an "Evil RMS" icon please?)
I handle customer support for a downloadable games website based in the US. Mercifully, I do this in my spare time, mainly on weekends. Want to know what the most common operating system our customers use?
In second place, their operating system is simply "Microsoft".
I kid you not. (And this is despite the fact that only three games run on anything other than Microsoft Windows.)
The reason Linux isn't winning on the desktop is because most computer users are *ignorant*.
(NOTE: Not "stupid". That's insulting and I hope those posters who insist on implying such labels to their users don't do so to their faces. Ignorance is most emphatically *not* equatable with stupidity. Most people are ignorant about the finer details of quantum mechanics. This doesn't make them "stupid". It simply isn't possible to know absolutely everything about everything.)
99% of the computer using public simply cannot distinguish the point where 'software' ends and 'hardware' begins. It's just a machine. An appliance. A tool for getting something done, be it writing a letter, sorting a spreadsheet or playing a game. Microsoft aren't _winning_ the desktop war. They've already _won_ it. (Apple came second.)
Linux's future is in embedded and niche markets because it allows one thing neither Microsoft nor Apple will ever do: rebadging. This is why you see it on the Asus Eee, on servers, in routers and other devices, where the OS is an anonymous, hidden component which stays the hell out of the user's way. Windows won't be going there any time soon. Congratulations! You won something! Here, have a medal.
The GNU Foundation has been around since the early 1980s. Is a 1970s-throwback of a UNIX clone and some mediocre commercial software rip-offs really all there is to it? Colour me disappointed. It could -- and damned well *should* -- have been so much more.
The "too much choice" argument against Linux is a telling one. I spent years working in the games industry and "meaningful / interesting" choices are key in designing a successful game: sure, you could make an RPG where literally _anything_ is possible, as in the real world, but the real world is, for the most part, quite dull. Most choices we make in our daily lives are so tedious that we don't even remember making them.
There's a damned good reason why Apple deliberately *limit* choices on their products, and now you have an idea why. Why provide fifteen different text editors? How many *ordinary users* ever even USE a text editor? (Seriously! When was the last time you saw your mother or aunt fire up Notepad?) Who CARES about Emacs? Who gives a toss about VI? Only the techies and geeks! These are the people who will cheerfully build a Gentoo system from scratch. If your distro isn't aimed at these people, STOP CATERING TO THEM. They are NOT your audience!
Is it beyond the wit of a Linux distro team to run some polls and _decide_ which apps are best-of-breed -- and I use the term loosely -- in the Linux ecosystem? Is it beyond their ken to just -- oh, I don't know -- get off their high horses and go _look_ at Apple's implementation of their "App Store"? Build something similar for Linux so all those other apps can still be made available to those who want or need them, but don't blind people with meaningless, pointless choices.
No, "apt-get" on its own is not enough and neither are most attempts at GUI-fying it. It needs to be capable of handling dependency issues silently, behind the scenes. Most importantly, it must include _meaningful metadata_ about each app. User ratings, for example. User reviews. Some kind of feedback loop which will make Linux developers actually sit up and take notice of what the ordinary punter really thinks of their software. As long as the only feedback developers see is from fellow developers, they will never, ever, produce anything Joe User wants to use.
Of course, there's nothing anyone can do about Linux developers who don't _care_ about Joe User. I am also assuming that these developers are in the minority in the Linux scene, which admittedly flies in the face of all the evidence to the contrary.
I demand my browser hijacks, viruses and endless, endless bloody "[INSERT SOFTWARE TITLE HERE] wants your attention! DIDN'T YOU HEAR ME? TALK TO ME NOW, SCUM!" notifications interrupting me while I'm working.
(Guess which of the above actually made me stop using Windows. Hint: it wasn't the malware.)
Steve is not Apple.
I wish Steve all the best, whatever his health problems are. However, I do wish the frequent cults of personality this industry plays host to would cease.
Jonathan Ive -- the industrial and usability design brain behind many of Apple's recent successes (and arguably far more deserving of credit for the iPhone, iMac and iPod) -- was hired way back in the early-90s, long before Jobs' reappearance on the scene. (The iMac was already heading for production when Jobs' NeXT company was bought by Apple.)
Jobs is responsible for Ive now being the VP of Industrial Design, so Ive is the name to drop when drooling over the specific aesthetics and design choices Apple makes.
Jobs is a salesman who understands technology and how people use it, but Apple won't fall apart as long as there is *someone* there willing to assert that "'Good enough' is NOT good enough!" This industry needs far more people with the same attitude. Good on Jobs.
(And no, I don't own an iPhone. I own a Nokia 2630; the cheapest phone I could get with Bluetooth. I do own an early '08 MacBook Pro though.)
"Where are the Olympics?"
The Olympics run for just two weeks and few people watched every single event. More people watched the opening and closing ceremonies than watched any of the actual sporting events.
Contrary to popular belief, sport is not followed by the majority of TV viewers. Even major football events rarely get more than half the audience as an episode of "Doctor Who".
Quality control & development.
" I mean it should be pretty easy to notice that 30% of your products overheat during normal use, and fix the firmware before they hit the streets."
Apple -- like many others -- perform development and testing on prototypes built in-house or built in small batches by their chosen partners. It's hard to tell if a problem is due to a statistical QA issue in the production process or simply because someone forgot to solder a wire properly at this stage.
In *theory*, the full production line, once up and running, will be cranking out machines of identical quality, but if one of your own key suppliers -- in this case, NVidia -- isn't being open and honest with you about their own QA, there's not much you can do. Apple aren't the only ones to have been caught up in NVidia's recent shenanigans.
Similarly, Apple tend to add new technologies to their new models every so often. In such cases, there are often teething troubles when the production line is first started up; problems that affect only 30% of the units coming off the line wouldn't necessarily show up in prototyping. (Especially if those problems are caused by bad batches of components getting through the system.)
QA is an *ongoing process*, not something you do once, prior to production. The more complex the product, the more likely it is that problems will be discovered in production. Even the likes of Ford and Volvo have had to recall products. It's not just laptops.
I love the architectural ouvré of Nash and his contemporaries -- you'll see examples all over London. It has wonderful aesthetics. Does this mean I should be allowed to live in any of their houses for free? One person's property is another person's "art". Where do you draw the line?
The whole "Art should be FREE!" nonsense is complete and utter cobblers. The very concept of "art" in the modern sense is an entirely invented one. Michelangelo didn't paint the Sistine Chapel for the price of some pizza and biscotti. He was PAID to create it. And quite handsomely at that. Ditto for all the other major works of art most of us are familiar with. Even Charles Dickens famously wrote that nobody writes for free. Creativity is f*cking hard work. (Watch this week's "Screenwipe" if you want to see what writing is really like. Best writers-on-writing programme I've seen in years. It's not all lounging around drinking lattes.)
Art is work. You want access to the work I produce? Fine: Pay up, or fuck off.
You can tell this is a techie site. All this talk about operating systems, Kingston RAM and Intel architectures misses the point: Apple is a DESIGN company, NOT a technology company. The underlying technology is irrelevant!
Apple have never shied away from using their own interfaces and technologies where necessary. The iPhone hasn't been the most successful v1.0 (and now v2.0) phone design in history because of its technology. It's been successful because of how that technology was *exposed to the user*.
We're finally reaching the end of an era. The end of that first, learning-how-to-walk-and-talk stage of the IT industry, where technology finally learns about manners, wearing nice clothes and how to behave when dealing with its peers.
All the talk in this thread has focussed on PCs and how Apple uses the same hardware as everyone else. There are idiots here who have gone so far as to explicitly claim that this hardware is "identical"! And you call yourselves IT experts!? Since when have any two PCs ever been exactly the same, right down to the RAM chips and North Bridge? BULLSHIT! The IA PC is an open _hardware platform_. That means PCs have different graphics cards, sound cards, network chipsets, motherboard chipsets, RAM chips, even CPUs -- from Intel to VIA!
It's precisely because of the PC architecture's openness that end users have had to suffer decades of instability and poor, lowest-common-denominator software and interface design.
Yes, OS X now runs on a specific *subset* of PC hardware, but so was the original Xbox. I don't recall anyone _outside_ the tech-wanking fraternity caring a gnat's fart that Microsoft's box couldn't also run Microsoft Word! Closed, proprietary, systems are NOT a problem if -- and only if -- we have open standards for information exchange. We've been this route before with the likes of the Atari ST, Commodore Amiga, Acorn Archimedes -- all proprietary -- and not been the worse for it.
(CPUs aside, all consoles are now around 90% industry-standard hardware components and 10% "special sauce". All three current consoles use PowerPC-derived CPU cores.)
Technology is NOT the end. It is merely the means to an end. Enough already!
Apple is not the villain here.
Apple is NOT a recording label! Those artists are paid by their *labels*, not directly by Apple. Those self-serving labels slurp up *70 percent* of each sale. Of that, less than a third ends up in their artists' hands. How is that *Apple's* fault?
Apple (and eMusic, etc.) invest a lot of money in their infrastructure: servers, bandwidth, graphic designers, copy-writers, online review moderators, etc., all have to be paid too. (And I doubt iTunes itself is cheap to develop.)
Why artists continue to grant online publishing rights to their labels I have no idea; any decent agent should have long-ago started negotiating directly with companies like Apple for online distribution rights. Let the record labels handle old-school media... and nothing else.
"I don't remember too many occasions back home in England where two fast moving trains traveling towards each other were allowed to share a track."
The UK has a far, far higher population density than the US, so single-track sections are rare. They do exist in rural areas, but are usually subject to a "token block" signalling system of some sort (based on a system developed by the Victorians.) This system is certainly safe, but dramatically limits the service frequency as only one train is allowed on a stretch of single track at a time. No great loss in the UK as such lines see very little traffic anyway. Branch lines for goods services tend to be single-track, but the UK rail network is mostly passenger-oriented.
In the US, with its huge distances, single-track lines are *everywhere*, even in urban and suburban areas with high populations. This is a direct result of the country's history -- when the lines were built, populations were tiny and most of the traffic was freight; a pattern which has held true until quite recently. There is a far greater focus on freight rather than passenger traffic. Instead of running lots of short trains, the single-track sections have encouraged fewer, but much, much longer trains. (Freight trains in the US can be well over a mile long. Some one-off services reached over four miles in length.)
The US is only now reaching the point where rail travel is looking more attractive again, with increasing levels of patronage, but it'll be decades before investment in new infrastructure and better practices makes a noticeable impact. (California's proposed high-speed line is an encouraging sign.)
"You've even worked within the education sector? Well done! Was it higher education or do you mean primary school?"
I worked for a while in Higher Education, private sector. (Mainly admin, but also some teaching.) It dealt heavily with foreign students, so I have plenty of insight into the surreal workings of Lunar House too.
If someone approached me for a job with a qualification from said school, I'd throw them out of the building and send them an invoice for wasting my time. I've met plenty of staff who've worked in other schools -- most of my relatives work in education, including the state sector -- and I've yet to meet _any_ who have a good thing to say about the UK's present education systems.
Frankly, I can't even understand why there's even a debate about dumbing down of exams: that it's really happening should be so f*cking obvious to anybody, you'd think denial was a communicable disease and not merely a state of mind.
No surprise there...
My own experiences with the education sector have made me extremely cynical of its fitness for purpose -- particularly at the FE and HE levels.
I've even worked within the education sector and am now well aware that the most important criterion for almost every major institution is money. More students = more money.
It may shock US readers to learn that even their own schools, colleges and universities effectively buy entire courses and curricula from companies based in India. (Yes, you read that right: your own country's very *education* is being outsourced!) Companies such as NIIT. (www.niit.com -- check out their (incomplete) list of clients here: http://www.niit.com/Colleges/Colleges_Index.asp?Section=Colleges&L1=Clients).
The UK's own education sector is, however, not exempt. The OU is presumably not exempt from the same pressures that have made a mockery of the UK's own university sector of late -- particularly in the area of language skills, where someone who can barely speak English can still manage to walk away at the end of the course with a full Bachelors degree from a major British institution.
I, for one, welcome our future Indian and Chinese overlords. I suggest everyone else here does so too, unless you actually intend to do something about it instead of whining on some random website's comment system. (Oh, wait...)
Fuck 3. Seriously. They're trying to play the "poor little underdog" card while conveniently forgetting that *all* those "big, evil mobile networks" started out small too! All that infrastructure owned by Vodafone, T-Mobile and O2? They *built* it. Using their own damned money!
3 claims it cannot compete because it's too "small". Really? So they've been borrowing money from their parents to put up a couple of antennas near their head office, have they? They *knew* what they were getting into. If they couldn't afford the investment, they have no right to complain that others *can*. That's the nature of business. You don't get a free lunch.
Termination fees are unnecessary: abolish them entirely and set up something like the old Railway Clearing House (as mentioned in an earlier reply). Mobile phone providers are in the business of selling a service. That service is sold to the customer as the ability to both make *AND RECEIVE* phone calls. If I buy a phone, I expect people to be able to call me. I see no reason why this should be chargeable given the economies of scale involved.
Mobile phone infrastructure isn't mechanical, it's solid-state, so maintenance is a known, fixed, cost. It doesn't change because more calls are being made on it. It only increases if the infrastructure itself is increased. And, again, that increase is a one-off capital expense -- building a mast -- plus a continuing, *fixed* maintenance cost. It's no cheaper to maintain a mast that's only handling two calls a day than one which is handling two thousand.
"There are quite a few companies that buy new cars, tart them up, and resell them at a much higher price. Ford don't care, they don't have to deal with the warranty and they sell more cars."
And there's the catch: Psystar are forcing Apple to support Psystar's hardware! They explicitly mention that they provide plain vanilla Leopard media with their hardware, so that means all bug reports will go to *Apple*, not Psystar. Safari will default to... Apple's homepage. Software Update will download from Apple's servers. (Gee, nice of Psystar to let their customers leech off Apple's bandwidth!)
In fact, ALL the support infrastructure in the OS will point to Apple, not Psystar.
Why the hell should Apple be expected to support third-party hardware? They don't build the boxes. They don't train their staff to know how to support them. Their Apple Store staff will doubtless have been told to turn Psystar owners away.
Customers are going to have lousy support from Psystar, because Psystar will simply palm them off to Apple. Nice job. I'd be pissed if I were Apple.
And no, Apple didn't *choose* to switch to Intel. They had no choice! Nobody was making a comparable CPU suitable for their market any more; Motorola completely dropped the ball. Apple are stuck with Intel's 1970s throwback of a CPU instruction set because Intel have spent many decades quietly kicking all their rivals out of their pram while MS got all the public ire.
I think this illustrates a key philosophical difference between those who believe interface design doesn't end at the pretty pixels, but should be created holistically as a seamless integration between hardware and software; and those like Microsoft who believe that merely having some swishy translucent windows is "good enough for the punters, but please stop whining about your hardware issues as if that's even remotely our problem."
Shamefully, most of the FOSS crowd seem to be on Microsoft's side on this issue.
Nobody complains that they can't install Linux on their Indesit washing machine, so why should they care what their desktop or notebook computer runs?
If Psystar really wanted to make something worthwhile, why didn't they just install a Linux distro instead? It's not as if this approach hasn't worked for Elonex or Asus.
There's a lot of FUD going on here.
The ASA's flagrant misunderstanding of W3C standards notwithstanding, the lack of Adobe Flash -- Apple don't get to produce their own version without being sued -- and Java are hardly showstoppers.
The lack of Java is, frankly, no great loss. It serves little purpose other than to annoy and it has so many "standards" of its own that complaining that a phone doesn't run one of them is hypocritical to say the least: which version of Java would you like it to have? J2ME? J2EE? (The iPhone runs on OS X, not Symbian or WM6.) How about one of the umpteen variants that still exist on embedded platforms? Or perhaps one or two of the older JVMs that are still installed by default on desktops?
Would the real "Java" PLEASE stand up!
As for accessing the whole Internet: yes it can.
Just because you don't get FTP, WebDAV and other Internet apps supplied by default, it doesn't mean they can't be written. There's a full, proper OS in there with a *complete* TCP/IP stack. None of it is limited and there are plenty of third-party apps that let you do whatever you want. The only reason you can't run VOIP apps on it (in certain territories) is because Apple won't put them on its App Store, not because the iPhone can't handle it.
(FYI: I'm the proud owner of a Nokia 2630.)
Have a break...
"You can't multi-task, you can't copy & paste! You can't take video! You can't MMS! You can't be used as a modem with your laptop! You can't do Bluetooth properly! You have negligibly limited web-browsing features!"
[Unwraps Kit-Kat and eats it...]
"You'll go a long way!"
(For the young 'uns, here's the original 1980s ad: http://video.aol.com/video-detail/kitkat-1980-s-retro-advert/1332649792)
Is this the fault of HD?
Expect to see more and more of these remakes and reimaginings.
The BBC's back-catalogue is huge, but it's almost entirely in 4:3-aspect SD format. This is going to be harder and harder to sell to other countries (and niche channels like Dave and UK Gold) as HD and widescreen TVs become the norm.
Building up a stock of HD-format content will give BBC Worldwide -- the BBC's profit-making commercial arm -- more to sell. BBC Worldwide pumps money back into the BBC, topping up income from the TV License, so the more they can shift, the better the BBC can be. (Well, that's the theory anyway.)
The BBC barely got the TV License renewed during the last round of negotiations. I suspect they know its future is uncertain, so it makes sense to take the initiative and stock up on decent content. By the time the next TV License negotiations come around, it's a fair bet that very little of the BBC's SD back-catalogue will be acceptable to export markets.
"Doctor Who" is an obvious example: The DVD release of the Troughton-era Cybermen story "The Invasion" has two entire episodes recreated in animation form and it works surprisingly well. I can see the BBC commissioning remakes of entire seasons of classic episodes in HD-format animation.
"And please dont be patronising and ignorant and use the non-word 'Brit'. I am not a 'Brit', I am English, we invented the English language that you dimwits over the pond have massacred ever since."
British English and US English have diverged mainly because the original founders of what is now the United States of America left the UK some time ago, with a peak during the 17th Century. Many of the differences are therefore due to changes on *both* sides of the Atlantic. "Autumn" is one example:
We English originally used the word "harvest" to refer to the season of autumn. This lasted until the 16th Century as urban living grew, gradually switching to "fall", then, later, "autumn". The latter was initially rare and only gained popularity later; both were in use during the time of peak English emigration to the colonies, with "fall" still being the more common usage.
Many new words were coined independently during the 19th and 20th Centuries, resulting in an increased divergence between the two languages. This coincides with the birth and rise of the Industrial and Information Revolutions, so many neologisms appeared over this period. However, there are also many differences resulting from the US sticking to a usage now considered archaic in the UK, with US grammar rules tending to stick to older rules.
Neither is "right" or "wrong". The English language herself is a natural whore amongst languages and will survive, with or without us pedants!
... was never designed for the use it's seeing today. It was designed and built by nerds *for* nerds. The Ignorant, IT-illiterate general public were NEVER meant to use it.
Why is Phishing even *possible*? This is a UI issue, not a user issue. Any interface that can be so abused should be taken out and shot.
Why is spam possible? This is, again, a UI issue, not a user issue. Quit blaming people who have WAY better things to do with their time than read $90, 1000-page books on securing servers. Some of us actually want to *work*, not just fix the damned tool.
Seriously, how about all you f*cktards and willy-wavers shut the hell up and get on with FIXING THE BLOODY PROBLEM, rather than pointing fingers at anyone who hasn't spent as many decades studying IT as you have?
The Internet -- not Safari, not IE, not any other sodding browser -- is the PROBLEM, not the solution. It needs to be made usable by people who have no idea what an IP Address is because they shouldn't HAVE to know.
Most people have no bloody clue what a frame flyback is, or what protocol is used to transmit digital TV. Because they don't NEED to know. It just works. Why the hell can't the internet be like this?
FYI: I use Safari. I've also used every version of IE, back when I used Windows. I have NEVER, in over 20 years, had a virus, a piece of malware, or any other problems. Why? Because I've been in this industry for 25 f*cking years. I am NOT representative of Joe Public and I know it. Strange how few others seem to have that level of awareness.
Grow the f*ck up and start producing decent, quality products that (a) don't crash -- yes, it IS possible, though you'll probably want to stop using 30-year-old tools and paradigms first -- and (b), don't require a 300-page manual.
People are getting tired of shitware. I'm one of them. Enough already. How about all you so-called "experts" stop arsing around inventing Web 3.0 and spend a little time on getting us to Internet 2.0 first? Foundations first. House second. That's how it's done in the building trade. Learn.
CEO Admits Mistakes Shock!
If only politicians could be as humble.
In fairness, I've not been affected by MobileMe -- I don't have an iPhone and, while I'd like one at some point, I'm quite happy to wait until things stabilise -- so Apple's "woes" weren't an issue. Nevertheless, it's good to see CEOs admitting they're only human and that they can make mistakes occasionally.
I'm a Mac user. I like good design and companies that focus on providing quality user interfaces. I'm weird like that.
I don't think I qualify as a Macolyte: I'm quite willing to state that Apple aren't perfect. But then, I've used more operating systems and interfaces over the past 26 years than most people probably knew existed. (Research Machines' "Cassette Operating System" on their 380Z, anyone? CP/M? GEM?) I'm therefore extremely OS-agnostic. Currently, I prefer Apple. Five years ago, I preferred Windows. Tomorrow? Who knows? Right now, they're *all* shit. Some are just a little less shit than others.
Steve Jobs has presided over not only OS X's evolution, but also its predecessor: NeXTSTEP / OpenSTEP. So that's TWO successful, innovative GUIs built on Open Source foundations in the same time that that the GNU/Linux community, in their rampaging hordes, have achieved... er... uhm... what, exactly?
Comparing oranges with oranges.
Others have already pointed out that very few punters ever buy Windows "naked". The chances are pretty good that any computer you buy from PC World or Dell will already have some bundled OEM software to enable DVD playback. They usually also include some security software -- usually Norton, but nobody's perfect.
Similarly, very few people build a complete GNU/Linux environment from scratch. They tend to download and installed pre-packaged *distros*. These are directly comparable to the OEM software bundles installed on most consumer PCs.
Comparing a distro like Ubuntu or SuSE with a typical OEM-built Windows box is thus perfectly fair. Claiming that "Windows doesn't come with a built-in DVD player out of the box!" is desparate: last time I looked, all consumer Linux distros come bundled with umpteen software packages; it's not unreasonable to assume that some of them will be of some use to the end user.
(Similarly, Apple bundle plenty of applications with their computers. If you buy OS X on its own, you'll have to buy "iLife" separately.)
There's no such thing as a perfect OS: they're all generally shit. All this thread seems to be arguing about is how their OS of choice is differently shit than its competitors.
(FWIW: I own a Macbook Pro, because it runs OS X, Windows and Linux.)
I honestly don't understand all this love for WM-based phones. My MDA Vario II (which is a rebadged TYTN) has been so shit that I have, for the first time in my life, been forced to buy a *replacement phone* before the contract ran out. Oh MDA Vario II, how do I hate you? Let me count the ways: Lousy battery life, frequent crashes, soft resets, crap build quality (it's been replaced once, repaired once and reflashed *three times*, most recently with T-Mobile's WM6 image), terrible reception and a truly appalling GUI, positively *hates* synchronising, be it with Windows or OS X... the list is endless.
I've owned some real bricks and lemons -- Nokia 9500 Communicator, anyone? "Let's build a phone with WiFi, but make the CPU so damned slow that you might as well be connecting using two tin cans and a piece of string!" -- but this is the first time I have *ever* had to do anything like this.
The replacement phone? A Nokia 2630. This cheap-and-cheerful phone (it was the cheapest PAYG I could find that had Bluetooth) works like a charm, is tiny, synchronises perfectly with my computer and -- get this -- gets the bloody job done without any fuss. It's stable, reliable and easy to use. That, dear readers, should be the #1 design goal of ALL technology. (I'm guessing that's the real reason the iPhone is selling so well: it's the *design*, stupid! Features aren't worth shit without good design.)
I saw an identical car driving around Brockley (London SE4) a couple of weeks ago, so they've been here a while.
I'm surprised it's taken this long for a photo to pop up. (I was driving, so I couldn't take a snapshot.)
RE: CO2 is not the problem.
"Reducing congestion would reduce carbon emissions. But we can't just keep concreting over everything to make space for more cars - after all the world is supposed to be a good place for people, not the cars and the two do not go hand in hand."
Since when was the automobile the one and only means of providing a door-to-door transportation system? There are any number of perfectly viable alternatives. Sure, it'd mean building new infrastructure, but those traffic lights, zebra crossings, motorways and road signs didn't magically appear out of nowhere.
Personally, I favour the idea of building a "physical internet". (The Victorians naturally toyed with the idea: Google "London Pneumatic Despatch Company".) If we could send our shopping home separately, instead of carrying it around with us on the train, we would reduce the need for cars. Construction of something like this may sound expensive, but no more so than constructing all the other door-to-door services we already have in the developed world: water, sewage, electricity, gas, etc. (Building a door-to-door network for freight would also be cheaper than one for humans as the Health & Safety fascists don't care what happens to tinned peas.)
That's just one notion off the top of my head. There's plenty of R&D at universities around the world looking into alternative door-to-door transport systems, from freight trams -- already in use in some parts of the world; through building new roads beneath existing ones to segregate motorised and non-motorised transport; right up to completely new transportation systems based on suspended rail systems.
What is important to the end user is the *interface* -- the "door-to-door" part -- NOT the implementation! I don't care *how* a transport system gets me from my home to my destination.
It's time to stop treating the automobile as the acme of such design, 'cos it bloody well isn't anything of the sort. Had the automobile been invented last year, it's a given that we'd have to build completely segregated road networks for them, instead of having them share the same infrastructure as non-motorised transport such as bicycles and pedestrians. This may yet come to pass. (It'd certainly cut down road deaths. And forcing cars underground would make it feasible to fit current collectors to them, massively reducing the need for energy-storage systems like batteries or hydrogen tanks.)
I'm amazed nobody else has made the same points. Isn't this a *technology* website?
CO2 is not the problem.
As others have pointed out, the main problem faced by Old World cities like London, Manchester, Rome and Naples isn't the CO2 as such: it's the *congestion*.
CO2 emissions and other pollutants can -- and, in general, are -- being solved (albeit slowly) by scientists. However, the problem of providing a door-to-door transportation system is still proving extremely difficult to solve. We've tried feet, but humans cannot carry much. The motive power of the automobile may eventually switch to electricity, hydrogen or unobtainium, but it doesn't solve the congestion problem.
Even horses fail the congestion test. The congestion the humble horse and cart of old managed to create in London during the mid-1800s proved great enough to trigger the construction of the first stretch of what is now London's Underground rail network.
Traditional light rail is a dead-end technology: adding yet another road-using device to the infrastructure isn't going to solve congestion issues on, say, the A205 South Circular or the A20 through New Cross and Lewisham. The key problems are ridiculously narrow roads given their traffic, and the chronic lack of investment in the infrastructure. Squeezing trams onto the same roads will only make the congestion worse: light rail isn't designed for commuting; it's too slow for that. It's intended for _local_ users. (Croydon Tramlink's own metrics are based around reducing car journeys to and from *Croydon*, not cutting through traffic on the A23 between Brighton and London.)
It isn't just the roads either: if the trains in the South-East of England were any bloody good, we might consider using them. Unfortunately -- and ASI members, please note -- the *private* companies that built the original rail network around here during the 1800s were such bitter rivals that they frequently duplicated routes out of sheer spite, while constructing their core network as cheaply as possible. The result of which is a dearth of grade-separated junctions and the ludicrous situation of a 20MPH speed limit for trains between London Bridge, Cannon Street, Blackfriars (Thameslink), Waterloo East and Charing Cross.
People would be happy to use alternatives, if viable alternatives existed. As it is, London's rail network is already so saturated that it deliberately tries to price people *off* it, to *reduce* demand during peak hours -- exactly the hours when most of the region's population is *contractually obliged* to come into work! (No, my tree-hugging friends, we don't commute out of spite. We do it because the people we work for *require* us to be in that office by 0900 hrs!)
Cycling into the City from Dartford or Gravesend in a howling gale isn't really an option: the advantage of trains is that we can actually get some work done while travelling (in theory; in practice you can barely find space to stand). You can't do that on a bike, so it's an hour or two wasted. In each direction. And you're utterly *knackered* by the time you reach your destination.
One of the lesser-known advantages of a service industry is that it doesn't need to be close to anything other than good communications links and decent transport. London -- especially south-east London -- is headed for a massive wake-up call soon. The BBC is already relocating a number of operations to Manchester. They're not the first to move away, and I doubt they'll be the last.
"The only online music distributor making serious money is iTunes, and that's only because they've got a hardware tie-in. And they're run by Apple, who are by definition Big Business."
Apple isn't run by idiots. If your intention is to make some damned money from your content, then you'd better be prepared to think like a Big Business yourself. If there's one thing Big Business has proven to the world, it's that it knows an awful lot about how to extract money from punters.
*All* businesses started out as an idea. Businesspeople are people too. The reason we have so many issues with corporations today isn't the people running them -- if you're in charge of a corporation owned by shareholders, you _have_ to pander to those shareholders' whims. And damned few shareholders today have any expertise in the fields their chosen companies work in.
This is why new communication technology was always a mixed blessing: originally, you bought shares in a business because you believed in it and wanted to help set it up. The key attraction was the dividend, not the share's actual worth. Today, shares are merely a form of money, traded like currencies by people whose sole interest in your corporation is to keep pumping it up until it bursts -- and to blazes with any long-term strategy.
Offering shares in your company is like getting a loan from the worst loan shark in the world. A loan shark who will never, ever let you off the hook and demand that you keep paying him off forever. Worse still, he will also demand that you keep paying him more and more and also tell you how to do your job and run your company.
A proper loan has a defined end point, but once you've sold that share, you're going to be paying its owner -- and future owners -- _forever_, regardless of how much money you were paid for it. It's like an infinitely long mortgage. THIS is why the corporate world is in trouble right now: the public shareholding model is seriously broken.
Seems to me...
...that a better solution would have been to require MySpace.co.uk to remove any and all adverts for MySpace.com. It sounds like Nominet are trying to shoehorn the "Passing Off" laws into their domain arbitration process. If TWS were relying on the confusion to make money, they'd be far more likely to sell it.
If not, it'd be tough for Murdoch's bunch, but they wouldn't have a leg to stand on and TWS' case would be proven.
(That said, I'm of the opinion that the Internet in general is fundamentally borked anyway, so... meh.)
The article doesn't say "Open Source" (note the capitalizations). The article uses lower-case references to "open source" throughout.
Contrary to increasingly popular belief, Stallman & Co. did _NOT_ invent the concept of opening up the source code to one's applications. I regret to inform you that "Open Source" is not a f*cking registered trademark.
Er, no. It's not "his baby".
That's kind of the whole _point_ of the GPL. Or such was the impression I was given. (I'm a bit rusty on which particular prophet the GNU / FOSS folks are worshiping these days. Never was big on religion.)
If DTrace were a closed, copyrighted work, Apple would have no right to go farting about with it to begin with. But it isn't. It's Free Open Source Software. With the capital letters. And Apple aren't hiding the source code, so they're entirely in keeping with the GPL.
You don't get to have it both ways: either it's open to *anyone* to mess with as they choose (as long as they stick to the GPL's requirements), or it isn't. Which is it? I'm confused.
It's called a Congestion Charge for a reason.
Making a car electric will do precisely nothing to solve the *congestion* problems of horse-and-cart towns and cities.
That's why it's called a *Congestion* Charge, not a Pollution Charge. The pollution isn't the problem; you cannot physically squeeze enough cars onto London's tiny, country-lanes-with-delusions-of-grandeur to create the pollution and smog levels of, say, New York or Rome. Both of those cities have more roads that are usually wider and predominantly arranged in a grid pattern, offering more alternative routes. For them electric cars are of more immediate value.
Different countries and societies will require different solutions. There will be no magic bullet, one-size-fits-all solution. Unless someone finally invents a viable flying car -- and all the necessary infrastructure to support it.
HD is pointless for movies.
The human eye is optimised for edge-detection, NOT detail! 90% of what we _think_ we see is just interpolated by our brains. The eye simply cannot process 1920 x 1080p HDTV images at 60Hz+ at anything like the detail the HD-pimps like to suggest it can.
The Hollywood fad for jump-cut, hand-held cameras, whip-pans, crash-zooms, etc. in their action flicks are the worst use for HD ever conceived. The vast majority of the public will simply not see much difference in image quality between an SD version and its HD counterpart.
HD works best when the camera moves slowly and lingers on subjects long enough for the eye to pick up on the detail. In other words: some sports, documentaries, slow-paced movies and certain news / factual content formats, like "The Weather Channel".
(And don't forget that 99% of the world's media companies' archived content is going to be in the older SD format. HD is going to be an expensive move and this makes the risk-averse producers less likely to go out on a limb. Don't be surprised if the BBC spends most of the next few years remaking all their 'classic' period dramas.)
"Name one thing that the Apple Air has that the Asus Eee doesn't"
You mean, aside from a decent, backlit keyboard, decent 13.3" screen, an 80Gb hard disk and a UNIX-derived OS with a GUI worth a damn? (Oh, and and iLife'08 if that floats your boat.) Not everybody likes tiny, dinky keyboards and eye-strain-inducing screens.
A MacBook Air will run Parallels or VMWare Fusion, either of which will let you play with Windows, Linux, Solaris or any other supported OS. In a VM. At native speeds. (And even some 3D graphics support.) Good luck doing that on an Eee.
The Eee PC does *nothing* that the Psion netBook didn't do way back in 1999, so I have no idea why so many people are wetting themselves over it. Hell, the Psion even had a better OS: EPOC32. (Better known today as Symbian.)
In short: the Eee PC is a disposable gadget which will end up on eBay within six months. The Macbook Air is computer you can actually get some work done on, but which won't break your back and looks damned nice too. Sure, it's expensive, but what did you expect from the IT industry's equivalent of Bang & Olufsen? A $100 price-tag?
(And no, I have no intention of buying either. My Psion still works fine.)
I can live with weekly updates. I'm a developer myself, so I know how this industry works.
What used to really p*ss me off on Windows was how my anti-virus app would pop up an update request *every single day*. And then MS' anti-malware app would also want its pound of bandwidth. And so on... and on...
Worse still, many FUDware apps have an insane love of performing complete system scans at *exactly* the same time when I'm trying to do something resource-intensive. That running such resource-intensive scans daily is actually a _default_ setting just boggles the mind: If they can't even be sure their oh-so-brilliant software will stop the crap hitting my hard drive, what's the bloody point of it all?
And yes, I'm well aware that many "pro" FUDware apps, upon provision of my bank details and permission to siphon lumps of cash from same, will handle their updates more quietly and politely. So what? I'm not in the habit of buying cars that haven't had decent locks fitted. Nor do I expect my new home to come with deadbolts and window latches that stop working after just 90 days unless I pay someone a small fortune to fit some new ones and maintain them to a decent standard.
For fuck's sake people: the problem isn't the OS. The problem isn't the people. The problem is the *Internet*, which was never designed for the uses it's seeing today. (And neither were UNIX, Windows or any other mainstream OS, no matter what the partisans would have us believe. Even the mighty Linux has its share of vulnerabilities.)
The Internet we see today is a 1970s technology designed by naïve lab researchers who probably had a fit when they realised their sweet, innocent lamb of a technology was going to get royally rogered by corporations and the great unwashed the world over.
At present, merely connecting a computer to the Internet is practically a declaration of war as far as users are concerned. It's not just unsafe; it's _ridiculously_ unsafe. Who in Codd's name thought opening it up to the general public was a good idea?
The Internet is broken. Badly. It is impossible to police properly. It scales poorly. It has no security features whatsoever that weren't merely tacked on as afterthoughts. It's seriously unfriendly and unwieldy. It was designed by idealists rather than pragmatists. In short, it needs replacing wholesale. This probably won't stop people trying to abuse the replacement, but at least it should be easier to set up a "superhighway patrol" to keep it reasonably safe to use. Roll on Internet 2. And it'd better be good.
Education, education, education.
Windows' problem isn't the malware as such. The main reason I switched away from Windows is that, even after it's booted up, I then had to wade through a seemingly endless parade of warnings, dialogs, signature file downloads all shrieking messages like: "OMG! Your [INSERT FUDWARE NAME HERE] hasn't been updated for nearly two whole seconds!! You're gonna DIE!!!"
By the time they were dealt with, it was time for me to go shave again.
The FUDware companies are no better than the scum they pretend to keep in check. Their UIs are almost uniformly shite. Their software's instabilities make even Windows itself look like a saint. I would seriously much rather disconnect my computer from the Internet and never see another email or website again than have to install all that dross on my computer.
Anti-virus? Check! Anti-Spyware? Check! Spam killer? Check! The litany goes on and on.
Why the f*ck can't they just squish it all into one blasted app which just downloads all its stuff in the background and gets the hell on with it without trying to make me shit myself in gut-wrenching terror?
ISPs have been begging for a way to add value (and thus improve margins) to their offerings. May I suggest they find a way to run all these security apps at _their_ end instead of relying on their customers to police and maintain this service using outdated, subscription-lapsed copies of Norton Anti-Virus 2003?
(Feel free to charge a bit extra for the service; the rest of the internet-using public will thank you and computers the world over will cost that little bit less to run each day, saving energy, cutting business costs, helping the ecology, and mitigating climate change. Result!)
It's a question of cultural preference.
The French, like the Italians and Spanish, have a slower society. They have no pub culture as we Brits (and our German and Scandinavian cousins) know it. They work fewer hours, because the French don't see "work" as the be-all and end-all of life. They see things differently. Literally. As do people from every other nation.
Most importantly, our Latin cousins have much greater rural, regional identities. Italians are passionate about their regions -- Lazio, Toscana, Calabria, Ticino, etc. -- and the French are likewise. Most people outside France know of the Champagne and Loire regions in France. Nobody outside the UK gives a gnat's chuff about Rutland, Humberside or Shropshire. Our regional identities disappeared when the Industrial Revolution effectively rewrote our society from the ground up and stomped all over our (predominantly feudal) agricultural past.
Industry has never touched to France to quite the same extent and it is still heavily agricultural and focused on small, family-owned businesses. Italy has an almost 50:50 split, with the northern plains heavily industrialised, while the southern regions have remained predominantly agricultural.
I have no quarrel with the French system or its laws. Sure, not everyone agrees with all of them, but that's why they don't live there! To Brits, the British way of life _is_ the One True Way. To Americans, their home State is usually their cultural keystone. It's the same the world over and this is a Good Thing. If everyone thought, felt, believed and lived the same way, the world would be a much duller place.
Vive la différence!
(And yes: I agree Amazon.fr should be punished. Nobody pointed a gun at their managers' heads and _forced_ them to open up in France.)