1652 posts • joined Wednesday 10th June 2009 14:47 GMT
Re: Who needs it?
> But those two pages are going to be pretty short pages
That's the problem. With my old 23 inch 4:3 CRT I got a vertical height of 13.8 inches (sorry for the archaic units). That was good enough to display an A4 document at full size, or a portrait-formatted web page, given the amount of screen space lost at the top of the page with toolbars, menus etc.
To get the same height with a 16:9 screen, you'd need a stonkin' great 28 inch display - a 23 incher providing a paltry 11.3 inches. That 1½ inch loss is more significant as the applications overheads are constant (say an inch for all their clutter, usually more, irrespective of screen size or ratio) so the smaller height directly impacts the stuff you want to see most.
What's more interesting than peoples' reticence to switch banks is their reluctance to have more than one current account. We know, some through experience and some through sage advice, that it's unwise to only have 1 front-door key or a single kidney. Sure, you can get by with just the one but having a spare is a good move. Come the day you really, really need that fallback, it's already too late to try to get one.
As the article says, changing banks is easy. So is opening a new account. Having access to two sources of money (and maybe two separate credit cards - wallets do get lost, handbags do get stolen) is just as sensible - and it's free.
Sure, you get double the amount of paperwork. But in these days of internet banking it's just another password, or security dongle, to keep track of. The upside is that you don't have all you eggs in the same basket. So a bit of "local difficulty" with one bank's inept IT doesn't turn an inconvenience into a crisis.
Who needs it?
Leaving aside the obvious marketing benefit of "Bigger, Better, Faster, More", let's step back for a second and consider.
There seem to me to be two types of laptop user: those who primarily want to watch videos and everybody else. For the video-watchers, the 16:9 format is ideal but for everyone else it's terrible - especially for "business" users who deal mainly in A4-portrait format documents and people who surf a lot, as most websites are STILL designed for tall-thin, "page" form factor web content.
So we have a whole generation of laptops that are optimised for watching TV and films - oh and playing games maybe, to the detriment of everyone else. Now unless those media consumers are watching their shiny, glossy screens in perfect darkness the quality of what they see is always going to be compromised: by glare and reflected light.
So given all that, you have to ask: can yer average lappy user benefit from sooper-dooper screen technologies and resolutions that need an electron microscope to view adequately? Given that there's been no real drive to improve laptop screens since the early days (my 1996 vintage Olivetti sported a 1024x768 screen, I guess that would be "HD" by today's standards), I can only assume that the current crop of high resolutions is only being marketed on a "becauwe we can" basis as part of the BBFM principle.
There's the problem - right there
> Twenty-year-old models which have suggested serious ice loss in the eastern Antarctic
Now if they'd used proper scientists instead of people who wander around on catwalks, maybe they'd have got some better data.
And to say that an Elephant Seal is better at doing climate surveys makes you wonder why we're spending so much money on obviously under-qualified scientists, too.
> Almost all of the people I know who studied arts and humanities degrees in the past few years are paying back their student loans at higher rate than they have to, much faster than my friends who studied IT or science at university as a matter of fact.
No, they're only doing that because they're not very good at maths.
A student loan is the cheapest source of capital an individual will ever get. The interest charged on it is guaranteed to NEVER exceed the rate of inflation (meaning that over time, it's value will decrease naturally). Therefore the best approach is to pay it back as slowly as the system allows and put any "surplus" earnings into a savings account to earn the ex-student a nice little slice of interest.
The hard sell
I doubt that kids from any time in the past 20 years, brought up on a diet of MTV and more extreme, would be the slightest bit affected by this video. The "problem" only occurs because older people think (wrongly) that this will influence them. It's the same sort of patronising, or merely ignorant, attitude that some people have towards smut: "It doesn't affect ME, but I'm concerned about the effect it will have on others"
If the EU wants to get girls interested in science, they should get Adele to write a song about how sad it makes her feel. Or better yet, stop presenting science on TV (in fiction and in fact) as nerdy, geeky and only appropriate for social misfits
Re: I read that as an expired cat!
> Have you ever seen the damage that mice will do to the wiring under the floor of the server room?
Yeah, the USB ones are the worst.
Re: that's what happens
> of course "lessons will be learnt"
Generally the lesson that is learnt is that the bank in question can futz around for this, particular, length of time without anything bad happening to it's senior staffs' employment prospects, the bank's long-term reputation or in regard to shareholder backlash.
No doubt when RBS carry out a post-mortem, they won't actually find the root cause of the problem (it's the network, stoopid!) but will blame some third-party: either outsourced, software supplier or infrastructure. They will then issue a suitably
smug contrite press release about how they've "taken steps to make sure this never happens again", award themselves large bonuses for the successful cost-savings, take the regulator out for a very good lunch and prepare their CVs to move on and stick it to the next financial institution on the list.
> People should be free from the worry of some high-tech Peeping Tom technology
But isn't that exactly what all these american drones (UAVs, not people) do in all the countries they're currently bombing the crap out of? He should be glad that the likes of Apple and Google are only taking photographs.
The 'art of the matter
> arts and social sciences students, according to the Chinese news site, which reported that many felt the "work experience" was irrelevant to their studies.
A "proper job" might be irrelevant to these students' studies, but it will provide invaluable experience for what they'll probably end up doing after they graduate. As for the wages and deductions they get, isn't that just par for the course?
Maybe the UK could ship some arts and SS industrial placement students out to Foxconn for a taste of real-world jobs, too.
A taste of things to come
> a one-off designed to boost Windows 8
Or maybe it's a toe in the water to see how successful a single-sourced combination of: hardware, O/S and walled-apps; can be? MS must have an envious eye on Apple who have managed to close off all competition to their devices by locking the hardware and O/S together and only allowing apps that pay them a tribute for the privilege of running on their machine.
The trick is to persuade punters that this isn't just a mix of Windows8, and a tablet - it's a SYSTEM. Integrated, easy to buy (with none of that pesky "installation") and easy to use. Given the margins Apple makes on it's "buy everything from us" systems, the only surprise is that MS didn't do this years ago.
If I was a PC maker, or not on the list of most-blessed suppliers for Surface, I'd be getting a bit worried that my business could simply evaporate if this is a success.
Re: Cause or effect?
> The fact is there's plenty of new stuff but creating new stuff requires risk. It is simply easier to resell your old stuff to people who haven't seen it yet:
Good point. And very long copyright terms rewards the endless promotion of non-risky old stuff over going out on a limb and creating something new. If copyright was limited to (say) a single generation - 20 or 30 years - then that would decrease the value of a product, but would incentivise people to create new ideas (or even to pick up the out-of-copyright "classics" in new ways). I reckon it would generate more new content, though the old stuff would still be available if people wanted it.
Still living the dream
> extending copyright terms beyond absurdity,
The reason that vested interests keep pushing (and winning) ever longer copyright terms is that these old, ancient, "properties" are still very successful. A pertinent question would be: why?
Surely in the past 20, 30, 50 even 80 years someone, somewhere - with all the technology, marketing and production techniques at their disposal - would have made Mickey Mouse (c) (tm) and friends obsolete. The sad fact that there is STILL so little material that can compare with its popularity speaks volumes for the lack of originality, imagination and willingness to try new things.
We see popular music reinvent itself every 10-ish years (although the old stuff remains popular with the generations that grew up with it). But for children of today to still get fed the same saccharin-sweet, superficial culture that hasn't changed in 2 or 3 generations of childhood makes me think something is very wrong.
Is it time for the cult of Disney to go through a "punk revolution"? Maybe bring back the original, unexpurgated versions of Grimm's Fairy Tales
The author doesn't need to [outline the process ...], the spokesperson is quoted as saying
We are well aware of the commercial value of the data,
So since they are already well aware of its value, all they had to do was ask for that amount. If Google declined, then it would seem that this value had been set unrealistically high.
I wonder if all the extra-curricular business fondling makes up for the time used during the working day on personal fondling? If so, it's just a time-shifting phenomenon, not extra work.
Re: Why the delay in filing charges? Come on people. Smarten Up!
Maybe the cops over there are weighing up the advantages to themselves of showing justice to be swift and robust against the disadvantage to all mankind of the resulting book: The Kindness of Prison
Cut your coat according to your cloth
if there's a mismatch between the technical skills of an entire continent and the IT goals of a bunch of policymakers, my money would be on the goals being wrong.
If there really will be 700,000 ICT vacancies (a subtle but important distinction from IT vacancies, I'd guess the ICT element includes telesales agents - and I have to say I'm glad there's a shortage of them) the simple laws of supply and demand would require that the gap can be filled by raising the pay offered, until enough people retrain to fill them. What the report probably means is there will be a shortage of ICT staff who are willing to work for the pittance on offer.
Maybe the solution is to get rid of the bean counters who couldn't foresee such a massive shortfall when making their
dreams plans and replace them with a bunch who base their strategy for the future on solid reality. There should be no difficulty in performing this substitution as sadly, there is never a shortage of administrators.
The simplest answer
Maybe it spent so long in space because nobody could remember the command to bring it back?
What about the TV?
> helping people learn [ American ] English and understand a little more about American culture
I was under the distinct impression that american TV exports had already done that. All my english-as-a-second-language friends and colleagues have a recognisably american "twang" to their spoken english, usually picked up from TV programmes and the teaching material they were exposed to.
Maybe what these Kindles are for is to redress the balance a bit. To correct some possible notions that every american cop will shoot you as soon as look at you, that every crime can be solved within an hour and that their soldiers can drop into any country in the world, gun in hand (and suitcase, and shoulder holster and tucked into belt and another hidden in their sock - just in case) with impunity.
After all, this sort of programme has got to be cheaper than trying to teach their own citizens (or, it must be said: ours, too) another language.
Re: "I bet the lunches at the council offices are better than they serve the kids."
> Just out of curiosity when decade did you go to school? Just trying to figure out when it all went to shit
Well, I was at school in the 60's/70's (not the full 20 years, you understand!). One of the big problems my schools had was that the kitchens didn't keep a lot of reserves. So the food that made up the day's lunch was delivered from the suppliers that morning.
As a consequence the suppliers (esp. for meat) could deliver any old crud, safe in the knowledge that it couldn't be rejected or the little darlings would go hungry.
I do recall many occasions where it appeared the protein (at least that's what it appeared to be) had gone through some sort of vulcanisation process before being served. Whether that was the chemical genius of the school cooks, or the quality of the raw product is difficult to say. Generally the deserts were better as there aren't many ways to mess up Spong [sic] pudding though the custard sometimes made you wonder ...
What if "the cloud" is just a fad?
Basically a "cloud" is a very similar environment to a mainframe batch operation of years gone by. You submitted your "job", something, somewhere did something with it and produced your results. The person who initiated all this had little or no control (JCL notwithstanding) over the process.
While this sort of set-up provided a solution, like the cloud, it wasn't very flexible and like the cloud, the person who wanted the work done would often want a little more control - or assurance - over the nuts'n'bolts of the process.
As a consequence, it's easy to see that the huge datacentres that house "cloud" service providers these days are analogous to the manframe operations of yore. It also follows that in the IT world, nothing lasts forever - so what we see as a cloud-based solution today will be seen as a cloud-based problem, tomorrow.
So if we're looking forwards 10 years, sure; there WILL be cloud operations, but there will also be other ways to do thing. Ways that haven't yet been invented (just like cloud computing didn't happen in 2002). What they will be is difficult to say, but if the cycle keeps spinning round, I'd guess that the users would be emerging from the remains of cloud-based architectures and wanting their own systems to run their own applications in their own way.
Layers of scorn
So The IT crowd portrays women negatively - maybe, I only watched 1 episode (that was enough - didn't care for it). However, the media in general portrays ALL IT people negatively, too.
As she says herself, lack of women in IT is a worldwide problem, whereas The IT Crowd is purely a local "problem", so while it may not help, it's not a big barrier.
What needs to happen is for the media to depict IT people, in general, in a more sane and balanced way. Although the industry does little to help itself, with "geek speak" and its crappily designed and duff products.
Maybe if we could inject an air of professionalism, discipline and pride into our own industry, then that would make it an attractive proposition to newcomers and equally, help retain them over their whole career.
The tail that wags the (watch)dog
One could assume that our new overlords and masters; the International Olympic Committee had a clause in the contract (otherwise known as the UK's new constitution) that requires the host nation (otherwise known as The Fiefdom) to have such a rule in place. It sits alongside all the other ones that grant the IOC virtually absolute power in controlling, disrupting and diminishing the lives of the poor
sods serfs who live anywhere near an olympic venue.
All in the name of sport - the IOCs; seeing how far they can push a potential host nation into servitude with the sorts of demands that would make any on-tour pop prima-donna blush with embarrassment
So, not really important at all?
> described data sets as the fourth factor of production
From the website ...
Investopedia explains 'Factors Of Production'
In essence, land, labor, capital and entrepreneurship encompass all of the inputs needed to produce a good or service
So, in fact they're saying that big data is the least important factor of production.
and later ...
more and more management decisions are based on “hard analytic information”, as opposed to just having a hunch
I wonder if the decision (on how to make decisions) was taken in the light of "hard analytic information”, or if it was just a hunch?
The interesting thing is, that if all these business successes are the result of a company having a good process, rather than good leadership it rather shoots in the foot the principle that directors should be highly paid because their leadership is what drives success. It sounds like the success is due to the analysts who trawl through these datasets and come up with insightful conclusions - not the people at the top.
Maybe if good data really is the key to success, these CEOs should be keeping schtum about it and carry on claiming that the success was really down to their skill, vision and talent. Otherwise someone might just ask why they pay themselves so much.
Irrational numbers teaching
> asks what Pi is and where it came from
That's quite a good example of where abstract knowledge has failed. Teaching people about circles and radius and area could be vastly simplified by just saying that the area of a circle is 78.5% of the area of an enclosing square. If we want people to learn stuff, the simplest way to motivate that learning is to provide practical reasons for it.
As for dumping someone in the middle of London, wouldn't they just hop in a taxi, or pull up the TFL app on their phone?
Re: Price is ok...
These days you can build a fanless mini-itx system (see AMD Fusion, dual core 1.6GHz) for less than the cost of this puppy.
Industrial systems have always taken the mick with regards to pricing. Generally because of the lower volumes and tighter QA that hostile environments require. However this Linux Mint beastie looks like it's managed the worst of all worlds: high price, moving parts (disk) and aimed at the domestic market.
The second-hand slab market
... maybe that used tablet isn't such a good idea after all.
The problem is the users
This is something celebrities have had to put up with since the start of newspaper publishing. It's always been the custom for the press to keep files of clippings and "interesting" facts about people in the news. Whether the items of interest, or the unfortunate photograph was 6 months old, or 30 years ago never seemed to matter - it still got dragged out whenever an editor wanted to be petty and spiteful, or had a readership that responded well to hate, bile and jealousy.
The difference now is that the internet views everybody as a a celeb, but "ordinary people" haven't yet tumbled to that fact and therefore act as if everything they say and do is somehow private. There are two ways this could work itself out: a form of mutual blackmail where everyone can dig up something about everybody else - so the feeling of superiority cancels out, or a growing sense of maturity among internet users along the lines of "so what" when presented with some trivial lapse of judgement or taste. If history teaches us one thing, it's that the second option will never happen in Britain (some other countries have a much more relaxed attitude, but not us), - just as people won't stop posting things they'll later regret.
As a consequence, if the only defence of your own bad behaviour is the ability to drag up evidence of everyone else's, then maybe what we need are more and better sources of salacious material. Possibly putting all the country's surveillance cameras to good use and tagging every individual who ever puts a foot wrong, so they be seen to be just as "human" as the people they criticise.
Another option would be the national adoption of Bob Marley's classic commentary on the situation:
while you point your fingers someone else is judging you.
The theory and the practice
What I really want to know if I need to see a doctor is how good are they. Are they likely to prescribe an orchidectomy when the real problem is my underpants are too tight (or if I arrive late). I.e. can they accurately and quickly diagnose and treat my ailments.
All the website seems to present is whether the front-desk staff are nazis (ans: usually, yes, it's a perk of the job) and whether the practice in general was well organised. Since most surgeries are host to many, many doctors the overall rating tells you little or nothing about the individual quack your "pot luck" will refer you to on any particular appointment (I've never seen the same GP twice).
I suppose if the system did rate individual GPs then in the long run, the reviews would be good, since all the lousy doctors would have killed off their patients before they could complain to the website.
Ask and ye shall receive
It's not just focus groups, sending documents out for review is just as bad.
Possibly the worst aspect of "processes" in business is the number of people who wish to review, approve or be FYI'd on documents that are, essentially, none of their dam' business. Mostly it's just to pad out their days (shades of: "why don't estate agents look out the window in the morning? Then they'd have nothing to do in the afternoon") with the illusion of activity.
However, once these people get a copy of a document, they feel the need to suggest changes - whether they know anything about the subject or not. One boss I had made it his policy to require at least one change to every circuit diagram he reviewed - just to show that he'd examined it. This was a long, long time before Dilbert and PHBs. After all these induhviduals have suggested their changes (none of which are returned until the deadline), there then follows a period of argumentation regarding why you chose to ignore their "input" and the inevitable politicking if you happened to point out an error in one of their documents - expect the favour to be returned in spades.
I now adopt a policy of NOT circulating proposals, papers or designs whenever possible and everyone seems happier for it (though not as busy as they'd like to appear). I reckon focus groups act the same way - if they always said "yup, that's fine" there would be a feeling that their time had been wasted - that they hadn't exercised their "right" to an opinion. Maybe the secret is in the questions they are asked. If instead of open-ended critiques, focus groups or approvers were asked specific, if diversionary, questions about particular aspects: do you prefer X or Y? then it would be easier to obsfucate the responses and come up with exactly what you intended to in the first place.
After all: you can't please everyone, so you've got to please yourself.
Out of character
After having to wait this long, you'd expect 4 to arrive together
Last chance for the big bucks
While spinning storage prices remain high, SSD prices are following the traditional hardware trends downwards. At some point, a lot of domestic users will realise that the 1TB disk that came with their (pre-flood) machine is still 90% unused. Commercial users (who are more driven by TPS rates than GB capacity) will wake up to the fact that a mirrored pair of SSDs can outperform a much costlier disk array. More importantly, vendors and disti's will get higher margins from SSDs and therefore promote them instead of traditional solutions.
So although there is a post-flood "catch-up" as people who deferred purchases earlier are now buying again, it's not guaranteed that this will continue. After those needs have been fulfilled, I expect that either the disk manufacturers will "blink" or their markets will remain in a somewhat shrivelled state (long time immersion in water has that effect) as the SSD makers slowly cruise past them, gesticulating as they go.
102 uses for a... errr
I just popped back to 1981 (well ok: to a box in the loft) and referred to the source material. ISTM Bart Jansen's idea bears a striking resemblance to use #26 - given the technology available at the time.
I also can't help comparing the reception the book got back then (IIRC most people took it as whimsical humour - or outright ROFL, when they saw the pencil sharpener use) with some of the responses the reality generates 30+ years later. Intolerance, fear or hate - three sides of the same coin.
The hardest part ...
> access the site under parental supervision
... Isn't preventing children from access the internet (or FB, whichever is more "interesting" to them). The biggest obstacle is overcoming parental indifference. Maybe the easiest way to force parents to take an interest in the doings of their offspring (and maybe cleaning the 'net up as a beneficial sideeffect) is to somehow require the family credit card to be registered against little jonny's FB account. That way, even if those responsible for him/her don't feel the inclination to perform their duties, the possibility of all their
benefits beer-money draining away might appeal to their venal instincts and lead to the desired effect.
Considering that it's so much more useful than FB, I'm surprised how little it went for.
Tanks for the idea
If I was designing an autonomous vehicle to go motoring around the garden hacking down plants, grass and anything else that got in its way, I'd be drawn to a RC model tank as the basic platform. Apart from the sheer "cool" of a tracked vehicle, I reckon that when you scale up the natural contours of a less than perfectly flat lawn, then that's a reasonable match for something designed to drive over battlefield terrain.
Depending on the amount of ground clearance, there should be scope to mount a rotating-wizzy thing under the chassis; safe for prying fingers, nosy cats and slow-moving grannies. There's also the long-term possibility of doing something with the gun turret. I'd suggest using it as a means of delivering systemic weedkiller to dandelions in the path of the all-conquering garden-force.
Kinda opens the door
Given that this was an offensive, pre-emptive operation by one or two states against another "enemy" state, it will be intersting to see if the USA and Israel (if they really were in cahoots to make the attack) can retain any moral authority in the internet-as-a-battlefield stakes. It does seem that none of the perpetrators will be able to go bleating to anyone if some other state (or maybe even non-state) decides to go after them: either in retaliation or just for LOLs - after all, they started it.
It also make you wonder if this undermines the americans' ability to prosecute cyber-attackers, since they are not above using the same tactics, themselves.
Since cyber attacks are a very good match to asymmetrical warfare, by pulling the cork out of the proverbial these guys may well find that any baddies with a botnet or two could wreak much more damage on them than they managed to incur on their (first?) target.
I just wonder how many moves ahead their strategists were thinking when they decided to start down this particular road?
Should've gone to screensavers
OK, it would be nice to have a screen that could actually display a 1::1 rendition of the mutli-megapixel snaps that our hyper-giga-sooper-megabyte cameras and phones (complete with their mess-produced, fixed focus little plastic lenses) can take. But that's about it. All that will happen then is people will begin to see the Emperor's New Clothes of a 14Mpix camera that is bugger all use if the shot isn't perfectly focused, and taken with a decent lens (read: costs more than the camera) with a noise-free image sensor, and no camera-shake.
So far as looking at internet p
orn ictures goes, unless they get re-scaled to a suitable DPI, which obviates these extreme resolutions, they'll be about the size of a postage-stamp. Text, likewise.
As for movies, even 1920x1080 formats will need to lose the benefits of all those millions of pixels just to fit properly on the screen - unless you're planning on watching 4 movies simultaneously.
Finally, who actually has eyes that could discern such high resolution? Sure, if you have eyes like an eagle and are viewing in a well-lit (but reflection-free) environment then you might possibly get some benefit from a 4MPix screen on a 13-inch display, but for ordinary people: with or without fully corrected vision, viewing from a sensible distance, this seems like a "we'll do it because we can strategy - just like the megapixel marketing campaigns are with digital cameras.
Sounds like a wise course of action
This is excellent planning by the sounds of it. Just like we're told "never install version 1", it's only sensible to let someone else take the risk, find the problems and iron-out the bugs before committing to a new way of "doing" computing.
Maybe once all the concerns regarding getting your stuff into a cloud environment, getting it out again if the worst happens (and it will), learning how to deal with cloud suppliers who go bust, outfits that don't have top-rate security - or service provision - and learning how to recognise all of these pitfalls. After the problems of where the hell your data actually resides and who controls it have been sorted out we'll then be in a position to ask the basic question:
"What real, hard, monetary and business benefits do I get from handing over the IT part of my business to some complete strangers?"
can start to be addressed. If the answers to all these points makes it clear there are benefits and manageable risks, then - and only then - would it be worth considering.
> We look at the tablet and we think it's going to fail
which it will do eventually. And once it does fail there doesn't appear to be any alternative but to buy another (except, of course, NOT buying another). That's the genius.
Oh, you meant the business model?
All government agencies have their own preservation as the top priority
Hence, any change provides an excuse to add cost, bureaucracy, oversight, additional management and more "information". Even going back to the old ways adds more supervision, time, people and cost into the department.
Just as 1984 is the de-facto handbook for government surveillance of it's enemies - or "citizens" as we used to be known, so Little Dorritt¹ has been the "bible" of every government department for the past 150 years. The only way to break free of the ever-increasing costs, restriction, required-approvals and form-filling is a very long wall and an outsourced firing squad. Sadly, the revolution's been cancelled on Health and Safety grounds - until a full risk assessment is completed.
 The Circumlocution Department, specifically
All you need to read
> ... [people] with higher levels of scientific and mathematical knowledge are more sceptical
and that's all folks!
It's not about climate change, voodoo, astrology, psychology or the latest health fad. It's just a state of mind. Everyone's on the spectrum between iconoclastic and faith-believer. It's just that more people with more rational knowledge will tend to ask "why?" and not be fobbed off with responses that don't stand up to reason,
Re: No surprise
> Cost of manufacture doesn't suddenly go up because one factory goes offline.
Well, it can do. If an industry loses a percentage of it's supply AND the other, unaffected, suppliers need to step up their production to meet demand there IS a cost to doing that.
These days most manufacturing runs with very little slack. If a plant is designed to make 100,000 gizmos a week then it'll be making that many. Asking for 110,000 gizmos won't just be a case of turning the production line speed up to 11. It'll need more investment, more raw materials (or parts: from subcontractors who in turn are working at 100% capacity), more workers - to be trained, more factory floor space to be built, more storage packing clean-rooms and testing. In fact: more of everything.
If the expectation is that when the "lost" production is restored, all that extra investment will be standing idle you can't really expect the plants' owners to finance that expansion without wanting to recoup their costs.
Don't try this without expert supervision. Get your tongue stuck in the tread at speed could well be grounds for divorce.
But does it fail safe?
If this "platoon" is dependent on the lead lorry to provide guidance, what happens when LL fails, breaks, or loses its wifi?
I appreciate that this is more of a testbed/demonstrator than a viable option, but the key question isn't so much "can it be made to work?" but should be "what happens when it fails?" Even requiring each vehicle to have a drive who could take over isn't a complete solution. If that driver is busy doing something else: reading the paper, having lunch, getting "cosy" with the passenger, leaning out of the window trying to lick the tyres - or whatever else bored drivers get up to. If the driver can't get back to a position where they can take over quickly, or the car doesn't do something sensible on it's own then the system can't be usable.
Hopefully this particular implementation won't crash and kill everyone involved each time it goes past a roadside cafe offering free WiFi!
But kiloWatt*hours are a useful unit, as they have a direct connection to people's experience and their consumption of electricity. It's easy to know that if you run a 1kW electric fire for one hour how much electricity that equates to in the units that it's billed in. From that follows the cost of your action.
In the same way, we measure petrol consumption in miles per gallon, km per litre, litres per 100 km or some other variant of distance and volume. We don't feel the need to consider that distance is measured in units of length and the "per" is dividing that length by a volume (i.e. length cubed) unit. That would mean that logically petrol consumption should be stated in "inverse square feet" or some similarly meaningless definition.
- Review Samsung Galaxy Note 8: Proof the pen is mightier?
- Nuke plants to rely on PDP-11 code UNTIL 2050!
- Spin doctors brazenly fiddle with tiny bits in front of the neighbours
- Game Theory Out with a bang: The Last of Us lets PS3 exit with head held high
- That Microsoft-Nokia merger you've been predicting? It's no go