2369 posts • joined 10 Jun 2009
Ad hominem attacks are OK by me when they consist of revealing that someone is a hypocrite. Because a hypocrite is someone who should be attacked for what he is. The adjective "nauseating" is often applied, because any decent person will be sickened by anyone who preaches one thing in public and engages in its opposite in private, regardless of whether they support his public position or oppose it.
Good luck with Windows 7 on an ancient underpowered netbook. It's the most resuource-heavy Windows Microsoft has produced to date. Personaly I think 4Gb is its minimum RAM (and a Netbook can't have more than 2Gb). Windows 8 is actually far less demanding.
On the other hand if you have a Lenovo T400, Windows 7 is OK after you have upgrade the RAM to 4Gb. If you also replace the HD with an SSD ... who needs a new Windows notebook?
Wish you'd reported the kid's feelings about other Linux Desktops (KDE, Gnome3, Cinnamon) but I guess a bit OT.
Re: good, but ...
MRAM or RRAM (HP's memristor tech).
But chip stacking as a technology might be valuable in conjunction with either of the above, to deliver truly awesome bandwidth. Or, stack RAM on the CPU, to reduce interconnect speed-of-light latency. Stacking is almost certainly worth researching.
Re: Just don't get me started on
Caesium for an even more explosive reaction. http://www.youtube.com/watch?v=QSZ-3wScePM (Open University broadcast. I have to wonder how many out-takes before they got the Caesium just right :-)
Re: Coming up next on Mythbusters....
Judging by what Coke does to teeth, keyboards and electronics, the answer should be a resounding "Yes and you don't need the Mentos". What's a Mento anyway?
at least this one is under cover.
Actually I'm sure that's what I'd hate most about it. It' s possible that an Amazon warehouse has a transparent roof, or even just skylights, but somehow I doubt it. A day spent with nothing but industrial high-efficiency lighting would really suck. I once threatened to resign if I wasn't relocated to a location with a window. (They moved me rather than calling my bluff. I meant it).
I could walk eleven miles in a day and enjoy it (if sunny and outdoors) until I reached my 40s. I probably still could after a week or two toughening myself up. Ask someone in the army what they're expected to be capable of. And that's before hostiles start shooting at you. And they say they enjoy it!!
Re: Comp Sci degrees were sold to many kids looking for a well paid job....
You don't want a CS graduate. You want a Physics graduate who enjoys programming.
Doomed to fail
Unless there's a measurable difference in the isotopic makeup of Australian and African Tantalum, this is doomed to fail for lack of verifiability. Put bluntly, we'll just create a financial incentive for smugglers and corrupt miners. We'll be paying more. The horrible slave economy in Africa will be getting much the same for much the same. The new intermediaries will be pocketing the difference.
A better answer would be to make Tantalum obsolete. It's used for capacitors, isn't it? Can't we work out how to make Graphene capacitors (or something) with four billion dollars spent on research?
Re: 13 billion light years old or 13 billion light years away?
Back to relativity and that difficult concept "now". The light we observe has been travelling for an estimated thirteen billion years. The space it has been travelling through has been expanding. It was a lot less than thirteen billion light years distant when that light set out. By extrapolating, we can work out that light emitted by that object at this time (insert relativistic caveats) won't get here in thirteen billion years. Forty billion might be a better guess. Or not. It's a long time to wait.
If the expansion of the universe really is accelerating, maybe that object will move right outside the boundary of our observable universe at a future date. (In that case it will become more and more red-shifted, until the redshift becomes infinite and it disappears).
Re: Not as safe as people believe
To say nothing of PCs and other electronics with exploding power supplies. I once watched a switch turn itself into a small flamethrower. It's a good thing I was there to yank its power cable. Another time a PC PSU went BANG and took out the power to about forty rooms.
And don't talk to me about water heaters.
Re: Isn't Nicotine itself harmful?
Well, if the toxic effect on the vaper himself really is no worse than a serious expresso habit, I'm willing to accept that passive vaping is unlikely to be a serious hazard after the vapour is diluted in a large volume of air. Which just leaves the aromas.
During the hayfever season, most strong aromas and scents can send me into a sneezing fit. Vapers, please note. (This occasionally happens on the tube in response to perfumes. Any heavy perfume-wearers reading this, please fix in your memory that I enjoy my sneezing even less than you do, and that it is YOUR FAULT -- although I'm usually too polite to point this out )
Isn't Nicotine itself harmful?
Cigarettes cause cancer and heart disease. The former can be squarely blamed on the carcinogens in the tar, but the latter? The blame was usually laid with the nicotine ... is this now believed to be incorrect? (Citations rather than opinions sought).
If Nicotine is indeed a cardio-toxin, there is good reason for banning vaping in a public space or workspace. As for smokers turned vapers, they're reducing the harm to themselves, but by how much? Are they reducing their nicotine intake, or increasing it?
Time to re-christen the local pub?
There are lots of pubs called the World's end, because back when Jeremiahs were proclaiming the end of the world on an annual basis, the local reprobates would gather at the pub for one last drink before their eternal damnation. When that didn't happen, they had a few more drinks and renamed the pub.
Anyway, after that ramble, I'd suggest that this estate's local be re-named the Cock and balls. With a perfectly clean pub sign, of course. And a Google maps reference.
Re: Wintel irrelevance == x86 irrelevance
If it doesn't need Windows, it doesn't need Intel.
Hmmm. How else can I buy a really fast server for crunching numbers or big data using Linux? Say 4 CPUs, 32 Cores, and half a Terabyte(*) of RAM? Sun's gone. HP IA64 systems don't make financial sense unless you want to run VMS. Much the same for IBM Power-based systems. AMD have sadly fallen behind again, their Opteron glory days are behind them. GPGPU computing has its place, but so far, it's a fairly restricted subset of scientific programming for which a GPGPU is the answer. Don't think I've missed anyone.
(*) There's such a machine crunching away a hundred yards or so from where I'm sitting. It's attacking large sparse matrix problems. If you know what that means you'll know why it needs that much RAM.
Re: Wintel irrelevance == x86 irrelevance
Intel has nothing interesting to offer the ARM community.
Not true. Intel's process technology is second to none. If they process-shrank and fabbed ARM chips, they would be the best ARM chips on the planet anywhere in the power - performance envelope.
How else do you think is is that Intel can just about compete in the handheld market with that horrible warty i86 architecture? But I don't think Intel can hold the fort for much longer. Soon, they will be fabbing ARM designs.
Re: lost the plot
Intel lost the plot when they acquired Alpha from the wreckage that once was Digital, and buried it as "not invented here". AMD picked up a bunch of talented engineers with lots of Alpha know-how in their heads, produced the AMD64 architecture, forced Intel to follow instead of lead, and almost knocked Intel off its perch. If it hadn't been for an Intel "skunkworks" project that was keeping the original Pentium-3 alive, when the Pentium-4 architecture hit the speed barrier Intel would have been finished.
This battle, which AMD ultimately lost, is probably why Intel didn't spot the threat that ARM and handheld devices posed until it was too late. (I think AMD did spot the threat, but didn't have the corporate strength to respond sufficiently). In another universe, Alpha could have been stripped back to its origins, producing a low-power chip more than the equal of ARM, and with Intel's process technology behind it ....
I still dream of a world where the dominant 64-bit architecture is Intel Alpha, and where x86-32 is ancient history. Intel took one of those wrong turns on which empires totter and fall. I still have the Alpha Architecture handbook to remind me how a really good CPU might have been.
Microsoft is destroying itself, by failing to understand and nurture its own business niche.
It had a near-monopoly on the business desktop. That was its niche. Even after losing the server-protocol war with the EU, Office should still have been an all-but-unassailable advantage.
So what does it do? Instead of listening to its business customers, it listens to the numerically larger number of home users who bought an MS system for lack of an alternative. then it redesigns all the interfaces to suit its idea of Joe Public (which appears to be even more drooling than Joe actually is). Office has suffered three interface redesigns in ten years, each reducing the productivity of a skilled user of office 2003. Now they've done the same to Windows.
Joe is unimpressed. There's Apple, offering a superior consumer-orientated product range at a superior price. There's a host of companies selling Android pads into the fastest-growing non-keyboard sector. Joe never had a good reason to buy Microsoft, it's just that to start with there weren't any alternatives.
What hasn't materialized yet is a really good Linux- (or Android- ) based business desktop solution set with a big corporation backing it. IBM? Apple? Oracle? Samsung? China inc.? The next young Nokia (who started off growing trees and making tractors before getting into cellphones almost by accident)? Even (long odds) a fully open community project? Whatever, when that business alternative arrives, Microsoft will be finished, because for the last decade it's given them every last reason and more to regret their dependence on Microsoft. A trickle of defections will become a flood.
It's got nothing to do with which CPU goes inside at all.
why the military were so eager to get involved from the outset ?
Actually the military invented it before civilians knew what a computer network was. It was designed to maintain communications during and after a nuclear war, and probably would.
"Military intelligence" is reputed to be oxymoronic, and certainly isn't what most of the public need to be concerned about. It's the non-military state internal intelligence agencies that concern us, especially when the rules applied to us are not equally applied to our masters / political superiors / rulers / pond-slime / whatever you call them.
Re: On the other hand....
Actually it's quite a neat way of making Mathematica available to kids without cannibalizing its own market because their parents will use it. Don't cripple the product, just put it on very low-powered hardware. For the sort of uses that a non-genius schoolchild will make of Mathematica, an RPi will be OK. For the sort of uses that a university-graduate parent is likely to want to make of it, an RPi mostly won't be.
As to whether letting a kid use Mathematica before he knows any pure mathematics, I'll reserve judgement. I'm certain that giving schoolkids calculators has rendered them number-blind (eg incapable of telling that whatever 3547+2974 might be, it's not 4xxx or 5xxx). There's a danger that Mathematica might blunt any true mathematical insight that they might be trying to develop, rather than the opposite.
Re: Best name evar
A vacuum cleaner. With a Dyson you can even see the gamma-ray axis.
Re: The "interesting" number ...
If the gamma flux in significant it doesn't matter that only one hemisphere of the Earth is exposed.
The danger is not radiation sickness. It's bulk ionisation of the atmosphere. Break down a significant percentage of the oxygen and nitrogen molecules, and they'll recombine into Nitrogen Oxides by the megatonne. Cue (a) instant destruction of the ozone layer and (b) decades of Nitric Acid rain.
Land life is the most vulnerable to these. The oceans are a very large sink, so life in the deep ocean has the best chance.
It's not very likely, though it fits the fossil pattern of one of the mass extinctions as well as anything else. One thing for sure, we wouldn't see it coming.
Re: 10-bit multiplexed optical
Optical wavelength demultiplexing ought to be pretty straightforward. You use a prism or a grating. I'm not an engineer but I suspect the demultiplexer is the least of the problems!
The problem is that the same mechanism that separates wavelengths in the prism causes dispersion of signals in the fiber. At best the various optical carriers will get time-skewed in transit and that has to be compensated for. That's something that will vary with the temperature of the fibre.
At worst, dispersion in the fibre will scramble the data modulated onto the optical carriers. If that's not a killer you have to make modulators and a multiplexor and a coupler. Rather them than me.
It's far easier to transmit 40 packets in parallel down 40 fibres, than to try to make a 400Gbps single channel. Which makes me wonder, why is a 400Gbps single channel needed at all? c.f. one 400GHz CPU -- is that physicaly possible -- compared to the 1000 400MHz cores that you might find in a GPGPU.
Anyone else pondering what might be labelled 802.3BT and 802.3bu ?
Re: No nukes!
1970? Don't you mean pre-1870 levels? Or maybe pre-1070 levels?
Actually, if the world put all the resources it devotes to warmaking ("defense") into building massive solar plants in the Earth's most barren deserts, we could support current populations with completely renewable energy. Covering a mere hundredth of the Sahara desert alone would generate all the electricity that mankind uses today. Cover a few more percent to generate electricity in place of fossil-fuelled transportation and heating. The only thing that we'd have to completely give up is air travel. Long-distance ocean travel would become expensive or (and?) as impossible as the winds to timetable.
BTW the area that would need to be covered is about the same as the area of the planet that is already covered by roofs.
Re: New Heads
Not my experience. Mine is still on its original (starter-sized) cyan cartridge, many months, many reams and a lot of black-only printing later. I'd second the comment that there's somethiong wrong with it. Or ... you don't turn it off when you're not using it, do you? At the wall socket, rather than by using the off button? That's a really bad idea.
Re: Why is this a surprise?
The Vikings weren't particularly hot on writing down their history. I think it's quite likely that more of them went further in North America than is generally realized. Genes don't record who raped who. (And propaganda being what it is, if there had been peaceful, occasionally loving, co-existence between native Americans and Vikings, the warmongers on both sides would call them traitors and erase them from the verbal histories as soon as possible).
BTW the Chinese discovered the West coast of North America before Columbus. In a parallel universe, the emperor didn't order these explorations to cease, didn't burn the fleets to make sure, and it was the Chinese empire not the British empire where the industrial revolution took place.
Re: Why have the OS manage it?
All systems with virtual memory have the O/S managing the physical pages of memory (and backing storage). The hardware does the virtual address to physical address translation when the data has a current physical address, and throws a "page fault" when the data has to be moved from backing storage to a free physical page. the O/S handles the page faults. With different classes of RAM the O/S will also be managing movement of data between near and far physical pages when necessary, while the virtual address of the data that the programmer uses won't change.
Re: Why ?
Chances are it'll be well-hidden down in the O/S's virtual to physical page translation, so it'll look like one big contiguous space if your program isn't the sort to need every speed optimisation it can find. You'll probably have access to an extended malloc with a flag to request near memory, and be able to request that pages be locked in near or in far memory. The paging system will probably have algorithms to move busy pages of far memory into near memory and to move idle pages of near memory outwards.
There's already a sort of near/far distinction on some multi-CPU systems, in that memory is local to a CPU or local to a different CPU. On the Quad CPU AMD system I once looked at in detail, 1/4 of the memory was local, 1/2 was one CPU hop away, and 1/4 was two hops away (so effectively three levels).
Re: If you want your private life to remain private
don't post it on the internet
A corollary is that someone else will post it, if you did it in public. As the old adage goes,
"Don't make love by the garden gate // Love is blind but the neighbours ain't"
I think the deeper psychological problem is an expectation of forgetfulness. The internet's memory isn't human, and rarely forgets anything. (Something that can also be said of a certain type of sociopath).
Anonymous currency is a destabilizing problem for any country attempting to hold legitimate elections.
Not for the elections. Provided they are secret, as a voter you take the bribe and then vote however you please, with no fear of retribution. The real problem is corruption of the government or its officials by someone willing to splash a lot of untraceable cash around. It's an even worse problem if the government is not properly re-elected on a regular basis (witness China). Free, fair and secret elections tend to result in the most corrupt politicians being unseated ... eventually, once the electors notice the bad smell.
Inflation can be good
It's not well known that the drawback of hard money (gold, silver, bitcoins) is that if they are adopted as a nation's sole currency, they inevitably drive the vast majority of the population into serfdom. If they are combined with usury (lending money for interest) this happens so much faster.
In the middle ages the gold all resided in rich men's treasure chests, the silver all resided in merchants safes, the copper resided in freemen's hidey-holes and everyone else was a serf living fifty meals away from death by starvation. It took a war (usually civil war) or a plague to (temporarily) jog the system out of otherwise permanent economic stagnation, depression and deflation.
Eventually the Black Death laid the seeds of a modern economy, by killing half the population, thereby making labour scarce and redistribiting the rich men's coffers (thereby creating inflation). The Spanish then unwittingly kept the ball rolling by plundering all the gold from the natives of what became Latin America ... which imported inflation first to Spain and then, in a more beneficial form, to the rest of Europe. By the time that stimulus dried up, fractional reserve banking and paper money had evolved to a modern form. Which has its own problems, especially when no longer judged against gold bars, but which (so far?) has not proved quite as disastrous as the alternative.
If bitcoins ever become a global store of value as well as a global medium of exchange, we are in big trouble (deflationary collapse). Ditto, if there's ever one world paper currency and one world central bank (corruption, then inflationary collapse). It's competition between multiple currencies that keeps the present system going despite the periodic crises. There's probably a parallel to be drawn with a natural ecosystem.
Re: Off line
I suspect diamonds (or drug) will stay their currency of choice.
I would stay away from diamonds as investment.
No contradiction. The former refers to a medium of exchange. The latter refers to a store of value. As a medium of exchange, one cares only about short-term stability, for as long as it takes to transfer the currency from one party to another and then use it to buy one's chosen store of value. And with a store of value, one can sacrifice things that are vital for a medium of exchange. Things such as fungibility and liquidity and portability. An old master painting may be a good store of value, but it is fairly hopeless as a medium of exchange. A currency suffering 50% annual inflation is fairly hopeless as a store of value but still perfectly usable as a medium of exchange.
Re: Yes, HSBC Bank.................
You can't fine the $, but world currency markets place a value on it compared to every other currency and commodity. If the powers that manage the dollar do things widely regarded as deleterious to its value, then that value (as measured in other currencies or commodities) will go down.
They are looking for completely sane well-adjusted people willing to take a one-way trip away from almost all of their fellow humans. Volunteers for a certain death much sooner than would be expected if they didn't go. Hmmm.
Worse, they're looking now, not a few months before departure.
Their best hope would be to look for candidates much closer to departure. They might find some suitable ones, but only if they select from people with terminal cancer who will survive long enough to get to Mars and do a bit of exploring. Such people would also have a lot less to fear from solar flares, since the (almost) worst case radiation damage has already happened to them.
Re: Hard to find faults with PS4, isn't it?
Perhaps I should say that I'll buy one when I can run Linux on it, and they guarantee that this ability will never again be taken away on a whim.
A reactor would be a lot safer.
Seriously: a radio-isotope power source is as "hot" as it will ever be at launch. If the launch fails, some fairly short-lived really horribly radioactive crap gets showered into the atmosphere or the ocean.
A reactor could be launched unfuelled and then fuelled in orbit. Fuel is subcritical pieces of enriched Uranium, suitably packaged. If launch of those failed, pure Uranium (enriched or otherwise) is a lot less harmful than the contents of a radioisotope source. U235 has a half-life of over a billion years. That's almost non-radioactive, unless you build a supercritical assembly and add neutrons.
After it's assembled in orbit, take the reactor critical and take the spacecraft out exploring on ion drive. It won't ever come back. (To be absolutely sure give it enough fuel to reach Jupiter or wherever, and not enough to make a return flight). Once the reactor has been running for a while it'll be plenty radioactive (and unscreened). But also plenty far away enough to be safe!
Re: Ooops. Can you say "Tipping point"?
@Mike Richards. You fail to mention the worst Icelandic volcanic erruption in recorded history, Laki, 1783. http://en.wikipedia.org/wiki/Laki
The erruption had serious repercussions in Europe. An estimated 23,000 deaths in the UK alone. Globally there was climate disruption lasting several years.
The governments of East Germany and the USSR failed ...
... to stop their citizens getting access to "subversive" literture, just as soon as the most rudimentary data-copying technolgy became available. They just about kept a lid on the typewriter and carbon paper, but once the photocopier reached the Soviet bloc, they'd lost. (The DPRK seems to have learned this lesson: you can keep your masses down by denying them any technology more advanced than mediaeval, and ruling them in a like manner).
Yes, Google can remove most of the filth from their indexes, but how anyone thinks they can deal with sheets of paper containing lists of URLS distributed by paedo-sneakernet or peer-to-Tor-to-peer, heaven only knows. The servers will of course be some wholly innocent organisation's PC, running malware.
Re: Not sure this is so impressive, and this is dangerous...
Computers, even massive systems like Google's don't really have the chance to perform actions that effect the world around them.
I don't think this is correct. One trains a neural network by "rewarding" it (+n) for getting decisions right, and "penalising" it (-n) for getting them wrong. It has a build-in imperative to try to maximise its score. If it has any consciousness at all (I hope not), that consciousness is of a virtual environment of stimuli and chosen responses and consequences of those choices. (It would have to be a pretty darned smart virtual critter to start suspecting that it's in a virtual environment embedded in a greater reality. Human-equivalent, I'd hazard. )
A very simple life-form (an earthworm, say) can be trained to associate following certain unnatural signals with food, and others with a mild electrical shock. It'll learn to distinguish the one from the other. Just how is this different? If you attribute self-awareness to an earthworm but not to the neural network model, move down to a less sophisticated organism. It's possible to train an amoeba, even though it altogether lacks a nervous system!
Re: And they expect to get it?
Yes, this patent is a bunch of already-invented hardware strung together is a way that ought to be deemed obvious. So it ought to fail should it ever be challenged. (I haven't read the fine print - if there's a truly novel concept in the details of the creation of a 'loon network, then my comment does not apply to that).
One of the ways that the patent system is broken is that patent offices issue patents on almost anything if the applican'ts cheque is good. Then if someone wants to challenge the patent, he needs very deep pockets to pay his lawyers. So it's a system for the benefit of the large companies that works against individuals and small companies.
Memristors have many advantages over flash, but the areal density advantage is not a large one. My money is on the nonvolatile RAM (memristor) over the large-block-addressed and limited-rewrite Flash, but I also expect that multi-Terabyte disk drives will be around for the forseeable future. A lot depends on whether the HDD manufacturers have built the bit-patterned media plants and the two-digit-Tb drives before SSD eats their bread and breakfast market. They might decide to stop investing in future bigger HDDs because the HDD business doesn't have a future (which would be a strongly self-fulfilling prophecy).
The price of N Terabytes of SSD will always be N times the price of one terabyte (until they can make a 1Tb nonvolatile storage chip, if ever). The price of one disk drive will be £50 plus whatever they can get for it being bigger than cheaper ones. If a 10Tb or a 50Tb drive is ever marketed, it's a fair bet that 5 years later it will be available for £50 of today's money.
Wafer-scale SSD integration might one day put a terminal spanner in the HD works, but wafer-scale integration is something that's been coming for almost as long as nuclear fusion, and like fusion we still don't have it.
But I don't think they've built (gigabyte state-of-the-art) Flash fabs in China yet. The moment Intel or Samsung or whoever do that, they've given all their know-how to the Chinese. So it's the power cost in whatever country the Flash fab is in that you need to look up.
One can deduce an upper limit on the energy input from the sale price of the chips and current industrial electricity costs.
A 10 Watt disk drive running 24 x 365 x 5 years uses 438 kw/h, at 10p/unit that's £43.80. Depending on size, an SSD may not cost a lot more than that, and it's the cost of the chips it contains you need to use for your upper energy-input limit, not the cost of the completed, tested, packaged and warrantied assembly.
Not long-term figures. SSD tech has been developing extremely fast, disk tech less so, but in both cases by the time any device can be pronounced long-term reliable, it's also obsolete. The manufacturers do "accelerated ageing" tests but don't have a TARDIS. They have to substitute heat, humidity, vibration, extreme usage patterns ... and time axis scaling laws of very doubtful validity.
To put it bluntly, the fact that only 1/1000 of your test sample has failed after 12 months of torture testing, doesn't mean that 90% of them won't have failed after five realtime years of gentle usage. And in fact we see the unexpectedly bad ageing problem every time a manufacturer buys a bad batch of components. then there's a flood of "WD / Seagate / Hitachi are cr*p" messages, really meaning "model xxxxxxx with serial numbers between xxxxxxxx and xxxxyyyy are likely to fail prematurely because the ZZ corp supplied some bad widgets". It would be nice if the manufacturers put out recall notices like car manufacturers do, but of course for a £50 (or even £150) disk drive they cannot afford to do so.
Re: all down to $/GB
the base price of a disk drive is about £50 (they can make them cheaper, but these sacrifice some longterm reliability). It's been that price since the days it bought you just a few Gbytes.
They have technologies that will permit 50Tb disk drives working in the labs. I expect they'll be on the market within a decade. I doubt that 50Tb of SSD will ever be competitive on price.
One other thing: are we sure that SSD really is more reliable? The technology has not been in use for very long. One thing we all know, an SSD is likely to fail "just like that" with no advance warning. HDDs frequently (though not always) give warning of pending failure and permit pre-emptive replacement. My instincts say wait a few years more before betting the farm on SSD storage. Also that memristor tech will supplant flash SSDs - true random access instead of large-block addressing is a huge plus.
Re: "the fast lane"
There are official "crawler lanes" on hills, so that means that the other lanes at that location must be at least faster lanes if not fast lanes.
Re: Cell phone yakking != good driving
A driver, in the fast lane, went from 65mph to 50mph because he got a call on his cell phone. No brake lights, he simply took his foot off the gas and kept driving at 50mph with his cell phone in his hand.
Which was blatantly illegal (in the UK). Use of a hand-held mobile while driving a car should be made illegal in any jurisdiction where it isn't already.
I'd assumed this article was about hands-free mobiles? In which case I can't see the difference between talking on a mobile and talking with a passenger. Also if cars blocked mobiles, they would not be able to automatically call for help after a serious accident (which may have left the driver and passengers unable to make such a call manually).
chances of B16B00B5 in there?
Or even a DEADBABE? (I Shouldn't have read the BOFH of the week)
Re: This wouldn't have happened if they were using Linux!
That is what worries me. What I want to know is if this is someone clever, or someone with a Very Large Budget
- Tricked by satire? Get all your news from Facebook? You're in luck, dummy
- Google straps on Jetpac: An app to find hipsters, women in foreign cities
- Updated Microsoft Azure goes TITSUP (Total Inability To Support Usual Performance)
- The Return of BSOD: Does ANYONE trust Microsoft patches?
- Munich considers dumping Linux for ... GULP ... Windows!