Re: Does it really cost that much to bear foreign data?
1p for the data and £10 for the bean-counting?
2629 posts • joined 10 Jun 2009
1p for the data and £10 for the bean-counting?
A suggestion for Angela Merkel. Give Snowdon asylum in Germany. Quote the national-interest exception to any extradition treaties if the USA asks for him. That's how to find out every last detail, and also how to send a very strong signal that we really really are not amused.
Of course, it won't happen.
@AC 15:27. It's not a common philosophy, but one can believe that nothing really exists except one's own thoughts. "Am I a man dreaming I am a butterfly, or a butterfly dreaming it is a man". If you work in IT and are familiar with the concept of virtuality, it's actually a less strange idea than it might be otherwise. How can you prove that you and I are not simulations of human brains, in a historical simulation being run by the beings that supplanted humanity during the technological singularity that has already happened out there in the real solar system.
Occam's razor comes in handy with thoughts like this, but that's an axiom not a proof.
What is the difference in belief in God and belief in No_God? Answer: two letters and an underscore.
The problem is belief, without evidence. In a universe containing very good evidence for its own creation, is it really wise to deny all possibility of a creator just like that? As soon as you start thinking that's OK, you are on as much slippery path to logical inconsistency and moral ruin, as those who assert that the creation was 6000-odd years ago because this book says so.
I'm a militant agnostic, otherwise known as a scientist. Yes, I do have to start with a few axioms (which I'll drop if anyone ever finds a way to disprove them). One is that there exists an objective universe outside my head. Another is that Occam's razor is a really good idea.
wonder if we can set up a country with entry requirements based on a support for the scientific method - a citizen is free to worship any god or gods they please, so long as they can first devise a proper peer-reviewed experiment proving that said god(s) actually exist.
I believe in the universe, but I'm b*****ed if I can think of an experiment that proves it exists. What would the control be? I'm not even sure that I can prove that I exist.
Early artillery shells had several brass lugs sticking out of their sides, to engage with the rifling of the gun. Doesn't sound like it should work, does it? On top of which, we're talking muzzle-loaded cannons.
I love the way sone of these threads drift.
didnt Buddhism produce some of the finest warriors of their time (the Japanese samurai)?
Samurai beliefs had wandered a long way from the source. If they were still Buddhist at all, they were a sect on the fringes. And on the fringes of the fringe, the Japanese suicide-asassin sect, the name of which I've forgotten. Devote your life to learning to kill, kill the man your Abbot ordered you to kill, then kill yourself to attain Nirvana if you survive your mission. Truly bizarre.
Buddhism begat many of the martial arts. The forms of unarmed combat that specialize in redirecting an opponent's attack against himself are probably truest to their source. It is not necessary to become a punchbag to believe in non-violence.
I think Buddhists deserve an honorable mention. No large group of people is perfect, but I'd say that they're no less respecting of the beliefs of others, than Atheists are.
What's happening is that the desktop market is both saturated (in the West) and mature. People who need desktops (for business and for serious hobbies) have them already. Why buy a new one? 1. Because the old one has broken down. 2. Is there a two? Windows 8 is a new reason NOT to buy a new one if the old one still works. Ignoring that, desktop sales will go from one every 3-5 years to one every 6-10 years: down to 40% of the boom-time sales, then flat for the forseeable future. That's still a huge market.
The winners from Microsoft's incompetence will be Apple and (maybe) Linux. Also I can't see Windows 8 making much of a dent in iPad and Android tablet sales.
The tablet market will soon enough be mature and saturated. I suspect Tablets aren't replacing desktops, except for home data consumers who only ever had a desktop because it was all they could get at the time. Tablets are mostly supplementing desktops.
I would be interested to know what other "box" you think the OEM's should of been building
That's easy. On the desktop, an iMac-alike that runs Windows and Linux, priced at the usual industry margin rather than Apple's huge mark-ups. (Dell are actually selling something like this - for the first time in ages I was somewhat impressed by a bit of Dell hardware). On the lap-top, what we have but with higher-resolution screens (again, much like Apple hardware). 1360x768 isn't enough pixels on twelve-inch screens, let alone 17-inchers.
Just don't import Apple's hardware (un)reliability!
I also believe there should have been a lot of mileage in a Tablet with a passive docking station (stand plus bluetooth keyboard and mouse), where the interface switched from touch-optimised to desktop-optimised when you "docked" it. That's what Windows 8 might have been: competition for the iPad when un-docked, Windows 7 desktop when docked. That it isn't is Microsoft's fault.
Actually Apple gets desktop computing right - the UI on an iMac has nothing much wrong with it. So Microsoft obviously wasn't aping Apple, when it decided that the world really wanted a mobile phone interface on a 1920x1080 monitor (or two, or four).
Microsoft could let manufacturers pre-load Windows 7, and let retail customers choose between 7 and 8 on the same hardware at the same price. (Yes, I know one can buy systems loaded with Windows 7 that come with a Windows 8 "upgrade" disk - but they cost more). That Microsoft won't let the market decide, suggests Microsoft is now running on ego, not economics.
One last thought. You can run a diesel engine on any fuel, if the compression ratio is right. Diesel (the inventor) patented one that ran on coal dust, though don't ask me how he got it into the cylinders. In the old days before fuel injection etc. you could run a diesel car or lorry on petrol with nothing short-term worse than loss of power. I'd hope that the army's diesels still can run on anything: diesel, kero, petrol, cooking oil ... In a war your fuel supplies may be adversely affected by the enemy.
So emergency generators should use specialized diesels running on LPG, I think. Of course a bog-standard lorry engine will be cheaper, but at a higher likelyhood of failing when an emergency happens.
Powerpoint is a far lesser crime than using Excel as a database. It's a lesser crime even than using Access as a database!
I already knew that bacteria can thrive on petrodiesel as well as biodiesel.
Some think that the bacteria that cause problems kilometers underground for the oil industry, are not always introduced into the crude oil by the drilling process. It's possible that life has been surviving there, since the oil was organic-rich sludge at the bottom of a pre-historic sea.
So why do emergency generators run on diesel, not petrol or (best of all? ) LPG?
I just fed "creationist biologists" into Google. It said yes. Depressing, isn't it.
You mean, as in fix your problem by running registry editor and setting bit 13 of HKCU\two lines of gobbledgook\...
It had to be said.
I think that it has become almost a sport to take a swipe and Unity and Gnome. Anyone who uses Unity everyday will now take an oath to say how good it is.
I have no idea how true I'd believe that to be about Unity, but it's really bad statistics. You are citing a self-selected sample set. In other words, the folks who hate Unity are almost by definition the folks who don't use it. The only exception will be those who are working for an employer who insists that they use it.
As for Gnome, I slagged off the Gnome 3 folks not for producing a UI that I really disliked, but for packaging it in such a way that I couldn't choose to re-install Gnome 2 on a Gnome 3 distribution. They did a Windows 8 "upgrade" on me, which is not the Linux way. Cinnamon and Mate fixed that in two different ways. Mate is the true Gnome-2 fork, but I found that I preferred Cinnamon over Mate and KDE and XFCE (and all these to Gnome-3). Choice is good, especially if one gets it at log-in time.
in passing I think allowing sample self-selection was Microsoft's big mistake with Windows 8. They didn't realize that the folks who contributed during the pre-release by definition liked it, and the folks who hated it had tried it for an hour or two and then just blew it away. And of course, in the MS world one can't just download and use a different UI or three, and very likely one is told what to use by one's employer.
If you like Cinnamon but not so much the rest of the Mint distribution, Cinnamon is also available for Fedora through the standard package installer. In fact downloading Cinnamon is the first thing I do after installing Fedora, because I really dislike Gnome 3 and prefer Ciinnamon to KDE or XFCE. (Haven't checked if Mate is available for Fedora, since Cinnamon made me rather lose interest).
Scientific Linux 7 should be coming fairly soon (3 months after RHEL 7 ships? ), and will be based on Fedora 17. I hope that means I'll be able to install Cinnamon on Scientific Linux, which may come fairly close to perfection in my eyes.
I don't recommend Fedora in general because it's a bit bleeding-edge for my taste, and forces you onto an upgrade treadmill in that there's no long-term support for Fedora releases. However, it does tend to have support for the latest hardware, so I reach for it when SL lacks the hardware support I need.
Is there a commercially supported stable linux that does not charge an arm and a leg for private use ? The coorporate red hat prices are a bit steep I think.
Depends what you mean by "supported".
If you want to pay so there's someone on a phone who will help you with any "issues", there are various organisations out there who will do that for popular distributions. I don't have any experience of who is how good.
If you mean you want the long-term security updates and platform stability that you can get by paying for RHEL, then look at Centos or Scientific Linux. Both are built from RHEL Open source. Both are free. For most usages, they are interchangeable. Both, for example, can use all(?) binary packages from Red Hat's EPEL repository (Extra Packages for Enterprise Linux).
Centos claims 100% bug-for-bug compatibility with RHEL.
Scientific Linux (the most misnamed distribution?) is an almost-clone of RHEL and can be used virtually anywhere that RHEL can. It might also be mis-named "CERN LInux". It's supported at CERN and used extensively by CERN. The not quite a clone bit is that if CERN finds a bug or needs a kernel feature that RHEL don't fix or ship, CERN will fix or build it themselves. In practice I have never found anything that works on RHEL that doesn't work on the same-numbered release of Scientific Linux. It also ships with auto-updating turmed on. Centos ships with that off.
There's always a risk that the Centos distribition might go away. There were reports of bust-ups amongst the project leaders around the time of RHEL 6.0, but it's still with us.
I doubt that Scientific Linux could go away any time before CERN goes away. That makes it my choice for a free Enterprise-grade Linux. YMMV.
What sort of tasks produce this sort of results?
The most spectacular one for me was installing Windows XP + apps (i.e., developing images for deployment). Even after I worked out that VMWare had optimized formatting a virtual disk into an (almost-?) no-op and after adding back the time saved at that point, it was still considerably faster installing into a virgin VM than installing onto the hard disk of the same system natively. Some day, out of curiosity, I may repeat the measurements under native Linux KVM or Virtualbox and with Windows 7, but VMWare workstation on Linux was there first and that's when I was doing this stuff. It's someone else's baby now, and he insists on doing it all the all-MS way.
Why? My guess is that Linux's cacheing of the hard disk to RAM is greatly superior to Windows'. Operations like copying a large folder containing a lot of files were also faster within a VM than when Windows was in direct control of the disk. Or maybe it was just that 32-bit XP couldn't use all 4Gbytes of RAM, but 64-bit Linux could. (In fact so can 32-bit Linux, via PAE, though not all on one single process).
The even greater saving of time was from VMware VM snapshotting. Snapshot - sysprep - image - test depoly - find bugs - what then? On the VM as opposed to a native installation, you just revert to the snapshot, which takes mere seconds (and saves waiting for a boot!), then start fixing the bugs. You can also do much the same at the Linux LVM level, except it's probably a good idea to make sure the VM is shut down at the time you take the snapshot, if the VM is unaware that a snapshot is being taken.
KDE is probably the "heaviest" of the Linux desktops. If you have a powerful desktop system it's OK. As for whether you like the KDE experience ... try it and see. The nice thing about Linux is that you can choose between desktop UIs at log-in time.
But another good thing about Linux is that it can be very usable on elderly hardware, which you can obtain for free or almost-free because it's not man enough to run Windows 7 comfortably. My feeling is that KDE and Windows 7 desktops are similarly demanding. Something lighter is better on such hardware.
Boggle. On any Distro I know, install Chrome or Chromium (or other open browser of your choice) and then forget about Firefox until you are sure you are happy with your chosen replacement. Then use the package manager to un-install Firefox, if you are really keen to reclaim a smallish amount of disk space.
Or you could close your eyes and jump: un-install Firefox first and then install Chrome or whatever. You can always put Firefox back later.
Linux is a perfectly adequate alternative to Windows right now, and has been for years. EXCEPT ...
People often don't want alternatives. They want exact bit-for-bit and feature-for-feature identity. For example, they don't regard Openoffice as an MS office alternative, neither on Linux nor on Windows. Ditto Gimp versus Photoshop (much bigger bucks at stake here). Ditto Octave versus Matlab. Some even object to browsers that aren't MSIE.
And then there are existing organisations that have locked themselves into MS proprietary file formats, and now can't afford to escape. Access databases are one obvious lock-in.
But if you are starting an organisation right now, think long and hard before you let Microsoft get a toe in your door. The best time to break free, is never to enslave yourself in the first place.
Personally I've never had much joy with Wine, whereas a Windows VM under Linux works extremely well, sometimes faster than running Windows native on the same hardware. NB you do need enough RAM for the two O/Ses side by side. Don't try this with Windows 7 on a 2Gb Linux system!
If you have enough RAM and if Windows licensing doesn't block you, it's a good next step to make a Windows VM so that you never have to shut down Linux to run Windows.
It's more a matter of conservatism and long planning stages, resulting in Microsoft "lemons" being squeezed out (sorry) before any deployment happens. IT is there to support the business. Disasters happen when the tail is allowed to wag the dog.
Most enterprises avoided Vista altogether because they were happy with XP for the forseeable future, and their techies were able to say "don't go there" before any serious commitments were made.
Many enterprises are still migrating XP to Windows 7. Many would prefer that XP remained viable forever, but Microsoft have hit the XP kill switch. There's no reason that Microsoft couldn't have incrementally improved XP, including replacing its kernel, without inflicting a painful migration. Or have given us proper migration tools for going from XP to 7. So that was perhaps the first sign of the rot setting in at Microsoft.
The posts above should give you a pretty good idea of what enterprise techies are telling their bosses about Windows 8. Soon, we'll be reading about what happens to the bosses and companies who override their techies because they prefer to believe Ballmer's promises. It'll be interesting. I wonder if it'll take down a bank? (Kidding - I hope). Most companies will wait and watch for now. It's been a good strategy in the past.
IMO "Windows 9" will be the last chance saloon for Microsoft. If they EOL Windows 7 without providing a Windows that enterprises are happy with, the decision will be taken that if there's got to be a really painful migration, then why should it involve Microsoft at all? Until then there is still time for Microsoft to fix things.
Sooner or later if Microsoft carry on the way they are at present, a big company with deep pockets and a long-term plan will decide to challenge Microsoft with a Linux-based Enterprise Desktop and some heavyweight migrate-from-Microsoft support. A company like IBM or Samsung or possibly a company from outside IT altogether. (Anyone remember what Nokia did before entering the mobile phone business? )
Years ago I watched as Digital Equipment self-destructed because of managerial greed, incompetence, and hubris. Now I feel that Microsoft is treading the same path to corporate oblivion.
My thought also: WD and HGST will merge, so duplicating research is pointless. Hydrid drives is WD's job.
As for "robust device drivers" I understand what he is saying. A modern CPU is more powerful than the microcontroller in a hard drive, especially when it comes to algorithms that can't be accelerated by custom hardware. So letting an OS combine an SSD-based cache with conventional disk drives may be the best-performing solution. However, it does require someone to write that driver and to support it. (Is it a hint that post-merger, HGST may be getting into the software business? )
The nice thing about a hybrid drive is that you know exactly who to blame if it mangles your data. I feel pretty confident that the caching in hybrid drives will be robust and reliable, even if it's far from optimal for any particular usage pattern .
Actually it [petrol] is not much of a fire hazard and save storage is easy
You've got to design a quick-swap connector that can support very high currents and voltages, and associated environmental shielding for the swappable battery. 150kW implies 150A at 1000V, or 600A at 250V, or something like that.It's got to be safe. It's got to work in an automotive environment, where salty water is being sprayed around it at 80mph. It's got to have consistently low resistance or else the car will go a few miles and cut out with a thermal alert (or just catch fire).
The lead-acid battery in your car is connected using a spanner and contact jelly, not a plug and socket, for very good reason. The much lower-current hot-swap connector in a single-kilowatt UPS is a weak point. I've seen what happens when it fails. Not pretty. I suspect Boeing's Dreamliner woes were also a failed design iteration of this same problem.
The trouble is that a battery degrades with age, and that the degradation is a function of how it is used. In contrast a Calor gas container is pretty much binary: OK with 100% capacity, or completely fubar.
The only way I can see to make it work would be for every electric car owner to pay a one-time battery fee (transferrable, refundable when the car is scrapped) and for all batteries to be owned by the motor company (or by the national battery bank(*) ). That, plus standardized batteries and swap-robotics across all electric cars. That, plus some in-battery circuitry so that its charge capacity is known when it's swapped, and folks not needing full range would get a discount on an age-degraded battery.
It's not impossible, but it maybe points at a need to give up private ownership of the cars themselves. The Streetcar model?
(*) I mean bank. They used to use gold as capital. Now they use lies. Why not use Lithium?
I firmly believe that NSA will (if they haven't already) crack quantum computing long before the private sector. When they do, they won't tell anyone.
I'm quite sure that they haven't already.
You may be right as of some near future, but the consequences of that will be greater and faster and stranger than you imagine. "The Laundry" plays this idea for (very uneasy) laughs, but I expect laughs would be the last thing on our minds. The result would be more like Skynet going active crossed with the Stargate sequence from 2001.
I don't expect the singularity to arrive this way, because I don't believe nature will support quantum computing work for numbers of qubits sufficient to break strong cryptography ... but I don't have any particular hotline to the future and may be proved wrong. In which case, may the Eschaton be merciful. Cracking cryptograms for our amusement will be the last thing on its mind.
Wrong TV show
These ballbots are, of course, prototypes for "Rover". I am not a number, I am a free man!
The dead maniac might have made a very dangerous convert.
Or it could have been reality imitating fiction. In "Iron Sunrise" Charles Stross imagined a place where they studied Hitler, Mao and Pol Pot an examples of leaders who were excessively lenient.
Not ironic at all. A simple case of "that's OUR job not YOUR job", or just "YOU do what WE say".
I'd be far less concerned about Google if I could be assured that governments could never get their hands on Google's data. Because there are no rules at all as to what a government can do. It's the government that MAKES the rules (and usually exempts itself from them).
Titanium. With a big enough magnifying glass, probably rather better than you expected. Rather an oops, requiring immediate retreat, and I hope you are outdoors. Sand won't put it out, and adding water would be like throwing petrol at an ordinary fire. Think Aluminium (thermite) on steroids.
There's a huge difference between 25 years and forever, or even just 250 years. I'm sure back in Faraday's time, the majority view was that electricity would never be of much use to anyone ... and of course, the famous quote that there might be a world market for a dozen or so computers.
A billion years of forever gave us self-replicators that didn't even need printers. (Or sex, for that matter ... sex was new in Life 2.0, or maybe 3.0 or 3.1)
I thought it was the Kiwis that had the giant insects, and the Aussies that had the small but deadly ones?
Which is why the UI on an iMac is not the same as the UI on an iPad and neither is the same as the UI on an iPhone. THREE classes of device, three different UIs. Not one size fits all, nor two sizes fit all. If Apple is smart, there's considerable amounts of common code underlying them, but the user doesn't care one way or the other. He just cares about having the right UI for the device and work to hand.
The mark for how badly MS has fscked up, is that I'm starting to sound like an Apple fanboi!
Apologies seem to be completely out of fashion. Especially prompt and sincere ones.
I would have hoped that the guidelines would suggest there's a major difference between someone who apologises once it becomes clear he's caused offense, or who retracts something that's both hurtful and inaccurate, and someone who refuses to and instead reiterates or reinforces his original post.
How does anyone how many desktops and laptops thre are out there running Linux? It's not as if you have to buy Linux from anyone. I burn a distro DVD for anyone who asks. Most of the hardware that is running Linux, has got a paid-for Microsoft license key glued to it, so it's being counted as a Windows sale by someone. Some started being used with Windows, and were then re-purposed. Some dual-boot. Some run a Linux VM in Windows (good way to start a personal migration) and some run a Windows VM in Linux (which can be faster than running Windows natively on the same hardware!) Some were nuked to Linux out of the box, and the Windows sticker represents a tax levied by a monopoly which you cannot avoid short of building your own PC from components. (Technically-minded Linux users do self-build far more often than Windows users, because they avoid paying the Windows tax that way, but it's not an option with a laptop).
I've seen more Linux laptops in our students' hands this year than I can remember in previous years. That's nothing compared to the increase in the number of Macbooks, though. Tablets of any sort less so - in a university, you tend to need a keyboard. Wouldn't be surprised if the better-off students also have a tablet back home. Just like me. And unlike me, they all seem to be toting smartphones, mostly Android or iPhones.
The masses are not buying Mac.
their old PC running XP or 7 breaks down and they go down to a box-shifter on the high street and walk away with a new PC. They may see "Windows 8" as a mild positive if they are clueless. They get it home. They hate it. Their money is gone, so they are stuck with it.
When they have enough money to be able to buy again, what are they going to buy? "Fool me once, shame on you. Fool me twice, shame on ME".
No, the OS is NOT irrelevant.
It's true that the PC market is a replacement market, and that people are now only buying new PCs because their last PC has broken down.
But this particular PC user does NOT want Windows 8. To me, its value is NEGATIVE, in other words I will pay more in order NOT to have to struggle with its numerous deficiencies. If I didn't have access to corporate downgrade rights (ie Windows 7) I would buy something from Apple. Or something running Android. Or install Linux onto a PC that came with Windows 8, if that remains possible and if I cannot find any alternative.
Microsoft is blind to this, and every day that goes by without Windows 7 or Windows 9-like-7 returning to Joe Public's marketplace is another day that Apple (in particular) eats further into Microsoft's home-user market. Folks DON'T NEED a new PC immediately, and they DON'T WANT one running Windows 8. Two different facts.
Toshiba may also gain a lot of reliability by using SLC. The other manufacturers are presumably confident that they can detect failing blocks in MLC and retire them without any detriment to the end-user, but SLC is intrinsically thousands of times more reliable as well as faster. And 1000x more write cycles means the firmware can be more aggressive (or responsive) about replacing blocks in the flash cache.
BTW it's possible that thinner actually confers some sort of advantage to the electromechanical parts n a single-platter drive. Perhaps head aerodynamics? Or that the shorter spindle down the middle is more rigid? Or maybe it's just that it'll find a home in new ultra-thin designs where the fatter drives can't venture.
I really wish I could be privy to Intel's boardroom's view of AMD.
My theory is that Intel regards AMD as an asset because Intel fears becoming a monopoly. That would mean (a) government interference and regulation, and (b) nothing to stop it increasing margins by decreasing R&D ... short-term gain ... until it missed something radical from a company that was completely off its radar. At that point it would be unable to respond fast enough, because its R&D would by then be a shadow of its former glory.
So AMD gets to play with exotic CPUs, while Intel gets to say it's not a monopoly because folks can buy from AMD instead. Some do, as proof thereof. Most don't. Intel has an intrinsic advantage from its process technology and scale. It does boring mainstream stuff extremely well. It can follow wherever AMD shows there is a serious new direction to move in (like x86-64, for example). As long as it keeps its process technology advantage, its designs only have to be 2/3 as effective to benchmark as equals, and it can improve on 2/3 x 3/2 on its next design iteration. In short, AMD keeps Intel on its technological toes without making a serious dent in its profits. Intel for its part keeps AMD in financial distress, but not terminally so. It could kill AMD with a price war, but doesn't want to.
Intel could probably also put AMD out of business through the courts for patent violations, but doesn't want to. Intel probably believes that it can get away with copying anything AMD has patented, because in a nuclear war between patent lawyers, the company with the deepest pockets will win even when it loses.
Well, that's my theory. Thoughts?
I'll have a go. (If your question is WHY is quantum reality like (or somewhat like) this, no-one has the faintest idea. It's like asking why gravity exists. At present, just assume that $DEITY thought it was a good idea).
A quantum bit is in both states 0 and 1 at once, with a probability of 50% that when it is observed, it will be a 1 or a 0.
Entangled bits are in a state where their properties are mathematically correlated, so for example two qubits represent 0,1,2 and 3 at the same time. Call an ensemble of entangled quantum bits a quantum register. Start so that its N bits represent all humbers in the range 0 to (2^N)-1, with equal probability of 1/(2^N), at once.
Now perform arithmetic operations on your N-bit quantum register. For example if you add a quantum register to itself and then observe it, you are guaranteed an even number. The self-addition operator causes bit 0 to assume a zero probability of it being a one. So far, somewhat trivial.
It gets really interesting if they can make a quantum register with a large number of bits, and perform operations on it such that the probability of it representing any number that is not a factor of a specified much larger number is zero. If the specified number is the product of two large prime numbers, when you observe the quantum register you will obtain one factor or the other. The probability of any other number (pattern of observed bits) is zero. Suddenly not nearly so trivial!
You obtain your other factor, and also check that the quantum entanglement had not failed during your quantum computation, by conventional long division on a conventional computer. Entanglement failure during a quantum computation is akin to a hardware failure in a conventional CPU except far more likely and (on the positive side) not irreversible. For the factorisation application (i.e. cracking N-bit PK cryptography) even an unreliable quantum computer is useful, just as long as it occasionally spits out a right answer with a much higher probability than 1/(2^N) (i.e. guesswork).
Last time I read up on it, they'd managed a four-bit quantum computer and were able to factorize 15 (in other words, when observed after mathematical operators applied, their 4-bit register almost always said 3 or 5, rarely anything else). Of course that's trivial, but a 2048-bit quantum computer would be anything but. I don't know what's the latest N qubits? 30-plus would start getting practically very useful, and probably very secret.
My money is on quantum computing breaking down somewhere between four bits and 2048 bits, and that this will prove to be a gateway to some new physics. (Hopefully not of the Laundry variety, involving a takeover of our universe by beings best not even thought about). Philosophically, I'm not prepared to take on board the implications of a working Megabit quantum computer (or Giga - Tera - Zetta- ...!)
BTW if you are doing research in this field and make a sudden breakthrough of huge magnitude, your only chance of staying alive and at liberty is to spam the details to as much of the world as you can as fast as possible. There are also some mathematical theorems (unproven but believed true by almost all mathematicians), for which a disproof would have similarly vast real-world implications.
It's called chaos, or sensitivity to initial conditions. And also, you are comparing chalk to cheese. The black hole model is attempting to model possible behaviour of gas (plasma, surely!) in that environment, and to deduce some average properties such as its temperature. Weather forecasting is an attempt to predict exactly what state the atmoshere will be in tomorrow or next week, given observations of what state it's in today. A black hole type model simply tells you that it's possible that somewhere on the planet you will find a square mile where the air temperature is 40C and the humidity is high. And that nowhere will you find such a square mile where the temperature is 80C.
The weather forecast model is chaotic, or sensitive. You can't measure the atmosphere's current state with complete accuracy. The measurement errors grow with simulated time, to a greater or lesser extent, until the model is wholly erroneous when compared to the later reality. (It still represents possible weather, just not the correct realisation thereof). On the bright side, your forecast for a week out is useful maybe two times in three, and that for two weeks out is a waste of computer time. In the worst case, the not-quite-hurricane that is going to devastate northern France in six hours' time does a sudden right-angled turn and devastates Southern England instead. (Michael Fish, you are forgiven).
No, that can't be it. If they believe that a phone is going to be used as a trigger for a bomb, all they have to do is ask (or require) the phone company to disconnect that phone number (or better, for evidence-gathering, to divert the phone number to their evidence-gatering facility). If they can't do this today, it's a relatively uncontroversial bit of legislation to fix it so they can.
On the other hand I can see why they might want to permanently kill every phone within 100m of a demonstration at which police brutality was happening. Other things like that. And of course, it's completely impossible that some hostile foreign government might get its hand on the kill switch for every mobile phone in the USA. Isn't it?
Really FIX them? Or just work out how to steer the lusers around the rocks?
I don't know the details of what exactly the supremes did allow them to patent. As I said, if it's the novel lab or field techniques that allow testing for the BRCA1 gene, that may be fair enough. But if it's the complementary DNA sequence itself, that's really no different to patenting the BRCA1 gene (which the supremes disallowed).
An analogy might be a court saying that you aren't allowed a patent on the 23rd page of "Wuthering Heights", but letting you obtain a patent on the same with all occurences of the words "The" and "A" deleted. Complementary DNA is the original DNA with sequences known as introns deleted.