back to article Microsoft in hunt for the practical qubit

Redmond says it has joined the search for a practical qubit, in an effort to kick along the development of quantum computers. Head of research at Microsoft Peter Lee has told the MIT Technology Review digital summit that Redmond will be supporting research in other labs with funding, as well as doing its own work at its …

COMMENTS

This topic is closed for new posts.
  1. Charles Manning

    A quantum computer running Windows?

    MS are no doubt up to the challenge of bloating even a quantum computer to death.

    Might as well just stick with a regular computer running *nix!

    1. Rol

      Re: A quantum computer running Windows?

      Seeing as MS tends to create software that exists in the superposition state of working, but not working properly, I doubt they have the capacity to write the code necessary to lever anything of use from a quantum computer.

      1. Vic
        Joke

        Re: A quantum computer running Windows?

        MS tends to create software that exists in the superposition state of working, but not working properly

        The Blue-Cast Screen of Death?

        Vic.

    2. Trevor_Pott Gold badge

      Re: A quantum computer running Windows?

      "MS are no doubt up to the challenge of bloating even a quantum computer to death"

      Uh...what? Windows 7 consumed fewer resources than Vista. Windows 8 fewer than 7 and 8.1 fewer than 8. Say what you will about Microsoft, but they have done a damned fine job of delivering ever more functionality while requiring less and less horsepower to drive it. (Though they still have that "storage footprint" problem.)

      1. Wzrd1 Silver badge

        Re: A quantum computer running Windows?

        "Say what you will about Microsoft, but they have done a damned fine job of delivering ever more functionality while requiring less and less horsepower to drive it."

        And in another century or two, they'll finally catch up with *nix.

  2. Destroy All Monsters Silver badge
    Holmes

    Needs better explanation

    With enough qubits, a quantum computer could therefore represent an awful lot of states simultaneously. If you can then ask the quantum computer the right question, its waveform should collapse into the answer.

    With the answer given having a certain classical probability of being correct.

    The quantum computer therefore – in theory – lets you get a complex answer in a single operation, rather than having to step through lots of iterations, as in a classical computer.

    Only for the following algorithms:

    1) Factorization

    and

    2) Searching in databases

    All the other classical turing-machine algorithms have to STAY classical.

    MORE HERE

    1. Trevor_Pott Gold badge

      Re: Needs better explanation

      Uh...what? There's lots of quantum computing problems. Natural language comprehension is one. There's a lot of research going into the idea that mammalian neural processing employs a certain quantum component that allows us to solve "hard" problems faster than we should be able to, given the number of synapses, and our limited storage capacity.

      A quantum computer isn't going to play Crysis...but it might well be an important part of a true AI. Another place it could come in handy is simulating human behavior. Quantum-assisted SimCity anyone?

      1. Michael Wojcik Silver badge

        Re: Needs better explanation

        There's a lot of research going into the idea that mammalian neural processing employs a certain quantum component that allows us to solve "hard" problems faster than we should be able to, given the number of synapses, and our limited storage capacity.

        Is there? There are certainly some well-known proponents of such an idea, Penrose chief among them, but I don't know of anyone doing actual, serious, methodologically-sound research on the topic. Can you cite any? (Personally, I think such proposals are woefully unconvincing.)

        There's lots of quantum computing problems.

        There are many problems to which QC can, in theory, be applied, but all the ones I'm aware of are instances of (or isomorphic to) the quantum Fourier transform. That includes factoring (Shor's algorithm) and database searching (Grover's algorithm), both of which are instances of the hidden subgroup problem, which is a subset of the qFt. DAM may have overstated the case a bit, but AFAIK his post was basically correct.

        Natural language comprehension is one.

        I'm not aware of any serious research into applying QC to any problems in Natural Language Processing, except as applications of hidden-subgroup problems. See for example Li & Cunningham 2008. If you know of work that doesn't fit that description, I'd be interested to hear about it.

        Disclaimer: This isn't one of my areas of research.

        1. Trevor_Pott Gold badge

          Re: Needs better explanation

          Quantum effects of mammalian neurology is something that at the very least I know a group at the University of Alberta is working on. To listen to them talk about it, they've got folks around the world they're working with on the problem. I know the people involved and they don't do frivolous research. I also happen to know they aren't really anywhere near publishing.

          Next time I sit down for beer with 'em, I'll poke more into the details.

          As for qbits being used for natural language comprehension, I wish I could go into details, but it honestly does fall under "protecting a sources." I know that sounds like a cop out, but I am sworn to secrecy about the whole affair.

  3. Inventor of the Marmite Laser Silver badge

    Practical?

    Microsoft?

    After Windows 8?

    1. Trevor_Pott Gold badge

      Re: Practical?

      Windows 8 was an unmitigated disaster...and not the only one. But Microsoft has a massive R&D department of incredibly smart people that does a truly fantastic job of coming up with great tech. Even practical tech.

      Admittedly, a lot of what they do is take someone else's work and refine it (see: Kinect), but they do have a talent for finding ways to take otherwise PhD-level stuff and beating it into something the average prole can use.

      ...or they did. Today's Microsoft seems to think that PowerShell is the answer to everything, and we all are able to memorize entire tomes by rote. The company is a schism: one side focusing on the end user too much, to the point that they focus on their vision of what's 'easy' and to hell with everyone else. The other side focusing on pragmatic, practical technologies that the average user is never going to be able to use.

      ...but somewhere, deep inside that company...there are a great many people who damned good at taking any technology you could imagine and making it approachable. They just need to be given the reins.

      1. Wzrd1 Silver badge

        Re: Practical?

        "Admittedly, a lot of what they do is take someone else's work and refine it..."

        You mean butcher it, "giving" out an emasculated version, rather like their first antivirus solution, their stolen first disk compression solution, terminal services that are chopped up versions of Citrix, etc.

        Microsoft either steals or buys products, giving a limited and sometimes broken version with the OS.

        That said, Win 7 beat the crap out of Vista. But then, one recalls Win ME...

        New tech from MS tends to be... Broken.

        Then, next version gets it somewhat right and later versions finally get it right.

        But, a *good* OS is one one does not have to reboot weekly.

        Full disclosure:

        I have long been a Windows SA and AD admin, as well as *nix admin. I'm not an information security professional. I'm comfortable enough with Windows of any version as to happily delve into the registry and perform a bit of surgery as needed. Even to the point of manually exterminating malware, just for a bit of fun and figuring out what the crap is actually doing.

        Not to drop too many versions, but I can still edit win.ini and system.ini on a Windows 3.1x and can still administer a Windows NT 3.51 server, NT4 server, 2000 server, etc.

        1. Trevor_Pott Gold badge

          Re: Practical?

          To be fair, you reboot Windows monthly nowadays, and with a core install of Server (or when using Hyper-V) that can be stretched to 3 or even 6 months, on average. Plus, for their servers at least, they have cluster-aware updating.

          I do, however, agree that "a bandaid on top of a band-aid on top of a band-aid" is Microsoft's default approach to this...and that's bad.

          Windows RT should have been the opportunity to do it from scratch, and do it right. New chip architecture, new guts, no legacy cruft. Less application support, sure...but that will get solved if you make the thing not suck (and charge a reasonable price for it.)

          Sadly, what Windows RT ended up as was the worst of both worlds: all the badness from the x86 line ported into their ARM line, with none of the app support.

  4. oldcoder

    ...whoever successfully builds ...

    a reliable, mass-producable qubit will have IP of incalculable value...

    Yes... either it will be worth millions... or nothing.

    After all, once the value is measured it collapses into a value.... until it is measured it has no value...

    And with Microsoft doing the measuring it will be "incalculable"... and if somebody goes to buy it, will find it running some virus or other... and of no value.

    1. Deltics

      Re: ...whoever successfully builds ...

      It will be worth millions AND nothing, at the same time.

      Only by assessing it's value will the waveform collapse into an actual value, at which point of course the Uncertainty Principle dictates that the value is no longer what you think it is.

  5. Pascal Monett Silver badge

    "whoever successfully builds a reliable, mass-producable qubit"

    Will have Microsoft and everybody else banging on their door with Vuitton bags of shares worth billions.

    Come on, Microsoft will not invent this, it'll just buy out whoever does.

    And if whoever does doesn't sell because they're intelligent enough to not sell out, then Microsoft will finally have to publicly bow before the new king of computing and resort to its usual FUD tactics to try and keep people buying its stuff.

    1. Trevor_Pott Gold badge

      Re: "whoever successfully builds a reliable, mass-producable qubit"

      "Not sell"

      You're a funny, funny man.

      Also, how the hell is a 5 man startup with $0.5M in angel funding going to take a mass producible qbit to market? That isn't going to run classical tasks. Or classical OSes. You'll need massive R&D just to get the right questions to ask. You'll need to develop a OEM partnerships, a channel, a developer community, a support network...and all of it while fighting off massive amounts of FUD from established players.

      Christ man, companies today doing nothing more revolutionary that iterating a technology so that you can defer your purchase of a big shiny by another year or two (thus saving your company a few million) get buried under the power of the establishment. Do you honestly think that an HP or an IBM is going to let Bob's Bit Shack and Hot Dog Emporium come to market with a technology that could render them obsolete, or at least threaten to take away the high-margin portion of the market?

      Really? You honestly believe that? I want whatever you're on.

      If someone who is not a major comes out with a technology this revolutionary the only chance they have of bringing it to market is if one of the founders is an Elon Musk-class osmium-testicled billionaire willing to bet their entire fortune on a roll of the proverbial. And so far as I know, there's only one Elon Musk.

      That means they'll sell. Even if they're a crazy high-end VC that normally rolls the hard six looking for a multi-instagram payout, they'll sell. They might take it to Nutanix size, but they'll ultimately sell.

      Mark my words. Bookmark this comment. Throw it in my face if I'm wrong...but I promise you, I'm not.

      1. Wzrd1 Silver badge

        Re: "whoever successfully builds a reliable, mass-producable qubit"

        "Do you honestly think that an HP or an IBM is going to let Bob's Bit Shack and Hot Dog Emporium come to market with a technology that could render them obsolete..."

        Erm, you *do* realize that quantum computers won't replace those companies "bread and butter" product lines, don't you?

        That said, both corporations would likely license any mass produce-able quantum computer technology and produce their own high end products. Especially Big Blue.

        You're not even comparing apples to oranges, you're comparing apples to granite slabs. Not even in the same kingdom, vegetable and mineral, quantum and binary.

        1. Trevor_Pott Gold badge

          Re: "whoever successfully builds a reliable, mass-producable qubit"

          "Erm, you *do* realize that quantum computers won't replace those companies "bread and butter" product lines, don't you?"

          Um...yes they will. "Bread and butter" for HP, IBM and their ilk does not mean "sells the most volume of". "Bread and butter" means "is the most high margin. For HP/IBM/etc that isn't the commodity x86 servers that they're currently losing their shirt on. It's things like mainframes, custom interconnects, HPC ASICs and other tools of the very, very high-end and specialized trade.

          To keep this short: these are the exact places where quantum computers will go. Your high-end mainframe will end up replaced by a combination ultra-resiliant x86 cluster and an quantum computer to handle the hard questions/big database/big data problems. You won't need all the high-margin custom gear HP/IBM/etc makes. You can use a quantum computer and a handful of PhDs to achieve the same thing.

          That's the problem. This means the "mundane" portions of the workloads can get off the really high-margin mainframes and the tricky stuff can be farmed out to the pile of qbits in the corner. If those qbits are supplied by a startup then there's a damned good chance that the Big Customers will be able to bully the startup (rather than the other way around) and get a Great Deal.

          That's the end for HP and IBM, at the very least. If someone comes out with this stuff and they don't control it, they're done. And frankly, whichever of those two can manage to get a working proper quantum computer first...wins. The other one will fold, as there won't be a need for their high-margin services.

          Without their high margin services, they can't compete against the likes of Lenovo, ZTE and Quanta. They'll go quietly into that good night and nobody will notice.

          Except the thousands upon thousands that don't have a job, of course.

          "Bread and butter" is what keeps the company going. If you think that's commodity servers, you're mad.

          1. Michael Wojcik Silver badge

            Re: "whoever successfully builds a reliable, mass-producable qubit"

            It's things like mainframes, custom interconnects, HPC ASICs and other tools of the very, very high-end and specialized trade.

            To keep this short: these are the exact places where quantum computers will go. Your high-end mainframe will end up replaced by a combination ultra-resiliant x86 cluster and an quantum computer to handle the hard questions/big database/big data problems.

            Wow, Trevor. Usually I find your comments fairly reasonable, even if I don't agree with them; but you've really lost the plot on this one.

            QC offers very little for the vast majority of mainframe workloads. They're rarely CPU-bound in the first place. And the major barriers to replacing mainframes with an "ultra-resiliant x86 cluster" are perceived risk, decades of strange proprietary add-on software and obscure APIs, and customers' lack of knowledge about what they're actually running.1

            Very few businesses are using mainframes for big-data processing. They may have terabyte databases, but they're not dealing with big-data loads.

            And QC doesn't help with many big-data problems anyway. Grover's algorithm is optimal, and it runs in O(N1/2) time and O(lg N) space. So if a search would have taken an hour on a classical computer, it'd take a little under 8 minutes on a QC, all else being equal - and that's only if you have enough qubits. For large N, even lg N starts to become a problem if you're running a lot of simultaneous queries - and if you're not, why is QC useful for your application? - if the resource is scarce.

            As for what QC is supposed to do for "custom interconnects" I cannot guess.

            Scalable QC at a reasonable price has some applications - primarily in finding better approximate solutions to intractable problems (and in requiring us to double the lengths of our RSA keys, if we're using them to protect anything really valuable). Combined with graph sparsification and similar algorithms it could be used for some interesting stuff. But its impact on the mainframe market will be about as noticeable as what it does to pocket calculators.

            1Many of the potential customers in our market can't even start to disentangle the thousands of undocumented programs they have on their mainframes, in order to find a subset suitable for a trial migration. Even with the help of source-code application-suite analysis tools. And that's when they have source.

            1. Trevor_Pott Gold badge

              Re: "whoever successfully builds a reliable, mass-producable qubit"

              "Wow, Trevor. Usually I find your comments fairly reasonable, even if I don't agree with them; but you've really lost the plot on this one."

              Only because you seem to believe that QC is only good for two different algorithms. I'm far less convinced.

              "QC offers very little for the vast majority of mainframe workloads."

              Absolutely agree.

              "They're rarely CPU-bound in the first place."

              Again, agree. That said, however, the few things that are CPU bound are typically great bit huge database work. A huge chunk of that i I/O bound, but even when you can get enough of the DB into fast enough memory you run into CPU issues. This is not only where I think QC can help, it's also one of the things x86 can't really do well. (Power, Itanic et al having largely evolved to deal with these problems while x86 kept on the general compute path.)

              And the major barriers to replacing mainframes with an "ultra-resiliant x86 cluster" are perceived risk, decades of strange proprietary add-on software and obscure APIs, and customers' lack of knowledge about what they're actually running.

              Again, agree. That said, a lot of customers are looking to rewrite and move off onto ultra-resiliant x86 clusters. While some of that is possible, a major barrier is the ability to move the great big databases off, while still retaining the performance.

              Very few businesses are using mainframes for big-data processing. They may have terabyte databases, but they're not dealing with big-data loads.

              An interesting assertion, and not my understanding at all. I am lead to believe that many businesses using mainframes are working with giganamous databases that they have to do a large number of searches against. Datasets are so large that the searches become a problem for x86. I'd be quite happy to be proven wrong on that.

              And QC doesn't help with many big-data problems anyway. Grover's algorithm is optimal, and it runs in O(N1/2) time and O(lg N) space. So if a search would have taken an hour on a classical computer, it'd take a little under 8 minutes on a QC, all else being equal - and that's only if you have enough qubits. For large N, even lg N starts to become a problem if you're running a lot of simultaneous queries - and if you're not, why is QC useful for your application? - if the resource is scarce.

              Where QC helps - and for that matter, mainframes too - is searching a large dataset quickly. Traffic simulation and logistics are both repeated to me as examples of workloads where, apparently, multi-squillion-dollar mainframes are required and x86 clusters just don't do what is needed.

              As for what QC is supposed to do for "custom interconnects" I cannot guess.

              I don't think QC will replace custom interconnects. I think A3Cube and like setups will commoditise high-speed, low-latency interconnects to the point that there's no longer a need for the custom stuff. Thus the margin will evaporate.

              That means that the real money will shift to quantum interconnects as the demand for secure transmission grows. Will that be in-datacenter? Probably not. But in the networking world, I think the margins are going to move away from lashing together servers and towards quantum-secure comns. (Which, apparently, we can now do using mostly regular equipment? I need to investigate that more...)

              Many of the potential customers in our market can't even start to disentangle the thousands of undocumented programs they have on their mainframes, in order to find a subset suitable for a trial migration. Even with the help of source-code application-suite analysis tools. And that's when they have source.

              And yet they are trying. They are migrating. A trickle here, a trickle there...and this business is evaporating. What happens when the heavy lifting of the DBs (and their associated gobs of RAM) is no longer needed? When your "mainframe" can be stuffed into 2U + a 4U QC to run all that legacy stuff? I doubt you'll be getting the kind of money for it that you were getting when you could sell two whole racks to do the same job...and that's my point.

              QCs on their own are not going to kill the mainframe. They're just one additional wound. Mainframes are dying the death of a thousand papercuts as technology in general makes them no longer relevant.

              I just think that QCs ability to deal with big databses, fast factoring and - if my sources are correct, natural language - will take some away some of the remaining "you need a mainframe for this" workloads...hence stealing the margin.

              1. Michael Wojcik Silver badge

                Re: "whoever successfully builds a reliable, mass-producable qubit"

                Only because you seem to believe that QC is only good for two different algorithms

                Only if you didn't read my first post in this forum carefully. There are a handful of algorithms in BQP. If you know of a problem that's not isomorphic to one of them that looks likely to be in BQP, I'm sure I'd be interested to hear about it.

                Maybe there are other problems that unexpectedly will turn out to be in BQP. And maybe the unicorns will finally emerge from the forests and let us use their magic pixie-dust computers that can solve any problem in two shakes of a tail. But I'm not going to bet on it. And if I were running IBM's mainframe division, I wouldn't be betting on it.

                That said, a lot of customers are looking to rewrite and move off onto ultra-resiliant x86 clusters

                I'm well aware of that, since they pay my salary.

                I am lead to believe that many businesses using mainframes are working with giganamous databases that they have to do a large number of searches against

                That's not "big data". That's just a big database, and it's quite typical, and largely a solved problem. You can move TB-sized databases to Windows and UNIX clusters today. Mainframe databases are typically used for OLTP, OLAP, and similar workloads. Clusters with fast network storage can generally handle the data queries pretty well.

                Real big data is stuff like Google, and enormous scientific datasets, and huge amounts of continuously streaming data.

                Where QC helps ... is searching a large dataset quickly

                Searching a large in-memory database quickly. That could be useful for searching a large-for-in-memory-but-small-relative-to-conventional database at a very, very high rate of queries, if indexing isn't an option (generally because the data is changing too fast). That's a pretty small subset of applications.

                In-memory databases are not common on mainframes.

                QCs on their own are not going to kill the mainframe. They're just one additional wound. Mainframes are dying the death of a thousand papercuts as technology in general makes them no longer relevant.

                Yeah, yeah. This death has been reported non-stop since the mid-1980s. I don't expect mainframes, in their current incarnation (say, the direct descendants of the IBM 360 architecture, which is what we have now with System z) to last forever; few technologies do. But I don't expect them to be gone while I'm still in the business, and I'll be quite surprised if they're not still around when I'm on my deathbed.

  6. Anonymous Coward
    Anonymous Coward

    RE. Re: "whoever successfully builds a reliable, mass-producable qubit"

    I posted a few ideas recently, but so far little or no interest.

    I did however get a thank you email back from NASA re. my solar axion ideas so at least on one front someone has read my hypothesis and done an unbiased analysis.

    If anyone is interested, my cooled microSD based device is showing definite signs of quantum behavior but a lot more work is needed and probably substantial funding to get it to the same stage as D-Wave's system.

    Is it possible to patent something that already exists, but used in a novel or inventive way?

This topic is closed for new posts.

Other stories you might like