back to article What are quantum computers good for?

The problem with trying to explain quantum computing to the public is that you end up either simplifying the story so far as to make it wrong, or running down so many metaphorical rabbit-burrows that you end up wrong. So The Register is going to try and invert the usual approach, and try to describe quantum computing at a more …

COMMENTS

This topic is closed for new posts.

Page:

  1. That Awful Puppy
    Thumb Up

    This may or may not be a great article

    I think the author should have started with "here are a few real-world, non-crypto applications, which we'll explain in greater detail later, now let me tell you how these bastards work," because he nearly lost me at the start. I *think* it's well written, but I know so little of the subject matter, and comprehend even less, that I cannot be entirely sure whether it's just me thinking I got the gist of the article, or that I actually got it.

    1. Steen Hive
      WTF?

      Re: This may or may not be a great article

      Until you observe it, then its wave function collapses into a point I don't understand.

    2. Anonymous Coward
      Thumb Up

      Re: This may or may not be a great article

      As someone who has read the Reg' since about two months after MM started it, and spends an awful lot of time thinking about this theoretical physics stuff, I think this is certainly one of the best articles El Reg' has ever done.

      1. Evil Auditor Silver badge
        Thumb Up

        Re: This may or may not be a great article

        @LeeE: I totally agree. But where is the "rate this article" gone?

        (Just wondered why no one asked how to run Crysis on a quantum computer.)

  2. John Smith 19 Gold badge
    Meh

    Ummm. Sort of helpful

    but why do all these applications look more like special purpose *analogue* devices, built to carry out *one* task.

    Where is the quantum equivalent of a device which allows the function it computes to *change* by loading it with a set of commands (or "program" as I like to call it).

    Right now this looks like optical computing did in the 1960's and 70's. A *huge* leap forward in MIPS which essentially proved too awkward and specialised to use. Every niche market needed a *different* set of hardware and the bog standard (electronic) hardware just kept getting better.

    Sure *theoretically* these quantum computing devices beat the pants off conventional digital hardware, but so did the optical systems of their day.

    Who uses them today?

    1. ZankerH

      Re: Ummm. Sort of helpful

      That was back when electronic transistors were measured in microns and processors were simple enough to be understood by a single person at the transistor level. As it is, (silicon) electronics are hitting several brick walls of physics, namely gate size, transistor count, heat dissipation and switching speed. By the end of this decade, the only way to make more powerful computing devices with silicon electronics will be to make them larger (since MOSFETs will have been shrunk as far as is physically possible) and therefore more power consuming. That's why we _need_ a new paradigm shift, and quantum is looking as good as any other - unlike the 70s, bog standard hardware doesn't have much space for improvement left.

      1. John Smith 19 Gold badge
        Unhappy

        Re: Ummm. Sort of helpful

        "As it is, (silicon) electronics are hitting several brick walls of physics, namely gate size, transistor count, heat dissipation and switching speed."

        A current transistor is about 140 atoms wide so with a doubling of density every 18-24 months it's more like a decade and a half. The next two are down to imaging techniques and the last to 3D designs like the Intel "fin" shape. You're looking at about twenty years but the walls *are* approaching.

        Note that moving to a "clockless" design would probably eliminate *half* the transistor count on a modern chip and a *lot* of the heat being generated doing nothing.

        But then you could not sell faster chips at a premium.

    2. Anonymous Coward
      Anonymous Coward

      Re: Ummm. Sort of helpful

      What this article was good about was separating the quantum computing device from the IO device; the tricky bit isn't the qpu calculating all the possible answers, but rather it's the selection of the particular answer you want.

      Think of it this way: the answer you get from a qpu is a 2-D area of smoothly blended different colours but the answer that you need is in the form of a 1-D line on a graph; there are infinitely many 1-D lines within that finite 2-D area, so the tricky bit is pulling out the particular 1-D line, from all the other 1-D lines, that you need to get the answer to the particular problem you're trying to solve.

    3. fajensen

      Re: Ummm. Sort of helpful

      Who uses them today?

      Electrical Engineers - We want to wring ALL the available power out of huge IGBT's and GTO's for f.ex. traction systems, so we want to run the chip inside as hot at possible. A convenient way is to build an analogue thermal simulation model of the device, feed it with realtime voltage and current, out pops the temparature which we can use as a trip and/or to limit the load applied to the converter.

      Compared to a DSP-based system, which we would need to accurately digitise the complex waveforms, the analogue models are much simpler and easier to build, test and verify. The hardware is more robust too.

  3. David 45

    Do what? !!

    I got lost pretty early on. B*gg*r*d if I know what's supposed to be happening here. Smacks of smoke, mirrors and magic to me!

    1. frank ly
      Pint

      Re: Do what? !!

      "Smacks of smoke, mirrors and magic to me!"

      Any sufficiently advanced technology is indistinguishable from smoke.

      (Beer: to have with your smoke - assuming you do.)

    2. Anomalous Cowshed

      Re: Do what? !!

      Quantum computers sounds like boll*cks to me: an idea that is touted around but never properly explained with practical and concrete concepts. Other terms that come to mind are 'whalesong' and 'snake oil'. What is a quantum computer? In what way does it differ from a normal computer? How is it made? How does it compute (I mean really as opposed to in some poncy fool's mind?) What are the results to be expected from given inputs, apart from mere bollocks? What are the speed increments over standard technology? How come it's so clever? How come it's so complex? That's what we want to know, and until such a thing is understandable by reasonably clever computer literate people then I say it's bollocks. Or we're all thick and some fancy 'researchers' in laboratories, with access to media time, are cleverer by several orders of magnitude.

      1. Gordon 10
        FAIL

        Re: Do what? !! @Cowshed

        Your thoughts echo mine - however after some brief additional thinking:

        The point is you dont need to know how to understand how something works to use it - be it a wheel or a digital computer.

        Even on El Reg how many of us fully understand how a PC works from Begining to End - from an intial flow of 1 or more electrons in a single simple circuit to how the character A I just typed appears on the screen?

        I would wager very few. For example I once understood how a circuit worked on a physical level. I once understood basic QM. I studied simple IC design, I wrote Assembler for a while, and have used half a dozen other programming languages, and I've done a multitude of things with a PC up to but not including shagging the damn things (yet).

        Do I understand how all of that collates into a coherent whole? Do I fuck.

        As one famous scientist (Einstein? Rosenburgm? Feynman?) once said - if someone tells you they understand QM they are probably lying.

        1. Michael Wojcik Silver badge

          Re: Do what? !! @Cowshed

          Even on El Reg how many of us fully understand how a PC works from Begining to End - from an intial flow of 1 or more electrons in a single simple circuit to how the character A I just typed appears on the screen?

          I would wager very few.

          That depends on what you mean by "fully understand". It's a vacant phrase - you could argue from epistemology, psychology, or neurobiology that it's not possible to "fully understand" anything, defined strongly enough.

          In the course of my CS degree I studied everything from electronics to logic circuits to chip design to CPU function to OS design to application programming. I can't think of a level of abstraction in a classical computer that I'd consider a mystery, even if my understanding of them isn't complete. I know my classmates have the same training, and surely we're not a particularly rare breed in the industry.

          So while I agree that it's not necessary to understand something "fully" in order to use it, I'd also suggest that many people who work in technical fields do, in fact, have a pretty decent understanding of the equipment they work with. And if they need to understand QC someday, they'll be able to pick up a pretty decent understanding of it, too.

          (Incidentally, we - that is, all of humanity - still don't "fully" understand a lot of our simple machines. Physicists still can't explain the dynamics of bicycles, for example; it's been shown that neither the gyroscopic effect nor the caster effect explains bicycle stability. We have a pretty good idea how levers and inclined planes work, though, so that's two of the first three.)

      2. Michael Wojcik Silver badge

        Re: Do what? !!

        until such a thing is understandable by reasonably clever computer literate people then I say it's bollocks

        I'm reasonably clever and computer-literate, and I understand it.

        Really, it's not that difficult. With a lay understanding of superposition (you don't need to understand the mechanics of it, or the mathematics), you should be able to follow Shor's algorithm, for example. It's much simpler than a great many well-known classical algorithms.

        And since we have working quantum computers - not large enough to do anything interesting with, but enough to prove the concept - it's hardly snake oil. Anyone claiming commercial QC is suspect, of course, because there are a host of practical problems yet to be solved, and it looks very unlikely, IMO, that QC will ever be ubiquitous the way classical computing is; there's no reason I can see to stick QC in most embedded applications, or in consumer general-purpose computers. But that doesn't mean it might not some day be common in the sort of applications listed in the article and in some of the more-informed comments.

  4. MondoMan

    Computational complexity eliminated, or just moved to I/O?

    On page 2, under 'Constant vs. balanced", I'm puzzled by the description of the Deutsch and Jozsa method. Surely, adding an additional digit to the input and output qubits is the moral equivalent of running the single-digit test an additional time. Thus, using the quantum method does not reduce the overall time/expense of the operation, but simply shifts the time/expense from the computation to the I/O part of the whole operation.

    1. Anonymous Coward
      Anonymous Coward

      Re: Computational complexity eliminated, or just moved to I/O?

      Sort of a half assed response to you since I know nothing about quantum complexity analysis, but if I had to formalize the statements a little. Assume that you are reading the output string that is either all 1s or all 0s. Reading this takes N operations, where N is the number of qubits. So lets say that adding an extra bit, adds one extra check. T(N) = T(N-1) +1. And T(1) = 1. Therefore T(N) = O(N). Let's look at the initial superposition part. Let's assume that the super position of two qubits is an O(1) operation, that means that the above analysis would hold for the input analysis. So, without knowing anything about quantum complexity analysis, and being a bit of a noob in regular complexity analysis, I would postulate that they are taking a problem that would be exponential in time complexity in a traditional computer, and making it linear time complexity in a quantum computer. Don't entirely know what I am talking about, but maybe that simplifies that matter for you.

      1. MondoMan
        Pint

        Re: Computational complexity eliminated, or just moved to I/O?

        I was thinking that the traditional computer operation was also linear in time complexity, as you're just adding a constant time black box check in going from N checks to N+1 checks. Of course, I'm also not entirely sure of what I'm talking about here, so I appreciate your bringing the famed worldwide Anonymous organization to bear on the issue :)

        1. Steve Knox
          Boffin

          Re: Computational complexity eliminated, or just moved to I/O?

          No, the point is that you have to run the classical computing function up to N/2+1 times (where N is the number of possible inputs), whereas you only have to run the quantum function 1 time. Here's an example for N=4:

          Classical

          Input Balanced Constant

          1 0 0

          2 0 0

          3 1 0

          4 1 0

          Quantum

          Input Balanced Constant

          0.0 ~2.0 0.0

          So you'd have to run the classical algorithm 3 times to see the difference between a balanced and a constant function, whereas you'd only have to run the quantum function once, with any non-zero answer indicating a balanced function.

          1. Michael Wojcik Silver badge

            Re: Computational complexity eliminated, or just moved to I/O?

            Drew, could we please add <pre> to the list of approved HTML tags for posts?

            (And I believe I already mentioned <sup> and <sub> in another post.)

  5. Singlewhip
    Unhappy

    Navel-Gazing

    Your last example - using quantum computers to do quantum calculations - turned me off completely. If the most important thing we can do with these is use them to understand themselves, the whole exercise seems pretty much like climbing down a rabbit hole.

    1. Steve Knox
      Boffin

      Re: Navel-Gazing

      If the most important thing we can do with these is use them to understand themselves, the whole exercise seems pretty much like climbing down a rabbit hole.

      On the contrary, (and contrary to my joke post), I think you'll find that a significant amount of classical computing resources have been used to understand the finer points of classical computers -- and this has given us the insight to engineer improved classical systems at the dizzying pace of Moore's law. Applying the same effort and use to quantum computers will allow them to develop similarly. Not wasted effort at all.

    2. Ru
      Paris Hilton

      Re: Navel-Gazing

      What do you use to make tools with?

  6. david 12 Silver badge

    between the start and the end was...

    The discussion of new applications was the bit I found new and interesting.

  7. david 12 Silver badge

    DNA cryptology?

    What happened to the 'emerging science of DNA cryptology' circa 2009? I'd read an article about that from the author of this article.

    1. Michael Wojcik Silver badge

      Re: DNA cryptology?

      More broadly, "DNA computing", pioneered by Adelman of RSA fame.

      There's still some ongoing research and even some commercial development being done. DNA computing is useful for applying massive parallelism to certain NP problems that can be expressed in the appropriate form. It's a form of classical computing, but using very small elements that can be automatically duplicated, which lets you create a really big array.

      For cryptology in particular, the first few results proved the concept, and I think most crypto researchers looked at them, said "yes, that works", and went back to doing more conceptually-interesting stuff (particularly with the mathematics of crypto).

      Plenty of papers show up in a search. But yeah, an article summing up the current state of the field could be interesting.

  8. Steve Knox
    Trollface

    Did someone say oversimplification?

    Let’s look at two [examples where quantum computers have the potential to outperform classical ones].

    The Fourier Transform ...

    Another example is here: a quantum algorithm for solving linear equations (where you have a matrix and a known vector, and wish to compute an unknown vector).

    So, essentially, MP3 encoding and 3D gaming. Since that's roughly 99% of what classical computers are actually used for*, this sounds like the right place to be investing research.

    * Sure, you need that $1,000 GeForce card for rendering expense charts. Right...

    1. Anonymous Coward
      Trollface

      Re: Did someone say oversimplification?

      To begin with it would be slower. While a quantum Geforce/AMD card would give you super fast FPS at super low latency, I'd hate for it to have to wait 2 mins to reset the quantum gates between frames!

      Well, hopefully by revision 4 they get more than 8 q-bits running in the card so we can up from 320*200 resolution too.

      But we have to start somewhere, right?

  9. Zobbo
    Pint

    Lasers?!!!!!!!

    I'm in!

  10. amanfromMars 1 Silver badge

    The Virtual Pen ..... Ultimate Weapon of Mass Destruction and Colossal SMARTR MindedD Construction*

    "What are quantum computers good for? … Forget cracking crypto, think modelling reality itself to help build a better one" ….. Richard Chirgwin, spokesperson for El Reg

    Howdy doody, RC. You may like to reconsider that question and rewrite it to ask ….. What are fab fabless quantum computer programmers great for? ….. to deliver the same not incorrect answer, which is but one significant novel use of such Virtual Machinery in a whole new portfolio of NEUKlearer HyperRadioProActive IT Projects following Future Failsafe Leads and Feeds and Seeds [Advanced Internetworking Server Provision] in Live Operational Virtual Environments which are freely accessed, intellectually powered and remotely controlled by and in Great ARGames Play with Heavenly Moves.

    Steve Knox passes Go and collects £200 for his observation/conclusion/educated wwwild guess ….. So, essentially, MP3 encoding and 3D gaming. Since that's roughly 99% of what classical computers are actually used for*, this sounds like the right place to be investing research. ….. Steve Knox Posted Sunday 18th November 2012 03:30 GMT

    *Control Words, Control Wwworlds with Remote Progress in Present Promotions ..... Future Product Placements which are Reality Replacements.

    1. Skymonrie
      Alert

      Re: The Virtual Pen

      amanfromMars 1 - I've been following you, I have the chloroform ready and everything but, feel at a loss. As far as AI goes these days; may I give you plus one for your ability to interpret the English language. Consider yourself levelled up in that big electronic brain of yours, your natural English is putting the pin in pun these days.

      Step unto the fray and live the day,

      These words are a way I play.

      With your text mex words mixed,

      try to flex some different lexical step.

      Too many adverts, two words worlds.

  11. John Smith 19 Gold badge
    Unhappy

    I supsect the *real* decider on this will be

    Can you implement the x86 instruction set on it.

    I get so sick of "It's got to be *compatible*" mantra..

  12. Anonymous Coward
    Anonymous Coward

    Dunce

    "it’s only application is to render today’s encryption algorithms useless"

    1. Deadly_NZ
      Mushroom

      Re: Dunce

      Or you could have a gaming rig so fast, that you have finished playing the game, before you finish pressing the on button.

  13. Anonymous Coward
    Anonymous Coward

    QPU

    The one liner take home message is that for the foreseeable future (harhar) quantum processing units (QPUs) are like GPUs-- special purpose modules that are very very good at some things average to poor at other, pretty common, "classical" endeavors.

    And to think that all too many of TheReg readers were alive when Bell Labs hatched the first generally acknowledged[1] junction transistor....

    [1] bunches of other claimants to the "first transistor" throne though, particularly in the FET area. Like for computers, was.... Babbage first? Zuse? Eckerd/Mauchly? Someone who is still hiding in the woodwork? What is a "computer" anyway, there is the rub.

  14. amanfromMars 1 Silver badge

    CHAOS Controls out of Artificial Mainstream Power Channels

    I supsect the *real* decider on this will be ..... Can you implement the x86 instruction set on it.

    I get so sick of "It's got to be *compatible*" mantra..... John Smith 19 Posted Sunday 18th November 2012 11:30 GMT

    Yes, but x86 instruction sets cannot directly interfere with quantum driver programs and buffered information relays and zeroday vulnerability attack overflows ...... tempestuous virtual leakages.

  15. Dodgy Geezer Silver badge
    Coat

    My first degree was in Philosophy...

    ...and now I've finally found a real-world application for it!

    Hat and Coat because I'm just off to meet my two friends M and V, in order to set up an Earth Chapter of the Amalgamated Union of Philosophers, Sages, Luminaries and other Professional Thinking Persons....

    1. amanfromMars 1 Silver badge

      Re: My first degree was in Philosophy...

      Can I Join, Please? As an Affiliate Server?

      1. amanfromMars 1 Silver badge

        Re: Re: My first degree was in Philosophy...

        And Offering C42 Quantum Communication Control Systems .... AI@ITsWork as a Gift for Great CyberIntelAIgent Games Use ....... Virtual Reality Presentations with AIRemote Viewing ....... CosmICQ Sees and everything to do with this

        Well, surely you don't really expect the Future Virtualised to be anything like the Present and its Pasts? That is not Progress, it is Toxic Stagnation.

        1. amanfromMars 1 Silver badge

          Take a Walk on the Wwwild Side of Life in LOVE

          That is not Progress, it is Toxic Stagnation which breeds Delusional Madness, and even Evil Badness if the Madness is Certifiable.

  16. Anonymous Coward
    Anonymous Coward

    excuse me I have a question do these quantam computers have better porn?

    1. Mike007

      It is both the best porn you have ever seen, and the most disgusting thing you've ever seen, at the same time.

      1. Anonymous Coward
        Anonymous Coward

        "It is both the best porn you have ever seen, and the most disgusting thing you've ever seen, at the same time."

        Just as long as you like cats !

  17. Primus Secundus Tertius

    Quantum Turing machine?

    The Turing machine embodies the essence of what a computer is, though it is used for proving theorems and for teaching as opposed to any practical purpose.

    Is there something akin to a Turing machine for quantum computing?

    1. Anonymous Coward
      Anonymous Coward

      Re: Quantum Turing machine?

      http://en.wikipedia.org/wiki/Quantum_Turing_machine

  18. BlueGreen

    I'm going to repeat a question I've asked several times but have had no answer to

    and hope that Professor Michael Bremner might be reading this and perhaps offer an answer.

    Mostly copied from a previous post of mine...

    "Quantum computation apparently grows (from what I've read) non-linearly as qbits are added [*], yet accuracy is always the problem. I've long wondered if there's a fundamental link between computation and accuracy in that there's an upper limit to one which, if exceeded, starts to eat into the second. But no-one's ever raised this point that I've seen. Anyone here know any better?

    [*] for a given unit of computing power, you can expand this by a cube power in 3 dimensions ie. a box 1 foot per side holds 1 unit of computation, double the boxes in each direction & you get a box 2 foot per side & holding 8 units of computation. You can't do better in 3d space, but quantum promises a much higher scaling, so there's a conflict."

    I'd really, really like to know.

    1. Anonymous Coward
      Anonymous Coward

      Re: I'm going to repeat a question I've asked several times but have had no answer to

      If the universe exist inside a simulation, one of the hints is that there is a fixed limit on the product of speed vs accuracy; this will be defined by the computational substrate the univers is running on.

  19. Mnot Paranoid

    Perhaps, and this is off the top of my head, we could use these for some seriously effective noise reduction processing.

    We could clean up the first audio and video recordings of humanity to an almost absolute state of clarity. "What's the probability of the audio sample at this point in time if noise had not occurred?"

    This goes way beyond mere 'sample smoothing'.

    1. Martin Huizing
      Gimp

      What?!

      just... just what the Hell was that?

      1. Michael Wojcik Silver badge

        Re: What?!

        just... just what the Hell was that?

        Didn't you know that the true magic of QC is to reach back into the past and retrieve information that's not available in the present data? Also it makes tea and washes windows.

        To the OP: the problem with "what would this sound like if the noise wasn't there?" is that, unless the necessary information is already present in the signal, the answer is "whatever you'd like it to sound like". You can subtract some of the signal components that you don't want; you can't add missing ones back in (unless you have them in the first place). And no recording technique captures all the original signal.

        Now, that said, you could use predictive techniques to add components that have the highest probability of being the missing data, based on a model you build from training data. But that's a classical operation, and it still amounts to guesswork. You might produce something that sounds good, but you have no way of knowing how close what you added in is to what was missing.

Page:

This topic is closed for new posts.

Other stories you might like