Genuine question
Folks in the know are often heard saying that quantum computers are very good for certain tasks. What are these tasks, and how will they make my life and everyone else's life easier?
Canadian boffins have brought quantum computers a step closer to reality, by identifying one feature that will be key to finally building one – "contextuality". Our computers use the binary system of 1 and 0s. Quantum computers use qubits (quantum bits), which can exist in "superposition", meaning that they're simultaneously 0 …
Folks in the know are often heard saying that quantum computers are very good for certain tasks. What are these tasks, and how will they make my life and everyone else's life easier?
You will be able to
1) Factorize numbers for fun and profit
1.1) Probably do quantum crypto stuff on the side to alleviate 1)
2) Do quick searches in a database (faster than using a search tree)
3) And simulate quantum processes without exponential slowdown. THIS IS THE MOST IMPORTANT PART as it means advances in material science and deeper understanding of how molecular interactions actually work.
Also, many algorithms that are probabilistic at the moment should be significantly sped up, and this is many things in experiments where you need to find an element in a set with certain properties. Much of computational mathematics would be significantly easier with that.
Nice! That means number 2 is worth 100's of billions of dollars! MWUHAHAHA!!!!
1) Factorize numbers for fun and profit
1.1) Probably do quantum crypto stuff on the side to alleviate 1)
Probably not necessary. Shor's algorithm takes O((lg N)^{3}), so for each additional RSA key bit you get an 8-fold increase in the run time.
Back in 2003, RSA was recommending 2048-bit keys for short-term security (estimated secure through the year 2030) and 3072-bit keys for long-term security. Most software that processes RSA keys can handle 3072-bit keys and they are trivial to generate. In 2003, RSA estimated that by 2030 you'd be able to build a classical computer for about $10M that could factor a 2048-bit key in only 100,000 years - so those recommendations are pretty conservative.
If someone builds a very big, very fast QC with thousands of reliable qubits so they can factor a 2048 key in, say, a day, it'd still take them 22 years to factor a (single) 3072-bit RSA key - if their machine is big enough and reliable enough to do it.
It's not something someone's going to do in order to snaffle your credit-card details.
And all the cool kids are moving to ECDHE for perfect forward security anyway.
That means number 2 is worth 100's of billions of dollars!
Number 2 (using Grover's algorithm to search an unsorted database) is worth whatever the workloads that can be profitably applied to it are worth. My guess is there won't be that many of them.
Essentially you're looking for cases where you have a very large unsorted in-memory database, in a QC (so "in memory" means "in qubits"). You have to perform queries against this so quickly that linear search isn't good enough, and for some reason you can't index it, amortizing the cost of indexing over multiple queries.
Those conditions are going to rule out nearly all of the naive potential applications.
Shor's algorithm takes O((lg N)3), so for each additional RSA key bit you get an 8-fold increase in the run time.
Good lord, what was I thinking. Obviously for each key bit the time is increased by a factor of ((lg N) / ((lg N) -1))^{3}, which is only 8 when N=8. Duh. But the larger point still stands: the time is polynomial, but the exponent is large enough and the cost of adding work is small.
does that mean we know have a way of proving whether the D-Wave computers are true quantum computers or just very expensive snake oil?
I thought that a recent test of whatever it is the D-Wave box does, showed that it was doing it in a way that would be expected if it was doing some sort of quantum computing. Or that somebody had programmed the man behind the curtain to simulate the way a quantum computer would do things. I don't know, I'm not really qualified to comment.
But I do know how the petrol engine in the Chicago high-rise that apparently ran on water worked. (Hint - long vertical runs of water pipe and petrol is less dense than water.)
"Quantum computing" is a loose term. The question is, does the D-Wave
1) Use Quantum data
2) Use it in a way that is helpful for calculations.
Both of those are some what trivial for labs to do, so I would assume it's possible to commercialise this. The difficulty is doing so at a cost less than a regular computer for the same task (as said further up, tree searches etc) and if it can do so on large enough data sets... as a single qbit is not much help...
" at the least, it is a very good fridge..."
AH-HA! I knew this was beer related.
Our computers use the binary system of 1 and 0s. Quantum computers use qubits (quantum bits), which can exist in "superposition", meaning that they can be a 0 or a 1 and everything in between... simultaneously.
NO!
"The state of the qubit is a complex-valued probability distribution over the state space { 0, 1 }"
Isn't that clearer?
I was about to comment on exactly this.
Quantum computers, by definition, store information in quantum properties. Quantum properties, by definition, are eigenvalues - that is, they can only take specific values, so a qubit can be either a 0 or a 1 and nothing in between.
An example of such a property is the 'spin' of an electron (confusingly, it's called spin, but has nothing to do with the electron actually spinning). This can be either 'up', or 'down' (again, this actually has nothing to do with orientation and is just nomenclature).
Other quantum properties can take multiple values, such as the energy level of an electron in an atom, but because these have different energies, they are not stable in isolation, because an electron can collapse from a higher level to a lower one by releasing a photon.
Spin states are normally what is known as 'degenerate' - that is both states have the same energy level. This can be affected by altering the electromagnetic field around the particle, to make one state more energetic than the other, thus 'aligning' spin (this is how MRI scanners and NMR spectrometers work). This can be used to set, or measure the spin state of a particle, which is an integral part of making any quantum device work.
I get that the observed value of the qubit has to be 0 or 1, and I get that the superposition of states is a probability distribution, but what does "over the state space" actually mean?
To put it another way, let's suppose I have a probe that can somehow measure the value of the probability distribution. In what, and how, do I move it to create a map of that probability distribution?
It's easy enough to understand for a d orbital, for instance. The probability of finding the electron at any instant varies over "normal" three dimensional space, and if we take some boundary condition such as p < 0.001, we can create a three dimension representation of the p orbital. But how does this work for a qubit? What is a "state space"?
Of course you can never measure ""the value of the probability distribution".
The only way to plot is to prepare a large number of identical states, take a measurement, and mark the bin into which the measurement falls.
At the end of the day, you count the crosses in the bin, divide by the number of trials and then finagle a continuous function.
Just as in classical physics.
What is a "state space"?
Ok, let's use "sample space" for the individual classical outcomes. Sample space is just the possible values of the measurement with a funny notation:
For one qbit: |0> , |1>
For two qbit: |00>, |10> , |01> , |11> (the space gets larger quick - as 2^N)
For three qbit: |000>, |001>, |010>, |011>, |100> etc...
Now assign a complex-valued probability to each element of the sample space, under the constraint that the complex-valued vector of dimension 2^N has unit length (in the two-norm: sum of squared length of the individual components)
This is a straiightforward extension of classical probability, were you only assign real values to each element of the sample space , under the constraint that the real-valued vector of dimension 2^N has unit length (in the one-norm: sum of reals). It seems that this extension is the only one which makes mathematical AND physical sense.
"State space" of your current quantum state is the (Hilbert) space in which that complex-valued vector of 2^N dimensions "lives". This vector describes the full state of your N-qubit machine (it describes a "pure state"). There is the added finesses that shifting the "complex phase" of each component of the 2^N state space vector has no physical significance, so properly the actual state is actually a subspace of the state space, also called a "ray".
And that's about it. If you take a discrete space (of qubit arrays or anything else) and take it "to the limit" you get the continuous space where you need infinite-dimensional state space vectors - these may describe for example the position of a particle; every component of the vector giving the complex probability density function of "finding" a particle at the given position.
will this computer provide precise calculations that are simultaneously correct and incorrect ? (Quantum mechanics make my head hurt).
Yes, but you can't actually predict which is which and as soon as you measure the result the other one is lost; i.e. the observed result is not the result prior to the observation, as per quantum mechanics just observing it changed its state.
Thus if the observed answer is correct the original answer was wrong and vice versa. The tricky bit is of course is knowing which is which: is the unobserved answer correct if so what was its value, or if the observed answer is correct why did it get the wrong answer in the first place and should it be trusted?
What possible use this would be is beyond me. More tea vicar?
...or does this article not actually say anything?
It seems to just say you need some "magic contextual state stuff" to do quantum computing but doesn't say what the heck that means. As far as I can tell the term never even existed before this paper, so we're just as much in the dark as before we read the article.
You might as well say that you need some pixie dust, really (if they'd used a pentangle instead of a triangle for that diagram, it'd be just as informative).
As far as I can tell the term never even existed before this paper
How hard did you look? Here is a use in this context from 2008.
This presentation may clarify things for some readers. Then again, it may not.^{1}
The presentation associates contextuality (which I take to mean roughly "no non-contextual hidden variables allowed!") with a discrete version of the Wigner probability-distribution function. I just skimmed it and this is very much not my area, but I think I get the gist. The authors are saying: here's a subset of quantum theory that says for QC your system has to have a negative discrete Wigner function, and you can only get that if it "violates a contextuality inequality".
If you still don't care for "contextuality", you'll be happy to learn that it is equivalent to "negative quasi-probability". HTH.
^{1}Exercise: Measure clarification across a large number of readers and compute probability distribution.
Quantum anything is basically refined magic, try explaining a semiconductor laser to someone in 1947.
It seems that a lot of work done by Richard Feynman laid the theoretical groundwork for quantum computers, although at the time he probably never imagined how fast it would advance.
Often science advances when people try an "impossible" experiment and it works, then spend decades trying to figure out why it did.
Actually pixie dust is trademarked by IBM, used in the latest hard disks in the form of a thin layer of IIRC vanadium on the platter to focus the domains in order to permit reliable recording.
HAMR is actually a variant of a technique originally used on early military Flash memory which spot heated up the chip to write data faster but this was classified Top Secret until now.
A lot of interesting physics depends on effects we don't fully understand, in fact Flash memory based on 256 level cells was known about since 1990 in the form of single chip voice recorders before some clever people worked out that you could offset cells and store more data between cells by mathematically subtracting in the analogue domain and compensating for cell wear with conventional memory to store the frequently written data.
This is now used on all memory chips but it is indeed tricky stuff (tm) as 3b/cell is a lot harder to read back due to marginal states and wear in the oxide layers.
700-900W/cell for 8 b/cell chips max, OK for movies but not so much for reliable data storage, 1TB SD cards are feasible but would cost a lot more than a simple BD3+ or HVD.
I have heard rumours of a 16 b/cell device but that would be insane. 1TB on a microSD :-O