Re: Theory v Practice
I'll have a go. (If your question is WHY is quantum reality like (or somewhat like) this, no-one has the faintest idea. It's like asking why gravity exists. At present, just assume that $DEITY thought it was a good idea).
A quantum bit is in both states 0 and 1 at once, with a probability of 50% that when it is observed, it will be a 1 or a 0.
Entangled bits are in a state where their properties are mathematically correlated, so for example two qubits represent 0,1,2 and 3 at the same time. Call an ensemble of entangled quantum bits a quantum register. Start so that its N bits represent all humbers in the range 0 to (2^N)-1, with equal probability of 1/(2^N), at once.
Now perform arithmetic operations on your N-bit quantum register. For example if you add a quantum register to itself and then observe it, you are guaranteed an even number. The self-addition operator causes bit 0 to assume a zero probability of it being a one. So far, somewhat trivial.
It gets really interesting if they can make a quantum register with a large number of bits, and perform operations on it such that the probability of it representing any number that is not a factor of a specified much larger number is zero. If the specified number is the product of two large prime numbers, when you observe the quantum register you will obtain one factor or the other. The probability of any other number (pattern of observed bits) is zero. Suddenly not nearly so trivial!
You obtain your other factor, and also check that the quantum entanglement had not failed during your quantum computation, by conventional long division on a conventional computer. Entanglement failure during a quantum computation is akin to a hardware failure in a conventional CPU except far more likely and (on the positive side) not irreversible. For the factorisation application (i.e. cracking N-bit PK cryptography) even an unreliable quantum computer is useful, just as long as it occasionally spits out a right answer with a much higher probability than 1/(2^N) (i.e. guesswork).
Last time I read up on it, they'd managed a four-bit quantum computer and were able to factorize 15 (in other words, when observed after mathematical operators applied, their 4-bit register almost always said 3 or 5, rarely anything else). Of course that's trivial, but a 2048-bit quantum computer would be anything but. I don't know what's the latest N qubits? 30-plus would start getting practically very useful, and probably very secret.
My money is on quantum computing breaking down somewhere between four bits and 2048 bits, and that this will prove to be a gateway to some new physics. (Hopefully not of the Laundry variety, involving a takeover of our universe by beings best not even thought about). Philosophically, I'm not prepared to take on board the implications of a working Megabit quantum computer (or Giga - Tera - Zetta- ...!)
BTW if you are doing research in this field and make a sudden breakthrough of huge magnitude, your only chance of staying alive and at liberty is to spam the details to as much of the world as you can as fast as possible. There are also some mathematical theorems (unproven but believed true by almost all mathematicians), for which a disproof would have similarly vast real-world implications.