# NASA and Google team up to buy into quantumish computing

A consortium of researchers from Google and NASA are planning to crack the issue of machine learning with a $15m quantum computer that will form the basis of a new Quantum Artificial Intelligence Lab. The new facility, which will be sited at Silicon Valley's NASA Ames Research Center, will host a 10 square meter shielded room …

#### Lofty goals

"Quantum Artificial Intelligence Lab" sounds as interesting as S.H.A.D.O.

It will be interesting to see their results, hopefully soon. IMAO, the claims about opening up NP-complete problems are out there with claims of antigravity and cold fusion, and the paper on the experimental eval (ACM 978-1-4503-2053-5) was not entirely clear about anything.

I recommend:

IT WORKS: Quantum Computation by Adiabatic Evolution

NO IT DOESN'T: Adiabatic quantum optimization fails for random instances of NP-complete problems

YES IT DOES: Adiabatic Quantum Algorithms for the NP-Complete Maximum-Weight Independent Set, Exact Cover and 3SAT Problems

...but one would also like to know how D-Wave's machine maps to this approach exactly?

#### Must be more to this than sharing the hardware.

Google or NASA could afford a 15 million computer n their own. Presumably this is a joint research project and the new toy is just a tool that they'll be using.

#### Re: Must be more to this than sharing the hardware.

Google and Ames are chummy. Google gets to park their jets there in exchange for attaching NASA research hardware to the planes.

#### All this cooperation with NASA

is obviously a step in their long term goal of establishing a tax free headquarters on the moon.

#### hmm

The problem with doing private general quantum computer research (Google obviously teaming up and avoiding that type of quantum computer entirely) is once the technology becomes available the military and government are going to want a monopoly on it for some time because it could basically make alot of current encryption trivial to solve. That will be a very disruptive technology at first.

#### Re: hmm

Supposedly Quantum computing will be as much a speed up in computing as the move from mechanical to digital was (the only generation that completely blew away Moore's law) but am sure the experts on here can point out what a simpleton I am.

#### Re: hmm

IF they can be made to work, they will at least speed up computation of problems in quantum mechanics. That's already a BIG BIG win. They will MOST LIKLEY NOT crack NP complete (and definitely not NP hard) problems in polynomial time though.

Further intro at:

http://www.scottaaronson.com/democritus/lec14.html

#### Re: hmm

*it could basically make alot of current encryption trivial to solve*

Care to explain how?

"Alot [sic] of current encryption" is symmetric-key block ciphering with AES and the like, or stream ciphering with RC4. What QC algorithms do you have in mind to break those at significantly better than brute force?

Asymmetric key encryption is often RSA, based on the integer-factoring problem. Shor's algorithm is optimal for QC approaches to factoring, and it roughly cuts the time to the square root of the best known classical approaches. That's equivalent to cutting the key length in half. Make your keys twice as long and there's no QC advantage, even if you have a QC big enough for the problem in the first place (you don't).

For other asymmetric ciphers, such as those based on discrete log, again we must ask: what QC algorithms do you think render these "trivial"?

#### 20 millikelvin?

Talk about keeping your data center cool. I had a joke about overclocking here, but as soon as I knew where it was going, I lost where it was.

#### I *knew* this would happen, like last week!

But I've been trapped in a box with a cat, a vial of some cyanide compound and some sort of radioactive-hammer gadget.

#### Re: I *knew* this would happen, like last week!

"Thousands of times, and I always survived!"

#### "...a compact, low-power recognizer for mobile phones..."

How many test cycles did it have to go through before it would consistantly spot that a Galaxy Note is a phone rather than a tablet? Did they have to hard-code that?