Tuesday, June 14, 2016

"Google Hartmut Neven predicts that within 10 years there will only be quantum machine learning and no machine learning on classical computers"

Um...NVIDIA, are you listening?

From Next Big Future:
Google is working on error corrected adiabatic (analog) quantum computer designs. Tehy work less like a conventional computer and are less well understood theoretically. And they would still need a way to deal with errors. But the burden of error correction should be much smaller. As a result, it should be much easier to demonstrate the power of a quantum computer this way.

The team used the analog quantum computing approach to program a superconducting quantum chip to simulate nine atoms interacting magnetically. That was made possible by drawing on some of the error correction techniques developed in earlier work on the harder-to-scale-up digital quantum computing.

The chip used had nine of the basic building blocks of a quantum computer, known as qubits. It would take an analog quantum computer with 40 or more to demonstrate what researchers charmingly call “quantum supremacy”—meaning a system that can conclusively demonstrate things impossible for a conventional computer.

Google says it can scale up to that point relatively quickly, and other researchers in the field say it’s credible.

It would likely take scaling up a little further to do useful work with an analog quantum computer. If and when Google or some other company does that, the devices could be used to crack tough chemistry problems in health or energy by simulating atoms to a level of realism impossible today....MORE
HT: Beyond Search

See also:
What Can Quantum Computers Be Used For?
"Google says it has now proven that D-Wave’s quantum computer really works" (GOOG)

Regarding the quantum, this is not Schrödinger's Cat