Thursday 27 February 2014

D-Wave, disentangled: Google explains the present and future of quantum computing...






The performance and quantum nature of the D-Wave 2 processor continues to be a topic of discussion with every new data release, and the performance figures that Google released in late January were no exception. The company has now followed up these figures with a second blog post that describes its own interpretation of the results, what it intends to test next, and what the future of the program is likely to be.
The key question at the heart of the D-Wave enigma is whether or not the system is actually performing quantum annealing. In theory, the D-Wave 2 processor could be an excellent simulation of a quantum computer — possibly the best simulation ever built — but still, ultimately, an approximation of what the real thing would offer. The only way to determine whether or not the D-Wave performs true quantum annealing is to find a test case in which the D-Wave 2 outperforms even the best classical (meaning, standard) computers.
Google’s last set of data indicated that while the D-Wave 2 outperformed off-the-shelf classical software by huge margins, hand-tuned classical computer configurations running on Nvidia GPUs were capable of competing with the quantum computer in a number of specific benchmarks. According to Google’s engineers, this close performance is an artifact of the primitive state of current quantum annealers.
The D-Wave 2 is limited by what’s called “sparse connectivity,” as shown below.
D-Wave 2
Note that while each sub-group of eight qubits is tightly linked to its adjacent partners, the blocks themselves connect in far fewer places. This limits the performance of the quantum annealer because it limits the number of states that the quantum computer can test in order to find the ideal solution to the problem. This is a separate problem from the number of qubits in the system (up to 509 out of a possible 512 in this machine) — it’s an issue of how interconnected the 509 functional qubits are.
According to Google, it’s this sparse connectivity that’s allowing classical computers to keep pace with D-Wave’s quantum system. The company writes that, “For each solver, there are problems for which the classical solver wins or at least achieves similar performance. But the inverse is also true. For each classical solver, there are problems for which the hardware does much better.”

Echoes of the past

The current debate over the merits of quantum annealing and the relative performance advantage of classic computers versus D-Wave’s system is somewhat similar to the debates over digital vs. analog computing of the mid-20th century. From this end of history, it may look as though digital technology was an unstoppable wave that simply buried older, more primitive methods of computation — but this glosses over historical fact.
Project Cyclone
Image: Popular Science. Doctor Frances Baurer with the analog computer (Project Cyclone)
Electronic analog computers were initially far faster than their digital counterparts. They operated in parallel, whereas digital systems performed operations sequentially. They were capable of higher levels of precision (0.1% as compared to the 1% margin of error within the first digital systems). In the end, digital computers won out over analog — but the two types of systems co-existed at various levels for several decades.
Just as early digital systems were matched or outperformed in many respects by well-developed analog computers, it’s possible that D-Wave’s first quantum computing efforts can be matched or exceeded by well-tuned classical systems. In fact, given the hundreds of billions of dollars poured in to the development of modern computers, it would be astonishing if scientists invented a new computing solution capable of beating conventional equipment in all respects in just a handful of years.

No comments:

Post a Comment