Published

# Google schedules quantum compute chip for end of year

Google has committed to a showdown with supercomputers, aiming to prove the capabilities of its in-house quantum computing silicon. The new design is aiming to establish quantum supremacy, and establish that the new computing architecture can trounce the most powerful arrays of traditional silicon – opening up a new field to be embraced by emerging AI and machine-learning (ML) technologies.

So-called because of their use of quantum mechanical properties, the new chips threaten to outperform the most powerful supercomputers, potentially in power packages that could be run from batteries. While they have huge value in scientific research and experiments, quantum computers have picked up a little infamy for their potential ability to render current methods of digital encryption redundant. To this end, they have garnered a lot of interest from governments, with crypto-specialists turning their attention to techniques that would be able to stand up to these new attack methods.

The goal is to create a new form of computing that can perform tasks that are economically impossible within von Neumann computing – that is, the type of computing that is the norm today. Such a chip would be able to outperform supercomputers in certain functions, although the likes of x86 and ARM will still have their place – powering the vast majority of devices.

Most of the work in quantum computing has been academic, but the standout companies have been IBM, Google, and D-Wave – that latter of which has been selling its quantum computers to the likes of Lockheed Martin and NASA, while attracting a lot of negative attention and accusations that its machines were not proper quantum computers.

D-Wave has managed to mostly dispel those allegations, and currently ships a 2000-qubit machine – launched in January, with Temporal Defense Systems named as its first customer. However, there are still vocal critics of D-Wave’s approach, and the company has always operated against a background noise of skepticism and outrage. Currently, D-Wave’s system is only able to tackle optimization problems, and not the full range of quantum tasks.

The company sold a 512-qubit unit to NASA in 2013 for $15m, and appears to be the leader in the space currently. Google was involved in NASA’s purchase, announcing that it was partnering with NASA to launch the Quantum Artificial Intelligence Lab in NASA’s Ames Research Center, using the D-Wave machine.

D-Wave’s 2000Q computer uses quantum processing units (QPUs), to generate and house the qubits that it uses in its quantum annealing approach. In the launch, D-Wave claimed that its new system would outperform classical servers by factors of 1,000 to 10,000 times, in its benchmarks, using its lattice of qubits, built from niobium.

But there are still major hurdles that need to be overcome. The quantum-scale of the chips means they suffer from quantum decoherence, linked to the infamous double-slit experiment, but far more prominent is the matter of cost – in terms of dollars as well as R&D man-hours.

D-Wave’s machine cools the QPUs to nearly absolute zero (0.015 kelvin) to counter the immense heat generated by the superconduction of the qubits, and act as vibration and electromagnetic shields, operating in a vacuum – as any external interference would ruin the calculation. Miniaturization for using such a computer in a mobile device seems close to impossible, and the 2000Q currently measures 10-feet, by 7-feet, by 10-feet. D-Wave does cite the relatively tiny power consumption of its system compared to traditional supercomputers, of around 25kW – about 100x less.

The quantum chip represents bits of data, the most basic unit of information that expresses its value in binary form (as one or zero, a portmanteau of binary digit), using qubits – quantum bits. These qubits, thanks to the quantum mechanical properties of the computer, can occupy both binary positions at the same time, being both one and zero, as well as superpositions of those two states – meaning that one qubit can potentially hold two complete bits of data.

Multiple qubits can be held in a superposition of states, to a factor of 2^{n} where *n* is the number of qubits – meaning that a pair of qubits can be representations of 4 superposition states, a trio of qubits could represent 8 such states, and 4 qubits would be able to represent 16 states. Google’s test will use a 49-qubit design.

The traditional computing model only allows for one of these states at a time, while the quantum model scales with the increasing number of qubits. The limiting factor for quantum computing is the number of qubits in that 2^{n} relationship, as the calculation’s solution can only be represented by an amount of classical bits that is less than or equal to *n. *Essentially, you need more qubits to ask more complicated questions, and these questions can be extremely complex.

This allows the computer to carry out its calculations in a manner that the traditional approach cannot match. As the qubits are held in a superposition of states, where they are both binary values simultaneously, they can be used in certain extremely complex mathematical problems that the classical von Neumann architecture could only achieve with unfeasible amounts of compute resources.