Close
Close

Published

IBM’s 2019 is the year of the quantum computer, industry remains bemused

IBM has announced that customers will soon be able to rent time on its quantum computer, served up via IBM’s cloud platform. Notably, a deadline and pricing have not been discussed, but much of the computing industry is looking on in bemusement, as there are already companies that can sell you a quantum computer if you really, really need one.

Currently, there aren’t enough use-cases to really inspire widescale enthusiasm. Quantum computing is more a subset of supercomputing, and while they will eventually become useful for enterprises, especially when they can solve very complex optimization problems, for now they are not really relevant. Businesses need x86 cycles to run their boring software workloads, and sometimes GPUs or ASICs. Enterprises currently can’t make use of bleeding-edge hardware, but that time is definitely coming.

IBM’s Q machine has 20 qubits, the processing units that feature quantum mechanical properties that enable a pretty radical new way of calculating given problems. Currently, far more time is spent working out how to ask a quantum computer a particular question, as they can solve things very quickly due to their radically different architecture.

The 20 qubit size is a bit odd, given that IBM said it had a working 50 qubit machine a year ago. Intel announced that it had a 49 qubit chip in January 2018, and Atos announced a contract with the Oak Ridge National Laboratory in November 2017 to supply a 30 qubit machine to power the US labs’ work in Department of Energy (DoE) research projects. Google’s Bristlecone chip has 72 qubits.

However, there are a couple of orders of magnitude between these vendors and D-Wave Systems, a Canadian firm that can sell you a 2000 qubit machine for a mere $15mn. However, there has been quite a lot of disagreement as to whether D-Wave’s technology is actually displaying quantum mechanical properties, in the proper sense of the word. The machines do appear to work, but heavy marketing has brought with it a whole heap of skepticism. Currently, D-Wave claims to have a 5640 qubit chip, called the Pegasus, but it seems unfair to compare the D-Wave approach to the rest of the industry, currently.

Academics seem to think that you need at least a couple more orders of magnitude before there are enough qubits on hand to power the sorts of calculations needed. Over a million seems to be something of a consensus, and we seem a long way from that. However, the quantum approach could enable much smaller supercomputer arrays, which would also be able to provide much faster answers, thanks to the fundamentally different computing approach. No longer would you need racks and racks and racks of CPUs – you might need just a couple of cabinets of these quantum units, provided you can keep them cool enough.

Returning to IBM, which has been saying it would introduce a cloud-delivered 20 qubit service for some time now, it does seem that the company has carried out a lot of work to make its chip more robust. Currently, the typical quantum computer is a tiny chip and a massive cooling array, needed to bring the qubits to a low enough temperature to help introduce those quantum properties – as close to absolute zero as possible.

The qubits will lose their properties, the answer to your question, within a fraction of a second, so the machine needs to be able to process and read the results very quickly. Similarly, the qubits will also interfere with each other, and balancing them is pretty tricky. Vibrations are also problematic, and while IBM isn’t giving way much detail about the Q, it sounds like a lot of work has been put into making a machine that can be used repeatedly, without worrying about errors or breakdowns.

IBM has also put a lot of effort into creating an ecosystem around its Q projects. It has an open source SDK, and has been providing a simulated 5 qubit machine option to developers and researchers for free, which has apparently led to over 100 papers being published, via millions of experiments.

A quantum chip represents bits of data, the most basic unit of information that expresses its value in binary form (as one or zero, a portmanteau of binary digit), using qubits – quantum bits. These qubits, thanks to the quantum mechanical properties of the computer, can occupy both binary positions at the same time, being both one and zero, as well as superpositions of those two states – meaning that one qubit can potentially hold two complete bits of data.

Multiple qubits can be held in a superposition of states, to a factor of 2n where n is the number of qubits – meaning that a pair of qubits can be representations of 4 superposition states, a trio of qubits could represent 8 such states, and 4 qubits would be able to represent 16 states.

The traditional computing model only allows for one of these states at a time, while the quantum model scales with the increasing number of qubits. The limiting factor for quantum computing is the number of qubits in that 2n relationship, as the calculation’s solution can only be represented by an amount of classical bits that is less than or equal to n. Essentially, you need more qubits to ask more complicated questions, and these questions can be extremely complex.

This allows the computer to carry out its calculations in a manner that the traditional approach cannot match. As the qubits are held in a superposition of states, where they are both binary values simultaneously, they can be used in certain extremely complex mathematical problems that the classical von Neumann architecture could only achieve with unfeasible amounts of compute resources.

Close