D-Wave has announced that D-Wave Hybrid, its open source workflow platform that has been built to create hybrid applications that use both classical computing architectures and new quantum computing designs, is now available to download from GitHub. Unsurprisingly, the ulterior motive is to drive demand for D-Wave’s computer systems, as these quantum processors are going to have to be integrated properly into existing business applications.

Solving that headache should make it easier for D-Wave to sell into the enterprise and cloud markets. Making the D-Wave Hybrid workflow open source should help there, but open source does seem to be a pre-requisite for most projects in the AI and ML sector. Just remember, Microsoft now owns GitHub, but here’s the repository, should you want to delve deeper into the documentation.

Currently, quantum computing finds itself in a position similar to many IoT initiatives – it seems like a solution looking for a problem. Consequently, it has to demonstrate a use, rather than simply appeal to government spooks that are looking to crack the encryption that has been applied to all the data that has been collected and intercepted. Tin-foil hats aside, D-Wave’s approach is sensible, but we are still years out from these sorts of appliances being added to data centers for general availability.

D-Wave Hybrid is mostly based on Python, and includes example projects that should help developers get up to speed quickly. Access to D-Wave learning tools and technical forums is also available, which should then help the developers leverage the modular framework, which is essentially just a mechanism for splitting a large computing problem into parts that can be run across classical computer and quantum computers – because quantum computers are not general purpose computers, and so need to be supported by our conventional workloads.

“Prior to the general availability of D-Wave Hybrid, building hybrid algorithms was much more ‘from scratch’ by each user,” Murray Thom, D-Wave’s VP of software and cloud services, told *VentureBeat*. “With today’s Hybrid availability, we’ve provided an on-board for developers to more quickly get started prototyping their hybrid applications.”

Thom also published a blog post about the advent of quantum computing, in which he stresses that “don’t let anyone tell you otherwise; quantum applications will always and only be hybrid.” The second main point Thom makes is that customers value results, and it is these results that prove the value of the technology to D-Wave customers. “It’s our job to build useful quantum technologies to help benefit your business, not to mire you in the mechanics of it all.”

The final main point, is that quantum computers will power new frontiers, “meeting the demand of entrepreneurs with wild new ideas and businesses in competitive markets.” This would be why D-Wave is pushing the Hybrid framework, as people can’t exactly buy a developer kit for these things just to play around with on their desks, as you would with a Raspberry Pi or Arduino environment. These machines are incredibly expensive, with the D-Wave 2000Q costing around $15mn, and Thom’s blog has more detail about the intricacies of the computers themselves and the framework.

D-Wave is somewhat separate from the rest of the quantum computing pack. The likes of IBM and Google are still down in the double-digit qubit range, while D-Wave is a couple of orders of magnitude higher. However, there has been quite a lot of disagreement as to whether D-Wave’s technology is actually displaying quantum mechanical properties, in the proper sense of the word. The machines do appear to work, but heavy marketing has brought with it a whole heap of skepticism, and that conflict seems to be rumbling on. Currently, D-Wave claims to have a 5640 qubit chip, called the Pegasus, but it seems unfair to compare the D-Wave approach to the rest of the industry, currently.

Academics seem to think that you need at least a couple more orders of magnitude above D-Wave before there are enough qubits on hand to power the sorts of calculations needed. Over a million seems to be something of a consensus, and we seem a long way from that. However, the quantum approach could enable much smaller supercomputer arrays, which would also be able to provide much faster answers, thanks to the fundamentally different computing approach. No longer would you need racks and racks and racks of CPUs – you might need just a couple of cabinets of these quantum units, provided you can keep them cool enough – the hybrid approach that D-Wave is pushing.

A quantum chip represents bits of data, the most basic unit of information that expresses its value in binary form (as one or zero, a portmanteau of binary digit), using qubits – quantum bits. These qubits, thanks to the quantum mechanical properties of the computer, can occupy both binary positions at the same time, being both one and zero, as well as superpositions of those two states – meaning that one qubit can potentially hold two complete bits of data.

Multiple qubits can be held in a superposition of states, to a factor of 2n where n is the number of qubits – meaning that a pair of qubits can be representations of 4 superposition states, a trio of qubits could represent 8 such states, and 4 qubits would be able to represent 16 states.

The traditional computing model only allows for one of these states at a time, while the quantum model scales with the increasing number of qubits. The limiting factor for quantum computing is the number of qubits in that 2n relationship, as the calculation’s solution can only be represented by an amount of classical bits that is less than or equal to n. Essentially, you need more qubits to ask more complicated questions, and these questions can be extremely complex.

This allows the computer to carry out its calculations in a manner that the traditional approach cannot match. As the qubits are held in a superposition of states, where they are both binary values simultaneously, they can be used in certain extremely complex mathematical problems that the classical von Neumann architecture could only achieve with unfeasible amounts of compute resources.

Currently, the typical quantum computer is a tiny chip and a massive cooling array, needed to bring the qubits to a low enough temperature to help introduce those quantum properties – as close to absolute zero as possible. The qubits will lose their properties, the answer to the question posed, within a fraction of a second, so the machine needs to be able to process and read the results very quickly. Similarly, the qubits will also interfere with each other, and balancing them is pretty tricky. Vibrations are also problematic.