Close
Close

Published

AI accelerator spearheads Qualcomm’s second crack at data center

Despite the settlement with Apple last week, Qualcomm’s smartphone market is under pressure, and its expansion away from mobile devices continues.

Having found limited demand so far for IoT devices, it is now chasing that other shiny new market – AI-based processing workloads in the data center. With its new Cloud AI 100 accelerator, a 7nm chip that claims 10 times the performance per watt of its rivals, Qualcomm is hoping to steal some thunder from Intel, Google and Nvidia.

Intel has been scrambling to find a way to make up for the shortcomings of its x86 CPU architecture in the new workloads demanded by AI and ML tasks – as these tasks favor the parallel computing capabilities of GPUs (graphical processing units) and FPGAs (field programmable gate arrays), which the serial CPUs can’t provide. Nvidia, too, is now trying to figure out how to adjust to life after the cryptocurrency crash, which gutted the demand for its GPUs that had driven its share price skyward. It has chosen data center and automotive functions as key markets, to this end. Intel too is chasing cars, with its huge $15.3 billion Mobileye purchase all the evidence needed there.

Google, meanwhile, has its Tensor Processing Unit (TPU) silicon, but seems to be planning to use it internally only, inside its Google Cloud Platform (GCP). And AWS has been exploring its own ARM-based CPUs, in another threat to Intel x86’s dominance of the data center. Amazon’s Inferentia chip is, as the name suggests, designed for inferencing tasks, and was built by the firm’s Annapurna Labs subsidiary.

Huawei’s Kunpeng 920 is another ARM-based server chip design, and Facebook’s open source activities continued when it released Kings Canyon, an ASIC chip also designed for inferencing.

Now Qualcomm is joining the fray, but it is coming from a long distance outside the data center silicon incumbency. Indeed, its ARM-based server platform has reportedly been placed on the back burner, although its AI-related activities now may signal a return to that project.

It is attacking first by focusing on accelerators rather than CPUs – a tactic that served Nvidia well. Rather than going head-to-head with Intel, the idea is to surround the x86 CPU with PCI cards that can be plugged into conventional server motherboards – allowing intensive workloads like AI to be offloaded without replacing the entire hardware platform.

Qualcomm is keen to stress that the Cloud AI 100 is not simply a repurposed mobile processor, and that the design is a completely new signal processor that has been made specifically for AI inferencing workloads. To this end, the PCI form factor, which facilitates some very beefy thermal cooling options, claims peak performance of up to three times that of the Snapdragon 855 system-on-chip, and about 50 times that of the older Snapdragon 820. Specifically, the ’10 times’ claim over rivals is measured against FPGA approaches to inferencing.

The company is being coy about actual performance benchmarks. It told VentureBeat that its new chip could provide “far greater” than 100TOPS (trillion operations per second), but that doesn’t quite square with the claim of delivering three times the performance of the Snapdragon 855, as that SoC can support 7TOPS – which would put the Cloud AI 100 in the 21TOPS range.

With the chip slated for release in 2020, much will depend on specific details, on what rivals launch in the meantime, and on pricing. The offering will need to be cheap enough to displace the other approaches, or the data center managers are just going to use larger quantities of cheaper, less efficient designs.

So, a bloodbath may be looming in the AI inference processing sub-sector. Intel is a giant, but one that has been flailing somewhat, in response to the new challengers. Nvidia is out in front; Google has its own designs, as does AWS; while Microsoft seems happy to play the field. Qualcomm is a very capable company, and it could certainly pull this off, but this is a market that is going to see competing vendors start kicking lumps and tearing strips out of each other. It’s still in flux.

Close