Your browser is not supported. Please update it.

1 February 2019

Lattice thinks it has low-power FPGA market cornered

The marketplace for the semiconductors used in IoT applications is as turbulent as ever, and Lattice is striking out towards the network edge in the AI realm, with its low-power FPGAs, looking to complement its cloud and communications offerings. The move seems wise, given the time that its FPGA brethren are having in the data center, and we spoke to Lattice’s Deepak Boppana, Senior Director, Segment and Solutions Marketing, about the current environment.

We asked Boppana about the state of the FPGA market for AI applications, which he sees as split into two camps – the cloud, and the edge. Lattice is more focused on the edge devices, whereas Xilinx and Intel’s Altera are currently trying to fend off the likes of Nvidia and Google in the data center, who are encroaching on the FPGA big-boys with their GPUs and custom ASICs, respectively.

To that end, Boppana has noticed that Lattice’s competitors are hardening parts of their FPGA accelerator designs, effectively removing some of the reprogrammability in order to make it faster at a given function. They are trying to match the speeds that the ASICs are able to achieve, and to this end, Intel is positioning its Nervana ASICs as an option in these workloads, which could impact its Altera sales. On that note, the challenge is compounded by the few number of buyers in the data center realm, and that the likes of Google (TPU) and AWS (Graviton) are looking at their own silicon designs.

The network edge, where machines with on-device AI features, is where Lattice is focused, according to Boppana, with a plethora of applications that could integrate FPGA-powered AI-based functions. These sorts of customers require the flexibility that FPGA silicon provides, and as it is still early days in AI, and thus the architectures are still evolving. Similarly, the types of data collected by sensors at the edge is still settling, and with the amount of different parameters presented by voice, video, audio, and telematic data from machinery, adaptability is key. In time, specific ASIC designs might emerge, but in these early days, the FPGA seems like a much more enticing proposition.

Lattice is playing in the milliwatt to watt (1mW to 1W) – orders of magnitude below the likes of Xilinx and Altera, and still a lot less power-hungry than the low-power FPGAs from Microsemi, which Boppana says are more used in the aerospace and defense markets. We asked if Boppana was worried that these rivals might want to encroach on Lattice’s turf, but Boppana said that the data center vendors have their hands full at fighting off the ASIC incursion, and that the Microsemi technology is not as low-power as Lattice’s designs. He stressed that it is a fundamentally different problem to optimize for low-power or embedded designs, compared to the effectively unlimited power supply of data centers.

To that end, Boppana says Lattice doesn’t really have direct rivals for the milliwatt-scale FPGAs, and that the startups who are venturing into this area are up against the fact that Lattice is a well-connected incumbent supplier, with a wealth of customer and partner relationships. Of course, he notes, you can’t rule out anyone, but he believes the scale of the customer base is a very significant head start.

As for time-frames, Boppana says there is a two to five-year process, although this does depend on the applications. In the two-year frame, smart home functions like voice and video analysis are the strongest candidates, and in the three-year realm are smart city applications, such as parking monitoring, automated toll collection, smart ATMs and vending machines. Towards the five-year end of things are industrial and automotive, which as industries are much more slow-moving, where FPGAs can provide predictive analytics or maintenance functions, or sensor-data processing.

We asked about the difference in pricing in these sorts of applications, given that in the IoT, low-power often equates to low-value. As these FPGA-powered devices aren’t typically the batter-powered sensor devices, which generate data for the cloud applications up the stack, they aren’t quite comparable. To this end, Lattice is very aware that the consumer side of things is much more price sensitive, whereas the mission-critical applications in industrial and automotive has much better margins, as the customers are willing to pay for those integrations.

Further, we asked about the sorts of overlaps between applications, in terms of the requirements of the chips. Boppana said that there is an overlap in the underlying functionality of the silicon, but that the major difference is the training data that ends up feeding the model for the machine-learning functions, which is then pushed to the end-devices powered by these FPGAs. As such, it is quite a fragmented space, and that is one of the reasons that Lattice is so excited by it.

But Boppana notes that it is still early days for the AI sector, and that there haven’t been a whole lot of deployments, to understand the frequency or need for the in-field upgrades that the FPGA architecture facilitates. We were questioning whether many customers actually needed the reprogrammable functions, and Boppana emphasized that the ability to carry out in-field on-device training makes it an obvious selling point. In that circumstance, a user will need to be able to tweak the silicon that powers the models, and so you need a technology like the FPGA.

Over time, of course, as the application becomes more settled, that model could be ported to a specialist ASIC chip, but as we are still in the early phases of the market, that is not such a concern for the likes of Lattice. In time, sure, they need to have a compelling answer for why an FPGA makes more sense than a single-use design, but there’s hardly a shortage of applications for the FPGA vendors to pursue. Broadly, Boppana notes, the consumer side of things tends to use the in-field upgrades much more rarely than the likes of communications infrastructure.

We pushed Boppana, asking whether AI absolutely needed these in-field upgrade options, whether they were inextricable from the value proposition. He said it was hard to answer, but noted that because Lattice is playing at the milliwatt scale, it seems unlikely that users are going to see major cost benefits in moving from the FPGAs to a dedicated ASIC design.

He added that one can’t discount custom ASICs and ASIPs (Application-Specific Instruction Set Processors), but that the nice thing about FPGAs is that they aren’t necessarily an all-or-nothing proposition, that they can complement these other dedicated chips inside a device. The best example of this is akin to ARM’s big.Little architecture in smartphone SoCs, where smaller, less power hungry cores are running the background processes, until the larger cores need to be woken to perform the heavy lifting. As such, you might find a low-power FPGA in a camera is tasked with scanning for possible humans, before an ASIC is fired up to actually analyze the person’s face. Boppana said this was a bit like how the human body has reflexes that precede the brain’s thinking processes.

Our conversation moved to the developer community, and whether there is a problem ahead for developers looking to work with FPGA and AI technologies in their applications. Boppana outlined how a lot of work in the AI world is being done in frameworks such as Caffe, which are not part of the FPGA stacks. To this end, Lattice has developed neural network compilers, as part of its sensAI portfolio, which will translate the model from the framework onto the FPGA, so that a user does not be an expert in FPGA design.

Boppana noted that there is still the need for some knowledge on how to bridge the FPGA-enabled model with the rest of the components inside the device, in order to actually make use of the inferencing functions that the model provides. Consequently, Lattice offers help for these sorts of customers, as part of its design services offering, as well as fostering a pretty healthy partner community.

As for the risks to Lattice’s strategy, Boppana said that it’s really all about the execution. He believes that the strategy is solid, seeing good evidence of customer traction through the demand for custom design services. Boppana said that because of this new era of AI processing, some customers require a complete solution, and Lattice is therefore quite excited at this opportunity to deliver this sort of value-add to its customers, in conjunction with those ecosystem partners.