View profile

FPGA-based AI Edge Accelerator Chip from Efinix

Revue
 
Usage of Field-Programmable Gate Arrays (FPGA) in Deep Learning inference workloads is not new trend.
 
August 2 · Issue #9 · View online
Artificial Intelligence Technology Update
Usage of Field-Programmable Gate Arrays (FPGA) in Deep Learning inference workloads is not new trend. Microsoft has a long and successful track record of using FPGAs in their “Brainwave” project. Much of the inference associated with Bing and Cortana relies on FPGAs. Aside from Xilinx and Intel, there are a number of fabless semiconductor companies that are leveraging their FPGA technology and offer reprogrammable Deep Learning accelerator chips. Flex Logix is one of the first companies that has leveraged their embedded-FPGA technology to build AI inference chips. Efinix (www.efinixinc.com) is another company that is pursuing a similar path.
Efinix
Efinix is a Santa Clara-based FPGA company that has redirected its core FPGA technology to serve AI edge inference use cases. They believe that computer vision will be the dominant AI edge application and are working with a number of security camera vendors to enable them to leverage AI in their traditional surveillance products. Their strategy is to offer a lower power/cost alternative to customers that already use Xilinx or Intel FPGAs in their products and intend to reduce cost and power dissipation. The variation of their core FPGA technology that makes them better-suited for Deep Learning is called “Quantum”. The basic building block of the Quantum technology is called XLR (eXchangeable Logic and Routing Cell) which can be configured as either a garden-variety LUT function or a routing structure. This flexibility makes the die utilization far better compared traditional FPGAs. Additionally, their process technology of choice only requires 7 metal layers compared to 10+ layers commonly needed in traditional FPGAs and this can lead to sizable reduction in wafer pricing. The company has raised $36M in VC and debt financing.
My Take:
In absence of performance numbers, it is hard to predict the market reception for Efinix’s AI accelerators. One of their main strengths is the ability to have their customers start from Efinix’s traditional FPGAs in order to test things out. Once the design is stable, customers can seamlessly migrate to their AI-optimized FPGAs. This is a very big advantage and a great selling point.
One thing is for certain, the end-markets addressed by Efinix (inference at the edge) is being pursued by many others. The general advice given to upstarts is to focus their attention on innovation and R/D (and the rest will come). This is a sound advice in many circumstances but not always. My advice for the likes of Efinix is to dedicate significant resource in building customer footprint (land grab) as quickly as possible. Time is of a great essence.

Learning Corner
Bayes' Theorem - The Forecasting Pillar of Data Science - DataFlair
=========================================================
Hope you have benefited from this issue. Please forward to others if you find value in this content. I always welcome feedback.
Al Gharakhanian
info@cogneefy.com | www | Linkedin | blog | Twitter
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue