has a smart (go figure) article about Intel’s future
in a world where artificial intelligence is all the rage—something upon which rival Nvidia is already capitalizing. While the article is correct to paint this mainly as a battle of CPU vs. GPU at the moment, it also wisely points out that Intel made a big investment in the FPGA space when in bought Altera, as well as a smaller investment in the area of AI-specific processors when it acquired Nervana Systems.
The biggest factor for Intel’s future as far as AI is concerned (there’s still a lot of money to be made in CPU-based enterprise workloads) is where all those machine learning workloads will run. If they’re running primarily in the cloud, it does not look good.
The huge web companies leading the way in AI at the moment (Google, Facebook, Amazon, Microsoft, etc.) seem to have settled on GPUs as the right architecture for training their machine learning models, a situation that’s pretty good for Nvidia. It’s an even better situation if these companies opt to also run their production AI models on GPUs, which I believe they do in many cases.
Facing GPUs and other custom-built technologies inside these companies, Intel has an uphill battle to achieve mass adoption of FPGAs and data-center-scale AI processors.
However, we’re also seeing a lot of work already to make AI frameworks run on CPUs, which are still the default option in nearly every enterprise server and consumer desktop. The idea being that if developers and enterprises actually do want to train models themselves, many will want to use chip architectures they’re with which they’re already familiar and on which they’ve already invested a lot of money.
And then there are the specialized chips designed with the Internet of Things in mind, which I’ve covered many times already in this newsletter. Intel may have missed the smartphone wave (and all the AI algorithms that will run on them going forward) but the market for AI-optimized IoT chips is still wide open, and Intel has a deep pocketbook to buy up innovative technologies. In fact, in September, Intel acquired a startup called Movidius
that builds low-power hardware and software targeting computer vision.
Connected devices might actually be the killer app for brain-based chips in the shortish term, because small devices will require their smarts in a small, low-power package. The consumer world might never again care if their devices have “Intel Inside,” but they will care that they work.
It’s always fashionable to write off Intel, and AI represents another good opportunity to do it, but there’s still room for Intel to capitalize on AI as the market matures.
Want to sponsor this newsletter, or the ArchiTECHt Show podcast? Email email@example.com for more information.