View profile

AWS considers Kubernetes; Intel, DeepMind, China crank up their AI engines

I told you this one would be late, and I delivered on my promise. The links to individual items are s
AWS considers Kubernetes; Intel, DeepMind, China crank up their AI engines
By ARCHITECHT • Issue #116 • View online
I told you this one would be late, and I delivered on my promise. The links to individual items are sans analysis and include the original headlines, but they’re actually pretty clear about what you’ll be reading—and there are a lot of good ones. Hopefully I made up for the dearth of analysis there with a nice chunk of it up top.
There was some pretty big news on Thursday led, I think, by a report that Amazon Web Services is considering building a new Kubernetes-based container-orchestration service, or at least making its existing Elastic Container Service more Kubernetes-friendly. Nothing is confirmed at this point, but the idea makes perfect sense.
Basically, AWS is in a common position for vendors and cloud providers in a world where open source infrastructure technologies—especially those, like Kubernetes, with huge, active communities—reign supreme. You can either embrace them and offer services around them, or you can stubbornly buck against them and try to convince users that your proprietary (or might-as-well-be-proprietary) service is the better choice. We’ve seen this happen time and time again over the past decade; even where providers still offer their own proprietary options, they know they need a story around certain open source technologies. Hadoop, Spark, MySQL and Postgres come to mind off the top of my head.
Embracing Kubernetes seems like a no-brainer for AWS, which, as is, already hosts the majority of Kubernetes workloads. Providing a managed service would just let AWS convert some of those Kubernetes-on-EC2 deployments into higher-margin engagements. It would also help AWS capture some users looking at Google and Microsoft, both of which offer managed Kubernetes services and are actively engaged in its development.
However, one big gotcha for AWS to watch out for is making sure it doesn’t just pay lip service to Kubernetes, but actually engages in the community and builds something that users will want to use. This isn’t an ancillary big data service where good enough might do the trick. A lot of people expect Kubernetes to play a big role in application architectures going forward, which means users and competitors alike will be quick to identify and call out shortcomings.
This seems like a good opportunity to call out a bunch of podcast episodes and other stuff I’ve written about Kubernetes and the cloud so far this year. So, here goes (in somewhat chronological order):
There was also some big AI news on Thursday, highlighted for me by the Movidius/Intel Neural Compute Stick. As they name might suggest, it’s a USB stick that lets computers or other devices run deep learning models (specifically, convolutional neural networks) without requiring a connection to cloud resources. 
Obviously, the idea of embedded AI processors is not new (that’s what Movidius, which Intel acquired last September, has always done), but what makes the NCS so interesting is that it’s not an embedded devices. Nor is it a full-powered GPU.
Rather, it’s an inexpensive piece of plug-and-play hardware that can add a little intelligence, and computing horsepower, to all sorts of computers—laptops, desktops, Raspberry Pis, or anything with a USB port and the right specs. The NCS isn’t going to threaten Nvidia in the data center, but it could help Intel take a bite of out GPU sales to hobbyists and developers, ideally (for Intel) acting as a stepping stone to bring them into the Intel camp. Should they need fully embedded AI chips for production devices or—assuming Intel expands its range—more horsepower inside PCs or servers, they’ll now have a non-Nvidia choice.
For more on the importance of processors in AI, check out the latest ARCHITECHT AI and Robot Show interview, featuring an interview with Baidu researcher Sharan Narang on the company’s hardware benchmarks and where the field is headed.
The other really big piece of AI news was new details from the Chinese government’s plan to be the world’s AI powerhouse by 2030. As I’ve pointed out before, I don’t think we’ll see real winners and losers when it comes to commercial AI—China, the United States and other regions can all thrive in their own ways and benefit from one another. But looking at things through a geopolitical lens is a different story. And, there, Chinese spending on AI for things like military, surveillance and core scientific research should make other countries think about whether they’re doing enough to remain competitive on that very large stage.
Finally, Thursday also brought a spate of items from Google research arm DeepMind, which is stepping up its efforts to build AI systems that can function more like human brains by incorporating imagination and other human-only traits. DeepMind is the company that pioneered video-game-playing systems and brought us AlphaGo, so it serves as a pretty good harbinger of what’s coming in AI. Here’s a collection of DeepMind news items and research papers from Thursday:

Sponsor: Bonsai
Sponsor: Bonsai
Listen to the latest ARCHITECHT Show podcast
Artificial intelligence
Sponsor: DigitalOcean
Sponsor: DigitalOcean
Cloud and infrastructure
All things data
New ARCHITECHT Show every Thursday; new AI & Robot Show every Friday!
New ARCHITECHT Show every Thursday; new AI & Robot Show every Friday!
Did you enjoy this issue?

ARCHITECHT delivers the most interesting news and information about the business impacts of cloud computing, artificial intelligence, and other trends reshaping enterprise IT. Curated by Derrick Harris.

Check out the Architecht site at

If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue