I told you this one would be late, and I delivered on my promise. The links to individual items are sans analysis and include the original headlines, but they’re actually pretty clear about what you’ll be reading—and there are a lot of good ones. Hopefully I made up for the dearth of analysis there with a nice chunk of it up top.
Basically, AWS is in a common position for vendors and cloud providers in a world where open source infrastructure technologies—especially those, like Kubernetes, with huge, active communities—reign supreme. You can either embrace them and offer services around them, or you can stubbornly buck against them and try to convince users that your proprietary (or might-as-well-be-proprietary) service is the better choice. We’ve seen this happen time and time again over the past decade; even where providers still offer their own proprietary options, they know they need a story around certain open source technologies
. Hadoop, Spark, MySQL and Postgres come to mind off the top of my head.
Embracing Kubernetes seems like a no-brainer for AWS, which, as is, already hosts the majority of Kubernetes workloads. Providing a managed service would just let AWS convert some of those Kubernetes-on-EC2 deployments into higher-margin engagements. It would also help AWS capture some users looking at Google and Microsoft, both of which offer managed Kubernetes services and are actively engaged in its development.
However, one big gotcha for AWS to watch out for is making sure it doesn’t just pay lip service to Kubernetes, but actually engages in the community and builds something that users will want to use. This isn’t an ancillary big data service where good enough might do the trick. A lot of people expect Kubernetes to play a big role in application architectures going forward, which means users and competitors alike will be quick to identify and call out shortcomings.
This seems like a good opportunity to call out a bunch of podcast episodes and other stuff I’ve written about Kubernetes and the cloud so far this year. So, here goes (in somewhat chronological order):
AWS can no longer take its cloud dominance for granted. Here are 4 reasons why not.
Buoyant CEO on why cloud-native, why now, and why doing it at Twitter was very hard
Kubernetes 201 with CNCF director Dan Kohn and Microsoft’s Gabe Monroy
CoresOS CEO on Kubernetes, containers and coopetition with cloud providers
Google VP Eric Brewer on open source, innovation and the economics of cloud
Kubernetes creator Brendan Burns on joining Microsoft and where we’re really at with containers
Documenting the rise of Docker, and the multi-cloud, with Datadog’s Olivier Pomel
Why the future of enterprise IT includes containers, clouds and, yes, IBM
Box co-founder on moving from monolith to microservices—and the promise of Kubernetes to set developers free
Microsoft joins Cloud Foundry, and Cloud Foundry embraces Kubernetes
Kubernetes is a big deal, but an overlooked one
The container consolidation begins
Is the container war even winnable?
Are containers Microsoft’s ace in the hole in the cloud?
There was also some big AI news on Thursday, highlighted for me by the Movidius/Intel Neural Compute Stick
. As they name might suggest, it’s a USB stick that lets computers or other devices run deep learning models (specifically, convolutional neural networks) without requiring a connection to cloud resources.
Obviously, the idea of embedded AI processors is not new (that’s what Movidius, which Intel acquired last September, has always done), but what makes the NCS so interesting is that it’s not an embedded devices. Nor is it a full-powered GPU.
Rather, it’s an inexpensive piece of plug-and-play hardware that can add a little intelligence, and computing horsepower, to all sorts of computers—laptops, desktops, Raspberry Pis, or anything with a USB port and the right specs. The NCS isn’t going to threaten Nvidia in the data center
, but it could help Intel take a bite of out GPU sales to hobbyists and developers, ideally (for Intel) acting as a stepping stone to bring them into the Intel camp. Should they need fully embedded AI chips for production devices or—assuming Intel expands its range—more horsepower inside PCs or servers, they’ll now have a non-Nvidia choice.
The other really big piece of AI news was new details from the Chinese government’s plan to be the world’s AI powerhouse by 2030.
As I’ve pointed out before, I don’t think we’ll see real winners and losers when it comes to commercial AI
—China, the United States and other regions can all thrive in their own ways and benefit from one another. But looking at things through a geopolitical lens is a different story. And, there, Chinese spending on AI for things like military, surveillance and core scientific research should make other countries think about whether they’re doing enough to remain competitive on that very large stage.
Finally, Thursday also brought a spate of items from Google research arm DeepMind, which is stepping up its efforts to build AI systems that can function more like human brains
by incorporating imagination and other human-only traits. DeepMind is the company that pioneered video-game-playing systems and brought us AlphaGo, so it serves as a pretty good harbinger of what’s coming in AI. Here’s a collection of DeepMind news items and research papers from Thursday: