I never use Pinterest. I barely follow TechCrunch Disrupt. And yet, I found myself nodding along while reading this Disrupt interview with Pinterest president Tim Kendall
, in which he describes how Pinterest is now showing ads for products visually similar to other things a user has searched for or pinned. Pinterest isn’t trying to cure cancer or power autonomous cars, but—damn!—has it found a great application for deep learning and computer vision.
Commercially speaking, Pinterest might be the perfect application of these technologies. Its business, essentially, is pictures of things and strong positive signals in the form of “pins.” Deep learning makes it possible to take what you like and show you more stuff like that. If Pinterest can show ads—so someone can actually buy something they like, or at least know where to buy something—it’s giving advertisers (and possibly consumers) a truly unique and relevant experience.
As I’ve written before
, the Pinterest use case is both quite mundane and also a testament to how far deep learning has come in the past few years. But what makes Pinterest noteworthy is not that it’s using deep learning—seemingly everyone is today—but that it found a killer application for it.
Venture capitalists like to talk about competitive moats, and some (many?) now view data as a potentially
very large moat. (Here’s Jake Flomenberg from Accel talking about data moats
; here’s Jerry Chen from Greylock writing about them
.) Using deep learning is not a moat, in part because everybody now has access to the tools for doing it, and in part because it can’t magically make bad data valuable. You build a moat by gathering data that’s unique, provides strong signals and is relevant to whatever your monetization strategy is—and then applying the right machine learning approach to it.
Pinterest has done some good work molding deep learning libraries to its use case, but that could be a lot of wasted effort it it hadn’t nailed the data part first.
P.S. In more deep learning news, Google also announced on Wednesday that cloud users can now build on top of Google Tensor Processing Units. More on that tomorrow but, in the meantime, chew on what that might mean for Nvidia.
P.P.S. I realize the publication time of this newsletter is slipping later, mostly as a function of the other duties I find myself doing more now (e.g., sales, conference planning, reconnecting with sources, etc.). I am determined to figure out a schedule that will allow me to publish by 8 a.m. PT every day. Bear with me.