ARCHITECHT Daily: Pinterest might be the killer app for computer vision—because data

I never use Pinterest. I barely follow TechCrunch Disrupt. And yet, I found myself nodding along whil
ARCHITECHT
ARCHITECHT Daily: Pinterest might be the killer app for computer vision—because data
By ARCHITECHT • Issue #77
I never use Pinterest. I barely follow TechCrunch Disrupt. And yet, I found myself nodding along while reading this Disrupt interview with Pinterest president Tim Kendall, in which he describes how Pinterest is now showing ads for products visually similar to other things a user has searched for or pinned. Pinterest isn’t trying to cure cancer or power autonomous cars, but—damn!—has it found a great application for deep learning and computer vision. 
Commercially speaking, Pinterest might be the perfect application of these technologies. Its business, essentially, is pictures of things and strong positive signals in the form of “pins.” Deep learning makes it possible to take what you like and show you more stuff like that. If Pinterest can show ads—so someone can actually buy something they like, or at least know where to buy something—it’s giving advertisers (and possibly consumers) a truly unique and relevant experience.
As I’ve written before, the Pinterest use case is both quite mundane and also a testament to how far deep learning has come in the past few years. But what makes Pinterest noteworthy is not that it’s using deep learning—seemingly everyone is today—but that it found a killer application for it.
Venture capitalists like to talk about competitive moats, and some (many?) now view data as a potentially very large moat. (Here’s Jake Flomenberg from Accel talking about data moats; here’s Jerry Chen from Greylock writing about them.) Using deep learning is not a moat, in part because everybody now has access to the tools for doing it, and in part because it can’t magically make bad data valuable. You build a moat by gathering data that’s unique, provides strong signals and is relevant to whatever your monetization strategy is—and then applying the right machine learning approach to it.
Pinterest has done some good work molding deep learning libraries to its use case, but that could be a lot of wasted effort it it hadn’t nailed the data part first.
P.S. In more deep learning news, Google also announced on Wednesday that cloud users can now build on top of Google Tensor Processing Units. More on that tomorrow but, in the meantime, chew on what that might mean for Nvidia.
P.P.S. I realize the publication time of this newsletter is slipping later, mostly as a function of the other duties I find myself doing more now (e.g., sales, conference planning, reconnecting with sources, etc.). I am determined to figure out a schedule that will allow me to publish by 8 a.m. PT every day. Bear with me.

Sponsor: Cloudera
Artificial intelligence
The money is dynamic, as $20 million will free up if D-Wave achieves certain goals. I think the pressure has to be on for D-Wave, though, as more quantum startups are hitting the scene; large companies like IBM and Google are stepping up their efforts; and research is booming.
Speaking of quantum computing, the next decade(s) of computing are going to be fascinating as AI and quantum computing hit their strides and cyberwar becomes an even bigger threat.
Its full name is the Partnership on AI to Benefit People and Society, and new members include eBay, Intel, Salesforce and Sony (along with several more). You have to hope the non-profits have some voice in this; companies aren’t always great at identifying what benefits society.
Nick Carr argues that while automation is certain affecting jobs, there’s little proof so far to suggest it’s eliminating them. A counter might be that we haven’t previously had AI like what’s coming down the pike.
Well, this is pretty awesome. A robot can watch a VR example one time, and then correctly repeat the same task. Could be very valuable if it scales well.
This research is somewhat related to the above experiment (and even includes an OpenAI member). It’s a proposed method for making sure that robots don’t apply too much torque (or potentially anything else) when transferred from training into the real world.
Cloud and infrastructure
Google made a couple of cloud computing announcements before its big I/O conference this week:
If I’m looking at the cloud market based solely on what I’ve seen in 2017 so far, I might be inclined to think that Google and Microsoft are positioning themselves very well to win new workloads over Amazon Web Services. This is especially true on the database and IoT fronts, where the former two companies have launched some very compelling products. On the other hand, AWS still dominates “serverless” discussions with Lambda and seems to be getting solid traction with its Alexa-based machine learning tools. Let’s give it a couple years to play out …

Watching cloud providers fall over themselves to prove they’re the best place to run SAP is an amazing turn of events. But now that they’ve all but killed the “enterprise cloud,” it’s time to focus on making that money.
fortune.com  •  Share
And they are rumored to be for running SAP HANA.
There are reasons to be excited by HPE’s mission to jam as much memory as possible into a single computer, but I think an exabyte by 2022 might be overkill even for HANA. 
At the very least, Nvidia is selling a lot more of them now. As with all things, the cloud will influence how long this trend continues and how big it gets. Nvidia, obviously, suspects it will be long and huge.
This topic won’t get old for quite a while. And the fact that we’re having this discussion while it’s still in its infancy can only be a good thing.
At least, that’s my takeaway from the third installment of Kubernetes’ co-creator Craig McLuckie’s series on multi-cloud. Doing active multi-cloud right will be really tough, but cloud providers are all really good at providing availability.
Developers can manage their Git processes from the Atom text editor, as well as from the new GitHub Desktop, which is in beta.
github.com  •  Share
These researchers present a system for offloading computing to the cloud that they say will let robots do more. As with most workloads, it seems like that might be better suited for batch tasks while low-latency stuff happens locally.
Media partner: GeekWire
All things data
In the continuing quest for some order around comments sections, more data—especially well-annotated data—is our friend. Note: not all comments are bad; some are even useful.
These researchers are trying to solve the problem of ranking content (e.g., search results or “you might also like”) when companies are trying to achieve multi-faceted goals beyond, say, just high clickthrough rates.
I’m not sure if the world needs another graph-computation engine, but that’s what GraphH is. However, its developers claim its much faster than existing options and can run on smaller clusters.
Listen the the ARCHITECHT Show podcast. New episodes every Thursday!
Did you enjoy this issue?
ARCHITECHT
The most interesting news, analysis, blog posts and research in cloud computing, artificial intelligence and software engineering. Delivered daily to your inbox. Curated by Derrick Harris. Check out the Architecht site at https://architecht.io
Carefully curated by ARCHITECHT with Revue. If you were forwarded this newsletter and you like it, you can subscribe here. If you don't want these updates anymore, please unsubscribe here.