ARCHITECHT Daily: Dell is saying all the right things. Can it deliver?

I never thought I'd care too much about a Dell EMC conference again, but this year's is a little diff
ARCHITECHT
ARCHITECHT Daily: Dell is saying all the right things. Can it deliver?
By ARCHITECHT • Issue #72
I never thought I’d care too much about a Dell EMC conference again, but this year’s is a little different. The company opened up about its $100 million annual venture arm—including in machine-learning-chip startup Graphcore and DNA-sequencing-chip startup Edico Genome—and Michael Dell is on stage talking about how “large numbers” of customers are backpeddling from the public cloud, as well as the company’s startup-like mentality around innovation.

Throw in a recent deal to power Salesforce data centers with Dell gear, and today’s news about a dedicated health care cloud by Virtustream, Dell’s enterprise cloud computing division, and you have something to be genuinely excited about. Maybe.
Dell’s biggest problem, I would argue, is still the cloud and the buying and architecture patterns it has inspired. In a couple years’ time, Amazon Web Services will likely be a larger business than Dell servers, Dell storage and VMware, combined. The cloud businesses at Microsoft and Google are growing like mad, too
According to IDC, Dell is holding its own or leading in market share for servers, storage and cloud infrastructure market share. That’s good news, but revenue is also holding relatively flat or even decreasing across those areas. And in cloud infrastructure, IDC estimates that ODM manufacturers account for more market share and revenue than any individual vendor, while the “other” category of vendors is both the largest and fastest-growing.
Machine learning and private cloud (yes, it’s true!) are both growing areas, but Dell will have to work harder to capture them in any meaningful way. Acquisitions of Dell’s venture portfolio companies (e.g., Graphcore) might provide a short-term revenue boost, but could serve to lessen the importance of Dell servers in the long run. Dell is the majority owner of Pivotal, which is doing well for itself in the private cloud space, but, I’m not certain there’s a real technological incentive to running Cloud Foundry or any of Pivotal’s software on Dell gear.
I don’t claim to have the answer for how Dell should capitalize on the trends it clearly has identified, but it seems like more active investments in some of them might be one way. Rather than relying on venture investments, partnerships and even majority ownership, Dell could actually try to stake its claim in a new area and own it. 
Amazon used to sell books. Google used to be just a search engine. Dell used to sell PCs, servers, storage and networking gear. What’s its bold bet on the future?

Listen to the latest ARCHITECHT Show episode
In the latest episode of the ARCHITECHT Show podcast, Honeycomb CEO Charity Majors tackles issues ranging from distributed systems to Second Life, and from open source to the importance of IT. Here are some highlights from a fun interview.
Sponsor: Cloudera
Artificial intelligence
From an infrastructure perspective, there’s a lot to like about Nvidia’s Metropolis vision (in that it takes advantage of lots of Nvidia tech). But I think civil liberties groups might take issue stateside.
Raquel Urtasun is stepping out of the University of Toronto and into the fire. Here’s a good take of what she brings to the table, and the stakes at play for both Urtasun and Uber. 
The headline links to a story about the Data Science Bowl, which focused on lung cancer prediction and was won by a Chinese team. Here’s a story about a Chinese startup called Infervision aiming to do the same thing,
This post is a classic case of fear-mongering without any cause for excitement about AI. I get the risk of a superintelligent system being breached—but why do we want one in the first place? Is the risk worth the reward?
hbr.org  •  Share
… here’s a new method to try preventing the insertion of adversarial images (where changes imperceptible to the human eye can fool deep learning models) by first compressing the images.
arxiv.org  •  Share
The idea of a “right to explanation,” which will go live (to some degree) in the EU in 2018, is really interesting. However, how well it works will depend a lot on the algorithms involved and how that info is delivered.
We still have a lot of work to do before we can really trust facial recognition for anything beyond social media, IMHO. This gets at some of the reason why (and is connected, I’d argue, to concerns over predictive policing, etc.).
medium.com  •  Share
A Microsoft researcher weighs in on some promising work to ensure we can still have cybersecurity once quantum computing reaches commercial viability.
Oh, yeah, and we’re getting closer everyday to viable quantum computers. This research is on a method to keep qubits cold, so systems can compute faster and with less initialization time.
Listen the the ARCHITECHT Show podcast. New episodes every Thursday!
Cloud and infrastructure
I linked to a story about SiFive yesterday, but this one does a much better job explaining its promise. It’s easy to envision a small but large-scale customer base for this; longtail business could big but costly to scale.
This post poses some fair questions about the state of Kubernetes on OpenStack (or vice versa), which I also noted yesterday. Maybe OpenStack is headed toward the CNCF.
In case you couldn’t tell, the OpenStack Summit is happening this week. Here’s a dose of nostalgia that highlights OpenStack startups that were acquired or shut down.
Maybe I’m missing something, but don’t most algorithms eventually need to connect with existing systems? And aren’t cloud resources easily available already? I like this idea, but it might be solved …
This is good advice, which Honeycomb’s Charity Majors touches on in the latest ARCHITECHT Show, as well. Build so that you’re flexible early, not locked into webscale technologies you might never grow into.
medium.com  •  Share
I’ve been posting a lot of MapD news lately … It might be because machine learning has been a boon for GPUs, and bigger GPU footprints open up lots of possibilities.
This is good work by Google, which has fixed bugs in some pretty widely used technologies. They’re paying more for projects to get involved, too.
Media partner: GeekWire
All things data
It’s funny what a couple of good quarters can do for confidence in the Hadoop market. The big question now is how big it can realistically grow, and how much the cloud helps/hampers that.
This brief note from the 451 Group about HSBC helps explain why analysts take cautiously optimistic views like the one above. The big data market is big, but also full of choices.
A collection of companies from the machine learning, database and GPU companies form an alliance around data science, etc, on GPUs. Details are vague but, as mentioned above, it’s part of a broader push to popularize GPU usage.
This idea makes so much sense in theory, but previous attempts have proven remarkably difficult to implement. It might require a forced shift in the power dynamic, but even companies already have so much data.
arxiv.org  •  Share
Did you enjoy this issue?
ARCHITECHT
The most interesting news, analysis, blog posts and research in cloud computing, artificial intelligence and software engineering. Delivered daily to your inbox. Curated by Derrick Harris. Check out the Architecht site at https://architecht.io
Carefully curated by ARCHITECHT with Revue. If you were forwarded this newsletter and you like it, you can subscribe here. If you don't want these updates anymore, please unsubscribe here.