ArchiTECHt Daily: Why Cisco paid $3.7B for AppDynamics

In case you missed it yesterday afternoon, Cisco acquired application performance management vendor A
ArchiTECHt Daily: Why Cisco paid $3.7B for AppDynamics
By ARCHITECHT • Issue #3
In case you missed it yesterday afternoon, Cisco acquired application performance management vendor AppDynamics for $3.7 billion, just two days before the company’s slated IPO. Why did Cisco pay so much for the company—more double the amount AppDynamics expected to raise in its IPO, more than double the market cap of its closest competitor, New Relic, and roughly 24x AppDynamics’ 2016 revenue? 
In one word: microservices. In three words: microservices within enterprises. 
Cisco’s press release talks a lot about “digital transformation” which is just a fancy way of saying that companies need to build better applications. Consumers demand them and employees demand them. Microservices architectures are the key to delivering them, primarily because microservices make it possible to manage, scale and iterate on the various components independently.
However, while this is great for speed and flexibility, having to manage all those services can be a recipe for complexity. Hence yesterday’s acquisition. Cisco wants a piece of the pie as enterprises start deploying microservices at a large scale, and AppDynamics sells software for monitoring and managing them. It also has a pretty large existing enterprise customer base.
If you’re wondering why Cisco didn’t make a play for New Relic, which is only available as SaaS, a line from this Cisco blog post should provide an answer: “The acquisition of AppDynamics also supports Cisco’s strategic transition toward software-centric solutions that deliver predictable recurring revenue.” Cisco is not much of a cloud service provider and, at least for the time being, many enterprises still buy software. 
$3.7  billion is a lot of money, but at least it appears to be an investment in the future. 

Kevin Scott
Around the web: Cloud and infrastructure
As much as LinkedIn the product can be a pain, LinkedIn’s engineering department has been doing great stuff the past several years, including creating Apache Kafka. Some of that DNA could be good for Microsoft.
Both companies are talking about integrating Cloud Cruiser with HPE’s private cloud-y Flexible Capacity solution. But, really, Cloud Cruiser’s chops at monitoring public cloud usage are the longer-term benefit.
Ignore that this link is to a sponsored post from Alibaba. Focus on the fact that it’s now trying to sell Alibaba Cloud to an international market, including the United States. 
This one by a company called Platform9. What’s certain is that Kubernetes is here to stay, what’s less certain is who’ll actually make money from it. With Google, Microsoft, Red Hat and Heptio all in the picture, life won’t be easy for the little guys.
You have to wonder if even Facebook could have predicted how big an effect OCP would have on the hardware industry. It’s good news for smaller network vendors willing to accept lower margins, but bad news for the Ciscos and Junipers of the world. Also, neither Facebook nor Google nor Amazon are buying any of this stuff any time soon.
Yesterday, I linked to a story about a Microsoft exec claiming hardware tariffs might affect its data center plans. The official Microsoft stance is that Brexit changes nothing,
Around the web: All things data
Called BigDAWG, it will initially support Postgres, Accumulo and SciDB. This could be a big deal in a world where applications must connect to multiple databases for their wide range of workloads.  •  Share
INGESTBASE is built by researchers at Microsoft and MIT, and seems kind of like a relic of the past decade. Yes, there is a lot of HDFS out there, but the world is also moving on from it in a lot of cases.  •  Share
Google is doing the lord’s work here in trying to lay down the groundwork for making it easier to find, verify and connect open datasets. I don’t imagine the Trump administration’s attitude toward research will make this any easier.
Around the web: Artificial intelligence
The quantum computing company is now selling a 2,000-qubit system, which doubles its previous incarnation. But is it possible that deep learning will sufficiently push the state of the art in machine learning and put commercial quantum computing on the backburner for a while?
Filtering out the noise will be critical for real-world and real-time applications of computer vision, where data streaming off of sensors is rarely pristine.  •  Share
I have no idea what this means, but it’s from the Google Brain team and Jeff Dean and Geoff Hinton are listed among the authors. If you know anything about AI, you know that counts for something.  •  Share
The technology created by Recursion Pharmaceuticals sounds promising and potentially game-changing for medicine. On a related note, computer vision will have a much greater effect on that industry than will text analysis of medical literature.
This is the second paper I’ve read recently assessing how bad guys might be able to surreptitiously mess with features in machine learning models to influence the results. It’s a strong case against black box AI.
Did you enjoy this issue?
The most interesting news, analysis, blog posts and research in cloud computing, artificial intelligence and software engineering. Delivered daily to your inbox. Curated by Derrick Harris. Check out the Architecht site at
Carefully curated by ARCHITECHT with Revue. If you were forwarded this newsletter and you like it, you can subscribe here. If you don't want these updates anymore, please unsubscribe here.