Can telcos succeed on the edge where they failed in the cloud?

One of the biggest myths in the early days of cloud computing was that telcos like Verizon, AT&T
ARCHITECHT
Can telcos succeed on the edge where they failed in the cloud?
By ARCHITECHT • Issue #117
One of the biggest myths in the early days of cloud computing was that telcos like Verizon, AT&T and even CenturyLink were destined to win the “enterprise cloud” market. The rationale was, more or less, that they understood how to deal with large enterprises, owned the networks and had existing data center footprints they could leverage. 
Telcos, of course, did not win the enterprise cloud market. To the extent such a thing even exists, AWS, Microsoft and Google pretty much own that, too. 
So, what to make of AT&T’s plan to build out an edge computing infrastructure—based around 5G and software-defined networks—to own the Internet of Things? I would argue that, this time around, all of that distributed infrastructure actually could play in the favor of telcos. By converting even offices and cellular infrastructure into edge computing nodes, not to mention its roles in developing wireless protocols and owning networks, AT&T does appear to have an advantage when it comes to delivering low-latency data and computing services.
The link above is to a news item about AT&T’s edge ambitions, and here’s an ambitious excerpt from an AT&T blog post on what it has planned:
So we’re shrinking the distance. Instead of sending commands hundreds of miles to a handful of data centers scattered around the country, we’ll send them to the tens of thousands of central offices, macro towers, and small cells usually never farther than a few miles from our customers.
If the data centers are the “core” of the cloud, these towers, central offices, and small cells are at the “edge” of the cloud. Intelligence is no longer confined to the core. The cloud comes to you.
We’ll outfit those facilities with high-end graphics processing chips and other general purpose computers. We’ll coordinate and manage those systems with our virtualized and software-defined network.
And, hey, AT&T has even gotten some open source religion between then and now. It’s clearly not Facebook yet, but open sourcing its SDN management software does suggest AT&T is moving in the right direction.
However, cloud providers still present a big challenge for AT&T and all companies looking to cash in on our connected devices’ need for speed:
The latter bullet point applies to companies like Apple, as well. There are also companies like Facebook that prioritize performance and data efficiency, so are constantly optimizing their applications to consume as little data and, thus, bandwidth, as possible. Essentially, where they can’t own the distance between data centers and users, the companies currently driving consumer and enterprise compute consumption are working to make that distance matter a lot less.
Obviously, the network still matters a lot, and there’s still a ton of opportunity for companies like AT&T to capitalize on their ownership of it—as well as their edge computing footprint all along it. But if they’ve learned anything from the past decade of launching and shuttering IaaS offerings, telcos won’t underestimate AWS et al on the edge like they did in the cloud. 

Highlights from the ARCHITECHT AI Show
Highlights from a recent ARCHITECHT AI Show, in which Skymind CTO Adam Gibson explains how his company is succeeding in enterprise AI, thanks in part to a decision to start selling heavily in China and Japan.
Artificial intelligence
I referenced this up top, but here’s the actual blog post on Microsoft’s new AI co-processor. The user advantages of carrying out deep learning models on-device are obvious, but Microsoft’s decision is also part of a broader industry trend that’s working against Intel. I always point out Intel’s opportunity to capture on-device AI processing, but that window narrows every time a big company like Microsoft starts shipping devices with its own chips. 
This is obviously related to the Microsoft news above, although focused mostly on data center computing (where Microsoft uses FPGAs). Survey-takers were especially big on GPUs/
CB Insights never tires of providing data on AI startup funding and acquisition, and here’s the latest installment. Google and Apple are particularly busy.
Not to beat a dead horse, but over-promising, over-branding and a lack of transparency are fueling a lot of speculation about the fate of IBM’s Watson business. That being said, the podcast summary above (and the entire podcast) speak to the early nature of enterprise AI adoption and, indirectly, to the wisdom of IBM’s business strategy for the time being.
This is a good article explaining the various hardware architectures certain companies are looking at for AI workloads, as well as why. For example, there’s value for Google in owning the stack from bottom to top.
Some recent advances in speech and video recognition have people concerned that hackers will be able to create passable facsimiles of people without their permission. I love this quote from an AI researcher: “I believe that researchers should take ethical concerns into account and public discussion and fear should be informed by technical research.”
It’s honestly getting difficult to keep up with what’s coming out of DeepMind, much less the whole AI research community. This latest paper shows how a system can learn separate outcomes for actions rather than just averaging outcomes into a single prediction.
The thing with Disney Research is you have to assume the intent of its work is to apply it. I can see how this would be valuable to companies like Disney, but where’s the line when consumers stop being reduced to data sources?
The folks at Microsoft Research want to address the AI skills gap by making it easier for people to teach models, even if they can’t build them. It’s a smart idea, which reminds of what Bonsai is doing with its enterprise AI software. On a related note, here is a collection of videos from Microsoft Research’s recent Faculty Summit.
arxiv.org  •  Share
This is fair advice, but I wouldn’t hold my breath on seeing it practiced too often. Mostly because deadlines, clicks and the desire to engage with mainstream audiences all conspire against technical accuracy—and sometimes even reality.
Sponsor: DigitalOcean
Sponsor: DigitalOcean
Cloud and infrastructure
Like me, you probably haven’t heard of Applied Optoelectronics before this. However, the company sells components for data center networking gear and its stock is on a tear, with Amazon and Microsoft comprising nearly 75 percent of its revenue.
More on the speculation that AWS is building a service around Kubernetes, not just to have the best container service, but to prevent competition at the platform level. Here are two more good recent piece on Kubernetes:
This isn’t about cloud computing, specifically, but it reminded me of the rise of Alibaba and Baidu as cloud providers. There’s no real reason U.S. cloud providers must conquer China, and lots of reasons why they might not.
Another Redmonk analyst weighs in on this topic (here’s the first). Open source, open APIs and general culture around development have a lot to do with it.
redmonk.com  •  Share
Pretty much the same story as all those exposed Hadoop clusters and MongoDB deployments a while back. So if you’re reading this and running memcached, make sure you’re patched.
Sponsor: Bonsai
Sponsor: Bonsai
All things data
This is a good example of a company figuring out how to capture a new data source and think about monetizing it. But I suspect privacy concerns (and possibly a lack of utility) will prevent it from taking off.
Yael Garten has been part of the data science scene at LinkedIn since the term first caught on. Here’s some good advice from her on how to keep your data scientists happy via everything from giving them the right tools to viewing them “as partners, and not as service providers.”
But here’s a good set of recommendations for figuring out how to go about figuring it out for yourself. If all else fails, of course, err on the side of more data.
This is the real headline of a real article. Did anyone think this was going to happen? If so, it seems like the education campaign about what data scientists do needs to pick up again. 
New ARCHITECHT Show every Thursday; new AI & Robot Show every Friday!
New ARCHITECHT Show every Thursday; new AI & Robot Show every Friday!
Did you enjoy this issue?
ARCHITECHT
The most interesting news, analysis, blog posts and research in cloud computing, artificial intelligence and software engineering. Delivered daily to your inbox. Curated by Derrick Harris. Check out the Architecht site at https://architecht.io
Carefully curated by ARCHITECHT with Revue. If you were forwarded this newsletter and you like it, you can subscribe here. If you don't want these updates anymore, please unsubscribe here.