First things first
I'll assume you're getting tired of reading about Microsoft buying GitHub for $7.5 billion, and just offer a few thoughts of my own on that:
- Microsoft was probably the best place for GitHub to land, if you're looking for a win-win situation. Microsoft needs to attract more developers, and GitHub should help with that. Also, GitHub needs to scale its business (likely by developing and selling more of its enterprise product), and Microsoft can help with that. Google, AWS, IBM, Oracle or anybody else would be as promising of a home IMHO.
- AWS and Google will almost certainly step up their efforts to offer alternatives to GitHub. Everybody loves a de facto standard like GitHub, until your blood rival owns it. Perhaps they'll even pull some of their projects, although that could backfire because so many developers will still use GitHub over whatever they might offer.
- If you haven't heard of GitLab or any other GitHub alternatives, get ready to start hearing a lot more about them.
And now, here are a bunch of links to insightful commentary on the acquisition that you may or may not have already seen. You can also get my thoughts in the ARCHITECHT Show podcast episode linked to below:
- Microsoft + GitHub = Empowering Developers (Microsoft)
- So pigs do fly: Microsoft acquires GitHub (Redmonk)
- The cost of developers (Stratechery)
- Everyone complaining about Microsoft buying GitHub needs to offer a better solution (Ars Technica)
- Developers and cloud rivals will be watching Microsoft’s plans for Azure and GitHub very closely (GeekWire)
- Chris Aniszczyk on Twitter: "good news for @gitlab though, we should support them to keep Microsoft honest, even if they turn out to be a great steward of the acquisition"
- GitLab sees huge spike in project imports (Hacker News)
- GitLab’s high-end plans are now free for open source projects and schools (TechCrunch)
In defense of boring AI
I didn't watch any of Apple's WWDC this week, but I couldn't help but see a lot of the news coming out of it. And the one thing that struck me was how little AI hype Apple seemed to be spouting this time around. Yeah, it announced some improvements to its developer tools for building intelligent apps:
- Apple's Core ML 2 is 30% faster, cuts AI model sizes by up to 75% (VentureBeat)
- Apple's CreateML is a nice feature with an unclear purpose (TechCrunch)
but we were spared the usual smartphone-conference spiel about borderline-useful new features that are amazing because they use a machine learning algorithm.
I don't know if this is because of issues with facial recognition on the iPhone X (which Apple made a huge deal about when announcing it) or simply because Apple is great at taking the temperature of the room. Right now, after Cambridge Analytica and Alexa always listening and GDPR, privacy is top of mind for a lot people. Maybe Apple decided that its best bet is improving the AI features it already has and making them a top-notch experience, rather than trying to build new stuff just because it can. Whatever the reason, this pleased me.
We need to slow down, especially when it comes to consumer AI. I think AI can be really useful for a lot of things, but the breakneck pace of trying to jam it into every experience is going to cause problems. From unforeseen security or privacy issues, to over- or underwhelmed consumers, I don't see how it can end well. People (and I mean the majority of the world) still need to understand what these new capabilities are and get accustomed to them before being flung into hyperdrive. And it would be best for everyone involved if those experiences were as close to flawless as possible, rather than feeling like certain key aspects were afterthoughts.
We'll get to the smart home and smart everything eventually, but it doesn't need to be tomorrow. And like so many parents tell their kids, it's best to do it right the first time than to rush in the name of doing it faster.
On a related note, this story from Axios also pleased me: AI researchers are halting work on human-like machines. The quoted researcher from Carnegie Mellon makes my point exactly, which is that we've made some real advances over the past several years, and there is still so much room to improve upon them -- including simply by creating lower-power chips that consume less energy and don't, as he noted, increase the temperature of a smart car by 10 degrees. Everyone doesn't need to rush toward artificial general intelligence and leave the current and useful technologies half-baked as a result.