First things first
A funny thing happened this week, as news first broke that Google's head of AI and search, John Giannandrea, had stepped down as part of a reorganization (that left engineer extraordinaire Jeff Dean in charge of AI, and Ben Gomes in search of search). A day later, we learned that Giannandrea is now Apple's head of machine learning and AI strategy.
Most everyone writing about this is reporting it as a great hire; I agree, but for different reasons than I've seen noted elsewhere. Like most things today when is comes to applied AI, this is all about the data.
At the risk of insulting somebody who's clearly much smarter than me, I think Google will be just fine without Giannandrea. Between the Google Brain team that Dean already leads and DeepMind, the company is still one of -- if not the -- leading AI research organizations in the world. Unless Giannandrea is an absolute whiz at productizing AI research, the Google train should keep chugging down the tracks without any real slowdowns. It just has so. much. experience.
Hiring Jeff Dean or Demis Hassabis of DeepMind might have catapulted Apple into the discussion of being Google's peer in AI overall. I don't see the hiring GIannandrea as accomplishing that.
So what makes this hire so potentially great for Apple, I would argue, is Giannandrea's experience as CTO of Metaweb, the company behind Freebase. Metaweb described Freebase as "an open shared database of the world's knowledge," and "a massive, collaboratively edited database of cross-linked data." And, indeed, Freebase became a foundation for Google's now-ubiquitous Knowledge Graph after the company acquired Metaweb in 2010 (this was 2 years before the famous cat video that introduced the word to deep learning). Part of the reason Google's consumer AI products are so good is because they have such a large body of knowledge from which to draw answers/results.
What Apple needs is not to make Siri better at understanding speech -- that technology is known well enough -- but to make Siri smarter. And Apple "wants" to do this while maintaining user privacy, which I take to mean making Siri smarter without always listening and otherwise using data its users assumed was private. This is exactly what something like Freebase can help it do. It's why Facebook last year acquired a startup called Ozlo to help power its Messenger service. (A startup I profiled 5 months earlier, FWIW.)
Sometimes -- I would argue most of the time -- you just want to know the answer to a question. Personalization has a place, but it's overrated. The knowledge is everything.
Google's efforts in AI research deserve all the praise they've received, but it was search -- and the data and data architectures that underpin it -- that make Google's AI product so damn useful. If Apple and Amazon can figure out a way to compete with Google there, the next phase of consumer AI is going to be very, very interesting.
AWS adds a cheaper S3 tier and puts machine learning on laptops and IoT devices
The other big news this week was the AWS Summit in San Francisco, at which the cloud giant announced a slew of new services. Honestly, there's too much to cover so if you're really interested, just wander over to the AWS blog and start reading.
However, there were 3 items I found particularly interesting:
- Amazon S3 update: New storage class and general availability of S3 Select: The S3 One Zone-IA Storage Class is cheaper storage (20 percent cheaper), designed to be accessed less frequently and stored in only a single location. There's definitely a need for different tiers of storage at different price points, but you have to wonder (A) how many is too many and (B) at what point cloud providers are just selling less for less (as opposed to less for more) because they can.
- New – machine learning inference at the edge using AWS Greengrass: This is a potential game-changer for AWS users, as well as for industry in general (as you'll see from the SWIM.ai launch linked to below). The more intelligence we can put into devices themselves, the more we can actually solve real-world problems in agriculture, civic planning and other challenging fields.
- Amazon SageMaker now supports additional instance types, local mode, open sourced containers, MXNet and Tensorflow updates: The big deal here is the containerized local mode. Being able to develop locally and deploy to the cloud is what helped make Docker so popular in the first place. Nailing that experience for AI developers could help AWS stave off the AI competition from Google.
Microsoft is also doing a lot in AI, IoT and cloud storage
So check out its news from this week:
- Microsoft reinvents Massive Arrays of Idle Disks for Azure, 'cos IBM tape ain't enough (The Register)
- Microsoft commits to spending $5B over four years on IoT (ZDNet)
- Azure launches availability zones in GA, catching up to rivals (Data Center Knowledge)
- Microsoft launches new online training courses for aspiring AI engineers (GeekWire)
- Smooth talker: Microsoft speech technology claims superior conversational abilities (GeekWire)
- Can Microsoft get smarter? Inside the tech giant’s massive bet on AI (GeekWire)