THIS SECTION IS A BIT TECHNICAL/NERDY FOR THE GENERALIST BUT WORTH READING.
Does Apple understand artificial intelligence? Well … it depends… there were announcements at Apple’s developer event this week which suggest Apple is going to continue its investments in improving UX through technologies in the AI stack.
Three things stood out.
1. Apple announced support for basic neural networks
in iOS. In the short-term this means that Apple can offer accelerated inferencing (predicting) on iOS devices, including accessing the phone GPU, saving a round-trip to the cloud; or allowing certain models to run when there isn’t internet access. Both of these things will improve user experience. It’s unlikely at this stage to meaningfully be useful for local training. And there will be limitations, due to the memory and GPU footprint on the device, to the kind of inferencing that can be done - don’t expect a full automated speech recognition library to fit on a phone just yet. Of course, Google is pursuing a similar route to move TensorFlow onto Android devices and that team has at least one Apple alumnus who previously woked on image processing applications using early GPUs.
2. The firm continued to use privacy as a unique selling point. It introduced ‘differential privacy’, a cryptographically secure method for training over user data without compromising individual privacy. It “looks like Apple is honestly trying to do something to improve user privacy
” says Matthew Green. GOOD READ.
This is an important move - and let’s see whether consumers can understand it enough for it to be a product differentiator vs Google, Facebook or Amazon, or pressure higher standards of personal privacy elsewhere on the Internet. (Tom Simonite on Apple & differential privacy
is also accessible.)
3. Siri is open to third-party developers
, allowing them to plug into the voice interface. This is similar to the approach taken by Viv and Amazon’s Alexa. It seems like this will be the way we will get seamless voice interfaces that give us access to a broad, general range of underlying resources.
None of this really encouraged the meme that “Apple doesn’t get AI” to die. But Apple is a firm that cares about the UX almost to distraction. And the realms of artificial general intelligence and artificial super intelligence are not at the point where they do improve the UX, so it’s unsurprising that Apple isn’t announcing (or even investing in) systems that beat Go.
🔮 If you haven’t watched it already, look at the 1987 Apple vision of the future the Knowledge Navigator
. INCREDIBLY FARSIGHTED.
Apple also had a much-storied R&D group called the Advanced Technology Group, which created Hypercard and Quicktime. It was also responsible for breakthroughs in natural language interfaces including speech recognition & synthesises, V-Twin (a search index) and the handwriting recognition in the Newton. Jobs shuttered ATG when he returned as CEO in 1997 because he had the stabilise the firm.
More practically, Apple has been buying companies that deliver the AI stack for the past few years. These include Polar Rose (machine vision), Chomp (search), Novauris (speech recognition), Cue (virtual assistant), Topsy (stream processing), Acunu (big data analytics), FoundationDB (databases), Metaio (augmented reality), Perceptio (machine vision) & Emotient (machine vision).
Expect more AI in Apple’s products. But I would be surprised to see large-scale open source efforts, of the kind we have seen from Google or Facebook. Open source has rarely been Apple’s bag.