I’ve known Siri, Apple’s voice assistant, for almost a dozen years, and yet I still can’t recall a single meaningful conversation we’ve had. On the contrary, ChatGPT and I have known each other for six months and yet we talked about everything from the meaning of life to planning a romantic dinner for two and even collaborated on programming and film projects. I mean, we have a relationship.
Siri’s limitations mean it still can’t carry on a conversation or engage in a long, project-oriented back-and-forth. For better or worse, the Siri we use today on our iPhones, iPads, MacBooks, Apple Watches and Apple TVs isn’t much different from the one we first encountered in 2011 on the iPhone 4s.
Six years ago, I wrote about Siri’s first brain transplant (opens in a new tab), the moment Apple started using Machine Learning to train Siri and improve her ability to respond to conversational questions. The introduction of Machine Learning and, soon after, an onboard Neural network in the form of Apple’s A11 Bionic chip on the iPhone 8, marked what I thought was a turning point for, arguably, the first consumer digital assistant.
This programming and silicon helped Siri understand a question and its context, allowing it to go beyond rote answers to intelligent answers to more natural language questions.
Early Siri was not She
Not being able to have a full conversation with Siri didn’t seem like a big deal, even though we had already seen the movie. Hers and understood what we could finally expect from our chatbots.
It was not, however, until that once distant future was clicked to the present of OpenAI’s GPT-3 and ChatGPT that the deficits of Siri were thrown into relief.
Despite Apple’s best efforts, Siri has stalled in learning mode. Perhaps this is because Siri is still primarily built on Machine Learning and not Generative AI. It’s the difference between learning and creating.
All the generative AI chatbots and image tools we use today are creating something completely new from invitations and, soon, art and images. They are not answerbots, they are builders.
I doubt all this is lost on Apple. The question is, what will Apple do and can do about it? I think we will have nothing more to look forward to than its upcoming World Wide Web Developers Conference (WWDC 2023). We’re all fixated on the potential $3,000 mixed reality headset that Apple could show off in June, but the company’s most important announcements are sure to revolve around AI.
“Apple must be under incredible pressure now that Google and Microsoft have launched their natural language solutions,” Moor Insights CEO and Chief Analyst Patrick Moorhead (opens in a new tab) told me via Twitter DM.
More chatty Siri
As reported in 9to5Mac, Apple may already be – finally – working on its own language generation update for Siri (Bobcat). Note, this is not the same as “generative AI”. I think it means Siri will get a little better at casual banter. I don’t expect much more than that either.
Unfortunately, Apple’s own ethos may prevent it from achieving GPT-4, let alone GPT-3. Industry watchers aren’t exactly expecting a breakthrough moment.
“I think what they’re doing in AI won’t necessarily be a leap as much as a calculated and more ethically driven approach to AI in Siri. Apple loves, lives and dies by its privacy commitments and I expect no less in how they deliver . more AI-driven Siri”, Creative Strategies CEO and Principle Analyst Tim Bajarin (opens in a new tab) wrote to me by email.
Privacy before anything else
Apple’s steadfast adherence to user privacy may leave it in the way when it comes to true generative AI. Unlike Google and Microsoft Bing, it does not have a massive search engine data store to use. Nor does it train its AI on the vast ocean of internet data. Apple does its machine learning on a device. iPhone and Siri know what they know about you based on what’s on your phone and not what Apple can learn about you and its 1.5 billion global iPhone users. Sure, developers can use Apple’s ML tools to build and integrate new AI models into their apps, but they can’t simply collect your data to learn more about you to help Apple deliver better Siri AI.
As I wrote in 2016: “It’s also interesting to think about how Apple is deliberately hindering its own AI efforts. Your shopping habits data in iTunes, for example, is not shared with any other Apple systems and services.”
Apple’s local approach could hinder it in its potential generative AI efforts. As Moorhead told me, “I see most of the action on device and in the cloud. Apple is strong on the device but weak on the cloud and this is where I think the company will struggle”
The way I see it, Apple has a choice to make. Give up some user privacy to finally turn Siri into the voice assistant we’ve always wanted, or stay the course with incremental AI updates that improve Siri but never let it rival ChatGPT.