I’ve known Siri, Apple’s voice assistant, for almost a dozen years, and yet I still can’t remember a single meaningful conversation we had. Conversely, ChatGPT and I have known each other for six months and yet we have talked about everything from the meaning of life to planning a romantic dinner for two and even collaborated on programming and film projects. I mean, we have a relationship.
Siri’s limitations mean it still can’t carry on a conversation or engage in a long project-focused back and forth. For better or worse, the Siri we use today on our iPhones, iPads, MacBooks, Apple Watch, and Apple TV isn’t much different than the one we first encountered in 2011 on an iPhone. 4s.
Six years ago I wrote about Siri’s first brain transplant (opens in a new tab), when Apple started using machine learning to train Siri and improve its ability to respond to conversational queries. The introduction of Machine Learning and, soon after, an embedded neural network in the form of Apple’s A11 Bionic chip on the iPhone 8, marked what I thought was a turning point for, arguably, the first consumer digital assistant.
This programming and silicon helped Siri understand a question and its context, allowing it to move beyond rote responses to intelligent responses to more natural language questions.
The start of Siri wasn’t her
Not being able to fully converse with Siri didn’t seem like a big deal, even though we’d seen the movie before. Her and understood what we could finally expect from our chatbots.
It wasn’t until this distant future was brought back to the present by GPT-3 and OpenAI’s ChatGPT, however, that Siri’s shortcomings were brought into stark relief.
Despite Apple’s best efforts, Siri sat idle in learning mode. That may be because Siri is still primarily based on machine learning and not generative AI. This is the difference between learning and creating.
All the generative AI chatbots and image tools we use today create something new out of prompts and, soon, art and images. They are not response robots, they are builder robots.
I doubt all of that is lost on Apple. The question is, what will and can Apple do about it? I think we won’t have to look any further than its upcoming Worldwide Developers Conference (WWDC 2023). We’re all obsessed with the possible $3,000 mixed reality headset that Apple might introduce in June, but the company’s biggest announcements are sure to revolve around AI.
“Apple must be under incredible pressure now that Google and Microsoft have rolled out their natural language solutions,” said Patrick Moorhead, CEO and chief analyst at Moor Insights. (opens in a new tab) told me on Twitter DM.
A more talkative Siri
As reported in 9to5Mac, Apple may already be working – finally – on its own language generation update for Siri (Bobcat). Note that this is not the same as “generative AI”. I think this means Siri will get a little better at the occasional banter. I don’t expect much more than that either.
Unfortunately, Apple’s own philosophy may prevent it from catching up to GPT-4, let alone GPT-3. Industry watchers aren’t exactly expecting a breakthrough moment.
“I think what they’re doing in AI won’t necessarily be a leap forward as much as a calculated, more ethical approach to AI in Siri. Apple loves, lives and dies by its privacy commitments and I expect no less from the way they deliver a more AI-driven Siri,” Tim Bajarin, CEO and Principal Analyst, Creative Strategies (opens in a new tab) wrote to me in an e-mail.
Apple’s unwavering adherence to user privacy can cripple it when it comes to true generative AI. Unlike Google and Microsoft Bing, it does not have a huge search engine-driven data store. Nor does it train its AI on the vast ocean of internet data. Apple does its machine learning on the device. An iPhone and Siri know what they know about you based on what’s on your phone, not what Apple can learn about you and its 1.5 billion iPhone users worldwide . Of course, developers can use Apple’s ML tools to build and embed new AI models on their apps, but they can’t just collect your data to learn more about you to help Apple provide a best AI Siri.
As I wrote in 2016: “It’s also interesting to consider how Apple intentionally handicaps its own AI efforts. Your shopping habits data in iTunes, for example, is not shared with any of the other systems and services from Apple.”
Apple’s local approach could handicap it in its future generative AI efforts. As Moorhead told me, “I see most of the action on the device and in the cloud. Apple is strong on the device but weak in the cloud and that’s where I think the company will have trouble”
In my opinion, Apple has a choice to make. Give up some user privacy to finally turn Siri into the voice assistant we’ve always wanted, or stay the course with incremental AI updates that make Siri better but never let it compete with ChatGPT.