After the AI gala that was Samsung’s Galaxy S24 series phones, Apple could be the next to tap into the magic of deep learning and Large Language Models (LLM) that power tools such as ChatGPT and Google Bard. According to an industry analysis report by Financial Times, Apple has been on a hot acquisition streak, team reorganizations, and fresh hiring to develop AI capabilities for iPhones.
At the center of these AI acquisitions could be Siri, the on-device virtual assistant that has recently lost a lot of competitive ground to the notably smarter Google Assistant. And it looks like Apple will follow in the same footsteps as Google at supercharging its digital assistant.
Google has already baked the generative AI smarts of Bard into Google Assistant, and the revamped experience will soon be out for both Android as well as iOS devices. So, what exactly is Bard going to change with Google Assistant?
Google Assistant predicting Siri’s path?
Thanks to the multi-modal capabilities of the underlying PaLM-2 large language model, Assistant with Bard will soon accept text, audio, as well as media-based inputs. Think of it as the multi-search facility that comes to life courtesy of Google Lens, which is also now a part of the Circle to Search feature coming to the Pixel 8 and Galaxy S24 series phones.
In addition, Assistant with Bard will also be integrated within popular Google services such as Gmail and Docs. The upgraded assistant will be aware of the on-screen content at all times and will execute contextually aware tasks such as writing a fitting social media post based on the photo currently on the screen.
The Financial Times report also mentions that Siri will soon be powered by an LLM, one that is developed in-house by Apple instead of a licensed product like Meta’s Llama, OpenAI’s GPT, or Google’s PaLM. Notably, and quietly, Apple has already released a large language model called Ferret earlier this year in partnership with experts at Columbia University.
On-device is the flavor of this AI season
Another focus of Apple seems to be running LLM-based tasks with an on-device approach, similar to the Pixel 8 Pro and Galaxy S24 running Google’s Gemini Nano model. The benefit here is that AI operations no longer need an internet connection to link with the cloud, dramatically speeding up the operations and also ensuring added privacy as user data never leaves the device.
A Bloomberg report last year talked about Apple working on something called “Apple GPT” based on the company’s own language model, but it was limited to internal testing. Apple’s AI efforts could finally bear fruit in 2024. Another report from the same outlet also notes that an AI-powered avatar of Siri could arrive this year, likely with the arrival of iOS 18.
Aside from making Siri smarter and more responsive, Apple also aims to integrate the generative AI chops into more apps such as Messages. Samsung and Google have already given us a glimpse of how it can be implemented, thanks to snazzy features such as Magic Compose, style suggestions, real-time offline language translation for chats, and more.
So far, Apple hasn’t revealed when and how exactly it aims to implement AI across its products — especially the iPhone. But if the competition is any indication, it won’t be surprising to see Apple giving us a glimpse at its next WWDC developer conference later this year.
Interestingly, Apple has talked in glowing terms about the AI chops of its latest silicon, including the A17 Pro powering the iPhone 15 Pro duo. Apple just might lay the foundations of on-device AI and a smarter Siri — finally — starting with its current-gen flagship phones.