Every year, Apple announces a slew of new features for its iPhone software, but as is increasingly the reality, many of those capabilities don’t arrive with the initial release in the fall — but rather, in smaller updates throughout the year.
iOS 18 is no different, especially in regards to the big new capabilities that Apple announced under the banner of Apple Intelligence. The first set of those features, including Writing Tools and notification summaries, arrived last fall; those were followed by Image Playground, Genmoji, and ChatGPT integration just before the end of last year.
But that’s not the end of this cavalcade of features: one last set is scheduled to arrive in iOS 18.4, which was just launched as a developer beta, with a full release coming this spring.
Of the promised Apple Intelligence features announced at last year’s WWDC 2024, three major ones remain unreleased: Personal Context for Siri, in-app actions for Siri, and Priority Notifications. What will these do, and how important are they to Apple’s overall AI strategy?
Context is king
Among the yet to be released Apple Intelligence features, Personal Context for Siri is probably the most significant, as well as perhaps the most eagerly anticipated. Apple has [already advertised] some of the capabilities this will unlock, but ultimately this is supposed to enable the company’s voice assistant to understand more about your data.
To do so, Apple Intelligence constructs a “semantic index” of your data, including things like photos, files, calendar appointments, notes, links people have sent you, and more. When you make a request, it can identify items in that index that are relevant to what you’re asking for.
So, for example, you’d be able to ask it to find the message where your mom sent you her flight details, whether it was in a text or an email. Or you could ask it about whether you’ll have time to drive between two appointments, and it can look at your calendar, the relevant locations, and calculate the time to get between them. It can even retrieve information from pictures you’ve taken if, for example, you’ve taken a picture of your ID and need to enter details from it into a form on a website.
In addition to all of that context, Siri will also be able to understand what’s on your screen, so for example, if a friend sends you details about a party, you could tell Siri to add it to your calendar.
Lights, camera, in-app actions
On top of Siri understanding your personal data better, iOS 18.4 stands to unlock another powerful piece of functionality: the ability for the personal assistant to take actions for you in and across apps.
This is powered by an existing framework called App Intents, which also integrates with platform features like Spotlight and Shortcuts; it allows third-party apps to tell the system about actions they can perform. For example, a camera app could advertise its ability to take a picture. Or a messaging app could offer the power to send a message. Or a mapping app could provide a way to kick off transit directions. Those actions can then be linked together: pull information from a note in the Notes app and have it sent via Messages, for example.
Siri has been able to do this to some degree in the past, though it largely required manually creating a shortcut in advance and then triggering that shortcut with the voice assistant.
However, what this new ability promises is for Siri to understand all the in-app actions available to it right off the bat, freeing up users from the cognitive overhead of having to create, in advance, shortcuts for whatever they want to do. Instead, you should just be able to tell Siri “send the note with my partner’s flight times to them via Messages” and Siri just, well, does it.
Get your priorities straight
Apple introduced priority messages in Mail with iOS 18.1: the feature is supposed to float messages that actively require your attention to the top of your inbox, using machine learning to identify what’s actually important. (Admittedly, results have been mixed.) With iOS 18.4, however, Apple is likely to deliver a similar promised feature that applies to all of your notifications.
Priority notifications aims to use the same sort of machine learning technology to identify push notifications that may require your attention — a note from a colleague about the deadline for a project, for example, or a missed call from your kid’s school, or a delivery that’s about to arrive — and move it to the top of your lock-screen’s stack of notifications, separating it from the deluge of alerts we receive all day every day.
Apple’s tried its hand at features like this before, including the addition of “time-sensitive notifications” that could be set to break through Do Not Disturb mode back in iOS 15. But the decision of whether notifications were, in fact, time-sensitive was left up to the developers of apps themselves, rather than users.
Priority notifications takes this idea a step further by analyzing the actual content and context of those notifications to figure out what’s really important. A messaging app might decide all of its notifications are time-sensitive, but it may be more important for you to see a message from your partner about school pick-up than one from a colleague asking what’s for lunch.
Intelligence report
This final set of Apple Intelligence features are the last to arrive before Apple takes the wraps off iOS 19, and with good reason: they are both the most ambitious and the most difficult to implement. Personal context will require the synthesis of a tremendous amount of your data and sifting through all of that information will no doubt push Apple’s machine learning models to their limits. As for in-app actions, while a powerful way of marshaling the capabilities of Apple’s own built-in apps, it will require buy-in from third-party developers to go from being a tech demo to a feature that everybody will want to use.
But the combination of those features aims to deliver on a promise first made at Siri’s introduction more than a decade ago: a true virtual assistant that can take care of complex and onerous tasks for users. To date, people often spend more time trying to adapt themselves to Siri — figuring out how to phrase queries to get the right response or even what they should bother asking it to do — rather than Siri being able to adapt itself to people.
Whether these improvements work as Apple intends— and that’s a big if, based on previous Apple Intelligence features — they could usher in a new era for the virtual assistant and justify Apple’s AI investment by delivering a feature that truly benefits people’s lives.