Apple has introduced details on using the app with a new Siri function that will enable the voice assistant to read iPhone screen content. The so-called Onscreen Awareness is expected as part of Apple Intelligence with the upcoming iOS 18 update (possibly by spring 2025) and complements, among other things, the new context function, which aims to make Siri significantly smarter as it Existing content devices may be included.
Advertisement
Testing from iOS 18.2 – for developers
There should be a certain process from the developers side like Apple for convenience new developer documentation determined. It’s about how apps should be designed to prepare the content that appears on the screen for Siri and Apple Intelligence. API features are already implemented in iOS 18.2 so they can be tried out.
Apple clearly wants to make onscreen awareness as privacy-friendly as possible. Only when a user asks a question about something on the screen or wants to take an action based on it can Siri and Apple Intelligence see the content and take action. At the moment it’s unclear whether this only happens locally on the device (which is what Apple is really aiming for) or whether Apple’s private cloud compute is also used. Information is forwarded to third-party services only at the express request of the User. As an example, Apple cites displaying a website and then having Siri present a summary of it. This is currently possible with Apple Intelligence, but you have to click.
Forwarding to ChatGPT on request
As part of using ChatGPT, it should also be possible to send photos or documents shown on the screen to OpenAI. This applies, for example, to generating image descriptions or analyzing PDFs. Here too, there is always a request before the data flows. Siri offers ChatGPT, among other things, when users request tasks that the voice assistant cannot currently complete. Apple will soon provide image descriptions as part of its “Visual Intelligence” function.
In its developer documentation, Apple further states that “current and future personal intelligence features” require explicit sharing of screen content through the App Intents Framework. It is therefore conceivable that older applications do not support onscreen awareness, as relatively few iPhone apps currently use App Intent. So it can get frustrating quickly.
(B.Sc.)