Testers of the developer betas of iOS 18.1, iPadOS 18.1 and macOS 15.1 have reported improvements to Apple’s Siri. After Apple Intelligence is activated – which unfortunately only works with macOS within the EU – the voice assistant is much more understandable than before. The system can now respond in the context of previous questions and answers, has a database with support information for Apple products, also understands when a request has been rewritten or briefly interrupted and is overall more useful. Previously, it was often necessary to know the exact order of commands to avoid being rejected or redirected to the web. However, not all Apple Intelligence features have been implemented for Siri yet. Some of the most exciting new features will likely have to wait months.
Advertisement
Personal reference missing
You can currently control apps for some tasks via Siri – for example, setting an alarm or making a phone call, as has been the case for a long time. However, the new “App Intents”, which opens Siri to many external programs, will come later. This is also related to the fact that developers must first implement them. Personal context has still not been released. Siri should determine this from the contents of the device.
For example, you can ask when a loved one is landing at the airport, which in turn is extracted from existing email or iMessage messages. Keep in mind, this all happens “on the device”, without Apple seeing any of the data. The ability for Siri to read screen content and then draw conclusions from it is still not implemented. This way you should be able to ask specific questions about what is currently being seen there. Finally, the beta also lacks the use of ChatGPT as an alternative language model.
obviously before the end of the year
It looks like Apple is planning to release several missing Siri features before the end of the year. That’s true according to a report from the Wall Street Journal For example for ChatGPT integrationbut also for said personal reference. However, in-app functions aren’t expected to arrive until 2025, as are language versions outside of US English.
Still, Apple is making better progress than ever with Apple Intelligence. Although the beta only arrived at the end of July, many features, such as “Writing Instruments” Working pretty well – it’s good for a completely new language model. However, other announced features such as the image generator (“Image Playground”) or the generation of AI emojis are still missing from the first beta.
(B.Sc.)