These AI functions in iOS 18 & Co. will have to wait until 2025

0
30
These AI functions in iOS 18 & Co. will have to wait until 2025


Apple will be an important part of this Apple Intelligence Features iOS 18, iPadOS 18 and macOS 15 will only be available in 2025 – including the biggest improvements to voice assistant Siri and support for more languages ​​than just US English. Financial news agency Bloomberg gave this information in a report on Sunday. It can also be estimated that Support for ChatGPT Will not appear with the first release of the new operating system, but is planned to begin celebrations in 2024,

Advertisement


Among other things, Apple had announced that Siri would be able to better handle natural language in the future. This function could be available for the USA region as early as this year. This also means that users should be able to make promises without having to repeat the request. An improved “Type to Siri” (for keyboard input) could also be made available immediately, as well as a database with information and instructions for Apple products.

More context for Siri, including the use of information already present on the device from appointments, emails or chats, similar to the planned “semantic index” for all content on the device, may not appear until 2025. Control of apps and device functions does not seem to be planned until 2025, for example being able to edit photos for messages directly using voice. The so-called on-screen awareness, which gives Siri access to screen content, apparently still needs some time at Apple.

Apple already made it clear during its keynote that it would gradually roll out the Apple Intelligence function. The lack of internationalization is likely to be a particular problem initially. Among other things, Apple wants to avoid usage problems among its own engineers, reports Bloomberg. “Trying to bring too many new things to market at once has hurt Apple in the past. This gives developers more time to support new features in their applications,” the report says. Starting with US English also helps to gain time for training in other languages.

Apple shuts down Later Payments service | Heise OnlineApple shuts down Later Payments service | Heise Online

Since Apple uses its own language model, this costs time and money. The same applies to expanding Apple’s cloud infrastructure Also via our own servers with M-SoC wants to handle. Too many users could mean that the technology can no longer progress as we know it from the early stage of ChatGPT. Apple also fears damage to its image if generative AI functions lead to hallucinatory outputs. With a smaller user base, these issues can be identified and resolved more quickly. Apple is also looking for new partners for further chatbot integration. Discussions will continue with Alphabet, Anthropic and Chinese companies such as Baidu and Alibaba. Bloomberg also writes that other announced features may not appear until late 2024. This includes new category management in Apple Mail and Swift Assist Coding Assistant for Xcode. After all: Apple intelligence should be able to smartly prioritize notifications from the start, based on their content as well as from emails, notes and websites. There will also be new generative tools for writing, drawing and creating emojis (“Zenmoji”).


,B.Sc,

Emojis from AI: This is how Apple’s Genmojis are used in the operating systemEmojis from AI: This is how Apple’s Genmojis are used in the operating system

LEAVE A REPLY

Please enter your comment!
Please enter your name here