For example, Apple’s AirPods already have a number of sensors that detect when you remove an earbud from your ear. Future generations of devices could take in a much greater share of the environment. As Ming-Chi Kuo, an analyst at Taiwanese stock trading house TF International Securities, writes in a new report for investors, Apple is considering integrating IR cameras into its headphones.
Advertisement
Better for spatial computing
The corresponding modules could end up in devices as early as 2026. The idea is A better spatial experience Regarding the use of Vision Pro. The built-in IR camera should work the same as the iPhone, which Apple currently uses for Face ID facial recognition (including IR projectors). The modules are to be manufactured by Apple manufacturer Foxconn, believed to have a capacity of 10 to 18 million units per year for a total of 20 million AirPods.
“The new AirPods are intended to be used with Vision Pro and future Apple headsets to improve the spatial audio experience and strengthen the spatial computing ecosystem,” Kuo writes. For example, if a user is watching a video with Vision Pro while wearing the new AirPods, the sound source can be accurately placed in the room. However, Apple is already doing something similar with the speakers built into the Vision Pro, audio pods.
Gestures in front of the plug
Apple already supports AirPods Pro Regarding Vision Pro. For this purpose the group recommends Personalized Spatial Audio Install. It’s a special experience with local apps and videos. However, it doesn’t seem good enough for Apple.
According to Kuo, the IR cameras planned for the AirPods Pro will be able to detect changes in the environment for the first time. This will also make gesture control with the plug possible for the first time. Apple has already applied for the corresponding patent. Recently, the company has implemented head gestures with which you can accept (shake your head) or reject a call – they will appear in the fall with iOS 18 and other new systems.
(B.Sc.)