From Monday: Apple gives companies direct access to the Vision Pro camera

0
21
From Monday: Apple gives companies direct access to the Vision Pro camera


Apple’s first mixed reality headset, the Vision Pro, sees a lot of its surroundings. To make sure Numerous camerasWhich in addition to space, also records the movements of the hands hugging the body and the eyes internally to represent the person’s action. For data security reasons, most of this data is processed directly on the device. Developers, on the other hand, only receive clues that should be enough to display their apps, never the actual raw data from the cameras. With VisionOS 2, which is expected to be released on Monday evening, this is now changing – at least for a special group of developers.

Advertisement


With special “managed entitlements” that you must apply for from Apple — as well as a license file that’s tied to an Apple ID — developers can work with companies Enterprise APIs Develop apps that are closer to the machines. The idea is to enable more research and development, as well as enable internal applications that can do more than what normal apps are allowed to do. However, such programs should not be able to access the data “in the wild” for data security reasons.

On the enterprise API features for VisionOS 2, which were presented during a session at the WWDC 2024 developer conference in the summer First time shown This includes access to in-screen captures of things seen in the Vision Pro’s environment via a passthrough. There’s also direct access to the main camera, which the operating system currently blocks for apps. Developers in companies should also be given more leeway when it comes to the performance requirements of the M2 SoC (“increased performance headroom”) – normal apps aren’t even allowed to do this.

Apple promotes AI features – including those that are yet to comeApple promotes AI features – including those that are yet to come

Apple emphasizes that the Enterprise API is a feature that is to be used “only in a business context.” You can’t sell such apps in the App Store (of course), but you have to use Apple Business Manager. Even small features that are not yet available to regular App Store developers for the Vision Pro are now available on the headset through the new interface. This includes scanning QR codes and so-called spatial barcodes to detect position in space.

Finally, Apple also provides more direct access to the Apple Neural Engine (ANE) for machine learning tasks – it should work the same way it currently works on iOS. Object tracking has also been made easier for company developers, with known objects being more quickly identified and recorded using “configurable parameters”; enterprise developers who want to use the Enterprise API for Vision Pro will need to have a “Development only” request Request from the company.


(B.Sc.)

iPhone 16 details: Wi-Fi standard, eSIM, 5G mmWaveiPhone 16 details: Wi-Fi standard, eSIM, 5G mmWave

LEAVE A REPLY

Please enter your comment!
Please enter your name here