Developers can now download the software development kit required to create apps for its forthcoming Vision Pro mixed reality headset, Apple announced. Besides making the SDK available, Apple also unveiled a program to bring physical devices to selected labs around the world and more initiatives for developers to test their apps.
Dubbed spatial computer, the Vision Pro aims to define a new paradigm to interact with a computing system to carry through a variety of tasks, including productivity, design, gaming, and more.
Starting today, Apple’s global community of developers will be able to create an entirely new class of spatial computing apps that take full advantage of the infinite canvas in Vision Pro and seamlessly blend digital content with the physical world to enable extraordinary new experiences.
The SDK is included in the last Xcode 15 beta 2 and can be downloaded by any developer with a developer account. It provides a simulator that allows to run apps in visionOS without having access to a physical device. The Simulator provides a monoscopic view of the app immersed in the surrounding space. Being monoscopic, the view provides exactly the same image to both eyes, whereas in a sterescopic view each eye would see the image fom its own point in view, as it happens with objects in the real world.
The simulator also supports ways to mimic the distinct gestures you can use on a Vision Pro, which are mostly based on using your eyes and hands. For example, to tap a button, you look at it, making it focused, and then tap your finger to your thumb. On the simulator, a single mouse click will produce a tap; dragging using the mouse will simulate a drag gesture; dragging while pressing shift simulates dragging towards the viewer; and using the Option key you will be able to simulate gestures requiring two hands. Additionally, a set of controls in the Simulator windows make it possible to navigate the space, changing the viewpoint, viewing angle, and so on.
The visionOS SDK includes support for a number of well-know frameworks that are already in use for iOS and macOS programming, including SwiftUI, ARKit, Reality Kit, and so on. In addition, Apple is also making new tool available, Reality Composer Pro, a tool for creative artists and not just for programmers. It can be used to compose, edit, and preview RealityKit content organized in scenes. Each scene is a hierarchy of entities such as virtual objects and 3D models that can be rearranges without any knowledge of programming.
To make it easier for developers to test their apps on physical devices, Apple has additionally announced a number of initiatives. First and foremost, Apple will open in-person labs where developers can bring their apps. At the moment, only six locations are planned: Cupertino, London, Munich, Shanghai, Singapore, and Tokyo. This aims to extend the number of developers who can get some kind of access to physical devices, since access to the developer kit program will be limited. Furthermore, Apple will offer remote compatibility evaluations when going through App Review, which will provide a report on how apps look on a physical device.