WWDC 2023: Will Apple’s Vision Pro be a VR game changer?
Over the past few years, countless millions of words have been written about augmented reality (AR) and virtual reality (VR). For a time, it seemed the technologies could be about to change the world, with Mark Zuckerberg seemingly going all in, renaming his company Meta after the so-called metaverse, and launching a mixed reality headset aimed at businesses.
More recently, it’s felt more like a VR winter as Silicon Valley abandoned its grand plans. Microsoft, which launched its own pioneering HoloLens headset, has apparently laid off several of the teams working on it, and, in March, Google finally killed Google Glass, the company’s early attempt at getting into mixed reality. Even Meta is thought to be quietly backing away from the Metaverse, instead putting its more speculative investments into machine learning and artificial intelligence (AI).
But everything could be about to change again with Apple’s ‘spatial computing’ headset, the Vision Pro.
What is in the Vision Pro headset?
First off, the most impressive thing about the Vision Pro is the decoupling of the periperal from the iPhone. The headset will work independently and use its own operating system: a three-dimensional interface called visionOS. To enable user navigation and interaction with spatial content, Apple Vision Pro brings in a new input system controlled by a person’s eyes, hands, and voice. Users can browse through apps by simply looking at them, tapping their fingers to select, flicking their wrist to scroll, or using their voice.
Vision Pro also features Apple’s first three-dimensional camera with a link to iCloud. Panoramas shot on iPhone can expand and wrap around the user on the Vision Pro, creating the sensation they are standing right where it was taken.
Apple Vision Pro includes EyeSight, an extraordinary innovation that helps users stay connected with those around them. When a person approaches someone wearing Vision Pro, the device feels transparent – letting the user see them while also displaying the user’s eyes. When a user is immersed in an environment or using an app, EyeSight gives visual cues to others about what the user is focused on.
Identity management is controlled by Optic ID where retinal scans are kept on-device.
Under the hood Vision Profeatured Apple silicon, specifically the familiar M2 processor, and a new R1 chip processes input from 12 cameras, five sensors, and six microphones to ensure that content feels like it is appearing right in front of the user’s eyes, in real time. R1 streams new images to the displays within 12 milliseconds eight times faster than the blink of an eye. Speaking of eyes, the Vision Pro pricing is likely to bring a tear to yours, coming in at $3,499.
Apple has the developer base to be excited over the Vision Pro but as it’s not going to be a big seller in the consumer space the challenge is finding the killer app that will make it essential somewhere else. This brings an easy comparison to the aforementioned Hololens, which has found a home in manufacturing and medicine at a similar price point.
TechCentral Reporters with additional reporting by Future Publishing
Subscribers 0
Fans 0
Followers 0
Followers