By Carrie Morales
A few weeks ago, I had the opportunity to watch Apple’s Worldwide Developers Conference (WWDC), and witness the unveiling of their latest innovation—the Vision Pro. As someone deeply invested in technology and accessibility, I couldn’t help but wonder: will this mixed reality headset truly be accessible? How will it cater to the needs of individuals like me? While I don’t have all the answers just yet, I’d like to share my thoughts and insights on Vision Pro.
Design and Specification
Let’s start by exploring the design and specifications of this remarkable device. The Vision Pro resembles a pair of oversized ski goggles, complete with a wide gray adjustable strap that wraps around the back of your head. The front of the headset, made of glass, acts as a captivating screen. Its metallic-colored frame adds a touch of elegance to the overall design.
Equipped with five sensors, six microphones, and 12 cameras strategically placed to capture different angles, the Vision Pro promises to provide an immersive mixed reality experience. Some cameras are positioned outward, allowing you to view the world in front of you, while others capture your hands and face. Additionally, the device includes a button for capturing photos and videos, as well as a knob known as the digital Crown—a larger version of the one found on the Apple Watch.
Inside the headset, two high-resolution micro-LED displays cater to each eye. These provide a vivid representation of the world around you based on camera input. This display overlays apps, giving them the appearance of floating right in front of you. For those who wear glasses, custom Zeiss lenses can be magnetically attached to the Vision Pro.
With spatial audio delivered through two speakers, one on each side, the device truly immerses you in a multisensory experience. The Vision Pro operates on the M2 Chip and the new R1, which work harmoniously to process data from the sensors, ensuring minimal latency. This sophisticated technology enables users to interact with the Vision OS—the headset’s operating system—using a combination of eye and hand movements. Apple placed particular emphasis on eye tracking during the presentation and introduced Optic ID, a feature that scans your iris for authentication purposes.
Release Date and Price
Apple will be releasing the Vision Pro early next year, 2024, and along with it, they intend to offer most iOS and iPad apps in the Vision Pro App Store. As for the price, the Vision Pro will retail for $3,499—a significant investment for anyone interested in exploring this cutting-edge technology.
The Vision Pro’s “Eyesight”
One intriguing feature is the front screen I mentioned earlier. It not only displays shifting lights when viewing apps but also serves as a means for others to “see you”. The outside screen can project a digital copy of your eyes and the part of your face hidden behind the headset. The interior cameras are literally streaming a digital rendition of this part of your face for others to see—a feature Apple amusingly refers to as Eyesight. This made me wonder: when a blind person uses the Vision Pro, can they claim to have eyesight?
Accessibility Questions
Naturally, my mind is racing with questions about the accessibility of this device. For example, how will individuals with conditions like Nystagmus or those who rely on prosthetics control the operating system? How will users who don’t have the ability to control their eye movements operate the Vision Pro when it’s so dependent on eye tracking? As someone with Aniridia, the absence of full iris, I wonder if I’ll be able to use Optic ID. What alternatives will be available to users like me?
Additionally, I’m curious about the level of zoom and the ability to adjust the proximity of apps. The presentation showcased AR app screens across a room, and many people who are visually impaired can only see things close up. Apple did share a presentation on creating accessible experiences with the Vision Pro, but it mainly targeted developers rather than end users like me. While the video provided insights into various accessibility features, it lacked details about the firsthand experience of assistive technology users.
More Features to Think About
Nevertheless, I can’t wait to get my hands on the Vision Pro and put it to the test. I’m eager to explore the capabilities of VoiceOver, Zoom, Dynamic Type, and Voice Control within this new realm of mixed reality. Apple has an impressive track record of integrating accessibility into its devices, and I can only imagine the amount of time and effort they’ve devoted to ensuring the Vision Pro meets the needs of users with disabilities.
As mentioned in my recent A&AT news video podcast, someone discovered a fascinating feature called Visual Search on the Vision Pro. This feature allows users to obtain information about objects, interact with text in the real world, copy and paste printed text into apps, and even translate text in 17 languages. The potential for enhanced accessibility through these features fills me with hope.
Imagine being able to use the Point and Speak feature, entirely hands-free, as you navigate your surroundings. Just by pointing with the headset, you could have items described aloud to you—an incredible advancement. Accessible apps like Seeing AI and Be My Eyes could become even more powerful with the Vision Pro’s ability to process live video and provide detailed descriptions. Tasks that were once challenging, like using a phone camera to follow directions or identify objects, could become significantly easier with this headset.
What About Privacy and Isolation?
While privacy concerns exist for some individuals, I’m personally less worried about that aspect. After all, Apple already has access to a substantial amount of my information.
However, what concerns me more is the potential isolation caused by the device. While the Vision Pro can display a 3D image of the user on the outside screen, it cannot replace the genuine social connections we crave. As social creatures, human interaction remains essential. The thought of a world where people walk around with their faces concealed by headsets feels somewhat eerie and disconnected, despite the benefits it may offer introverted individuals like myself. We must remember the value of real, human connection.
How Will the Vision Pro Transform Accessibility?
must confess, I’m excited about this cutting-edge technology! It comes with its pros and cons, fears and excitement—all bundled together. The Vision Pro represents the future, and it’s a future we should embrace.
As technology advances, I hope to see the headset become smaller, lighter, less conspicuous, and more affordable. Real wireless charging would be a game-changer. Ultimately, I envision a world where the digital and physical seamlessly blend, making the entire world more accessible.
The Vision Pro holds incredible potential to transform accessibility, especially when combined with the advancements in artificial intelligence. While I still have many unanswered questions, I believe this headset can serve as a significant step towards a more accessible future. I eagerly await the opportunity to experience it firsthand. If anyone knows how I can get my hands on the Vision Pro for a test run, please reach out and let me know.
Now I turn to you. What are your thoughts on the Vision Pro? Do you share my concerns, or are you more optimistic about its accessibility? Will you be among the first to try this groundbreaking device? I’m genuinely interested in hearing your perspectives and engaging in a conversation about this exciting development.
Links:
A&AT News: More Accessibility AI, Google and Adobe Make PDFs More Accessible?
Apple’s Presentation on Designing for Accessibility on the Vision Pro