Why Apple’s VisionPro won’t make XR headsets mainstream

Major announcements around the VisionPro headset are expected next week and may even confirm tantalizing rumors that a lighter model of the high-end XR device is coming next year. Whatever Apple reveals at WWDC, we can expect another round of analyst predictions that this time, mass adoption of head-mounted displays is finally happening.
As Apple, Meta, and other major players continue to struggle with taking the technology mainstream, it’s worth addressing the virtual elephant in the room:
There’s reason to suspect that XR headsets’ barrier to success isn’t just about cost, headset weight, lack of content, eye strain, or other frequently cited issues. Neuroscience suggests that the sensory experience of XR can’t compensate for the value of sharing a physical environment with our screens—especially when it comes to how we form knowledge and memory.
Experiments with XR for Full-Time Computing
About 10 years ago, I started spending long sessions in VR (4–8 hours daily over several weeks) to see how it might affect my sense of time and attention. I began with the HTC Vive, which was comfortable and fun. Later, I tested the Apple VisionPro with a full virtual workstation of monitors and displays. I was curious whether XR could improve focus and research. It felt like a very different thing to use VR all day, versus a brief demo in a museum or Apple showroom.
Early on, I had to stand and move constantly throughout the day. It was exhausting, though it made for a great workout (SUPERHOT was one of my favorites). But this kind of motion-based interaction didn’t translate well to daily work tasks. Tossing data around in 3D like in Minority Report might make for compelling cinema, but it’s far less practical over time.
When the VisionPro arrived, I was excited to use it while seated. A friend set me up with a massive virtual workspace, and I genuinely appreciated the focus it provided when I was digging into something new—say, reading a dense Wikipedia article.
Still, when I set up my home office last year, I chose two large physical monitors at eye level and a sit/stand desk. As enjoyable as VR was, the process of setup and adjusting the space took more time and energy than docking a laptop. And even though I was initially thrilled to be in a virtual workspace, I found myself feeling more distant from the real world. I wanted to go outside more, and didn’t feel like looking at any screen at all by day’s end.
I also noticed the difference in eye strain from screens so close to my face. I prefer to work near a window where I can shift my gaze into the distance. That rhythm of looking away and refocusing helps me process thoughts—especially after encountering something new.
The Trouble With Losing Touch
I asked my colleague Dr. David Sisson, a neurophysiologist, why XR adoption hasn’t matched media hype. He reminded me that audio and visual inputs—the primary senses XR taps into—aren’t the full picture.
Then there’s the matter of touch in XR.
“Without touch, there’s no ‘intimacy.’ You’re not really interacting with what’s going on,” David told me. “You can hit a ball—and hear the ‘crack’ in VR—but you’re not feeling anything other than a little jerk in the controller that makes you feel like there’s some inertia happening.”
In short, headsets deprive us of the tactile and physical context that supports memory formation. Contrast that with playing a console game: you’re anchored in a physical space, hands on colorful controllers, with the screen at a distance. If you’re like me, you vividly remember not just the game, but the exact couch you sat on and the friends or family who played with you.
The Neuroscience of Virtual Experience
This sense of “placeness” goes beyond just touch. As David explained, “[C]hemical senses are not a part of [the VR experience] . . . There’s a well-considered idea, a linkage between olfaction, smell between memory—that you’re living that out of the [VR] picture entirely.”
Attempts at incorporating virtual smells into XR exist, but it’s unclear whether they’re effective—or even desirable. Recent research supports what French literature has long told us: smell powerfully evokes memory. Studies show scent not only enhances recall but may also support learning across other sensory areas.
While XR devices like VisionPro do re-create home and office setups and allow for vast screen real estate, they lack a true sense of location. Evolution shaped our brains to operate differently depending on whether we’re traveling or at home. Researchers call this the encoding specificity principle—our memories link closely with the environment where they were first formed.
With a headset on, our minds don’t fully orient to a place, and so we never quite settle in. Apple offers a vast virtual workspace you can take on the go, but that benefit comes at the cost of the sensory richness and physical grounding of a real-world setup. Neuroscience, not just practicality, suggests that working in a physical space—with monitors, windows, textures, smells, and distance—offers deeper engagement and memory retention.
Putting the “Pro” in VisionPro
To be clear, XR headsets excel in specific contexts like rehabilitation or short bursts of fully embodied interaction, where body motion tracking is vital. Some content creators might find immense value in a distraction-free, multiscreen virtual studio.
But that’s not a mass-market audience.
XR evangelists may continue promoting VisionPro as the breakout device, but we—and Apple, for that matter—should remember that “Pro” isn’t just branding. It reflects the narrow set of advanced use cases that justify immersion.
For most of us, computing still works best in a physical world that engages all five senses. And that’s not something XR can replicate—at least, not yet.
What's Your Reaction?






