I filed a feedback about my erratic eye tracking on my #VisionPro and got some useful information:
“Unfortunately, the artificial lenses that are implanted during a Cataract surgery can sometimes cause interference with Apple Vision Pro’s eye tracking. This can manifest in unstable, or erratic tracking and highlighting.”
@steveriggins Soubds frustrating, but at least they know ofnit and know what the problem exactly is. So this could potentially be fixed in a revision or maybe even with Software.
Thoughts after Apple iPad event with implications for #VisionPro: Today, Apple positioned iPad and VisionPro for professional use, including movie production and sound editing (e.g., FinalCut & Logic Pro on the iPad), and training (VisionPro). They also updated the Apple Pencil. Here's an exciting idea:
An issue to some with Vision Pro has been the lack of strong integration of hand controllers, especially compared to more gaming-centric headsets. For serious use of VisionPro's initial major pro app, Excel, I think it helps to use a physical keyboard and trackpad, which it does support. But that's not rich enough for many more advanced uses.
I think in the not-too-distant future we’ll see the iPad integrated with VisionPro like the Mac started, if not more so. You’ll use an iPad, perhaps with a Magic Keyboard, and the new Apple Pencil Pro for professional-level control. Having both a pencil, with squeeze, twirl, haptic-feedback, hover, etc., along with the current full-motion hand and arm movement in 3D-space, gives you the start of a very rich and precise way of interacting with spatial computing. Moving on the hard iPad surface could be quite superior to waving something in the air or using a joystick. The Mac is not for using a pen, but the iPad is. I’m thinking long-term, not just the current headset. The videos they showed of their pro-apps on iPad, and the VisionPro update which included touting a film director using it to oversee the editing and visual effects for an upcoming film, hinted towards this convergence to me. I wonder if it's true.
Reading some comments, I think I wasn't clear enough about the role of the iPad. I was assuming the pen would normally manipulate things you see elsewhere, not under it. Like a mouse. The iPad could add context, but especially to add electronics and computing power to interact with the pen, and an appropriate surface on which to move the pen.
I really like my #VisionPro and still use it all the time, but I think Apple missed the boat a bit. The Meta Raybans are a more practical face computer.
If Apple had created sunglasses that had a decent camera, Apple Music/Podcasts, and Siri built in for $300, it probably would have been a massive success.
VIsionPro seems targeted at developers and app users. It’s cool and all, but it is overpriced and doesn’t really solve a problem that a desktop/laptop couldn’t solve. The main problem is that developers are lazy and just want to bring their existing apps to VisionPro rather than breaking out of the 2D box. And Apple hasn’t really given them a good reason to do so.