[visionlist] Current Advice for VR Headsets
Sean Noah
seannoah at berkeley.edu
Thu Jul 10 17:10:12 -05 2025
Hi all,
I’m newly on this listserv, so I apologize if this is a topic that has already been extensively covered here.
The lab I’m in is exploring using a VR headset to run visual psychophysics experiments. I’d appreciate it if anyone can offer me advice headset models, programming visual experiments for VR, and exporting data.
In particular, I’ve been comparing the Apple Vision Pro with the Meta Quest Pro. The Apple Vision Pro seems to offer better stimulus quality, with limited eyetracking API exposure. I’m under the impression that gaze vectors are not accessible via API for standard developers. Is there anyone who is conducting visual experiments with a Vision Pro who has found ways around this limitation? Does anyone know if Apple has any research partnership programs that might provide special access to this data?
The Meta Quest Pro seems to offer better access to eyetracking, but seemingly with tradeoffs for stimulus quality. Has anyone successfully run visual experiments in this headset?
Thank you for your time.
Thanks,
Sean
More information about the visionlist
mailing list