-
Notifications
You must be signed in to change notification settings - Fork 0
VR Pointing Experiment technical breakdown
PaulvanderLaanMPI edited this page Apr 2, 2026
·
6 revisions
Describing various technical details and experiences within the VR Pointing Experiment.

- Meta Quest 3 (120hz mode) with Meta Quest Link cable
- Unity 2022.3.62f2
- OpenXR
- XR Interaction Toolkit
- for lip sync and facial animation we use SALSA plug-in
- for eye and head gaze we use Animator.SetLookAtPosition (Mecanim feature) (explain why and how)
- Reallusion Character Creator 4 is used for the character
- Pointing gestures are recorded in our Vicon Mocap Lab
- 3D environment based on Stylized Town asset
- offset playing audio to compensate for audio and visual delays
- initial connection issues for real-time gaze streaming (TCP vs UDP)
- wired ethernet setup with usb ethernet adapter and 2nd network card
- no cloud connection needed, everything local
- Eye Calibration (mount, phone, in-experiment)
- eye gaze vector, hit position, object hit is saved each frame into csv file
- BBTK Latency measurements
- GoPro Latency measurements
Example csv output file (16.3MB): P13_2026-03-05_15-08-06.csv