A dataset of paired head and eye movements during visual tasks in virtual environments

Sci Data. 2024 Dec 5;11(1):1328. doi: 10.1038/s41597-024-04184-1.

Abstract

We describe a multimodal dataset of paired head and eye movements acquired in controlled virtual reality environments. Our dataset includes head and eye movement for n = 25 participants who interacted with four different virtual reality environments that required coordinated head and eye behaviors. Our data collection involved two visual tracking tasks and two visual searching tasks. Each participant performed each task three times, resulting in approximately 1080 seconds of paired head and eye movement and 129,611 data samples of paired head and eye rotations per participant. This dataset enables research into predictive models of intended head movement conditioned on gaze for augmented and virtual reality experiences, as well as assistive devices like powered exoskeletons for individuals with head-neck mobility limitations. This dataset also allows biobehavioral and mechanism studies of the variability in head and eye movement across different participants and tasks. The virtual environment developed for this data collection is open sourced and thus available for others to perform their own data collection and modify the environment.

Publication types

  • Dataset

MeSH terms

  • Adult
  • Eye Movements*
  • Female
  • Head Movements*
  • Humans
  • Male
  • Virtual Reality*