Visual contributions to human self-motion perception during horizontal body rotation

Arch Ital Biol. 2000 Apr;138(2):139-66.

Abstract

It is still an enigma how human subjects combine visual and vestibular inputs for their self-motion perception. Visual cues have the benefit of high spatial resolution but entail the danger of self motion illusions. We performed psychophysical experiments (verbal estimates as well as pointer indications of perceived self-motion in space) in normal subjects (Ns) and patients with loss of vestibular function (Ps). Subjects were presented with horizontal sinusoidal rotations of an optokinetic pattern (OKP) alone (visual stimulus; 0.025-3.2 Hz; displacement amplitude, 8 degrees) or in combinations with rotations of a Bárány chair (vestibular stimulus; 0.025-0.4 Hz; +/- 8 degrees). We found that specific instructions to the subjects created different perceptual states in which their self-motion perception essentially reflected three processing steps during pure visual stimulation: i) When Ns were primed by a procedure based on induced motion and then they estimated perceived self-rotation upon pure optokinetic stimulation (circular vection, CV), the CV has a gain close to unity up to frequencies of almost 0.8 Hz, followed by a sharp decrease at higher frequencies (i.e., characteristics resembling those of the optokinetic reflex, OKR, and of smooth pursuit, SP). ii) When Ns were instructed to "stare through" the optokinetic pattern, CV was absent at high frequency, but increasingly developed as frequency was decreased below 0.1 Hz. iii) When Ns "looked at" the optokinetic pattern (accurately tracked it with their eyes) CV was usually absent, even at low frequency. CV in Ps showed similar dynamics as in Ns in condition i), independently of the instruction. During vestibular stimulation, self-motion perception in Ns fell from a maximum at 0.4 Hz to zero at 0.025 Hz. When vestibular stimulation was combined with visual stimulation while Ns "stared through" OKP, perception at low frequencies became modulated in magnitude. When Ns "looked" at OKP, this modulation was reduced, apart from the synergistic stimulus combination (OKP stationary) where magnitude was similar as during "staring". The obtained gain and phase curves of the perception were incompatible with linear systems prediction. We therefore describe the present findings by a non-linear dynamic model in which the visual input is processed in three steps: i) It shows dynamics similar to those of OKR and SP; ii) it is shaped to complement the vestibular dynamics and is fused with a vestibular signal by linear summation; and iii) it can be suppressed by a visual-vestibular conflict mechanism when the visual scene is moving in space. Finally, an important element of the model is a velocity threshold of about 1.2 degrees/s which is instrumental in maintaining perceptual stability and in explaining the observed dynamics of perception. We conclude from the experimental and theoretical evidence that self-motion perception normally is related to the visual scene as a reference, while the vestibular input is used to check the kinematic state of the scene; if the scene appears to move, the visual signal becomes suppressed and perception is based on the vestibular cue.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Eye Movements / physiology
  • Female
  • Head Movements / physiology
  • Humans
  • Male
  • Models, Neurological*
  • Motion Perception / physiology*
  • Perceptual Masking / physiology
  • Photic Stimulation
  • Reflex, Vestibulo-Ocular / physiology*
  • Rotation
  • Space Perception / physiology*
  • Vestibule, Labyrinth / physiology