Modulation of LIP activity by predictive auditory and visual cues

Cereb Cortex. 2004 Dec;14(12):1287-301. doi: 10.1093/cercor/bhh090. Epub 2004 May 27.

Abstract

The lateral intraparietal area (area LIP) contains a multimodal representation of extra-personal space. To further examine this representation, we trained rhesus monkeys on the predictive-cueing task. During this task, monkeys shifted their gaze to a visual target whose location was predicted by the location of an auditory or visual cue. We found that, when the sensory cue was at the same location as the visual target, the monkeys' mean saccadic latency was faster than when the sensory cue and the visual target were at different locations. This difference in mean saccadic latency was the same for both auditory cues and visual cues. Despite the fact that the monkeys used auditory and visual cues in a similar fashion, LIP neurons responded more to visual cues than to auditory cues. This modality-dependent activity was also seen during auditory and visual memory-guided saccades but to a significantly greater extent than during the predictive-cueing task. Additionally, we found that the firing rate of LIP neurons was inversely correlated with saccadic latency. This study indicates further that modality-dependent differences in LIP activity do not simply reflect differences in sensory processing but also reflect the cognitive and behavioral requirements of a task.

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Acoustic Stimulation / methods*
  • Action Potentials / physiology*
  • Animals
  • Female
  • Macaca mulatta
  • Parietal Lobe / physiology*
  • Photic Stimulation / methods*
  • Psychomotor Performance / physiology*