Combining sensory information: mandatory fusion within, but not between, senses

Science. 2002 Nov 22;298(5598):1627-30. doi: 10.1126/science.1075396.

Abstract

Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Cues*
  • Form Perception
  • Humans
  • Mathematics
  • Stereognosis
  • Touch*
  • Vision Disparity
  • Visual Perception*