When language meets action: the neural integration of gesture and speech

Cereb Cortex. 2007 Oct;17(10):2322-33. doi: 10.1093/cercor/bhl141. Epub 2006 Dec 11.

Abstract

Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we identified the neural networks involved in the integration of semantic information from speech and gestures. Verbal and/or gestural content could be integrated easily or less easily with the content of the preceding part of speech. Premotor areas involved in action observation (Brodmann area [BA] 6) were found to be specifically modulated by action information "mismatching" to a language context. Importantly, an increase in integration load of both verbal and gestural information into prior speech context activated Broca's area and adjacent cortex (BA 45/47). A classical language area, Broca's area, is not only recruited for language-internal processing but also when action observation is integrated with speech. These findings provide direct evidence that action and language processing share a high-level neural integration system.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Brain Mapping
  • Cerebral Cortex / anatomy & histology
  • Cerebral Cortex / physiology*
  • Female
  • Gestures*
  • Humans
  • Language*
  • Magnetic Resonance Imaging
  • Male
  • Motor Activity / physiology
  • Semantics
  • Speech / physiology*