Early decreases in alpha and gamma band power distinguish linguistic from visual information during spoken sentence comprehension

Brain Res. 2008 Jul 11:1219:78-90. doi: 10.1016/j.brainres.2008.04.065. Epub 2008 May 1.

Abstract

Language is often perceived together with visual information. This raises the question on how the brain integrates information conveyed in visual and/or linguistic format during spoken language comprehension. In this study we investigated the dynamics of semantic integration of visual and linguistic information by means of time-frequency analysis of the EEG signal. A modified version of the N400 paradigm with either a word or a picture of an object being semantically incongruous with respect to the preceding sentence context was employed. Event-Related Potential (ERP) analysis showed qualitatively similar N400 effects for integration of either word or picture. Time-frequency analysis revealed early specific decreases in alpha and gamma band power for linguistic and visual information respectively. We argue that these reflect a rapid context-based analysis of acoustic (word) or visual (picture) form information. We conclude that although full semantic integration of linguistic and visual information occurs through a common mechanism, early differences in oscillations in specific frequency bands reflect the format of the incoming information and, importantly, an early context-based detection of its congruity with respect to the preceding language context.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adolescent
  • Adult
  • Brain Mapping
  • Comprehension / physiology*
  • Electroencephalography*
  • Female
  • Humans
  • Male
  • Photic Stimulation / methods
  • Psycholinguistics*
  • Reaction Time / physiology
  • Semantics*
  • Spectrum Analysis
  • Speech Perception / physiology*
  • Time Factors
  • Vision, Ocular / physiology*