Dynamic EEG analysis during language comprehension reveals interactive cascades between perceptual processing and sentential expectations

Brain Lang. 2020 Dec:211:104875. doi: 10.1016/j.bandl.2020.104875. Epub 2020 Oct 18.

Abstract

Understanding spoken language requires analysis of the rapidly unfolding speech signal at multiple levels: acoustic, phonological, and semantic. However, there is not yet a comprehensive picture of how these levels relate. We recorded electroencephalography (EEG) while listeners (N = 31) heard sentences in which we manipulated acoustic ambiguity (e.g., a bees/peas continuum) and sentential expectations (e.g., Honey is made by bees). EEG was analyzed with a mixed effects model over time to quantify how language processing cascades proceed on a millisecond-by-millisecond basis. Our results indicate: (1) perceptual processing and memory for fine-grained acoustics is preserved in brain activity for up to 900 msec; (2) contextual analysis begins early and is graded with respect to the acoustic signal; and (3) top-down predictions influence perceptual processing in some cases, however, these predictions are available simultaneously with the veridical signal. These mechanistic insights provide a basis for a better understanding of the cortical language network.

Keywords: Electroencephalography; N100; N400; Predictive coding; Semantic integration; Speech perception; Top-down effects.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation / methods*
  • Adult
  • Auditory Perception / physiology
  • Comprehension / physiology*
  • Electroencephalography / methods*
  • Female
  • Humans
  • Language*
  • Male
  • Motivation / physiology*
  • Semantics
  • Speech Perception / physiology*