Processing of speech and nonspeech sounds occurs bilaterally within primary auditory cortex and surrounding regions of the superior temporal gyrus; however, the manner in which these regions interact during speech and nonspeech processing is not well understood. Here, we investigate the underlying neuronal architecture of the auditory system with magnetoencephalography and a mismatch paradigm. We used a spoken word as a repeating "standard" and periodically introduced 3 "oddball" stimuli that differed in the frequency spectrum of the word's vowel. The closest deviant was perceived as the same vowel as the standard, whereas the other 2 deviants were perceived as belonging to different vowel categories. The neuronal responses to these vowel stimuli were compared with responses elicited by perceptually matched tone stimuli under the same paradigm. For both speech and tones, deviant stimuli induced coupling changes within the same bilateral temporal lobe system. However, vowel oddball effects increased coupling within the left posterior superior temporal gyrus, whereas perceptually equivalent nonspeech oddball effects increased coupling within the right primary auditory cortex. Thus, we show a dissociation in neuronal interactions, occurring at both different hierarchal levels of the auditory system (superior temporal versus primary auditory cortex) and in different hemispheres (left versus right). This hierarchical specificity depends on whether auditory stimuli are embedded in a perceptual context (i.e., a word). Furthermore, our lateralization results suggest left hemisphere specificity for the processing of phonological stimuli, regardless of their elemental (i.e., spectrotemporal) characteristics.