This study was conducted to investigate the connectivity architecture of neural structures involved in processing of emotional speech melody (prosody). 24 subjects underwent event-related functional magnetic resonance imaging (fMRI) while rating the emotional valence of either prosody or semantics of binaurally presented adjectives. Conventional analysis of fMRI data revealed activation within the right posterior middle temporal gyrus and bilateral inferior frontal cortex during evaluation of affective prosody and left temporal pole, orbitofrontal, and medial superior frontal cortex during judgment of affective semantics. Dynamic causal modeling (DCM) in combination with Bayes factors was used to compare competing neurophysiological models with different intrinsic connectivity structures and input regions within the network of brain regions underlying comprehension of affective prosody. Comparison on group level revealed superiority of a model in which the right temporal cortex serves as input region as compared to models in which one of the frontal areas is assumed to receive external inputs. Moreover, models with parallel information conductance from the right temporal cortex were superior to models in which the two frontal lobes accomplish serial processing steps. In conclusion, connectivity analysis supports the view that evaluation of affective prosody requires prior analysis of acoustic features within the temporal and that transfer of information from the temporal cortex to the frontal lobes occurs via parallel pathways.