Coding of vocalizations by single neurons in ventrolateral prefrontal cortex

Hear Res. 2013 Nov:305:135-43. doi: 10.1016/j.heares.2013.07.011. Epub 2013 Jul 26.

Abstract

Neuronal activity in single prefrontal neurons has been correlated with behavioral responses, rules, task variables and stimulus features. In the non-human primate, neurons recorded in ventrolateral prefrontal cortex (VLPFC) have been found to respond to species-specific vocalizations. Previous studies have found multisensory neurons which respond to simultaneously presented faces and vocalizations in this region. Behavioral data suggests that face and vocal information are inextricably linked in animals and humans and therefore may also be tightly linked in the coding of communication calls in prefrontal neurons. In this study we therefore examined the role of VLPFC in encoding vocalization call type information. Specifically, we examined previously recorded single unit responses from the VLPFC in awake, behaving rhesus macaques in response to 3 types of species-specific vocalizations made by 3 individual callers. Analysis of responses by vocalization call type and caller identity showed that ∼19% of cells had a main effect of call type with fewer cells encoding caller. Classification performance of VLPFC neurons was ∼42% averaged across the population. When assessed at discrete time bins, classification performance reached 70 percent for coos in the first 300 ms and remained above chance for the duration of the response period, though performance was lower for other call types. In light of the sub-optimal classification performance of the majority of VLPFC neurons when only vocal information is present, and the recent evidence that most VLPFC neurons are multisensory, the potential enhancement of classification with the addition of accompanying face information is discussed and additional studies recommended. Behavioral and neuronal evidence has shown a considerable benefit in recognition and memory performance when faces and voices are presented simultaneously. In the natural environment both facial and vocalization information is present simultaneously and neural systems no doubt evolved to integrate multisensory stimuli during recognition. This article is part of a Special Issue entitled "Communication Sounds and the Brain: New Directions and Perspectives".

Publication types

  • Research Support, N.I.H., Extramural
  • Review

MeSH terms

  • Acoustic Stimulation
  • Animals
  • Auditory Pathways / physiology*
  • Auditory Perception*
  • Macaca mulatta / physiology*
  • Neurons / physiology*
  • Pattern Recognition, Physiological*
  • Prefrontal Cortex / physiology*
  • Recognition, Psychology
  • Social Behavior
  • Time Factors
  • Vocalization, Animal*