The medial entorhinal cortex encodes multisensory spatial information

Cell Rep. 2024 Oct 22;43(10):114813. doi: 10.1016/j.celrep.2024.114813. Epub 2024 Oct 11.

Abstract

Animals employ spatial information in multisensory modalities to navigate their natural environments. However, it is unclear whether the brain encodes such information in separate cognitive maps or integrates it all into a single, universal map. We address this question in the microcircuit of the medial entorhinal cortex (MEC), a cognitive map of space. Using cellular-resolution calcium imaging, we examine the MEC of mice navigating virtual reality tracks, where visual and auditory cues provide comparable spatial information. We uncover two cell types: "unimodality cells" and "multimodality cells." The unimodality cells specifically represent either auditory or visual spatial information. They are anatomically intermingled and maintain sensory preferences across multiple tracks and behavioral states. The multimodality cells respond to both sensory modalities, with their responses shaped differentially by auditory or visual information. Thus, the MEC enables accurate spatial encoding during multisensory navigation by computing spatial information in different sensory modalities and generating distinct maps.

Keywords: CP: Neuroscience; Medial entorhinal cortex; auditory; cognitive map; multisensory; spatial navigation; two-photon imaging; unisensory; virtual reality; visual.

MeSH terms

  • Animals
  • Auditory Perception / physiology
  • Entorhinal Cortex* / physiology
  • Male
  • Mice
  • Mice, Inbred C57BL
  • Neurons / physiology
  • Space Perception / physiology
  • Visual Perception / physiology