Animals employ spatial information in multisensory modalities to navigate their natural environments. However, it is unclear whether the brain encodes such information in separate cognitive maps or integrates it all into a single, universal map. We address this question in the microcircuit of the medial entorhinal cortex (MEC), a cognitive map of space. Using cellular-resolution calcium imaging, we examine the MEC of mice navigating virtual reality tracks, where visual and auditory cues provide comparable spatial information. We uncover two cell types: "unimodality cells" and "multimodality cells." The unimodality cells specifically represent either auditory or visual spatial information. They are anatomically intermingled and maintain sensory preferences across multiple tracks and behavioral states. The multimodality cells respond to both sensory modalities, with their responses shaped differentially by auditory or visual information. Thus, the MEC enables accurate spatial encoding during multisensory navigation by computing spatial information in different sensory modalities and generating distinct maps.
Keywords: CP: Neuroscience; Medial entorhinal cortex; auditory; cognitive map; multisensory; spatial navigation; two-photon imaging; unisensory; virtual reality; visual.
Published by Elsevier Inc.