Vision research has traditionally been studied in stationary subjects observing stimuli, and rarely during navigation. Recent research using virtual reality environments for mice has revealed that responses even in the primary visual cortex are modulated by spatial context - identical scenes presented in different positions of a room can elicit different responses. Here, we review these results and discuss how information from visual areas can reach navigational areas of the brain. Based on the observation that mouse higher visual areas cover different parts of the visual field, we propose that spatial signals are processed along two-streams based on visual field coverage. Specifically, this hypothesis suggests that landmark related signals are processed by areas biased to the central field, and self-motion related signals are processed by areas biased to the peripheral field.
Copyright © 2020 Elsevier Ltd. All rights reserved.