This study examines the neural computations performed by neurons in the auditory system to be selective for the direction and velocity of signals sweeping upward or downward in frequency, termed spectral motion. We show that neurons in the auditory midbrain of Mexican free-tailed bats encode multiple spectrotemporal features of natural communication sounds. These features to which each neuron is tuned are nonlinearly combined to produce selectivity for spectral motion cues present in their conspecific calls, such as direction and velocity. We find that the neural computations resulting in selectivity for spectral motion are analogous to models of motion selectivity studied in vision. Our analysis revealed that auditory neurons in the inferior colliculus (IC) are avoiding spectrotemporal modulations that are redundant across different bat communication signals and are specifically tuned for modulations that distinguish each call from another by their frequency-modulated direction and velocity, suggesting that spectral motion is the neural computation through which IC neurons are encoding specific features of conspecific vocalizations.