Sensory loss is associated with behavioural changes, but how behavioural dynamics change when a sensory modality is impaired remains unclear. Here, by recording under a designed standardized scenario, the behavioural phenotypes of 4,196 infants who experienced varying degrees of visual loss but retained high behavioural plasticity, we show that behaviours with significantly higher occurrence in visually impaired infants can be identified, and that correlations between the frequency of specific behavioural patterns and visual-impairment severity, as well as variations in behavioural dynamics with age, can be quantified. We also show that a deep-learning algorithm (a temporal segment network) trained with the full-length videos can discriminate, for an independent dataset from 400 infants, mild visual impairment from healthy behaviour (area under the curve (AUC) of 85.2%), severe visual impairment from mild impairment (AUC of 81.9%), and various ophthalmological conditions from healthy vision (with AUCs ranging from 81.6% to 93.0%). The video dataset of behavioural phenotypes in response to visual loss and the trained machine-learning algorithm should help the study of visual function and behavioural plasticity in infants.