Discrimination of the behavioural dynamics of visually impaired infants via deep learning

Nat Biomed Eng. 2019 Nov;3(11):860-869. doi: 10.1038/s41551-019-0461-9. Epub 2019 Oct 21.

Abstract

Sensory loss is associated with behavioural changes, but how behavioural dynamics change when a sensory modality is impaired remains unclear. Here, by recording under a designed standardized scenario, the behavioural phenotypes of 4,196 infants who experienced varying degrees of visual loss but retained high behavioural plasticity, we show that behaviours with significantly higher occurrence in visually impaired infants can be identified, and that correlations between the frequency of specific behavioural patterns and visual-impairment severity, as well as variations in behavioural dynamics with age, can be quantified. We also show that a deep-learning algorithm (a temporal segment network) trained with the full-length videos can discriminate, for an independent dataset from 400 infants, mild visual impairment from healthy behaviour (area under the curve (AUC) of 85.2%), severe visual impairment from mild impairment (AUC of 81.9%), and various ophthalmological conditions from healthy vision (with AUCs ranging from 81.6% to 93.0%). The video dataset of behavioural phenotypes in response to visual loss and the trained machine-learning algorithm should help the study of visual function and behavioural plasticity in infants.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Behavior*
  • Deep Learning*
  • Female
  • Humans
  • Infant
  • Machine Learning
  • Male
  • Neuronal Plasticity
  • Phenotype
  • Sensation
  • Videotape Recording
  • Vision Disorders / psychology*
  • Visually Impaired Persons / psychology*