Assessing the utility of a virtual environment for enhancing facial affect recognition in adolescents with autism

J Autism Dev Disord. 2014 Jul;44(7):1641-50. doi: 10.1007/s10803-014-2035-8.

Abstract

Teenagers with autism spectrum disorder (ASD) and age-matched controls participated in a dynamic facial affect recognition task within a virtual reality (VR) environment. Participants identified the emotion of a facial expression displayed at varied levels of intensity by a computer generated avatar. The system assessed performance (i.e., accuracy, confidence ratings, response latency, and stimulus discrimination) as well as how participants used their gaze to process facial information using an eye tracker. Participants in both groups were similarly accurate at basic facial affect recognition at varied levels of intensity. Despite similar performance characteristics, ASD participants endorsed lower confidence in their responses and substantial variation in gaze patterns in absence of perceptual discrimination deficits. These results add support to the hypothesis that deficits in emotion and face recognition for individuals with ASD are related to fundamental differences in information processing. We discuss implications of this finding in a VR environment with regards to potential future applications and paradigms targeting not just enhanced performance, but enhanced social information processing within intelligent systems capable of adaptation to individual processing differences.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Adolescent
  • Child Development Disorders, Pervasive / psychology*
  • Face*
  • Facial Expression
  • Female
  • Humans
  • Male
  • Recognition, Psychology / physiology*