Quantification of facial expressions using high-dimensional shape transformations

J Neurosci Methods. 2005 Jan 30;141(1):61-73. doi: 10.1016/j.jneumeth.2004.05.016.

Abstract

We present a novel methodology for quantitative analysis of changes in facial display as the intensity of an emotion evolves from neutral to peak expression. The face is modeled as a combination of regions and their boundaries. An expression change in a face is characterized and quantified through a combination of non-rigid (elastic) deformations, i.e., expansions and contractions of these facial regions. After elastic interpolation, this yields a geometry-based high-dimensional 2D shape transformation, which is used to register regions defined on subjects (i.e., faces with expression) to those defined on the reference template face (a neutral face). This shape transformation produces a vector-valued deformation field and is used to define a scalar valued regional volumetric difference (RVD) function, which characterizes and quantifies the facial expression. The approach is applied to a standardized database consisting of single images of professional actors expressing emotions at predefined intensities. We perform a detailed analysis of the deformations generated and the regional volumetric differences computed for expressions. We were able to quantify subtle changes in expression that can distinguish the intended emotions. A model for the average expression of specific emotions was also constructed using the RVD maps. This method can be applied in basic and clinical investigations of facial affect and its neural substrates.

MeSH terms

  • Adolescent
  • Adult
  • Aged
  • Algorithms*
  • Emotions / physiology
  • Facial Expression*
  • Female
  • Humans
  • Image Processing, Computer-Assisted / instrumentation
  • Image Processing, Computer-Assisted / methods*
  • Male
  • Middle Aged
  • Photography / methods*
  • Social Behavior