Generative adversarial networks to create synthetic motion capture datasets including subject and gait characteristics

J Biomech. 2024 Dec:177:112358. doi: 10.1016/j.jbiomech.2024.112358. Epub 2024 Oct 4.

Abstract

Resource-intensive motion capture (mocap) systems challenge predictive deep learning applications, requiring large and diverse datasets. We tackled this by modifying generative adversarial networks (GANs) into conditional GANs (cGANs) that can generate diverse mocap data, including 15 marker trajectories, lower limb joint angles, and 3D ground reaction forces (GRFs), based on specified subject and gait characteristics. The cGAN comprised 1) an encoder compressing mocap data to a latent vector, 2) a decoder reconstructing the mocap data from the latent vector with specific conditions and 3) a discriminator distinguishing random vectors with conditions from encoded latent vectors with conditions. Single-conditional models were trained separately for age, sex, leg length, mass, and walking speed, while an additional model (Multi-cGAN) combined all conditions simultaneously to generate synthetic data. All models closely replicated the training dataset (<8.1 % of the gait cycle different between experimental and synthetic kinematics and GRFs), while a subset with narrow condition ranges was best replicated by the Multi-cGAN, producing similar kinematics (<1°) and GRFs (<0.02 body-weight) averaged by walking speeds. Multi-cGAN also generated synthetic datasets and results for three previous studies using reported mean and standard deviation of subject and gait characteristics. Additionally, unseen test data was best predicted by the walking speed-conditional, showcasing synthetic data diversity. The same model also matched the dynamical consistency of the experimental data (32 % average difference throughout the gait cycle), meaning that transforming the gait cycle data to the original time domain yielded accurate derivative calculations. Importantly, synthetic data poses no privacy concerns, potentially facilitating data sharing.

Keywords: Conditional Generative Adversarial Networks; Gait; Synthetic Mocap Dataset.

MeSH terms

  • Adult
  • Biomechanical Phenomena
  • Deep Learning
  • Female
  • Gait* / physiology
  • Humans
  • Male
  • Middle Aged
  • Motion Capture
  • Neural Networks, Computer
  • Walking / physiology