Efficient Hessian computation using sparse matrix derivatives in RAM notation

Behav Res Methods. 2014 Jun;46(2):385-95. doi: 10.3758/s13428-013-0384-4.

Abstract

This article proposes a new, more efficient method to compute the minus two log likelihood, its gradient, and the Hessian for structural equation models (SEMs) in reticular action model (RAM) notation. The method exploits the beneficial aspect of RAM notation that the matrix derivatives used in RAM are sparse. For an SEM with K variables, P parameters, and P' entries in the symmetrical or asymmetrical matrix of the RAM notation filled with parameters, the asymptotical run time of the algorithm is O(P ' K (2) + P (2) K (2) + K (3)). The naive implementation and numerical implementations are both O(P (2) K (3)), so that for typical applications of SEM, the proposed algorithm is asymptotically K times faster than the best previously known algorithm. A simulation comparison with a numerical algorithm shows that the asymptotical efficiency is transferred to an applied computational advantage that is crucial for the application of maximum likelihood estimation, even in small, but especially in moderate or large, SEMs.

Publication types

  • Validation Study

MeSH terms

  • Algorithms*
  • Computer Simulation
  • Humans
  • Likelihood Functions*
  • Linear Models
  • Models, Psychological*
  • Models, Statistical*
  • Probability