Background: The atherogenic index of plasma (AIP), defined as the base 10 logarithm of the ratio of plasma triglyceride (TG) to high density lipoprotein cholesterol (HDL-C), has been employed as a predictor of cardiovascular risk. We seek to quantify the analytical precision of the AIP using the coefficients of variation (CVs) of the TG and HDL-C assays.
Methods: Error propagation methods are employed to develop a simple formula for the standard deviation of the random analytical error in the AIP assuming that the errors in the TG and HDL-C assays are normally distributed. An alternative derivation assuming log-normal distribution of errors gives nearly identical results while avoiding subtle technical problems.
Results: The SD of the AIP is given by sigma(AIP) approximately = 1/ln(10) square root(CV(TG)(2) + CV(HDL-C)(2)) and this formula will provide SD results that are accurate within 0.4% for CVs of TG and HDL-C less than 5%, as compared with results of Monte Carlo simulation. We also explain that the concept of CV cannot be applied to the AIP since it is a logarithm.
Conclusions: The formula provides a simple means to quantify the precision of the AIP from precision data available for the TG or HDL-C assays.