Background: Whether including additional environmental risk factors improves cardiovascular disease (CVD) prediction is unclear. We attempted to improve CVD mortality prediction performance beyond traditional CVD risk factors by additionally using metals measured in the urine and blood and with statistical machine learning methods.
Methods: Our sample included 7,085 U.S. adults aged 40 years or older from the National Health and Nutrition Examination Survey 2003-2004 through 2015-2016, linked with the National Death Index through December 31, 2019. Data were randomly split into a 50/50 training dataset used to construct CVD mortality prediction models (n = 3542) and testing dataset used as validation to assess prediction performance (n = 3543). Relative to the traditional risk factors (age, sex, race/ethnicity, smoking status, systolic blood pressure, total and high-density lipoprotein cholesterol, hypertension, and diabetes), we compared models with an additional 17 blood and urinary metal concentrations. To build the prediction models, we used Cox proportional hazards, elastic-net (ENET) penalized Cox, and random survival forest methods.
Results: 420 participants died from CVD with 8.8 mean years of follow-up. Blood lead, cadmium, and mercury were associated (p < 0.005) with CVD mortality. Including these blood metals in a Cox model, initially containing only traditional risk factors, raised the C-index from 0.845 to 0.847. Additionally, the Net Reclassification Index showed that 23% of participants received a more accurate risk prediction. Further inclusion of urinary metals improved risk reclassification but not risk discrimination.
Conclusions: Incorporating blood metals slightly improved CVD mortality risk discrimination, while blood and urinary metals enhanced risk reclassification, highlighting their potential utility in improving cardiovascular risk assessments.
Keywords: Cardiovascular Disease; Environmental exposures; Machine learning; Metal mixtures; Mortality; NHANES.
© 2024. The Author(s).