Objectives: This study aimed to analyze the ability of subendocardial viability ratio (SEVR) to predict the degree of coronary artery stenosis and the relationship between SEVR and the incidence of short-term cardiovascular endpoint events.
Method: The indexes of 243 patients with chest pain were collected.. Binary logistic regression analyses were performed using the dichotomous outcome of high and non-high SYNTAX scores. Receiver operating characteristic curves were employed to comparatively analyze the diagnostic efficiencies of the indices and models. A survival analysis combined with the Cox regression analysis was performed using the Kaplan-Meier method to understand the relationship between the SEVR and the incidence of cardiovascular events within 1 year in patients with coronary heart disease (CHD).
Results: SEVR was significantly lower ( P < 0.05) in the high-stenosis group than control and low-stenosis groups. The diagnostic efficacy of SEVR [area under the curve (AUC) = 0.861] was better than those of age (AUC = 0.745), ABI (AUC = 0.739), and AIx@HR75 (AUC = 0.659). The cutoff SEVR was 1.105. In patients with confirmed CHD who had been discharged from the hospital for 1 year, only SEVR affected survival outcomes (hazard ratio = 0.010; 95% confidence interval: 0.001-0.418; P = 0.016).
Conclusion: A significant decrease in SEVR predicted severe coronary artery stenosis, with a cutoff value of 1.105 and an accuracy of 0.861. In patients with CHD, the lower the SEVR, the higher was the rate of cardiovascular events at 1 year after hospital discharge.
Copyright © 2024 The Author(s). Published by Wolters Kluwer Health, Inc.