Background: Clinical guidelines on implantable cardioverter defibrillator (ICD) therapy changed significantly in the last decades with potential inherent effects on therapy efficacy. We aimed to study therapy rates in time and the association between therapies and mortality.
Methods: All patients receiving an ICD, primary and secondary prevention, were included in a single-center retrospective registry. Information on first appropriate and inappropriate therapies was documented. Dates of implant were divided in P1: 1996-2001, P2: 2002-2008, and P3: 2009-2014.
Results: A total of 727 patients, 84.9% male-66.4% ischemic cardiomyopathy (ICM)-56% primary prevention-mean follow-up 5.2 ± 4.1 years, were included. There was a shift from secondary to primary prevention indications, from ischemic to non-ICM, and from single chamber to cardiac resynchronization therapy defibrillator devices. The annual 1- and 3-year appropriate shock (AS) rate declined from 29.4% and 15.1% in P1, over 13.3% and 9.2% in P2 to 7.8% and 5.7% in P3 (log-rank P < 0.001), while inappropriate shock (IAS) rates remained unchanged (log-rank P = 0.635). After multivariate regression analysis a higher age at implant, lower left ventricular ejection fraction, history of stroke, diabetes mellitus, intake of loop diuretics or digitalis, higher creatinine, and longer QTc were independent predictors of mortality.
Conclusion: These changes in clinical practice with a shift to primary prevention and rise in non-ICM implants caused a significant decrease in AS incidence, while IAS remained stable. Receiving AS or IAS was not an independent predictor of mortality in our real-life cohort.
Keywords: appropriate therapies; implantable cardioverter defibrillator; inappropriate therapies; mortality.
© 2016 Wiley Periodicals, Inc.