Background: Although cardiac resynchronization therapy (CRT) is beneficial in heart failure patients with left bundle branch block, 30% of these patients do not respond to the therapy. Identifying these patients before implantation of the device is one of the current challenges in clinical cardiology.
Methods: We verified the diagnostic contribution and an optimized computerized approach to measuring ventricular electrical activation delay (VED) from body surface 12-lead ECGs. We applied the method to ECGs acquired before implantation (baseline) in the MADIT-CRT trial (Multicenter Automatic Defibrillator Implantation-Cardiac Resynchronization Therapy). VED values were dichotomized using its quartiles, and we tested the association of VED values with the MADIT-CRT primary end point of heart failure or death. Multivariate Cox proportional models were used to estimate the risk of study end points. In addition, the association between VED values and hemodynamic changes after CRT-D implantation was examined using 1-year follow-up echocardiograms.
Results: Our results showed that left bundle branch block patients with baseline VED <31.2 ms had a 35% risk of MADIT-CRT end points, whereas patients with VED ≥31.2 ms had a 14% risk (P<0.001). The hazard ratio for predicting primary end points in patients with low VED was 2.34 (95% confidence interval, 1.53-3.57; P<0.001). Higher VED values were also associated with beneficial hemodynamic changes. These strong VED associations were not found in the right bundle branch block and intraventricular conduction delay cohorts of the MADIT-CRT trial.
Conclusions: Left bundle branch block patients with a high baseline VED value benefited most from CRT, whereas left bundle branch block patients with low VED did not show CRT benefits.
Keywords: bundle-branch block; cardiac resynchronization therapy; electrocardiography; heart failure; prognosis.
© 2018 The Authors.