Background: A study was undertaken to analyze the survival of chronic lymphocytic leukemia (CLL) patients relative to age-matched individuals in the general population and determine the age-stratified utility of prognostic testing.
Methods: All 2487 patients diagnosed with CLL between January 1995 and June 2008 and cared for in the Mayo Clinic Division of Hematology were categorized by age at diagnosis and evaluated for differences in clinical characteristics, time to first treatment, and overall survival (OS).
Results: Among Rai stage 0 patients, survival was shorter than the age-matched general population for patients aged <55 years (P < .001), 55 to 64 years (P < .001), and 65 to 74 years (P < .001), but not those aged ≥75 years at diagnosis (P = not significant). CD38, IGHV mutation, and ZAP-70 each predicted time to first treatment independent of stage for all age groups (all P < .04), but had less value for predicting OS, particularly as age increased. IGHV and fluorescent in situ hybridization (FISH) predicted OS independent of stage for patients aged <55 years (P ≤ .001), 55 to 64 years (P ≤ .004), and 65 to 74 years (P ≤ .001), but not those aged ≥75 years. CD38 and ZAP-70 each predicted OS independent of stage for only 2 of 4 age categories. Among Rai 0 patients aged <75 years, survival was shorter than the age-matched population only for IGHV unmutated (P < .001) patients or those with unfavorable FISH (P < .001).
Conclusions: Survival of CLL patients aged <75 years is shorter than the age-matched general population regardless of disease stage. Among patients aged <75 years, the simple combinations of stage and IGHV or stage and FISH identifies those with excess risk of death relative to the age-matched population. Although useful for predicting time to first treatment independent of stage for patients of all ages, prognostic testing had little utility for predicting OS independent of stage among patients aged ≥75 years.
© 2010 American Cancer Society.