Estimates of radiation absorbed dose to the red marrow (RM) would be valuable in treatment planning for radioimmunotherapy if they could show a correlation with clinical toxicity. In this study, a correlation analysis was performed to determine whether estimates of radiation absorbed dose to the bone marrow could accurately predict marrow toxicity in patients who had received 186Re-labeled monoclonal antibody.
Methods: White blood cell and platelet count data from 25 patients who received 186Re-NR-LU-10 during Phase I radioimmunotherapy trials were analyzed, and the toxicity grade, the fraction of the baseline counts at the nadir (percentage baseline) and the actual nadir were used as the indicators of marrow toxicity. Toxicity was correlated with various predictors of toxicity. These predictors included the absorbed dose to RM, the absorbed dose to whole body (WB) and the total radioactivity administered.
Results: Percentage baseline and grade of white blood cells and platelets all showed a moderate correlation with absorbed dose and radioactivity administered (normalized for body size). The percentage baseline platelet count was the indicator of toxicity that achieved the highest correlation with the various predictors of toxicity (r = 0.73-0.79). The estimated RM absorbed dose was not a better predictor of toxicity than either the WB dose or the total radioactivity administered. There was substantial variation in the blood count response of the patients who were administered similar radioactivity doses and who had similar absorbed dose estimates.
Conclusion: Although there was a moderately good correlation of toxicity with dose, the value of the dose estimates in predicting toxicity is limited by the patient-to-patient variability in response to internally administered radioactivity. In this analysis of patients receiving 186Re-labeled monoclonal antibody, a moderate correlation of toxicity with dose was observed but marrow dose was of limited use in predicting toxicity for individual patients.