Maintaining the balance between over- and under-immunosuppression has a critical role for successful immunosuppressive therapy after renal transplantation. We studied the predictive value of our functional immune assay, which works based on adenosine triphosphate (ATP) levels, in determining risk of infection and rejection among renal transplant recipients (RTRs). A total of 65 RTRs with less than 1 month (RTRL1) and 48 RTRs with more than 6 months (RTRM6) of post-transplant time, and 56 healthy individuals were included. Upon lymphocyte activation by phytohemagglutinin (PHA), CD4+ T cells were separated using magnetic beads (Dynabeads), the intracellular ATP (iATP) concentrations were measured by luciferin-luciferase reaction, and compared within and between the groups. Activated CD4+ cells iATP production directly correlated with post-transplant time (r = 0.32, P = 0.011). The iATP levels were significantly lower in both RTRL1 and RTRM6 groups compared to control (P < 0.001), and in the RTRL1 group compared to the RTRM6 (P < 0.05). The iATP concentrations were significantly lower in patients who suffered from infection versus the RTRs with stable graft function (SGF). However, the iATP levels were higher in those with allograft rejection episode (ARE). Our optimization experiments showed that best iATP levels cutoffs were 472.5 and 572.5 ng/ml for predicting risk of ARE, and 218.5 and 300.5 ng/ml for predicting risk of developing infection in RTRL1 and RTRM6 patients, respectively. iATP levels measured by immune function assay might be a promising predictive tool for identifying RTRs who are at risk of developing infection or allograft rejection.
Keywords: Immune function; Infection; Rejection; Renal transplant; Stable graft function.
Copyright © 2018. Published by Elsevier Masson SAS.