We evaluated the precision and accuracy of a procedure for detecting recent human immunodeficiency virus (HIV) infections, specifically, the avidity index (AI) calculated using a method based on an automated AxSYM HIV 1/2gO assay (Abbott). To evaluate precision, we performed multiple replicates on eight HIV-positive serum samples. To evaluate the accuracy in identifying recent infections (i.e., within 6 months of seroconversion), we used 216 serum samples from 47 persons whose dates of seroconversion were known. To evaluate the sensitivity and specificity of the procedure for different AI cutoff values, we performed receiver operating characteristic (ROC) analysis. To determine the effects of antiretroviral treatment, advanced stage of the disease (i.e., low CD4-cell count), and low HIV viral load on the AI, we analyzed 15 serum samples from 15 persons whose dates of seroconversion were unknown. The precision study showed that the procedure was robust (i.e., the total variance of the AI was lower than 10%). Regarding accuracy, the mean AI was significantly lower for samples collected within 6 months of seroconversion, compared to those collected afterwards (0.68 +/- 0.16 versus 0.99 +/- 0.10; P < 0.0001), with no overlap of the 95% confidence intervals. The ROC analysis revealed that an AI lower than 0.6 had a sensitivity of 33.3% and a specificity of 98.4%, compared to 87.9 and 86.3%, respectively, for an AI lower than 0.9. Antiretroviral treatment, low CD4-cell count, and low viral load had no apparent effect on the AI. In conclusion, this procedure is reproducible and accurate in identifying recent infections; it is automated, inexpensive, and easy to perform, and it provides a quantitative result with different levels of sensitivity and specificity depending on the selected cutoff.