Objective: We evaluate the performance of a Natural Language Processing (NLP) application designed to extract follow-up provider information from free-text discharge summaries at two hospitals.
Evaluation: We compare performance by the NLP application, called the Regenstrief EXtracion tool (REX), to performance by three physician reviewers at extracting follow-up provider names, phone/fax numbers and location information. Precision, recall, and F-measures are reported, with 95% CI for pairwise comparisons.
Results: Of 556 summaries with follow-up information, REX performed as follows in precision, recall, F-measure respectively: Provider Name 0.96, 0.92, 0.94; Phone/Fax 0.99, 0.92, 0.96; Location 0.83, 0.82, 0.82. REX was as good as all physician-reviewers in identifying follow-up provider names and phone/fax numbers, and slightly inferior to two physicians at identifying location information. REX took about four seconds (vs. 3-5 minutes for physician-reviewers) to extract follow-up information.
Conclusion: A NLP program had physician-like performance at extracting provider follow-up information from discharge summaries.