Background Clinically significant prostate cancer (PCa) diagnosis at MRI requires accurate and efficient radiologic interpretation. Although artificial intelligence may assist in this task, lack of transparency has limited clinical translation. Purpose To develop an explainable artificial intelligence (XAI) model for clinically significant PCa diagnosis at biparametric MRI using Prostate Imaging Reporting and Data System (PI-RADS) features for classification justification. Materials and Methods This retrospective study included consecutive patients with histopathologic analysis-proven prostatic lesions who underwent biparametric MRI and biopsy between January 2012 and December 2017. After image annotation by two radiologists, a deep learning model was trained to detect the index lesion; classify PCa, clinically significant PCa (Gleason score ≥ 7), and benign lesions (eg, prostatitis); and justify classifications using PI-RADS features. Lesion- and patient-based performance were assessed using fivefold cross validation and areas under the receiver operating characteristic curve. Clinical feasibility was tested in a multireader study and by using the external PROSTATEx data set. Statistical evaluation of the multireader study included Mann-Whitney U and exact Fisher-Yates test. Results Overall, 1224 men (median age, 67 years; IQR, 62-73 years) had 3260 prostatic lesions (372 lesions with Gleason score of 6; 743 lesions with Gleason score of ≥ 7; 2145 benign lesions). XAI reliably detected clinically significant PCa in internal (area under the receiver operating characteristic curve, 0.89) and external test sets (area under the receiver operating characteristic curve, 0.87) with a sensitivity of 93% (95% CI: 87, 98) and an average of one false-positive finding per patient. Accuracy of the visual and textual explanations of XAI classifications was 80% (1080 of 1352), confirmed by experts. XAI-assisted readings improved the confidence (4.1 vs 3.4 on a five-point Likert scale; P = .007) of nonexperts in assessing PI-RADS 3 lesions, reducing reading time by 58 seconds (P = .009). Conclusion The explainable AI model reliably detected and classified clinically significant prostate cancer and improved the confidence and reading time of nonexperts while providing visual and textual explanations using well-established imaging features. © RSNA, 2023 Supplemental material is available for this article. See also the editorial by Chapiro in this issue.