A simulation method is described in this work that aids in quantifying the upper limits of lesion detectability as a function of lesion size, lesion contrast, pixel size, and x-ray exposure for digital x-ray imaging systems. The method entails random lesion placement with subsequent simulated imaging on idealized x-ray detectors with no additive noise and 100% quantum detective efficiency. Lesions of different size and thickness were simulated. Mean (expectation) lesion signal-to-noise ratios (LSNRs) were calculated and receiver operating characteristic (ROC) curves were constructed based on LSNR ensembles. Mean (expectation) values of the areas under the ROC curves were calculated for lesions of varying size on pixel arrays of varying size at different exposures. Analyses were performed across several parameters, including lesion size, pixel size, and exposure levels representative of various areas of radiography. As expected, lesion detectability increased with lesion size, contrast, pixel size, and exposure. The model suggests that lesion detectability is strongly dependent on the relative alignment (phase) of the lesion with the pixel matrix for lesions on the order of the pixel size.