A new type of radiographic film, EDR (extended dose range) film, has been recently become available for film dosimetry. It is particularly attractive for composite isodose verification of intensity modulated radiation therapy because of its low sensitivity relative to the more common Kodak XV film. For XV film, the relationship between optical density and dose, commonly known as the sensitometric curve, depends linearly on the dose at low densities. Unlike XV film, the sensitometric curve of EDR film irradiated by megavoltage x rays is not linearly dependent on the dose at low densities. In this work, to understand the mechanisms governing the shape of the sensitometric curves, EDR film was studied with kilovoltage x rays, 60Co gamma rays, megavoltage x rays, and electron beams. As a comparison, XV film was also studied with the same beams mentioned above. The model originally developed by Silberstein [J. Opt. Soc. Am. 35, 93-107, 1945)] is used to fit experimental data. It is found that the single hit model can be used to predict the sensitometric curve for XV films irradiated by all beams used in this work and for EDR films exposed to kilovoltage x rays. For EDR film irradiated by 60Co gamma rays, megavoltage x rays, and electron beams, the double hit model is used to fit the sensitometric curves. For doses less than 100 cGy, a systematic difference between measured densities and that predicted by the double hit model is observed. Possible causes of the observed differences are discussed. The results of this work provide a theoretical explanation of the sensitometric behavior of EDR film.