The CIEMAT/NIST and TDCR methods in liquid scintillation counting, initially developed for the activity standardization of pure-beta radionuclides, have been extended to the standardization of electron capture and beta-gamma radionuclides. Both methods require the calculation of the energy spectrum absorbed by the liquid scintillator. For radionuclides emitting X-rays or gamma-rays, when the energy is greater than a few tens of keV the Compton interaction is important and the absorption is not total. In this case, the spectrum absorbed by the scintillator must be calculated using analytical or stochastic models. An illustration of this problem is the standardization of 54Mn, which is a radionuclide decaying by electron capture. The gamma transition, very weakly converted, leads to the emission of an 835 keV photon. The calculation of the detection efficiency of this radionuclide requires the calculation of the energy spectrum transferred to the scintillator after the absorption of the gamma ray and the associated probability of absorption. The validity of the method is thus dependent on the correct calculation of the energy transferred to the scintillator. In order to compare the calculation results obtained using various calculation tools, and to provide the metrology community with some information on the choice of these tools, the LS working group of the ICRM organised a comparison of the calculated absorbed spectra for the 835 keV photon of 54Mn. The result is the spectrum of the energy absorbed by the scintillator per emission of an 835 keV gamma ray. This exercise was proposed for a standard 20 ml LS glass vial and for LS cocktail volumes of 10 and 15 ml. The calculation was done for two different cocktails: toluene and a widely used commercial cocktail, Ultima Gold. The paper describes the results obtained by nine participants using a total of 12 calculation codes.