Five-year-old early maturing peach trees (Prunus persica (L.) Batsch cv. Flordastar grafted on GF-677 peach rootstock) were subjected to three irrigation treatments from March 18 to November 10, 2006. Control plants (T0 treatment) which received irrigation in excess of their crop water requirements (1089.7 mm) were compared with plants watered according to sap flow (SF; T1 treatment) or maximum daily trunk shrinkage (MDS; T2 treatment) measurements, so as to maintain SF and MDS signal intensities (control SF/SF in T1 and MDS in T2/control MDS, respectively) close to unity. When SF or MDS signal intensity on at least two of three consecutive days was at or below unity, irrigation was reduced by 10%. When the MDS signal intensity on at least two of three consecutive days exceeded unity, irrigation was increased by 10%. During the experiment, estimated crop evapotranspiration was 704.9 mm, and the cumulative amounts of applied water in the T1 and T2 treatments were 463.2 and 654.5 mm, respectively. The MDS-signal-intensity-driven irrigation schedule was more suitable than the SF-signal-intensity-driven irrigation schedule because it was more sensitive and reliable in detecting changes in plant water status, preventing the development of detectable plant water stress. Moreover, it had no effect on fruit size. We conclude that peach tree irrigation scheduling can be based on MDS measurements alone. Changes in the irrigation protocol assayed were proposed to reduce MDS signal intensity deviations above unity, for example, by increasing the irrigation scheduling frequency or the amount of water applied, or both. Irrigation schedules based on maintaining MDS signal intensities close to unity could be applied when local crop factor values are unavailable.