There are numerous reports of cardiorespiratory patterns in infants on home monitors, but no data to determine whether "experts" agree on the description of these patterns. Therefore, we evaluated agreement among four experienced investigators and five trained technicians who assessed independently the same sample of physiologic waveforms recorded from infants enrolled in a multicenter study. The monitor used respiratory inductance plethysmography and recorded waveforms for apnea > or = 16 s or a heart rate < 80 beats/min for > or = 5 s. The investigators and technicians initially assessed 88 waveforms. After additional training, the technicians assessed another 113 additional waveforms. In categorizing waveforms as apnea present or absent, agreement among technicians improved considerably with additional training (kappa 0.65 to 0.85). For categorizing waveforms as having bradycardia present versus absent, the trends were the same. Agreement in measurement of apnea duration also improved considerably with additional training (intraclass correlation 0.33-0.83). Agreement in measurement of bradycardia duration was consistently excellent (intraclass correlation 0.86-0.99). Total agreement was achieved among technicians with additional training for measurement of the lowest heart rate during a bradycardia. When classifying apnea as including > or = 1, > or = 2, > or = 3, or > or = 4 out-of-phase breaths, agreement was initially low, but after additional training it improved, especially in categorization of apneas with > or = 3 or > or = 4 out-of-phase breaths (kappa 0.67 and 0.94, respectively). Although researchers and clinicians commonly describe events based on cardiorespiratory recordings, agreement amongst experienced individuals may be poor, which can confound interpretation. With clear guidelines and sufficient training raters can attain a high level of agreement in describing cardiorespiratory events.