Purpose: Trend analysis techniques to detect glaucomatous progression typically assume a constant rate of change. This study uses data from the Ocular Hypertension Treatment Study to assess whether this assumption decreases sensitivity to changes in progression rate, by including earlier periods of stability.
Methods: Series of visual fields (mean 24 per eye) completed at 6-month intervals from participants randomized initially to observation were split into subseries before and after the initiation of treatment (the "split-point"). The mean deviation rate of change (MDR) was derived using these entire subseries, and using only the window length (W) tests nearest the split-point, for different window lengths of W tests. A generalized estimating equation model was used to detect changes in MDR occurring at the split-point.
Results: Using shortened subseries with W = 7 tests, the MDR slowed by 0.142 dB/y upon initiation of treatment (P < 0.001), and the proportion of eyes showing "rapid deterioration" (MDR <-0.5 dB/y with P < 5%) decreased from 11.8% to 6.5% (P < 0.001). Using the entire sequence, no significant change in MDR was detected (P = 0.796), and there was no change in the proportion of eyes progressing (P = 0.084). Window lengths 6 ≤ W ≤ 9 produced similar benefits.
Conclusions: Event analysis revealed a beneficial treatment effect in this dataset. This effect was not detected by linear trend analysis applied to entire series, but was detected when using shorter subseries of length between six and nine fields. Using linear trend analysis on the entire field sequence may not be optimal for detecting and monitoring progression. Nonlinear analyses may be needed for long series of fields. (ClinicalTrials.gov number, NCT00000125.).