The use of meta-analysis to combine results of several trials is still increasing in the medical field. The validity of a meta-analysis may be affected by various sources of bias (for example, publication bias, language bias). Therefore, an analysis of bias should be an integral part of any systematic review. Statistical tests and graphical methods have been developed for this purpose. In this paper, two statistical tests for the detection of bias in meta-analysis were investigated in a simulation study. Binary outcome data, which are very common in medical applications, were considered and relative effect measures (odds ratios, relative risk) were used for pooling. Sample sizes were generated according to findings in a survey of eight German medical journals. Simulation results indicate an inflation of type I error rates for both tests when the data are sparse. Results get worse with increasing treatment effect and number of trials combined. Valid statistical tests for the detection of bias in meta-analysis with sparse data need to be developed.
Copyright 2002 John Wiley & Sons, Ltd.