Boosted mixture of experts: an ensemble learning scheme

Neural Comput. 1999 Feb 15;11(2):483-97. doi: 10.1162/089976699300016737.

Abstract

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hintnon, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classifical ensemble approaches.

Publication types

  • Comparative Study

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Databases as Topic
  • Handwriting
  • Humans
  • Models, Statistical
  • Neural Networks, Computer
  • Pattern Recognition, Automated*
  • Pattern Recognition, Visual