FAQ

Schedae Informaticae

logo of Jagiellonian University in Krakow

On the Consistency of Multithreshold Entropy Linear Classifier

Publication date: 11.04.2016

Schedae Informaticae, 2015, Volume 24, pp. 123 - 132

https://doi.org/10.4467/20838476SI.15.012.3034

Authors

Wojciech Marian Czarnecki
Department of Mathematics Faculty of Mathematics and Computer Science Jagiellonian University, ul. Łojasiewicza 6, 30-348 Kraków, Poland
All publications →

Alternative titles

On the Consistency of Multithreshold Entropy Linear Classifier

Abstract

Multithreshold Entropy Linear Classifier (MELC) is a recent classifier idea which employs information theoretic concept in order to create a multithreshold maximum margin model. In this paper we analyze its consistency over multithreshold linear models and show that its objective function upper bounds the amount of misclassified points in a similar manner like hinge loss does in support vector machines. For further confirmation we also conduct some numerical experiments on five datasets.

References

[1] Sch¨olkopf B., Smola A.J., Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, London, England, 2002.
[2] Steinwart I., On the influence of the kernel on the consistency of support vector machines. The Journal of Machine Learning Research, 2002, 2, pp. 67–93.
[3] Steinwart I., Consistency of support vector machines and other regularized kernel classifiers. IEEE Transactions on Information Theory, 2005, 51(1), pp. 128–142.
[4] Czarnecki W.M., Tabor J., Multithreshold entropy linear classifier. arXiv preprint arXiv:1408.1054, 2014.
[5] Jenssen R., Principe J.C., Erdogmus D., Eltoft T., The cauchy–schwarz divergence and parzen windowing: Connections to graph theory and mercer kernels. Journal of the Franklin Institute, 2006, 343(6), pp. 614–629.
[6] Silverman B.W., Density estimation for statistics and data analysis. vol. 26. CRC press, Boca Raton 1986.
[7] Principe J.C., Information theoretic learning: R´enyi’s entropy and kernel perspectives. Springer, Gainesville 2010.
[8] Vapnik V., The nature of statistical learning theory. Springer, New York 2000.
[9] Ho T.K., Kleinberg E.M., Building projectable classifiers of arbitrary complexity. In: Proceedings of the 13th International Conference on Pattern Recognition, 1996. vol. 2., IEEE, 1996, pp. 880–885.
[10] LeCun Y., Cortes C., The mnist database of handwritten digits, 1998, http://yann.lecun.com/exdb/mnist/.

Information

Information: Schedae Informaticae, 2015, pp. 123 - 132

Article type: Original article

Titles:

Polish:

On the Consistency of Multithreshold Entropy Linear Classifier

English:

On the Consistency of Multithreshold Entropy Linear Classifier

Authors

Department of Mathematics Faculty of Mathematics and Computer Science Jagiellonian University, ul. Łojasiewicza 6, 30-348 Kraków, Poland

Published at: 11.04.2016

Article status: Öffnen Sie

Licence: None

Percentage share of authors:

Wojciech Marian Czarnecki (Author) - 100%

Article corrections:

-

Publication languages:

Englisch

View count: 2249

Number of downloads: 1143