Cascading classifiers
Kybernetika (1998)
- Volume: 34, Issue: 4, page [369]-374
- ISSN: 0023-5954
Access Full Article
topAbstract
topHow to cite
topAlpaydin, Ethem, and Kaynak, Cenk. "Cascading classifiers." Kybernetika 34.4 (1998): [369]-374. <http://eudml.org/doc/33363>.
@article{Alpaydin1998,
abstract = {We propose a multistage recognition method built as a cascade of a linear parametric model and a $k$-nearest neighbor ($k$-NN) nonparametric classifier. The linear model learns a “rule” and the $k$-NN learns the “exceptions” rejected by the “rule.” Because the rule-learner handles a large percentage of the examples using a simple and general rule, only a small subset of the training set is stored as exceptions during training. Similarly during testing, most patterns are handled by the rule -learner and few are handled by the exception-learner thus causing only a small increase in memory and computation. A multistage method like cascading is a better approach than a multiexpert method like voting where all learners are used for all cases; the extra computation and memory for the second learner is unnecessary if we are sufficiently certain that the first one’s response is correct. We discuss how such a system can be trained using cross validation. This method is tested on the real-world application of handwritten digit recognition.},
author = {Alpaydin, Ethem, Kaynak, Cenk},
journal = {Kybernetika},
keywords = {multistage recognition method; linear parametric model; cascading; multistage recognition method; linear parametric model; cascading},
language = {eng},
number = {4},
pages = {[369]-374},
publisher = {Institute of Information Theory and Automation AS CR},
title = {Cascading classifiers},
url = {http://eudml.org/doc/33363},
volume = {34},
year = {1998},
}
TY - JOUR
AU - Alpaydin, Ethem
AU - Kaynak, Cenk
TI - Cascading classifiers
JO - Kybernetika
PY - 1998
PB - Institute of Information Theory and Automation AS CR
VL - 34
IS - 4
SP - [369]
EP - 374
AB - We propose a multistage recognition method built as a cascade of a linear parametric model and a $k$-nearest neighbor ($k$-NN) nonparametric classifier. The linear model learns a “rule” and the $k$-NN learns the “exceptions” rejected by the “rule.” Because the rule-learner handles a large percentage of the examples using a simple and general rule, only a small subset of the training set is stored as exceptions during training. Similarly during testing, most patterns are handled by the rule -learner and few are handled by the exception-learner thus causing only a small increase in memory and computation. A multistage method like cascading is a better approach than a multiexpert method like voting where all learners are used for all cases; the extra computation and memory for the second learner is unnecessary if we are sufficiently certain that the first one’s response is correct. We discuss how such a system can be trained using cross validation. This method is tested on the real-world application of handwritten digit recognition.
LA - eng
KW - multistage recognition method; linear parametric model; cascading; multistage recognition method; linear parametric model; cascading
UR - http://eudml.org/doc/33363
ER -
References
top- Alpaydın E., 1997, REx: Learning A Rule and Exceptions. International Computer Science Institute TR-97-040 Berkeley
- Alpaydın E., Gürgen F., 10.1007/BF01414175, Neural Computing Appl. 3 (1995), 38–49 (1995) DOI10.1007/BF01414175
- Bishop C. M., Neural Networks for Pattern Recognition, Oxford University Press, Oxford 1995 Zbl0868.68096MR1385195
- Garris M. D., Blue J. L., Candela G. T., Dimmick D. L., Geist J., Grother P. J., Janet S. A., Wilson C. L., NIST Form–Based Handprint Recognition System, NISTIR 5469, 199
- Pudil P., Novovičová J., Bláha S., Kittler J., Multistage pattern recognition with reject option, In: 11th IAPR International Conference on Pattern Recognition B, 1992, vol. II, pp. 92–95 (1992)
- Xu L., Krzyżak, A., Suen C. Y., 10.1109/21.155943, IEEE Trans. Systems Man Cybernet. 22 (1992), 418–435 (1992) DOI10.1109/21.155943
NotesEmbed ?
topTo embed these notes on your page include the following JavaScript code on your page where you want the notes to appear.