SELF-ORGANIZING NEURAL GROVE: EFFICIENT NEURAL NETWORK ENSEMBLES USING PRUNED SELF-GENERATIONG NEURAL TREES
DOI:
https://doi.org/10.47839/ijc.12.3.601Keywords:
Neural network ensembles, self-organization, improving generalization capability, bagging, boosting.Abstract
R ecently, mul tiple classifier systems have been used for practical applications to improve classification accuracy. Self-generating neural networks are one of the most suitable base-classifiers for multiple classifier systems because of their simple settings and fast learning ability. However, the computation cost of the multiple classifier system based on self-generating neural networks increases in proportion to the numbers of self-gene rating neural networks. In this paper, w e propose a novel prunin g method for efficient classification and we call this model a self-organizing neural grove. Experiments have been conducted to compare the self-organizing neural grove with bagging and the self-organizing neural grove with boosting, and support vector machine. The results show that the self-organizing neural grove can improve its classification accuracy as well as reducing the computation cost.References
J. Han and M. Kamber, Data Mining: Concepts and Techniques, Morgan Kaufmann Publishers, San Francisco, CA, 2000.
L. Breiman, Bagging predictors, Machine Learning, (24) (1996), pp. 123-140.
R. E. Schapire, The strength of weak learnability, Machine Learning, (5) 2 (1990), pp. 197-227.
J. R. Quinlan, Bagging, Boosting, and C4.5, the Thirteenth National Conference on Artificial Intelligence, Portland, OR, (August 4-8, 1996), pp. 725-730.
G. Ratsch, T. Onoda, K. R. Muller, Soft margins for AdaBoost, Machine Learning, (42) 3 (2001), pp. 287-320.
S. Haykin, Neural Networks: A Comprehensive Foundation, second ed., Prentice-Hall, Upper Saddle River, NJ, 1999.
R. O. Duda, P. E. Hart, D. G. Stork, Pattern Classification, second ed., John Wiley & Sons Inc., New York, 2000.
W. X. Wen, A. Jennings, H. Liu, Learning a neural tree, the International Joint Conference on Neural Networks, Beijing, China, (November 3-6, 1992), Vol. 2, pp. 751-756.
T. Kohonen, Self-Organizing Maps, Springer, Berlin, 1995.
H. Inoue, H. Narihisa, Improving generalization ability of self-generating neural networks through ensemble averaging, in: T. Terano, H. Liu, A.L.P. Chen (Eds.), PAKDD 2000, Lecture Notes in Computer Science, Vol. 1805, Springer, Heidelberg, 2000, pp. 177-180.
H. Inoue, H. Narihisa, Optimizing a multiple classifier system, In: M. Ishizuka, A. Sattar (Eds.), PRICAI 2002, Lecture Notes in Computer Science (Lecture Notes in Artificial Intelligence), Vol. 2417, Springer, Heidelberg, 2002, pp. 285-294.
H. Inoue, K. Sugiyama, Self-organizing neural grove, the 7th International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications, Berlin, Germany, (September 12-14, 2013), pp. 319-323.
M. Stone, Cross-validation: A review, Math. Operations for sch. Statist. Ser. Statistics, (9) 1 (1978), pp.127-139.
Y. Freund and R. E. Schapire, Boosting: Foundations and Algorithms, MIT Press, Cambridge, MA, 2012.
C.-C. Chang, C.-J. Lin, LIBSVM: A library for support vector machines, ACM Transactions on Intelligent Systems and Technology, (2) 27 (2011), pp. 1–27, software available at http://www.csie.ntu.edu.tw/cjlin/libsvm
C. Blake, C. Merz, UCI repository of machine learning databases, 1998. [Online]. Available: http://www.ics.uci.edu/mlearn/MLRepository. html
Downloads
Published
How to Cite
Issue
Section
License
International Journal of Computing is an open access journal. Authors who publish with this journal agree to the following terms:• Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
• Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
• Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.