LEARNING AND UNDERSTANDING BASED ON NEURAL NETWORK TREES
DOI:
https://doi.org/10.47839/ijc.3.1.257Keywords:
Neural networks, decision trees, neural network trees, pattern recognition, machine learning and understanding, neural network interpretation, incremental learningAbstract
Models for machine learning can be categorized roughly into two groups: symbolic and non-symbolic Generally speaking, symbolic model based learning can provide understandable results, but cannot adapt to changing environments efficiently. On the other hand, non-symbolic model based learning can adapt to changing environments, but the results are usually "black-boxes”. In our study, we introduced a hybrid model called neural network tree (NNTree). An NNTree is a decision tree (DT) with each non-terminal node containing an expert neural network (ENN). Results obtained so far show that an NNTree can be re-trained incrementally using new data. In addition, an NNTree can be interpreted easily if we restrict the number of inputs for each ENN. Thus, it is possible to perform recognition, learning and understand using the NNTree model alone.References
M. Hilario, “An overview of strategies for neurosymbolic integration,” Proc. International Joint Conference on Artificial Intelligence, 1995.
S. Wermter and R. Sun, “An overview of hybrid neural systems,” in Hybrid Neural Systems, S. Wermter and R. Sun editors, Springer-Verlga, Berlin Heidelderg, 2000.
L. R. Medsker and D. L. Bailey, “Models and guidelines for integrating expert systems and neural networks,” in Intelligent Hybrid Systems, A. Kandel and G. Langholz editors, CRC Press, 1992.
R. P. Brent, "Fast training algorithms for multilayer neural nets," IEEE Trans. on Neural Networks, Vol. 2, No. 3, pp. 346-354, 1991.
I. K. Sethi, “Entropy nets: from decision trees to neural networks,” Proc. IEEE, Vol. 78, No. 10, pp. 1605-1613, 1990.
M. W. Craven and J. W. Shavlik, “Extracting tree-structured representations of trained networks,” In D. Touretzky, M. Mozer and M. Hasselmo editors, Advances in Neural Information Processing Systems (Vol. 8), MIT Press, 1996.
J. L. Castro, C. J. Mantas and J. M. Benitez, "Interpretation of artificial neural networks by means of fuzzy rules," IEEE Trans. Neural Networks, Vol. 13, No. 1, pp. 101-116, 2002.
L. M. Fu, "Rule generation from neural networks," IEEE Trans. System, Man, and Cybernetics, Vol. 24, No. 8, pp. 114-124, 1994.
G. P. J. Schmitz, C. Aldrich and F. S. Gouws, "ANN-DT: an algorithm for extraction of decision trees from artificial neural networks," IEEE Trans. Neural Networks, Vol. 10, No. 6, pp. 1392-1401, 1999.
T. Mitchell and S. Thrun, “Explanation-based neuralnetwork learning: a lifelong learning approach,” Kluwer Academic Publishers, Boston, 1996.
H. Tsukimoto, "Extracting rules from trained neural networks," IEEE Trans. Neural Networks, Vol. 11, No. 2, pp. 377-389, 2000.
A. B. Tickle, R. Andrews, M. Golea and J. Diederich, “The truth will come to light: directions and challenges in extracting the knowledge embedded within trained artificial neural networks,” IEEE Trans. on Neural Networks, Vol. 9, No. 6, pp. 1057-1068, 1998.
http://www.salford-systems.com/index.html
H. Guo and S. B. Gelfand,``Classification trees with neural network feature extraction,'' IEEE Trans. on Neural Networks, Vol. 3, No. 6, pp. 923-933, Nov. 1992.
A. Suarez and J. Lutsko, “Globally optimal fuzzy decision trees for classification and regression,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 21, No. 12, pp. 1297-1311, 1999.
C. Z. Janickow, “Fuzzy decision trees: issues and methods,” IEEE Trans. on Systems, Man and Cybernetics B, Vol. 28, No. 1, pp. 1-14, 1998.
J. R. Koza, Genetic Programming – I, Fourth Printing, The MIT Press, 1994.
B. T. Zhang and H. Muhlenbein, “Evolving optimal neural networks using genetic algorithms with Occam’s razor,” Complex Systems, Vol. 7, No. 3, pp. 199-220, 1993.
R. A. Jacobs and M. I. Jordan, “Adaptive mixtures of local experts,” Neural Computation, Vol. 3, pp. 79-87, 1991.
M. I. Jordan and R. A. Jacobs, “Hierarchical mixtures of experts and EM algorithm,” Neural Computation, Vol. 6, pp. 181-214, 1994.
J. R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann Publishers, 1993.
S. Mizuno and Q. F. Zhao, "Neural Network Trees with Nodes of Limited Inputs are Good for Learning and Understanding," Proc. 4th Asia-Pacific Conference on Simulated Evolution And Learning (SEAL2002), pp. 573-576, Singapore, 2002.
Q. F. Zhao, ''Evolutionary design of neural network tree - integration of decision tree, neural network and GA,'' Proc. IEEE Congress on Evolutionary Computation, pp. 240-244, Seoul, 2001.
Q. F. Zhao, ''Training and re-training of neural network trees,'' Proc. INNS-IEEE International Joint Conference on Neural Networks, pp. 726-731, 2001.
T. Takeda and Q. F. Zhao, "Size reduction of neural network trees through re-training,'' Technical Report of IEICE, PRMU2002-105 (2002-10).
T. Takeda, Q. F. Zhao and Y. Liu, "A Study on On-line Learning of NNTrees'' Proc. INNS-IEEE International Joint Conference on Neural Networks, 2003.
Downloads
Published
How to Cite
Issue
Section
License
International Journal of Computing is an open access journal. Authors who publish with this journal agree to the following terms:• Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
• Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
• Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.