NEUROCOMPUTER BASED COMPLEXITY ESTIMATOR OPTIMIZING A HYBRID MULTI NEURAL NETWORK STRUCTURE
DOI:
https://doi.org/10.47839/ijc.7.3.533Keywords:
ZISC© IBM® Neurocomputer, T-DTS, Hybrid Multiply Neural Networks, Self Organizing Map – Linear Support Vector Machine Decision Tree, RBF AlgorithmAbstract
This paper presents application of ZISC© IBM® neurocomputer based approach for estimating task complexity within T-DTS framework. T-DTS (Tree-like Divide To Simplify) is Hybrid Multiple Neural Networks software platform which constructs a neural tree structures of a complex problem following the paradigm “divide” and “conquer”. Complexity estimator modules are the core of this framework. One of them is ZISC© IBM® complexity estimator that has been recently applied to T-DTS. The global aim of this research work is to increase T-DTS performance in terms of generalization and learning abilities. In this paper we demonstrate matchless ZISC© IBM® based neurocomputer complexity estimator effect on database decomposition and searching for optimal T-DTS adjustment of complexity threshold.References
Multiple Model Approaches to Modeling and Control, edited by R. Murray-Smith and T.A. Johansen, Taylor & Francis Publishers, 1997, ISBN 0-7484-0595-X.
S. Goonatilake and S. Khebbal, “Intelligent Hybrid Systems: Issues, Classification and Future Directions”, in Intelligent Hybrid Systems, John Wiley & Sons, pp 1-20, ISBN 0 471 94242 1.
Krogh A., Vedelsby J.: Neural Network Ensembles, Cross Validation, and Active Learning, in Adv in Neural Inf Processing Syst. 7, The MIT Press, Ed by G. Tesauro, pp 231-238, 1995.
Sridhar D.V.,Bartlett E.B., Seagrave R.C., "An information theoretic approach for combining neural network process models", Neural Networks, Vol. 12, pp 915-926, Pergamon, Elsevier, 1999.
Jordan M. I. and Xu L., "Convergence Results for the EM Approach to Mixture of Experts Architectures", Neural Networks, Vol. 8, N° 9, pp 1409-1431, Pergamon, Elsevier, 1995.
Bruske J., Sommer G., Dynamic Cell Structure, Adv in Neural Inf Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 497-504, 1995.
Sang K. K. and Niyogi P., Active learning for function approximation, in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 497-504.
Madani K., Chebira A., "A Data Analysis Approach Based on a Neural Networks Data Sets Decomposition and it’s Hardware Implementation", PKDD 2000, Lyon, France, 2000.
A. Chernoff, “Estimation of a multivariate density”, Annals of the Institute of Statistical Mathematics, vol. 18, pp. 179-189, 1966.
A. Bhattacharya, “On a measure of divergence between two statistical populations defined by their probability distributions”, Bulletin of Calcutta Maths Society, vol. 35, pp. 99-110, 1943.
J. Lin, “Divergence measures based on the Shannon entropy”, IEEE Transactions on Information Theory, 37(1):145-151, 1991.
K. Matusita, “On the notion of affinity of several distributions and some of its applications”, Annals Inst. Statistical Mathematics, 19:181-192, 1967.
E. Parzen, “On estimation of a probability density function and mode”, Annals of Math. Statistics, vol. 33, pp. 1065-1076, 1962.
W.E. Pierson, “Using boundary methods for estimating class separability”, PhD Thesis, Dept of Elec. Eng., Ohio State Univ., 1998.
A. Kohn, L. G. Nakano, and V. Mani, “A class discriminability measure based on feature space partitioning”, Pattern Recognition, 29(5):873-887, 1996.
I. Budnyk, A. Chebira, K. Madani, ZISC Neural Network Base Indicator for Classification Complexity Estimation, ANNIIP (2007), pp. 38-47.
Ivan Budnyk, El khier Bouyoucef, Abdennasser Chebira , Kurosh Madani, "A Hybrid Multi-Neural Network Structure Optimization Handled by a Neurocomputer Complexity Estimator", Proceedings of International Conference on Neural Networks and Artificial Intelligence (ICNNAI 2008), ISBN : 978-985-6329-79-4, Minsk, Byelorussia, May 27 – 30, 2008, pp. 310-314.
Laboratory IBM France: ZISC® 036 Neurons User’s Manual. Version 1.2. Component Development (1998).
G. De Tremiolles, “Contribution to the theoretical study of neuro-mimetic models and to their experimental validation: a panel of industrial applications”, Ph.D. Report, University of PARIS 12, 1998 (in French)
D. P. Feldman, J. P. Crutchfield, Measure of statistical complexity. Why? Phys. Lett .(1998) A 238, pp. 244-252.
I. S. Mehmet, Y. Bingul, K. E. Okan, Classification of Satellite Images by Using Self-organizing Map and Linear Support Vector Machine Decision Tree, 2nd Annual Asian Conference and Exhibition in the field of GIS, (2003).
H. Chi, O.K. Ersoy: Support Vector Machine Decision Trees with Rare Event Detection, International Journal for Smart Engineering System Design (2002), Vol. 4, pp. 225-242.E.
E. Bouyoucef, Contribution a l’etude et la mise en ?uvre d’indicateurs quantitatifs et qualitatifs d’estimation de la complexite pour la regulation du processus d’auto organisation d’une structure neuronale modulaire de traitement d’information, PhD Thesis repport, University Paris XII, LISSI (2007), in French.
Downloads
Published
How to Cite
Issue
Section
License
International Journal of Computing is an open access journal. Authors who publish with this journal agree to the following terms:• Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
• Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
• Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.