ESTIMATING COMPLEXITY OF CLASSIFICATION TASKS USING NEUROCOMPUTERS TECHNOLOGY

Authors

  • Ivan Budnyk
  • Abdennasser Chebira
  • Kurosh Madani

DOI:

https://doi.org/10.47839/ijc.8.1.655

Keywords:

IBM © Zero Instruction Set Computer (ZISC-036 ®) Neurocomputer, Neural tree modular architecture, T- DTS, DNA (Deoxyribonucleic acid), RNA (Ribonucleic acid), exon, intron, Splice junctions problem, Tic-tac-toe endgame problem.

Abstract

This paper presents an alternative approach for estimating task complexity. Construction of a self-organizing neural tree structure, following the paradigm “divide and rule”, requires knowledge about task complexity. Our aim is to determine complexity indicator function and to hallmark its’ main properties. A new approach uses IBM © Zero Instruction Set Computer (ZISC-036 ®) and applies for a range of the different classification tasks.

References

E. Bouyoucef. A. Chebira. M. Rybnik. K. Madani. Multiple Neural Network Model Generator with Complexity Estimation and self-Organization Abilities, International Scientific Journal of Computing (2005) Vol. 4. Issue 3, pp. 20–29.

Laboratory IBM France. ZISC® 036 Neurons User’s Manual. Version 1.2. Component Development (1998).

K. Madani. M. Rybnik. A. Chebira. Data Driven Multiple Neural Network Models Generator Based on a Tree-like Scheduler. Lecture Notes in Computer Science. Edited J. Mira. A. Prieto. Springer Verlag (2003). ISBN 3-540-40210-1, pp. 382-389.

M.A. DE Almeida. H. Lounis. W.L. Melo. An Investigation on the Use of Machine Learned Models for Estimating Software Correctability. International Journal of Software Engineering and Knowledge Engineer (JSEE). Volume 9. Issue 5. (October 1999), pp. 565-593.

M. I. Jordan. L. Xu. Convergence Results for the EM Approach to Mixture of Experts Architectures Neural Networks. Pergamon. Elsevier (1995). Volume 8, N 9, pp. 1409-1431.

K. Madani. A. Chebira. Data Analysis Approach Based on a Neural Networks Data Sets Decomposition and it’s Hardware Implementation. PKDD 2000. Lyon, France 2000.

G. DE Tremiolles. Contribution to the Theoretical Study of Neuro-Mimetic Models and to their Experimental Validation: a Panel of Industrial Applications. Ph.D. Report. University of Paris XII (1998).

G. DE Tremiolles. P. Tannhof. B. Plougonven. C. Demarigny. K. Madani. Visual Probe Mark Inspection using Hardware Implementation of Artificial Neural Networks in VLSI Production. Lecture Notes in Computer Science, Biological and Artificial Computation: From Neuroscience to Technology. Edited by J. Mira. R. Diaz. M. J. Cabestany. Springer Verlag, Berlin. Heidelberg (1997) 1374-1383.

Laboratory IBM France. ISC/ISA Accelerator Card for PC. User Manual. IBM France (1995).

J. Wang. P. Neskovic. L. N. Cooper. Learning class regions by sphere covering. IBNS Technical Report 2006-02. March 2006. Department of Physics and Institute for Brain and Neural Systems Brown University. Providence. RI 02912.

B. V. Dasarathy, editor (1991) Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. ISBN 0-8186-8930-7.

J. Park. J. W. Sandberg. Universal Approximation Using Radial Basis Functions Network, Neural Computation (1991) Volume 3, pp. 246-257.

J. Sima. R. Neruda. Teoreticke Otazky Neuronov?ch Siti. MATFYZPRESS, Prague(1996), 390 pages.

B. Edmonds. What is Complexity? - The philosophy of complexity per se with application to some examples in evolution. F. Heylighen & D. Aerts (Eds.) 1999. The Evolution of Complexity.

A. Kohn. L. G. Nakano. V. Mani. A Class Discriminability Measure Based on Feature Space Partitioning. Pattern Recognition (1996) 29(5), pp. 873-887.

S. F. Bush. On The Effectiveness of Kolmogorov Complexity Estimation to Discriminate Semantic Types, Senior member of IEEE, Todd Hughes, Ph.D.

D. G. Green. D. Newth. Towards a Theory of Everything? – Grand Challenge in Complexity and Informatics. Complexity international (2000). Volume 8. ISSN 1320-0682.

C. Lucas. Quantifying Complexity Theory. (2004).

J. Watson. F. Crick. Molecular Structure of Nucleic Acids, a Structure for Deoxyribose Nucleic Acid, Nature (1953) 171 (4356), 737-8.

J. M. Butler. Forensic DNA Typing, Elsevier (2001), pp. 14-15.

A. Bruce. A. Johnson. J. Lewis. M. Raff, K. Roberts. P. Walters. Molecular Biology of the Cell; Fourth Edition. New York and London. Garland Science (2000).

D. Haussler. M. Kearns. R. Schapire. Bounds on the Sample Complexity of Bayesian Learning Using Information Theory and the VC Dimension. Proceeding of the 4th annual workshop on Computional Learning Theory. Santa Cruz. Califonia. U.S. 1991, pp. 61-74. ISBN: 1-55860-213-5.

H. Haken. Synergetics. Springer-Verlag, Berlin (1978).

E. Bouyoucef, Comparaison des Performances de la T-DTS avec 34 Algorithmes de Classification en Exploitant 16 Bases de Donnees de l’UCI (Machine Learning Repository). Ph.D. Report. University of Paris XII (2007).

D. W. Aha. Incremental Constructive Induction: An Instance-Based Approach. In Proceedings of the Eight International Workshop on Machine Learning. Morgan Kaufmann (1991), pp. 117-121.

F. Torre. Contributions a l’apprentissage disjonctif. Integration des bias de langage a l’algorithme generer-et-tester. Equipe Inference et Apprentisage Laboratoire de Recherche en Informatique Universite Paris-Sud. Rapport (28.02.2000), pp. 16-19.

H. Bostrom. Covering vs. divide-and-conquer for top-down induction of logic programs. Proceedings of the 14th International Joint Conference on Artificial Intelligence (1995), pp. 1194-1200.

L. Dakovski. Z. Shevked. An Alternative Approach for Learning from Examples, International Conference on Computer and Technologies – CompSysTech’ 2005, pp. IIIB.5-1 – IIIB.5-6.

Downloads

Published

2014-08-01

How to Cite

Budnyk, I., Chebira, A., & Madani, K. (2014). ESTIMATING COMPLEXITY OF CLASSIFICATION TASKS USING NEUROCOMPUTERS TECHNOLOGY. International Journal of Computing, 8(1), 43-52. https://doi.org/10.47839/ijc.8.1.655

Issue

Section

Articles