THE INSTANCE SELECTION METHOD FOR NEURO-FUZZY MODEL SYNTHESIS
DOI:
https://doi.org/10.47839/ijc.13.3.630Keywords:
instance, neuro-fuzzy network, sample selection, data dimensionality reduction.Abstract
The problem of automation of neuro-fuzzy model synthesis on instance set is addressed. The method of instance selection for neuro-fuzzy model synthesis is proposed. It allows reducing the sample size, and decreasing the requirements to computer resources. The method also performs transformation of the original multi-dimensional coordinate set to the one-dimensional axis, which is also discretized to improve the data generalization properties. The software implementing proposed method is developed. The experiments were conducted to study the proposed method at the real problem solution. The results of experiments allow recommending proposed method for usage at practice.References
D. Ruan, Intelligent hybrid systems: fuzzy logic, neural networks, and genetic algorithms, Springer, Berlin, 2012, 354 p.
S. Sumathi, S. Paneerselvam, Computational intelligence paradigms: theory & applications using MATLAB, CRC Press, Boca Raton, 2010, 851 p.
S. Subbotin, An. Oleinik, E. Gofman, S. Zaitsev, Al. Oleynik, Intelligent information technologies of automated diagnosis and pattern recognition systems design, Smith Company Ltd., Kharkov, 2012, 318 p. (in Russian).
A. Chaudhuri, H. Stenger, Survey sampling theory and methods, Chapman & Hall, New York, 2005, 416 p.
S. Subbotin, Methods of sampling based on exhaustive and evolutionary search. Automatic Control and Computer Sciences, (47) (2013), pp. 113-121.
P. Lavrakas, Encyclopedia of survey research methods, Sage Publications, Thousand Oaks, 2008, 1072 p.
H. Bernard, Social research methods: qualitative and quantitative approaches, Sage Publications, Thousand Oaks, 2006, 659 p.
S. Ghosh, Multivariate analysis, design of experiments, and survey sampling, Marcel Dekker Inc., New York, 1999, 698 p.
M. Plutowski, Selecting training exemplars for neural network learning, PhD Thesis in computer science and engineering, University of California, San Diego, 1994, 135 p.
S. Subbotin, The training set quality measures for neural network learning, Optical Memory and Neural Networks (Information Optics), (19) 2 (2010), pp. 126-139.
B. Everitt, Cluster Analysis, John Wiley & Sons Ltd., Chichester, 2011, 346 p.
J. Abonyi, B. Feil, Cluster analysis for data mining and system identification, Birkhäuser, Basel, 2007, 303 p.
A. Boguslaev, Al. Oleinik, An. Oleinik, D. Pavlenko, S. Subbotin, Progressive technologies of modeling, optimization and intelligent automation of aviation engine lifecycle stages, Motor Sich JSC, Zaporozhye, 2009, 468 p. (in Russian).
S. Subbotin, K. Boichenko, Automatic system of vehicle detection and recognition on the image, Software Products and Systems, (1) (2010), pp. 114-116. (in Russian).
V. Dubrovin, S. Subbotin, S. Morshchavka, D. Piza, The plant recognition on remote sensing results by the feed-forward neural networks, Intelligent engineering systems through artificial neural networks. Vol. 10 – Smart engineering systems design: neural networks, fuzzy logic, evolutionary programming, data mining, and complex systems, ASME Press, New York, 2000, pp. 697-702.
Downloads
Published
How to Cite
Issue
Section
License
International Journal of Computing is an open access journal. Authors who publish with this journal agree to the following terms:• Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
• Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
• Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.