Open Access Open Access  Restricted Access Subscription Access


Yuri Kornienko, Arkady Borisov


The paper discusses the experiments performed with Machine Learning algorithms (ID3, C4.5, Bagged-C4.5, Boosted-C4.5 and Naive Bayes) and an algorithm made on the basis of a combination of genetic algorithms (GA) and ID3. The latter algorithm is implemented as an extension of the MLC++ Library of Stanford University. The behaviour of the algorithm is tested using 24 databases including those with a large number of attributes. It is shown that owing to “hill-climbing” problem solving, the characteristics of the classifier made with the help of the new algorithm became significantly better. The behaviour of the algorithm is examined when constructing pruned classifiers. The ways to improve standard Machine Learning algorithms are suggested.


Genetic algorithm; ID3; tree generation; “hill-climbing”; ensembles of classifiers

Full Text:



J. R. Quinlan. Bagging, Boosting, and C4.5. University of Sydney, Sydney, 1998

T. M. Mitchell. Machine Learning. The McGraw-Hill Comp., New York, 1997

J. R. Quinlan. C4.5: Programs for Machine Learning. San Mateo, Morgan Kaufman, 1993

L. Breiman, J. H. Friedman, R. A. Olshen and C. J. Stone. Classification and Regression Trees, Wadsworth Int. Group, Belmont, CA, USA, 1984

S. M. Weiss. I. Kapouleas. An Empirical Comparison of Pattern Recognition, Neural Nets, and Machine Learning Classification Methods. Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pp. 57-74, 1989

Y. Kornienko and A. Borisov. Production Rules Induction Algorithm Based On The Finish Learning Principle. Fourth International Conference on Application of Fuzzy Systems and Soft Computing, Siegen, Germany, June 27-29, pp. 287-292, 2000


  • There are currently no refbacks.