• May 27, 2016 News!The submission for Special Issue is officially open now!   [Click]
  • May 03, 2016 News!Vol.6, No.6 has been indexed by EI (Inspec).   [Click]
  • Mar 17, 2017 News!Vol.9, No.2 has been published with online version. 13 peer reviewed articles from 4 specific areas are published in this issue.   [Click]
General Information
Editor-in-chief
Prof. Wael Badawy
Department of Computing and Information Systems Umm Al Qura University, Canada
I'm happy to take on the position of editor in chief of IJCTE. We encourage authors to submit papers concerning any branch of computer theory and engineering.
IJCTE 2010 Vol.2(6): 931-936 ISSN: 1793-8201
DOI: 10.7763/IJCTE.2010.V2.266

Comprehensive Evolution and Evaluation of Boosting

Amit P Ganatra*, Yogesh P Kosta**1

Abstract—At present, an active research topic is the use of ensembles of classifiers. They are obtained by generating and combining base classifiers, constructed using other machine learning methods. The target of these ensembles is to increase the predictive accuracy with respect to the base classifiers. One of the most popular methods for creating ensembles is boosting, a family of methods, of which AdaBoost is the most prominent member. Boosting is a general approach for improving classifier performances. Boosting is a well established method in the machine learning community for improving the performance of any learning algorithm. It is a method to combine weak classifiers produced by a weak learner to a strong classifier. Boosting refers to the general problem of producing a very accurate prediction rule by combining rough and moderately inaccurate rules-of-thumb. Boosting Methods combine many weak classifiers to produce a committee. It resembles Bagging and other committee based methods. Many weak classifiers are combined to produce a powerful committee. Sequentially apply weak classifiers to modified versions of data. Predictions of these classifiers are combined to produce a powerful classifier. This paper contains comprehensive evolution of Boosting and evaluation of Boosting on various criteria (parameters) with Bagging.

Index Terms—Ensemble, machine learning, predictive accuracy, classifiers, Boosting

1* Associate Professor & Head, CE-IT, CITC (amitganu@yahoo.com)
**Dean, Faculty of Technology & Engineering (ypkosta@yahoo.com)(IEEE Member) (SCPM, Stanford University) Charotar University of Science Technology (CHARUSAT), Education Campus, Changa – 388421, Ta – Petlad, Dist – Anand, Gujarat (INDIA)

[PDF]

Cite: Amit P Ganatra and Yogesh P Kosta, "Comprehensive Evolution and Evaluation of Boosting,"  International Journal of Computer Theory and Engineering vol. 2, no. 6, pp. 931-936, 2010.  

Copyright © 2008-2015. International Journal of Computer Theory and Engineering. All rights reserved.
E-mail: ijcte@vip.163.com