General Information
    • ISSN: 1793-8201 (Print), 2972-4511 (Online)
    • Abbreviated Title: Int. J. Comput. Theory Eng.
    • Frequency: Quarterly
    • DOI: 10.7763/IJCTE
    • Editor-in-Chief: Prof. Mehmet Sahinoglu
    • Associate Editor-in-Chief: Assoc. Prof. Alberto Arteta, Assoc. Prof. Engin Maşazade
    • Managing Editor: Ms. Mia Hu
    • Abstracting/Indexing: Scopus (Since 2022), INSPEC (IET), CNKI,  Google Scholar, EBSCO, etc.
    • Average Days from Submission to Acceptance: 192 days
    • E-mail: ijcte@iacsitp.com
    • Journal Metrics:

Editor-in-chief
Prof. Mehmet Sahinoglu
Computer Science Department, Troy University, USA
I'm happy to take on the position of editor in chief of IJCTE. We encourage authors to submit papers concerning any branch of computer theory and engineering.

IJCTE 2010 Vol.2(6): 931-936 ISSN: 1793-8201
DOI: 10.7763/IJCTE.2010.V2.266

Comprehensive Evolution and Evaluation of Boosting

Amit P Ganatra*, Yogesh P Kosta**1

Abstract—At present, an active research topic is the use of ensembles of classifiers. They are obtained by generating and combining base classifiers, constructed using other machine learning methods. The target of these ensembles is to increase the predictive accuracy with respect to the base classifiers. One of the most popular methods for creating ensembles is boosting, a family of methods, of which AdaBoost is the most prominent member. Boosting is a general approach for improving classifier performances. Boosting is a well established method in the machine learning community for improving the performance of any learning algorithm. It is a method to combine weak classifiers produced by a weak learner to a strong classifier. Boosting refers to the general problem of producing a very accurate prediction rule by combining rough and moderately inaccurate rules-of-thumb. Boosting Methods combine many weak classifiers to produce a committee. It resembles Bagging and other committee based methods. Many weak classifiers are combined to produce a powerful committee. Sequentially apply weak classifiers to modified versions of data. Predictions of these classifiers are combined to produce a powerful classifier. This paper contains comprehensive evolution of Boosting and evaluation of Boosting on various criteria (parameters) with Bagging.

Index Terms—Ensemble, machine learning, predictive accuracy, classifiers, Boosting

1* Associate Professor & Head, CE-IT, CITC (amitganu@yahoo.com)
**Dean, Faculty of Technology & Engineering (ypkosta@yahoo.com)(IEEE Member) (SCPM, Stanford University) Charotar University of Science Technology (CHARUSAT), Education Campus, Changa – 388421, Ta – Petlad, Dist – Anand, Gujarat (INDIA)

[PDF]

Cite: Amit P Ganatra and Yogesh P Kosta, "Comprehensive Evolution and Evaluation of Boosting,"  International Journal of Computer Theory and Engineering vol. 2, no. 6, pp. 931-936, 2010.  


Copyright © 2008-2024. International Association of Computer Science and Information Technology. All rights reserved.