Header menu link for other important links
X
Impact of Gradient Ascent and Boosting Algorithm in Classification
Basha S.M, , Vandhan V.
Published in The Intelligent Networks and Systems Society
2018
Volume: 11
   
Issue: 1
Pages: 41 - 49
Abstract
Boosting is the method used to improve the accuracy of any learning algorithm, which often suffers from over fitting problem, because of inappropriate coefficient associated to the data points. The objective of our research is to train the data, such that the weighing error of linear classifier goes to zero and classify the sentiments accurately. In this paper, Gradient ascent approach is used to minimize the weighing error of sentiment classifier by predicting the proper coefficients to the data points in trained dataset. When compared to previous studies on designing a strong classifier, our research is novel in the following areas: Estimation of Maximum Likelihood for logistic regression using Gradient ascent and making use of weights of metric in understanding the behavior of AdaBoost algorithm in classifying the sentiments. In our finding, the first decision stump has training error of 30.44%. After thousand iterations, we observed a smooth transition, where the classification error tends to go down to 8.22% and actually stays at same value. Finally, concluding that Boosting algorithm outperforms Random Forests with lesser Mean squared Test Errors.
About the journal
JournalInternational Journal of Intelligent Engineering and Systems
PublisherThe Intelligent Networks and Systems Society
ISSN2185310X
Open Access0