码迷,mamicode.com
首页 > 其他好文 > 详细

Adaboost

时间:2017-05-20 14:26:06      阅读:144      评论:0      收藏:0      [点我收藏+]

标签:choices   ble   between   orm   ase   assigned   att   problem   roc   

Boosting is a very powerful technique of alogrithms ensembling. Its outstanding performance achieved by combining some or many weak classifiers to make a strong one. like Bagging, it also vote to judge a sample‘s catagory, but there is a significant difference: base classifiers that are made up of the strong classifier usually has different ‘voting right‘. The mostly used form of boosting is adaptive boosting( adaboost), is the one will talk about in detail.

There is a key point on training the difference between bagging and adaboost: the base classifiers are trained in sequence,which has to be, as the performance of previous classifiers determine the following‘s weights of sample and the ‘voting right‘ of the following classifier.

 

The points that are misclassified will be assigned to a larger weights and the right ones to a small weights. Then in the following base classifier, ‘error‘ points will attract more ‘consideration‘, repeat the process, once all the base classifiers have been trained,  the final prediction will combine all the classifiers‘ choices, and choose the largest weights of catagory as a point‘s right class.

Consider a two-class classification problem, in which the training data comprise to vector \(x_1,x_2,...,x_N\) along with the corresponding binary variables \(t_1,t_2,...,t_N\), where \(t_n \in \{-1,1\}\).  And we have procedure availiable for training a base classifier using weighted data to give a function \(y(x) \in \{-1,1\}\). Each data point is given an associated weighting parametere \(w_n\), which is initially set 1/N for all data points.

AdaBoost Process:

1. Initialize the data weighting coeffecients \(\{w_n\}\) by setting \(w_n^{(1)} = 1/N\) for \(n = 1,2,...,N\).

2. For m = 1,2,...,M:

  (a): Fit a classifier \(y_m(X)\) to train data by minimizing the weighted error function:

      \(e_m = P(G_m(x_i) \neq y_i) = \sum_{i=1}^{N}(w_{mi}I(G_m(x_i)) \neq = y_i\) 

Adaboost

标签:choices   ble   between   orm   ase   assigned   att   problem   roc   

原文地址:http://www.cnblogs.com/vpegasus/p/6882012.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!