码迷,mamicode.com
首页 > 系统相关 > 详细

Machine Learning Techniques -6-Support Vector Regression

时间:2015-08-27 12:37:46      阅读:183      评论:0      收藏:0      [点我收藏+]

标签:

6-Support Vector Regression

For the regression with squared error, we discuss the kernel ridge regression.

 With the knowledge of kernel function, could we find an analytic solution for kernel ridge regression?

 Since we want to find the best βn

 

技术分享

技术分享

However, compare to the linear situation, the large number of data will suffer from this formation of βn.

Compared to soft-margin Gaussian SVM, kernel ridge regression suffers from the operation of  βn through N:

技术分享

That means more SVs and will slow down our calculation, a sparse βn is now we want.

技术分享

Thus we add a tube, with the familiar function of MAX, we prune the points at a small |s - y|.

Max function is not differentable at some points, so we need some other operation as well.

技术分享

These operations are about changing the appearance to be more like standard SVM, in order to deal with the tool of QP.

wTZn + b = wTZn +w0, which is separated as a Constant.

技术分享

we add a factor to descrip the violation of margin, and use upper and lower bound to keep linear formation.

技术分享

Our next task : SVR primal -> dual

 

Machine Learning Techniques -6-Support Vector Regression

标签:

原文地址:http://www.cnblogs.com/windniu/p/4762749.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!