码迷,mamicode.com
首页 > 其他好文 > 详细

Gradient Boosting

时间:2017-11-06 21:24:12      阅读:191      评论:0      收藏:0      [点我收藏+]

标签:array   cti   each   second   rac   for   简介   lan   数值   

参考网址:

1. GBDT(MART) 迭代决策树入门教程 | 简介

2. Wikipedia: Gradient boosting

 


一般Gradient Boosting:

输入:训练集$\{(x_{i}, y_{i})\}_{i=1}^{n}$,可导损失函数$L(y, F(x))$,迭代次数$M$

算法:

1. 用常数值初始化模型

\[F_{0}=\argmin_{\gamma} \sum_{i=1}^{n} L(y_{i}, \gamma)\]

2. 从m=1到M:

  1)计算“伪残差”:

\[

\gamma_{im}=-[\frac{\partial L(y_{i}, F(x_{i}))}{\partial F(x_{i})}|_{F(x)=F_{m-1}(x)}], i=1, ..., n

\]

\[ \sum_{k=1}^n k^2 = \frac{1}{2} n (n+1).\]

\[ \frac{\partial u}{\partial t}
= h^2 \left( \frac{\partial^2 u}{\partial x^2}
+ \frac{\partial^2 u}{\partial y^2}
+ \frac{\partial^2 u}{\partial z^2}\right)\]

The Newton‘s second law is F=ma.

The Newton‘s second law is $F=ma$.

The Newton‘s second law is
$$F=ma$$

The Newton‘s second law is
\[F=ma\]

Greek Letters $\eta$ and $\mu$

Fraction $\frac{a}{b}$

Power $a^b$

Subscript $a_b$

Derivate $\frac{\partial y}{\partial t} $

Vector $\vec{n}$

Bold $\mathbf{n}$

To time differential $\dot{F}$

Matrix (lcr here means left, center or right for each column)
\[
\left[
\begin{array}{lcr}
a1 & b22 & c333 \\
d444 & e555555 & f6
\end{array}
\right]
\]

Equations(here \& is the symbol for aligning different rows)
\begin{align}
a+b&=c\\
d&=e+f+g
\end{align}

\[
\left\{
\begin{aligned}
&a+b=c\\
&d=e+f+g
\end{aligned}
\right.
\]

Gradient Boosting

标签:array   cti   each   second   rac   for   简介   lan   数值   

原文地址:http://www.cnblogs.com/RRRRecord/p/7794777.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!