码迷,mamicode.com
首页 > 其他好文 > 详细

从最小角回归(LARS)中学到的一个小知识(很短)

时间:2015-01-15 17:39:37      阅读:249      评论:0      收藏:0      [点我收藏+]

标签:

 

【转载请注明出处】http://www.cnblogs.com/mashiqi

 

假设这里有一组向量$\left\{ x_i \right\}_{i=1}^n$和一个待投影的向量$u$。假设$u$和每个$x_i$的内积都为正数,也就是说$u$和每个$x_i$的夹角都小于90度。那么当我们把$u$投影到$\left\{ x_i \right\}_{i=1}^n$上时,理所应当地每个$x_i$的系数$\beta_i$也都应该大于零:$$u = x_1\beta_1+\cdots+x_n\beta_n,\beta_i\geq0$$不知道读者们的空间直觉怎么样,反正我最开始就是这么天真的认为的。最近看了Efron的“Least Angle Regression”后,才明白原来不是这样的,自己以前too young了。有些时候系数会变成负的。下面贴一个小MATLAB代码,自己去体会吧!

 1 %{
 2 % This small matlab program show you a unexpected result in
 3 % high-dimensional geometry: for any set of n-dimensional vectors
 4 % {x_1,...,x_n}, if these vectors are indepentent, then you can always find
 5 % an equiangular vector u, so that the inner product (x_i,u)=1 for all i.
 6 % BUT, if we project u into {x_i}, some coefficients may be negative!
 7 %}
 8 
 9 %%
10 clear;
11 close all;
12 %% You can change the following two variables value
13 dimension = 3;
14 vectors = 3;
15 
16 %% In case of unpredicted problems, dont change the following code.
17 if vectors > dimension
18 disp(Please set vectors <= dimension.);
19 end 
20 X = randn(dimension,vectors); % every column is a vector
21 X = X./repmat(sqrt(sum(X.^2,1)),dimension,1); % standardize
22 if rank(X) ~= size(X,2)
23 disp(These vectors are not independent. Run again.);
24 end
25 w = (X*X)\ones(vectors,1);
26 u = X*w;
27 w % the coefficient
28 u % the equiangular vector
29 X*u % the correlation value
30 if dimension == 3 % 3 is the upper bound of dimensions of humans where we can draw.
31 quiver3(0,0,0,X(1,1),X(2,1),X(3,1),r),hold on;
32 quiver3(0,0,0,X(1,2),X(2,2),X(3,2),r),hold on;
33 quiver3(0,0,0,X(1,3),X(2,3),X(3,3),r),hold on;
34 quiver3(0,0,0,u(1),u(2),u(3),b),hold off;
35 end

 

总结一句话:高维空间有危险,忽久留= =||

PS:ta今天一共和我说了30个字。

从最小角回归(LARS)中学到的一个小知识(很短)

标签:

原文地址:http://www.cnblogs.com/mashiqi/p/4226692.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!