码迷,mamicode.com
首页 > 其他好文 > 详细

【DeepLearning】Exercise:PCA in 2D

时间:2015-01-04 22:51:27      阅读:329      评论:0      收藏:0      [点我收藏+]

标签:

Exercise:PCA in 2D

习题的链接:Exercise:PCA in 2D

 

pca_2d.m

close all

%%================================================================
%% Step 0: Load data
%  We have provided the code to load data from pcaData.txt into x.
%  x is a 2 * 45 matrix, where the kth column x(:,k) corresponds to
%  the kth data point.Here we provide the code to load natural image data into x.
%  You do not need to change the code below.

x = load(pcaData.txt,-ascii);
figure(1);
scatter(x(1, :), x(2, :));
title(Raw data);


%%================================================================
%% Step 1a: Implement PCA to obtain U 
%  Implement PCA to obtain the rotation matrix U, which is the eigenbasis
%  sigma. 

% -------------------- YOUR CODE HERE -------------------- 
%u = zeros(size(x, 1));  %You need to compute this
sigma = (x*x) ./ size(x,2);    %covariance matrix
[u,s,v] = svd(sigma);

% -------------------------------------------------------- 
hold on
plot([0 u(1,1)], [0 u(2,1)]);
plot([0 u(1,2)], [0 u(2,2)]);
scatter(x(1, :), x(2, :));
hold off

%%================================================================
%% Step 1b: Compute xRot, the projection on to the eigenbasis
%  Now, compute xRot by projecting the data on to the basis defined
%  by U. Visualize the points by performing a scatter plot.

% -------------------- YOUR CODE HERE -------------------- 
%xRot = zeros(size(x)); % You need to compute this
xRot = u*x;

% -------------------------------------------------------- 

% Visualise the covariance matrix. You should see a line across the
% diagonal against a blue background.
figure(2);
scatter(xRot(1, :), xRot(2, :));
title(xRot);

%%================================================================
%% Step 2: Reduce the number of dimensions from 2 to 1. 
%  Compute xRot again (this time projecting to 1 dimension).
%  Then, compute xHat by projecting the xRot back onto the original axes 
%  to see the effect of dimension reduction

% -------------------- YOUR CODE HERE -------------------- 
k = 1; % Use k = 1 and project the data onto the first eigenbasis
%xHat = zeros(size(x)); % You need to compute this
%Recovering an Approximation of the Data
xRot(k+1:size(x,1), :) = 0;
xHat = u*xRot;


% -------------------------------------------------------- 
figure(3);
scatter(xHat(1, :), xHat(2, :));
title(xHat);


%%================================================================
%% Step 3: PCA Whitening
%  Complute xPCAWhite and plot the results.

epsilon = 1e-5;
% -------------------- YOUR CODE HERE -------------------- 
%xPCAWhite = zeros(size(x)); % You need to compute this
xPCAWhite = diag(1 ./ sqrt(diag(s)+epsilon)) * u * x;



% -------------------------------------------------------- 
figure(4);
scatter(xPCAWhite(1, :), xPCAWhite(2, :));
title(xPCAWhite);

%%================================================================
%% Step 3: ZCA Whitening
%  Complute xZCAWhite and plot the results.

% -------------------- YOUR CODE HERE -------------------- 
%xZCAWhite = zeros(size(x)); % You need to compute this
xZCAWhite = u * xPCAWhite;

% -------------------------------------------------------- 
figure(5);
scatter(xZCAWhite(1, :), xZCAWhite(2, :));
title(xZCAWhite);

%% Congratulations! When you have reached this point, you are done!
%  You can now move onto the next PCA exercise. :)

 

【DeepLearning】Exercise:PCA in 2D

标签:

原文地址:http://www.cnblogs.com/ganganloveu/p/4202337.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!