码迷,mamicode.com
首页 > 其他好文 > 详细

【DeepLearning】Exercise:Softmax Regression

时间:2015-01-08 21:23:36      阅读:162      评论:0      收藏:0      [点我收藏+]

标签:

Exercise:Softmax Regression

习题的链接:Exercise:Softmax Regression

 

softmaxCost.m

function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)

% numClasses - the number of classes 
% inputSize - the size N of the input vector
% lambda - weight decay parameter
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
% labels - an M x 1 matrix containing the labels corresponding for the input data
%

% Unroll the parameters from theta
theta = reshape(theta, numClasses, inputSize);

numCases = size(data, 2);

groundTruth = full(sparse(labels, 1:numCases, 1));
%cost = 0;

%thetagrad = zeros(numClasses, inputSize);

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute the cost and gradient for softmax regression.
%                You need to compute thetagrad and cost.
%                The groundTruth matrix might come in handy.


weightDecay = (1/2) * lambda * sum(sum(theta.*theta));

% M1(r, c) is theta(r) * x(c)
M1 = theta * data;
% preventing overflows
M1 = bsxfun(@minus, M1, max(M1, [], 1));
% M2(r, c) is exp(theta(r) * x(c))
M2 = exp(M1);
% M2 is the predicted matrix
M2 = bsxfun(@rdivide, M2, sum(M2));
% 1{·} operator only preserve a part of positions of log(M2)
M = groundTruth .* log(M2);

cost = -(1/numCases) * sum(sum(M)) + weightDecay;

% thetagrad
thetagrad = zeros(numClasses, inputSize);
% difference between ground truth and predict value
diff = groundTruth - M2;

for i=1:numClasses
    thetagrad(i,:) = -(1/numCases) * sum((data .* repmat(diff(i,:), inputSize, 1)) ,2) + lambda * theta(i,:);
end

% ------------------------------------------------------------------
% Unroll the gradient matrices into a vector for minFunc
grad = thetagrad(:);
end

 

softmaxPredict.m

function [pred] = softmaxPredict(softmaxModel, data)

% softmaxModel - model trained using softmaxTrain
% data - the N x M input matrix, where each column data(:, i) corresponds to
%        a single test set
%
% Your code should produce the prediction matrix 
% pred, where pred(i) is argmax_c P(y(c) | x(i)).
 
% Unroll the parameters from theta
theta = softmaxModel.optTheta;  % this provides a numClasses x inputSize matrix
%pred = zeros(1, size(data, 2));

%% ---------- YOUR CODE HERE --------------------------------------
%  Instructions: Compute pred using theta assuming that the labels start 
%                from 1.

result = theta * data;
% sort by column
[~,ind] = sort(result);
pred = ind(size(theta,1), :);






% ---------------------------------------------------------------------

end

 

【DeepLearning】Exercise:Softmax Regression

标签:

原文地址:http://www.cnblogs.com/ganganloveu/p/4211799.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!