标签:machine learning 深度学习 ufldl deep learning
function [f,g] = linear_regression(theta, X,y) % % Arguments: % theta - A vector containing the parameter values to optimize. % X - The examples stored in a matrix. % X(i,j) is the i‘th coordinate of the j‘th example. % y - The target value for each example. y(j) is the target for example j. % m=size(X,2); n=size(X,1); f=0; g=zeros(size(theta)); % % TODO: Compute the linear regression objective by looping over the examples in X. % Store the objective function value in ‘f‘. % % TODO: Compute the gradient of the objective with respect to theta by looping over % the examples in X and adding up the gradient for each example. Store the % computed gradient in ‘g‘. %%% YOUR CODE HERE %%% % Step 1 : Compute f cost function for i = 1:m f = f + (theta‘ * X(:,i) - y(i))^2; end f = 1/2*f; % Step 2: Compute gradient for j = 1:n for i = 1:m g(j) = g(j) + X(j,i)*(theta‘ * X(:,i) - y(i)); end end
function [f,g] = linear_regression_vec(theta, X,y) % % Arguments: % theta - A vector containing the parameter values to optimize. % X - The examples stored in a matrix. % X(i,j) is the i‘th coordinate of the j‘th example. % y - The target value for each example. y(j) is the target for example j. % m=size(X,2); % initialize objective value and gradient. f = 0; g = zeros(size(theta)); % % TODO: Compute the linear regression objective function and gradient % using vectorized code. (It will be just a few lines of code!) % Store the objective function value in ‘f‘, and the gradient in ‘g‘. % %%% YOUR CODE HERE %%% f = 1/2*sum((theta‘*X - y).^2); g = X*(theta‘*X - y)‘;
function [f,g] = logistic_regression_vec(theta, X,y) % % Arguments: % theta - A column vector containing the parameter values to optimize. % X - The examples stored in a matrix. % X(i,j) is the i‘th coordinate of the j‘th example. % y - The label for each example. y(j) is the j‘th example‘s label. % m=size(X,2); % initialize objective value and gradient. f = 0; g = zeros(size(theta)); % % TODO: Compute the logistic regression objective function and gradient % using vectorized code. (It will be just a few lines of code!) % Store the objective function value in ‘f‘, and the gradient in ‘g‘. % %%% YOUR CODE HERE %%% f = -sum(y.*log(sigmoid(theta‘*X)) + (1-y).*log(1 - sigmoid(theta‘*X))); g = X*(sigmoid(theta‘*X) - y)‘;
深度学习 Deep Learning UFLDL 最新Tutorial 学习笔记 3:Vectorization
标签:machine learning 深度学习 ufldl deep learning
原文地址:http://blog.csdn.net/songrotek/article/details/41287417