码迷,mamicode.com
首页 > 其他好文 > 详细

【4】One-vs-all

时间:2016-05-01 19:07:35      阅读:596      评论:0      收藏:0      [点我收藏+]

标签:

%% Machine Learning Online Class - Exercise 3 | Part 1: One-vs-all

%% Initialization
clear ; close all; clc

%% Setup the parameters you will use for this part of the exercise
input_layer_size  = 400;  % 20x20 Input Images of Digits
num_labels = 10;          % 10 labels, from 1 to 10   
                          % (note that we have mapped "0" to label 10)

%% =========== Part 1: Loading and Visualizing Data =============
% Load Training Data
fprintf(Loading and Visualizing Data ...\n)

load(ex3data1.mat); % training data stored in arrays X, y
m = size(X, 1);    %m=5000----5000*400的矩阵

% Randomly select 100 data points to display
rand_indices = randperm(m);
sel = X(rand_indices(1:100), :);     %100*400→(40*10)

displayData(sel);
fprintf(
Program paused. Press enter to continue.\n); pause; %% ============ Part 2: Vectorize Logistic Regression ============ fprintf(\nTraining One-vs-All Logistic Regression...\n) lambda = 0.1; [all_theta] = oneVsAll(X, y, num_labels, lambda);

fprintf(
Program paused. Press enter to continue.\n); pause; %% ================ Part 3: Predict for One-Vs-All ================ % After ... pred = predictOneVsAll(all_theta, X); fprintf(\nTraining Set Accuracy: %f\n, mean(double(pred == y)) * 100);

 

function [all_theta] = oneVsAll(X, y, num_labels, lambda)
%ONEVSALL trains multiple logistic regression classifiers and returns all
%the classifiers in a matrix all_theta, where the i-th row of all_theta 
%corresponds to the classifier for label i
%   [all_theta] = ONEVSALL(X, y, num_labels, lambda) trains num_labels
%   logisitc regression classifiers and returns each of these classifiers
%   in a matrix all_theta, where the i-th row of all_theta corresponds 
%   to the classifier for label i

% Some useful variables
m = size(X, 1);
n = size(X, 2);

% You need to return the following variables correctly 
all_theta = zeros(num_labels, n + 1);

% Add ones to the X data matrix
X = [ones(m, 1) X];

% ====================== YOUR CODE HERE ======================
% Instructions: You should complete the following code to train num_labels
%               logistic regression classifiers with regularization
%               parameter lambda. 
%
% Hint: theta(:) will return a column vector.
%
% Hint: You can use y == c to obtain a vector of 1s and 0s that tell use 
%       whether the ground truth is true/false for this class.
%
% Note: For this assignment, we recommend using fmincg to optimize the cost
%       function. It is okay to use a for-loop (for c = 1:num_labels) to
%       loop over the different classes.
%
%       fmincg works similarly to fminunc, but is more efficient when we
%       are dealing with large number of parameters.
%
% Example Code for fmincg:
%
%     % Set Initial theta
%     initial_theta = zeros(n + 1, 1);
%     
%     % Set options for fminunc
%     options = optimset(GradObj, on, MaxIter, 50);
% 
%     % Run fmincg to obtain the optimal theta
%     % This function will return theta and the cost 
%     [theta] = ...
%         fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)), ...
%                 initial_theta, options);
%












% =========================================================================


end

 

【4】One-vs-all

标签:

原文地址:http://www.cnblogs.com/nice-day/p/5450812.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!