码迷,mamicode.com
首页 > 其他好文 > 详细

Introduction to LDA (Linear Discrimination Analysis)

时间:2014-10-27 00:25:35      阅读:181      评论:0      收藏:0      [点我收藏+]

标签:图像处理


Linear Discrimination Analysis


锻炼一下ability of english writing : ) 光看不写感觉不行哇~


       First of all, we try to solve a problem and then guide the LDA out here :)


             Here is the question that there are two different points in this picture, what‘s the evidence in mathmatic that  you classify there two different points.  



bubuko.com,布布扣


            After classification, try to input some generic points and classify these inputed points by that mathmatic evidence that you have found.


--------------------------------------------------------------------------------------------------------------------------------------


Method One:  the distance between projected means

         Just compute the different mean value of different class points. Compare the distance between the inputed new points and the mean value‘s location. If the inputed points is close to mean value location of RED class, we treat its as red class, verse via.


The green point in this picture is the mean location of blue class.

The yellow point in this picture is the mean location of red  class.


          Which points close to the mean point of which class, it belongs to that class.

bubuko.com,布布扣


              However, the distance between projected means is not a good measure  since it does not account for the standard deviation within classes



Method two : Fisher’s linear discriminant 


bubuko.com,布布扣

This is a fantastic discriminant method  \(^o^)/~


What is our target if we want to descrininate different class datas ?


            Fisher suggested maximizing the difference between the means,  normalized by a measure of the within-class scatter


Attention! y is a vector but not a normal single dimention varible !

bubuko.com,布布扣

So one of our target is to make the within-class scatter as min as possible.


On the another hand, we should consider about the relationship between the two different class of datas.


bubuko.com,布布扣

bubuko.com,布布扣

If could find a matrix W_t which multiple vector x could translate vectorx into scale y , our will finish half of our work.


                Function J(w) is very helpful. It describe the target of our discriminant. The more bigger of J(w), the better of our discriminant


bubuko.com,布布扣


Within-class scatter


Matrix Si describe the level of scatter inner of class-i

bubuko.com,布布扣


Beblow this is a description about within-class scatter in scale y


To get matrix Sw, we could sum all matrix-Si.

bubuko.com,布布扣

Between-class scatter

This Matrix describe the level of scatter between different class.


bubuko.com,布布扣

Everything is more and more clearly...


bubuko.com,布布扣


At this moment, we may memory back the operation of differential on matrix.

bubuko.com,布布扣

After this, you will have the ability to understand proof beblow here.

Attention! SB  is  a diagonal matrix.  A_t == A

bubuko.com,布布扣


Let‘s have a exercise :)

Go back and consider about there data points in below picture.


bubuko.com,布布扣


bubuko.com,布布扣


First of all we should set the data.

Class_1 = [ 4,1;
            2,4;
            2,3;
            3,6;
            4,4];
        
Class_2 = [9,10;
    6,8;
    9,5;
    8,7;
    10,8];

And the compute the mean location of each class data collection.


mean_Class_1 = [mean(Class_1(:,1)),mean(Class_1(:,2))];
mean_Class_2 = [mean(Class_2(:,1)),mean(Class_2(:,2))];

mean_Class_1 =

    3.0000    3.6000

mean_Class_2 =

    8.4000    7.6000


And then compute S1 and S2


S_w =
   13.2000   -2.2000
   -2.2000   26.4000

S_b =
   29.1600   21.6000
   21.6000   16.0000


S_w = S_1 + S_2;

S_b = (mean_Class_1 - mean_Class_2)' * (mean_Class_1 - mean_Class_2);


Look! The red line is what we want! Just project every points onto the red line in the picture.


bubuko.com,布布扣


(y - original_y)/(x - original_x) = -1/slope;


new_location_x = ((1/slope)*x_original_point + y_original_point)/(slope + 1/slope);

new_location_y = slope*new_location_x;



Now,I will give my code in matlab which draw that picture out here.

clear all
close all
clc

Class_1 = [ 4,1;
            2,4;
            2,3;
            3,6;
            4,4];
        
Class_2 = [9,10;
    6,8;
    9,5;
    8,7;
    10,8];

figure(1);
hold on;
scatter(Class_1(:,1),Class_1(:,2),'fill','r');
scatter(Class_2(:,1),Class_2(:,2),'fill','b');


mean_Class_1 = [mean(Class_1(:,1)),mean(Class_1(:,2))];
mean_Class_2 = [mean(Class_2(:,1)),mean(Class_2(:,2))];

scatter(mean_Class_1(1,1),mean_Class_1(1,2),'fill','y');
scatter(mean_Class_2(1,1),mean_Class_2(1,2),'fill','g');

value = Class_1;

S_1 = [0 0;0 0];

for temp = 1:size(Class_1,1)
    value(temp,1) = Class_1(temp,1) - mean_Class_1(1,1);
    value(temp,2) = Class_1(temp,2) - mean_Class_1(1,2);
    
    S_1 = S_1 + value(temp,:)'*value(temp,:);
end

S_2 = [0 0;0 0];

for temp = 1:size(Class_1,1)
    value(temp,1) = Class_2(temp,1) - mean_Class_2(1,1);
    value(temp,2) = Class_2(temp,2) - mean_Class_2(1,2);
    
    S_2 = S_2 + value(temp,:)'*value(temp,:);
end

S_w = S_1 + S_2;

S_b = (mean_Class_1 - mean_Class_2)' * (mean_Class_1 - mean_Class_2);

Temp_matrix = inv(S_w)*S_b;

%% compute the eig dialog matrix by function eig()
[V,D] = eig(Temp_matrix);

eig_value = max(D(:));

Temp_matrix(1,1) = Temp_matrix(1,1) - eig_value;
Temp_matrix(2,2) = Temp_matrix(2,2) - eig_value;

slope = -Temp_matrix(1,1)./Temp_matrix(1,2);

x = [1:16];
y = slope*x;

plot(x,y);

Projection_Class_1(:,1) = ...
(Class_1(:,1).*(1/slope) + Class_1(:,2))./(slope + (1/slope));

Projection_Class_1(:,2) = Projection_Class_1(:,1).*slope;

scatter(Projection_Class_1(:,1),Projection_Class_1(:,2),'r');

Projection_Class_2(:,1) = ...
(Class_2(:,1).*(1/slope) + Class_2(:,2))./(slope + (1/slope));

Projection_Class_2(:,2) = Projection_Class_2(:,1).*slope;

scatter(Projection_Class_2(:,1),Projection_Class_2(:,2),'b');





Introduction to LDA (Linear Discrimination Analysis)

标签:图像处理

原文地址:http://blog.csdn.net/cinmyheart/article/details/40476845

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!