码迷,mamicode.com
首页 > 其他好文 > 详细

Libsvm自定义核函数【转】

时间:2015-05-15 22:51:50      阅读:211      评论:0      收藏:0      [点我收藏+]

标签:

1. 使用libsvm工具箱时,可以指定使用工具箱自带的一些核函数(-t参数),主要有:

-t kernel_type : set type of kernel function (default 2)

  • 0 -- linear: u‘*v
  • 1 -- polynomial: (gamma*u‘*v + coef0)^degree
  • 2 -- radial basis function: exp(-gamma*|u-v|^2)
  • 3 -- sigmoid: tanh(gamma*u‘*v + coef0)

2. 有时我们需要使用自己的核函数,这时候可以用 -t 4参数来实现:

-t kernel_type : set type of kernel function (default 2)
4 -- precomputed kernel (kernel values in training_instance_matrix)

使用-t 4参数时,再有了核函数后,需要给出核矩阵,关于核函数以及核函数构造相关的知识,大家可以看看相关书籍,在此不特别深入说明。

比如线性核函数 是 K(x,x‘) = (x * x‘),设训练集是train_data,设训练集有150个样本 , 测试集是test_data,设测试集有120个样本
则 训练集的核矩阵是 ktrain1 = train_data*train_data‘
测试集的核矩阵是 ktest1 = test_data*train_data‘
想要使用-t 4参数还需要把样本的序列号放在核矩阵前面 ,形成一个新的矩阵,然后使用svmtrain建立支持向量机,再使用svmpredict进行预测即可。形式与使用其他-t参数少有不同,如下:

ktrain1 = train_data*train_data;
Ktrain1 = [(1:150),ktrain1];
 
model_precomputed1 = svmtrain(train_label, Ktrain1, -t 4);  % 注意此处的 输入 Ktrain1
 
ktest1 = test_data*train_data;
Ktest1 = [(1:120), ktest1];
 
[predict_label_P1, accuracy_P1, dec_values_P1] = svmpredict(test_label,Ktest1,model_precomputed1); % 注意此处输入Ktest1</pre>

下面是一个整体的小例子,大家可以看一下:

%% Use_precomputed_kernelForLibsvm_example
% faruto
% last modified by 2011.04.20
%%
tic;
clear;
clc;
close all;
format compact;
%%
load heart_scale.mat;
% Split Data
train_data = heart_scale_inst(1:150,:);
train_label = heart_scale_label(1:150,:);
test_data = heart_scale_inst(151:270,:);
test_label = heart_scale_label(151:270,:);
 
%% Linear Kernel
model_linear = svmtrain(train_label, train_data, -t 0);
[predict_label_L, accuracy_L, dec_values_L] = svmpredict(test_label, test_data, model_linear);
 
%% Precomputed Kernel One
% 使用的核函数 K(x,x) = (x * x)
% 核矩阵
ktrain1 = train_data*train_data;
Ktrain1 = [(1:150),ktrain1];
model_precomputed1 = svmtrain(train_label, Ktrain1, -t 4);
ktest1 = test_data*train_data;
Ktest1 = [(1:120), ktest1];
[predict_label_P1, accuracy_P1, dec_values_P1] = svmpredict(test_label, Ktest1, model_precomputed1);
 
%% Precomputed Kernel Two
% 使用的核函数 K(x,x) = ||x|| * ||x||
% 核矩阵
ktrain2 = ones(150,150);
for i = 1:150
 for j = 1:150
 ktrain2(i,j) = sum(train_data(i,:).^2)^0.5 * sum(train_data(j,:).^2)^0.5;
 end
end
Ktrain2 = [(1:150),ktrain2];
model_precomputed2 = svmtrain(train_label, Ktrain2, -t 4);
 
ktest2 = ones(120,150);
for i = 1:120
 for j = 1:150
 ktest2(i,j) = sum(test_data(i,:).^2)^0.5 * sum(train_data(j,:).^2)^0.5;
 end
end
Ktest2 = [(1:120), ktest2];
[predict_label_P2, accuracy_P2, dec_values_P2] = svmpredict(test_label, Ktest2, model_precomputed2);
%% Precomputed Kernel Three
% 使用的核函数 K(x,x) = (x * x) / ||x|| * ||x||
% 核矩阵
ktrain3 = ones(150,150);
for i = 1:150
 for j = 1:150
 ktrain3(i,j) = ...
 train_data(i,:)*train_data(j,:)/(sum(train_data(i,:).^2)^0.5 * sum(train_data(j,:).^2)^0.5);
 end
end
Ktrain3 = [(1:150),ktrain3];
model_precomputed3 = svmtrain(train_label, Ktrain3, -t 4);
 
ktest3 = ones(120,150);
for i = 1:120
 for j = 1:150
 ktest3(i,j) = ...
 test_data(i,:)*train_data(j,:)/(sum(test_data(i,:).^2)^0.5 * sum(train_data(j,:).^2)^0.5);
 end
end
Ktest3 = [(1:120), ktest3];
[predict_label_P3, accuracy_P3, dec_values_P3] = svmpredict(test_label, Ktest3, model_precomputed3);
 
 
%% Display the accuracy
accuracyL = accuracy_L(1) % Display the accuracy using linear kernel
accuracyP1 = accuracy_P1(1) % Display the accuracy using precomputed kernel One
accuracyP2 = accuracy_P2(1) % Display the accuracy using precomputed kernel Two
accuracyP3 = accuracy_P3(1) % Display the accuracy using precomputed kernel Three
%%
toc;

运行结果:

Accuracy = 85% (102/120) (classification)
Accuracy = 85% (102/120) (classification)
Accuracy = 67.5% (81/120) (classification)
Accuracy = 84.1667% (101/120) (classification)
accuracyL =
 85
accuracyP1 =
 85
accuracyP2 =
 67.5000
accuracyP3 =
 84.1667
Elapsed time is 1.424549 seconds.

 

Libsvm自定义核函数【转】

标签:

原文地址:http://www.cnblogs.com/tec-vegetables/p/4506959.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!