Basic Uses of Libsvm in Matlab

System: Ubuntu with gcc installed

Libsvm Installation

1
2
3
cd libsvm-3.2/matlab
mex -setup
make

Just for Test

Load data

1
2
cd ../
load heart_scale

Result:

Error using load
Number of columns on line 3 of ASCII file heart_scale must be the same as previous lines.

Load data again and train model

1
2
3
[heart_scale_label, heart_scale_inst] = libsvmread('heart_scale');
cd matlab;
model = svmtrain(heart_scale_label, heart_scale_inst)

Result:

*
optimization finished, #iter = 162
nu = 0.431029
obj = -100.877288, rho = 0.424462
nSV = 132, nBSV = 107
Total nSV = 132

model = 

struct with fields:

  Parameters: [5×1 double]
  nr_class: 2
  totalSV: 132
    rho: 0.4245
    Label: [2×1 double]
  sv_indices: [132×1 double]
    ProbA: []
    ProbB: []
    nSV: [2×1 double]
  sv_coef: [132×1 double]
    SVs: [132×13 double]

Predict the data and show accuracy

1
[predict_label, accuracy, decision_values] = svmpredict(heart_scale_label, heart_scale_inst, model);

Result:

Accuracy = 86.6667% (234/270) (classification)

Use Parameters

Load and split data

1
2
3
4
5
6
7
8
9
cd ../
[heart_scale_label, heart_scale_inst] = libsvmread('heart_scale')
cd matlab
save heart_scale.mat;
load heart_scale.mat;
train_data = heart_scale_inst(1:150,:);
train_label = heart_scale_label(1:150,:);
test_data = heart_scale_inst(151:270,:);
test_label = heart_scale_label(151:270,:);

Linear Kernel

1
2
model_linear = svmtrain(train_label, train_data, '-t 0');
[predict_label_L, accuracy_L, dec_values_L] = svmpredict(test_label, test_data, model_linear);

Result:

.....*...*
optimization finished, #iter = 1296
nu = 0.343180
obj = -48.403858, rho = -1.184737
nSV = 58, nBSV = 44
Total nSV = 58
Accuracy = 85% (102/120) (classification)

Test -wi Parameters

1
2
data = heart_scale_inst;
label = heart_scale_label;

Without -wi parameter:

1
2
model = svmtrain(label,data,'-c 1');
CR = ClassResult(label, data, model);

Result:

*
optimization finished, #iter = 162
nu = 0.431029
obj = -100.877288, rho = 0.424462
nSV = 132, nBSV = 107
Total nSV = 132
Accuracy = 86.6667% (234/270) (classification)
===some info of the data set===
#class is 2
Lable: 1 -1 
Support vectors number: 132, is 48.8889% of train dataset (132/270)
===Different Classification Accuracy===
Whole = 86.6667% (234/270)
The 1 classification accuracy = 80.8333% (97/120)
The -1 classification accuracy = 91.3333% (137/150)

With -wi parameter:

1
2
modelwi = svmtrain(label,data,'-c 1 -w1 2 -w-1 0.5');
CR = ClassResult(label, data, modelwi);

Result:

*
optimization finished, #iter = 200
obj = -98.828297, rho = -0.581853
nSV = 156, nBSV = 132
Total nSV = 156
Accuracy = 80.7407% (218/270) (classification)
===some info of the data set===
#class is 2
Lable: 1 -1 
Support vectors number: 156, is 57.7778% of train dataset (156/270)
===Different Classification Accuracy===
Whole = 80.7407% (218/270)
The 1 classification accuracy = 95.8333% (115/120)
The -1 classification accuracy = 68.6667% (103/150)

Cross Validation

Classification:

1
accuracy = svmtrain(heart_scale_label,heart_scale_inst,'-s 0 -v 5')

Result:

*
optimization finished, #iter = 125
nu = 0.461423
obj = -85.196874, rho = 0.455347
nSV = 112, nBSV = 87
Total nSV = 112
*
optimization finished, #iter = 151
nu = 0.409704
obj = -72.962828, rho = 0.195437
nSV = 106, nBSV = 77
Total nSV = 106
*
optimization finished, #iter = 132
nu = 0.422268
obj = -77.976883, rho = 0.253928
nSV = 103, nBSV = 77
Total nSV = 103
*
optimization finished, #iter = 118
nu = 0.441832
obj = -81.506660, rho = 0.486918
nSV = 107, nBSV = 86
Total nSV = 107
*
optimization finished, #iter = 132
nu = 0.478520
obj = -89.280327, rho = 0.145904
nSV = 113, nBSV = 90
Total nSV = 113
Cross Validation Accuracy = 82.963%

accuracy =

   82.9630

Regression:

1
mse = svmtrain(heart_scale_label,heart_scale_inst,'-s 3 -v 5')

Result:

*
optimization finished, #iter = 216
nu = 0.642478
obj = -79.888791, rho = 0.188775
nSV = 160, nBSV = 121
*
optimization finished, #iter = 231
nu = 0.619544
obj = -76.549482, rho = 0.189642
nSV = 151, nBSV = 117
*
optimization finished, #iter = 232
nu = 0.634874
obj = -79.619778, rho = 0.276149
nSV = 158, nBSV = 121
*
optimization finished, #iter = 256
nu = 0.640413
obj = -77.053667, rho = 0.107534
nSV = 163, nBSV = 121
*
optimization finished, #iter = 282
nu = 0.587022
obj = -70.675772, rho = 0.184189
nSV = 145, nBSV = 108
Cross Validation Mean squared error = 0.544248
Cross Validation Squared correlation coefficient = 0.46728

mse =

  0.5442

Simple Classification

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
data = ...
[176 70;
180 80;
161 45;
163 47]

label = [1;1;-1;-1]

model = svmtrain(label,data,'-s 0 -t 2 -c 1 -g 0.1');

testdata = [190 85;162 50]
testdatalabel = [1;1]

[predictlabel, accuracy, decision_values] = svmpredict(testdatalabel,testdata,model);
predictlabel

type = 1;
CR = ClassResult(label, data, model, type)


plabel = zeros(2,1);
for i = 1:2
x = testdata(i,:);
plabel(i,1) = DecisionFunction(x,model);
end
plabel

Simple Regression

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
x = (-1:0.1:1)';
y = -x.^2;

model = svmtrain(y, x,'-s 3 -t 2 -c 2.2 -g 2.8 -p 0.01');

[py,mse, dec1] = svmpredict(y,x,model);
figure;
plot(x,y,'o');
hold on;
plot(x,py,'r*');
legend('Origin data','Regression data');
grid on;

testx = [1.1;1.2;1.3];
display('Real data')
testy = -testx.^2

[ptesty, tmse, dec2] = svmpredict(testy,testx,model);
display('Prediction data');
ptesty

save RegressModel.mat model
Mastodon