System: Ubuntu with gcc installed
Libsvm Installation
1 | cd libsvm-3.2/matlab |
Just for Test
Load data
1 | cd ../ |
Result:
Error using load
Number of columns on line 3 of ASCII file heart_scale must be the same as previous lines.
Load data again and train model
1 | [heart_scale_label, heart_scale_inst] = libsvmread('heart_scale'); |
Result:
*
optimization finished, #iter = 162
nu = 0.431029
obj = -100.877288, rho = 0.424462
nSV = 132, nBSV = 107
Total nSV = 132
model =
struct with fields:
Parameters: [5×1 double]
nr_class: 2
totalSV: 132
rho: 0.4245
Label: [2×1 double]
sv_indices: [132×1 double]
ProbA: []
ProbB: []
nSV: [2×1 double]
sv_coef: [132×1 double]
SVs: [132×13 double]
Predict the data and show accuracy
1 | [predict_label, accuracy, decision_values] = svmpredict(heart_scale_label, heart_scale_inst, model); |
Result:
Accuracy = 86.6667% (234/270) (classification)
Use Parameters
Load and split data
1 | cd ../ |
Linear Kernel
1 | model_linear = svmtrain(train_label, train_data, '-t 0'); |
Result:
.....*...*
optimization finished, #iter = 1296
nu = 0.343180
obj = -48.403858, rho = -1.184737
nSV = 58, nBSV = 44
Total nSV = 58
Accuracy = 85% (102/120) (classification)
Test -wi Parameters
1 | data = heart_scale_inst; |
Without -wi parameter:
1 | model = svmtrain(label,data,'-c 1'); |
Result:
*
optimization finished, #iter = 162
nu = 0.431029
obj = -100.877288, rho = 0.424462
nSV = 132, nBSV = 107
Total nSV = 132
Accuracy = 86.6667% (234/270) (classification)
===some info of the data set===
#class is 2
Lable: 1 -1
Support vectors number: 132, is 48.8889% of train dataset (132/270)
===Different Classification Accuracy===
Whole = 86.6667% (234/270)
The 1 classification accuracy = 80.8333% (97/120)
The -1 classification accuracy = 91.3333% (137/150)
With -wi parameter:
1 | modelwi = svmtrain(label,data,'-c 1 -w1 2 -w-1 0.5'); |
Result:
*
optimization finished, #iter = 200
obj = -98.828297, rho = -0.581853
nSV = 156, nBSV = 132
Total nSV = 156
Accuracy = 80.7407% (218/270) (classification)
===some info of the data set===
#class is 2
Lable: 1 -1
Support vectors number: 156, is 57.7778% of train dataset (156/270)
===Different Classification Accuracy===
Whole = 80.7407% (218/270)
The 1 classification accuracy = 95.8333% (115/120)
The -1 classification accuracy = 68.6667% (103/150)
Cross Validation
Classification:
1 | accuracy = svmtrain(heart_scale_label,heart_scale_inst,'-s 0 -v 5') |
Result:
*
optimization finished, #iter = 125
nu = 0.461423
obj = -85.196874, rho = 0.455347
nSV = 112, nBSV = 87
Total nSV = 112
*
optimization finished, #iter = 151
nu = 0.409704
obj = -72.962828, rho = 0.195437
nSV = 106, nBSV = 77
Total nSV = 106
*
optimization finished, #iter = 132
nu = 0.422268
obj = -77.976883, rho = 0.253928
nSV = 103, nBSV = 77
Total nSV = 103
*
optimization finished, #iter = 118
nu = 0.441832
obj = -81.506660, rho = 0.486918
nSV = 107, nBSV = 86
Total nSV = 107
*
optimization finished, #iter = 132
nu = 0.478520
obj = -89.280327, rho = 0.145904
nSV = 113, nBSV = 90
Total nSV = 113
Cross Validation Accuracy = 82.963%
accuracy =
82.9630
Regression:
1 | mse = svmtrain(heart_scale_label,heart_scale_inst,'-s 3 -v 5') |
Result:
*
optimization finished, #iter = 216
nu = 0.642478
obj = -79.888791, rho = 0.188775
nSV = 160, nBSV = 121
*
optimization finished, #iter = 231
nu = 0.619544
obj = -76.549482, rho = 0.189642
nSV = 151, nBSV = 117
*
optimization finished, #iter = 232
nu = 0.634874
obj = -79.619778, rho = 0.276149
nSV = 158, nBSV = 121
*
optimization finished, #iter = 256
nu = 0.640413
obj = -77.053667, rho = 0.107534
nSV = 163, nBSV = 121
*
optimization finished, #iter = 282
nu = 0.587022
obj = -70.675772, rho = 0.184189
nSV = 145, nBSV = 108
Cross Validation Mean squared error = 0.544248
Cross Validation Squared correlation coefficient = 0.46728
mse =
0.5442
Simple Classification
1 | data = ... |
Simple Regression
1 | x = (-1:0.1:1)'; |