SVM Classifier Demo
Several ways to use svm classifier
Contents
Load dataset
% Name of matlab dataset % Available names : [cancer, wine, iris, crab, glass, simpleclass, thyroid] datasetName = 'iris'; % This is just an example to load matlab datasets % you can load datasets from different sources with different ways % as long as the you provide x the training instance which is a matrix % of size(Number of Instance,Number of Features) and y which is the % label matrix having size(Number of Instance, 1) load(strcat(datasetName, '_dataset')); eval(sprintf('x = %sInputs;', datasetName)); eval(sprintf('y = %sTargets;', datasetName)); x = x'; y = y'; numClasses = size(y, 2); [~,y] = max(y,[],2); numFeatures = size(x, 2); numInstances = size(x, 1); % display dataset info disp(['Dataset Name ' datasetName]); disp(['Number of Classes ' num2str(numClasses)]); disp(['Number of Instances ' num2str(numInstances)]); disp(['Number of Features ' num2str(numFeatures)]);
Dataset Name iris Number of Classes 3 Number of Instances 150 Number of Features 4
Basic Usage
fprintf('===========\n'); fprintf('Basic Usage\n'); fprintf('===========\n'); % create svm classifier svmcl = SVMClassifier(numClasses); % train svm classifier disp('Training Classifier'); [svmcl, learnErr] = learn(svmcl, x, y); fprintf('Learning Error %f\n', learnErr); disp('Testing Classifier'); outs = computeOutputs(svmcl, x); % other way to calculate error either on learning dataset or other dataset err = sum(outs ~= y) / numInstances; fprintf('Learning Error %f\n', err); % comparing outputs of first 5 instances in predicted and target outputs disp('[predicted outputs : correct outputs]'); disp([outs(1:5, :) y(1:5 , :)]);
=========== Basic Usage =========== Training Classifier Learning Error 0.026667 Testing Classifier Learning Error 0.026667 [predicted outputs : correct outputs] 1 1 1 1 1 1 1 1 1 1
Instances Weights
fprintf('=================\n'); fprintf('Instances Weights\n'); fprintf('=================\n'); svmcl = SVMClassifier(numClasses); wts = ones(numInstances, 1) / numInstances; % training svm with given weights svmcl = learn(svmcl, x, y, wts); outs = computeOutputs(svmcl, x); err = sum(outs ~= y) / numInstances; fprintf('Error %f\n', err);
================= Instances Weights ================= Error 0.026667
Passing arguments and probability estimates
fprintf('====================================\n'); fprintf('Passing arguments and prob estimates\n'); fprintf('====================================\n'); % -b 1 is argument used to get probability estimates (prob. that each % instance belong to the predicted class) % for complete list of argument see http://www.csie.ntu.edu.tw/~cjlin/libsvm/ svmcl = SVMClassifier(numClasses,'-c 10 -g 1 -b 1','-b 1'); svmcl = learn(svmcl, x, y); [outs, prob] = computeOutputs(svmcl, x); disp('[predicted output : correct output : class probabilities]'); disp([outs(1:5, :) y(1:5 , :) prob(1:5, :) ]);
==================================== Passing arguments and prob estimates ==================================== [predicted output : correct output : class probabilities] 1.0000 1.0000 0.9802 0.0100 0.0098 1.0000 1.0000 0.9746 0.0127 0.0127 1.0000 1.0000 0.9798 0.0102 0.0100 1.0000 1.0000 0.9767 0.0117 0.0116 1.0000 1.0000 0.9814 0.0094 0.0092
Displaying The Classifier
fprintf('=========================\n'); fprintf('Displaying The Classifier\n'); fprintf('=========================\n'); svmcl = SVMClassifier(numClasses); disp('Display before training'); display(svmcl); disp('-----------------------'); svmcl = learn(svmcl, x, y); disp('Display after training'); display(svmcl);
========================= Displaying The Classifier ========================= Display before training SVM Classifier classifier is not trained Libsvm training options: -q Libsvm predict options: -q ----------------------- Display after training SVM Classifier classifier is trained Parameters: [5x1 double] nr_class: 3 totalSV: 30 rho: [3x1 double] Label: [3x1 double] sv_indices: [30x1 double] ProbA: [] ProbB: [] nSV: [3x1 double] sv_coef: [30x2 double] SVs: [30x4 double] Libsvm training options: -q Libsvm predict options: -q
Save And Load Classifier To/From A File
fprintf('=======================================\n'); fprintf('Save And Load Classifier To/From A File\n'); fprintf('=======================================\n'); svmcl = SVMClassifier(numClasses); svmcl = learn(svmcl, x, y, wts); % same classifier to file saveToFile(svmcl, 'test.bin'); % load classifier from file svmcl2 = loadFromFile(SVMClassifier, 'test.bin'); outs1 = computeOutputs(svmcl, x); err1 = sum(outs1 ~= y) / numInstances; outs2 = computeOutputs(svmcl2, x); err2 = sum(outs2 ~= y) / numInstances; fprintf('Error Before Save %f, Error After Save %f\n', err1, err2);
======================================= Save And Load Classifier To/From A File ======================================= Error Before Save 0.026667, Error After Save 0.026667
Using kfold with svm
fprintf('=========================\n'); fprintf('Using Kfold with SVM\n'); fprintf('=========================\n'); svmcl = SVMClassifier(numClasses); cp = kfold(x, y, 10, svmcl); fprintf('Accuracy of 10 fold-cross validation %f\n', cp.CorrectRate * 100);
========================= Using Kfold with SVM ========================= FOLD 1 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 FOLD 2 Learn Size 135 Test Size 15 FOLD RESULT = 14 / 15 = 93.333333 FOLD 3 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 FOLD 4 Learn Size 135 Test Size 15 FOLD RESULT = 14 / 15 = 93.333333 FOLD 5 Learn Size 135 Test Size 15 FOLD RESULT = 14 / 15 = 93.333333 FOLD 6 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 FOLD 7 Learn Size 135 Test Size 15 FOLD RESULT = 14 / 15 = 93.333333 FOLD 8 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 FOLD 9 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 FOLD 10 Learn Size 135 Test Size 15 FOLD RESULT = 15 / 15 = 100.000000 Accuracy of 10 fold-cross validation 97.333333