AdaBoost Classifier Demo
Several ways to use AdaBoosting Algorithm classifier
Contents
Load dataset
% Name of matlab dataset % Available names : [cancer, wine, iris, crab, glass, simpleclass, thyroid] datasetName = 'iris'; % This is just an example to load matlab datasets % you can load datasets from different sources with different ways % as long as the you provide x the training instance which is a matrix % of size(Number of Instance,Number of Features) and y which is the % label matrix having size(Number of Instance, 1) load(strcat(datasetName, '_dataset')); eval(sprintf('x = %sInputs;', datasetName)); eval(sprintf('y = %sTargets;', datasetName)); x = x'; y = y'; numClasses = size(y, 2); [~,y] = max(y,[],2); numFeatures = size(x, 2); numInstances = size(x, 1); % display dataset info disp(['Dataset Name ' datasetName]); disp(['Number of Classes ' num2str(numClasses)]); disp(['Number of Instances ' num2str(numInstances)]); disp(['Number of Features ' num2str(numFeatures)]);
Dataset Name iris Number of Classes 3 Number of Instances 150 Number of Features 4
Basic Usage
fprintf('===========\n'); fprintf('Basic Usage\n'); fprintf('===========\n'); % create AdaBoost classifier Using svm as weak classifier adacl = AdaBooster(SVMClassifier(numClasses)); % train Ada classifier fprintf('\tTraining Classifier for 3 iterations'); [adacl, learnErr] = learn(adacl, x, y, 3); fprintf('\tLearning Error %f\n', learnErr); fprintf('\t-----\n'); fprintf('\tTesting Classifier\n'); outs = computeOutputs(adacl, x); % other way to calculate error either on training dataset or other dataset err = sum(outs ~= y) / numInstances; fprintf('\tLearning Error %f\n', err); % comparing outputs of first 5 instances in predicted and target outputs fprintf('\t[predicted outputs : correct outputs]\n'); fprintf('\t\t%d\t\t%d\t\n',outs(1:5, :), y(1:5 , :));
=========== Basic Usage =========== Training Classifier for 3 iterations Learning Error 0.020000 ----- Testing Classifier Learning Error 0.020000 [predicted outputs : correct outputs] 1 1 1 1 1 1 1 1 1 1
Stage Details
fprintf('=============\n'); fprintf('Stage Details\n'); fprintf('=============\n'); adacl = AdaBooster(SVMClassifier(numClasses)); adacl = learn(adacl, x, y, 3, true); fprintf('\tLearning Error %f\n', learnErr);
============= Stage Details ============= AdaBoost ======== ================== Boosting Stage # 1 ================== Weak Classifier has been trained, err = 0.026667 ================== Boosting Stage # 2 ================== Weak Classifier has been trained, err = 0.031963 ================== Boosting Stage # 3 ================== Weak Classifier has been trained, err = 0.172170 AdaBoost Training is Done with err = 0.020000 Learning Error 0.020000
Display Classifier
fprintf('==================\n'); fprintf('Display Classifier\n'); fprintf('==================\n'); fprintf('\tDisplay Classifier Before Learning\n\t===>\n'); adacl = AdaBooster(SVMClassifier(numClasses)); display(adacl); fprintf('\t<===\n'); fprintf('\tDisplay Classifier After Learning\n===>\n'); adacl = learn(adacl, x, y, 2); display(adacl); fprintf('\t<===\n');
================== Display Classifier ================== Display Classifier Before Learning ===> AdaBooster Classifier classifier is not trained <=== Display Classifier After Learning ===> AdaBooster Classifier classifier is trained, the learnt paramters are:\n No target detection rate was specified Threshold = 0.000000 Number of stages = 2 * Stage # 1 ------------ + Weight = 4.290459 SVM Classifier classifier is trained Parameters: [5x1 double] nr_class: 3 totalSV: 30 rho: [3x1 double] Label: [3x1 double] sv_indices: [30x1 double] ProbA: [] ProbB: [] nSV: [3x1 double] sv_coef: [30x2 double] SVs: [30x4 double] Libsvm training options: -q Libsvm predict options: -q * Stage # 2 ------------ + Weight = 4.103823 SVM Classifier classifier is trained Parameters: [5x1 double] nr_class: 3 totalSV: 44 rho: [3x1 double] Label: [3x1 double] sv_indices: [44x1 double] ProbA: [] ProbB: [] nSV: [3x1 double] sv_coef: [44x2 double] SVs: [44x4 double] Libsvm training options: -q Libsvm predict options: -q <===
Iteratins Errors
fprintf('================\n'); fprintf('Iterations Error\n'); fprintf('================\n'); adacl = AdaBooster(SVMClassifier(numClasses)); fprintf('\tTraining Classifier for 10 iterations\n'); [adacl, learnErr, iterationsErrors] = learn(adacl, x, y, 10); fprintf('\tLearning Error %f\n', learnErr); fprintf('\t------------\n'); fprintf('\tPlotting Iteration Errors\n'); plot(1:length(iterationsErrors), iterationsErrors); ylim([0 1]);
================ Iterations Error ================ Training Classifier for 10 iterations Learning Error 0.020000 ------------ Plotting Iteration Errors

Add More Boosting Stages
fprintf('========================\n'); fprintf('Add More Boosting Stages\n'); fprintf('========================\n'); adacl = AdaBooster(SVMClassifier(numClasses)); fprintf('\tTraining Classifier for 5 iterations\n'); adacl = learn(adacl, x, y, 5); fprintf('\t------------\n'); fprintf('\tTraining Classifier for 5 more iterations\n'); [adacl, learnErr] = learn(adacl, x, y, 10); fprintf('\tLearning Error %f\n', learnErr);
======================== Add More Boosting Stages ======================== Training Classifier for 5 iterations ------------ Training Classifier for 5 more iterations Learning Error 0.020000
Using Different Weak Classifier
fprintf('===============================\n'); fprintf('Using Different Weak Classifier\n'); fprintf('===============================\n'); adacl = AdaBooster(DecisionTreeClassifier(numClasses)); % train Ada classifier fprintf('\tTraining Classifier for 3 iterations\n'); [adacl, learnErr] = learn(adacl, x, y, 3); fprintf('\tLearning Error %f\n', learnErr);
=============================== Using Different Weak Classifier =============================== Training Classifier for 3 iterations Learning Error 0.000000
Add Boost Stages till reaching a required Err Bound
fprintf('===================================================\n'); fprintf('Add Boost Stages till reaching a required Err Bound\n'); fprintf('===================================================\n'); adacl = AdaBooster(SVMClassifier(numClasses)); [adacl, learnErr] = learn(adacl, x, y, Inf, true, '', '', NaN, 0.02); fprintf('\tLearning Error %f\n', learnErr);
=================================================== Add Boost Stages till reaching a required Err Bound =================================================== AdaBoost ======== ================== Boosting Stage # 1 ================== Weak Classifier has been trained, err = 0.026667 Currently, boosted classifier's error = 0.026667 ================== Boosting Stage # 2 ================== Weak Classifier has been trained, err = 0.031963 Currently, boosted classifier's error = 0.026667 ================== Boosting Stage # 3 ================== Weak Classifier has been trained, err = 0.172170 Currently, boosted classifier's error = 0.020000 AdaBoost Training is Done with err = 0.020000 Learning Error 0.020000