RealAda Classifier Demo
Several ways to use RealAda Boosting Algorithm classifier
Contents
Load dataset
% Name of matlab dataset % Available names : [cancer, wine, iris, crab, glass, simpleclass, thyroid] datasetName = 'iris'; % This is just an example to load matlab datasets % you can load datasets from different sources with different ways % as long as the you provide x the training instance which is a matrix % of size(Number of Instance,Number of Features) and y which is the % label matrix having size(Number of Instance, 1) load(strcat(datasetName, '_dataset')); eval(sprintf('x = %sInputs;', datasetName)); eval(sprintf('y = %sTargets;', datasetName)); x = x'; y = y'; numClasses = size(y, 2); [~,y] = max(y,[],2); numFeatures = size(x, 2); numInstances = size(x, 1); % display dataset info disp(['Dataset Name ' datasetName]); disp(['Number of Classes ' num2str(numClasses)]); disp(['Number of Instances ' num2str(numInstances)]); disp(['Number of Features ' num2str(numFeatures)]);
Dataset Name iris Number of Classes 3 Number of Instances 150 Number of Features 4
Basic Usage
fprintf('===========\n'); fprintf('Basic Usage\n'); fprintf('===========\n'); % create RealAdaBoost classifier Using svm as weak classifier % NOTE : weak classifier must support probability estimates and % SVMClassifier supports probability estimates by bassing -b argument radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); % train RealAda classifier fprintf('\tTraining Classifier for 3 iterations\n'); [radacl, learnErr] = learn(radacl, x, y, 3); fprintf('\tLearning Error %f\n', learnErr); fprintf('\t-----\n'); fprintf('\tTesting Classifier\n'); outs = computeOutputs(radacl, x); % other way to calculate error either on training dataset or other dataset err = sum(outs ~= y) / numInstances; fprintf('\tLearning Error %f\n', err); % comparing outputs of first 5 instances in predicted and target outputs fprintf('\t[predicted outputs : correct outputs]\n'); fprintf('\t\t%d\t\t%d\t\n',outs(1:5, :), y(1:5 , :));
=========== Basic Usage =========== Training Classifier for 3 iterations Learning Error 0.020000 ----- Testing Classifier Learning Error 0.020000 [predicted outputs : correct outputs] 1 1 1 1 1 1 1 1 1 1
Stage Details
fprintf('=============\n'); fprintf('Stage Details\n'); fprintf('=============\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); radacl = learn(radacl, x, y, 3, true); fprintf('\tLearning Error %f\n', learnErr);
============= Stage Details ============= RealAdaBoost ============ RealAda boost Classifier wasnt trained before ========================== RealAda Boosting Stage # 1 ========================== Weak Classifier has been trained, err = 0.026667 ========================== RealAda Boosting Stage # 2 ========================== Weak Classifier has been trained, err = 0.158547 ========================== RealAda Boosting Stage # 3 ========================== Weak Classifier has been trained, err = 0.201575 RealAdaBoost Training is Done with err = 0.020000 Learning Error 0.020000
Display Classifier
fprintf('==================\n'); fprintf('Display Classifier\n'); fprintf('==================\n'); fprintf('\tDisplay Classifier Before Learning\n\t===>\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); display(radacl); fprintf('\t<===\n'); fprintf('\tDisplay Classifier After Learning\n===>\n'); radacl = learn(radacl, x, y, 2); display(radacl); fprintf('\t<===\n');
================== Display Classifier ================== Display Classifier Before Learning ===> Real AdaBooster Classifier classifier is not trained <=== Display Classifier After Learning ===> Real AdaBooster Classifier classifier is trained Number of stages = 2 * Stage # 1 ------------ SVM Classifier classifier is trained Parameters: [5x1 double] nr_class: 3 totalSV: 30 rho: [3x1 double] Label: [3x1 double] sv_indices: [30x1 double] ProbA: [3x1 double] ProbB: [3x1 double] nSV: [3x1 double] sv_coef: [30x2 double] SVs: [30x4 double] Libsvm training options: -b 1 -q Libsvm predict options: -b 1 -q * Stage # 2 ------------ SVM Classifier classifier is trained Parameters: [5x1 double] nr_class: 3 totalSV: 40 rho: [3x1 double] Label: [3x1 double] sv_indices: [40x1 double] ProbA: [3x1 double] ProbB: [3x1 double] nSV: [3x1 double] sv_coef: [40x2 double] SVs: [40x4 double] Libsvm training options: -b 1 -q Libsvm predict options: -b 1 -q <===
Iteratins Errors
fprintf('================\n'); fprintf('Iterations Error\n'); fprintf('================\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); fprintf('\tTraining Classifier for 10 iterations\n'); [radacl, learnErr, iterationsErrors] = learn(radacl, x, y, 10); fprintf('\tLearning Error %f\n', learnErr); fprintf('\t------------\n'); fprintf('\tPlotting Iteration Errors\n'); plot(1:length(iterationsErrors), iterationsErrors); ylim([0 1]);
================ Iterations Error ================ Training Classifier for 10 iterations Learning Error 0.020000 ------------ Plotting Iteration Errors

Add More Boosting Stages
fprintf('========================\n'); fprintf('Add More Boosting Stages\n'); fprintf('========================\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); fprintf('\tTraining Classifier for 5 iterations\n'); radacl = learn(radacl, x, y, 5); fprintf('\t------------\n'); fprintf('\tTraining Classifier for 5 more iterations\n'); [radacl, learnErr] = learn(radacl, x, y, 10); fprintf('\tLearning Error %f\n', learnErr);
======================== Add More Boosting Stages ======================== Training Classifier for 5 iterations ------------ Training Classifier for 5 more iterations Learning Error 0.020000
Using Different Weak Classifier
fprintf('===============================\n'); fprintf('Using Different Weak Classifier\n'); fprintf('===============================\n'); radacl = RealAdaBooster(DecisionTreeClassifier(numClasses)); % train RealAda classifier fprintf('\tTraining Classifier for 3 iterations\n'); [radacl, learnErr] = learn(radacl, x, y, 3); fprintf('\tLearning Error %f\n', learnErr);
=============================== Using Different Weak Classifier =============================== Training Classifier for 3 iterations Learning Error 0.013333
Saving and Loading classifier to and from a file
fprintf('=======================================\n'); fprintf('Save and Load Classifier to/from a file\n'); fprintf('=======================================\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); [radacl, learnErr] = learn(radacl, x, y, 1); fprintf('\tLearning Error %f\n', learnErr); saveToFile(radacl, 'test2.bin'); radacl2 = loadFromFile(RealAdaBooster,'test2.bin'); outs1 = computeOutputs(radacl, x); err1 = sum(outs1 ~= y) / numInstances; outs2 = computeOutputs(radacl2, x); err2 = sum(outs2 ~= y) / numInstances; fprintf('\tError Before Save %f, Error After Save %f\n', err1, err2);
======================================= Save and Load Classifier to/from a file ======================================= Learning Error 0.026667 Error Before Save 0.026667, Error After Save 0.026667
Add Boost Stages till reaching a required Err Bound
fprintf('===================================================\n'); fprintf('Add Boost Stages till reaching a required Err Bound\n'); fprintf('===================================================\n'); radacl = RealAdaBooster(SVMClassifier(numClasses, '-b 1', '-b 1')); [radacl, learnErr] = learn(radacl, x, y, Inf, true, '', '', NaN, 0.02); fprintf('\tLearning Error %f\n', learnErr);
=================================================== Add Boost Stages till reaching a required Err Bound =================================================== RealAdaBoost ============ RealAda boost Classifier wasnt trained before ========================== RealAda Boosting Stage # 1 ========================== Weak Classifier has been trained, err = 0.026667 Currently, boosted classifier's error = 0.026667 ========================== RealAda Boosting Stage # 2 ========================== Weak Classifier has been trained, err = 0.158547 Currently, boosted classifier's error = 0.020000 RealAdaBoost Training is Done with err = 0.020000 Learning Error 0.020000