⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 backsel_main.m

📁 Feature Selection using matlab
💻 M
📖 第 1 页 / 共 2 页
字号:
%==================================================================%--- Backward Feature selection by Correst Classification Rate ----%---------------   Main Function  ---------------------------------%==================================================================% function [ResultMat] = BackSel_main % Matlab function of the two backward selection algorithms:% a) Sequential Backward Selection              (SBS)% b) Sequential Floating Backward Selection     (SFBS)% Methods are improved by a t-test and a Information Loss  % evaluation. The criterion in feature selection is the correct % classification achieved by the Bayes classifier when each % probability density function is modeled as single Gaussian. % Main function: % - The feature selection method and our improvement on sequential%   selection algorithms% Secondary function: BayesClassMVGaussPDFs% - Bayes Classifier with Gaussian modeled PDFs %   using Crossvalidation or Resubstitution methods.%   Copyright 2003-2009 Dimitrios Ververidis, AIIA Lab.%   $Revision: 5.1.4 $  $Date: 27/01/2009 $% REFERENCES:% [1] D. Ververidis and C. Kotropoulos, "Fast and accurate feature %     subset selection applied into speech emotion recognition," %     Els. Signal Process., vol. 88, issue 12, pp. 2956-2970, 2008.% [2] D. Ververidis and C. Kotropoulos, "Optimum feature subset %     selection with respect to the information loss of the %     Mahalanobis distance in high dimensions," under preparation, %     2009. function [ResultMat, ConfMatFinal, Tlapse, OptimumFeatureSet,...      OptimumError] = BackSel_main(DatasetToUse, ErrorEstMethod,...                             MahalInfoLossMethod, FSMethod, handles)  format short g% INPUT% DatasetToUse:   STRING ('finalvecXXX' where XXX=Your dbname)% ErrorEstMethod: Error Estimation Method (STRING)  %                 Values: 'Standard'   stands for cross-validation%                         'ProposedA'  stands for cross-validation  %                         'ProposedAB' stands for cross-validation %                                      (see paper [1])   %                         'Resubstitution' Train and Test sets are  %                                        the whole data set% MahalInfoLossMethod: To view Limits of CCR wrt Dimensionality%                      found with Mahalanobis Info Loss (see [2]).% FSMethod           : Feature Selection method  ('SFBS','SBS')% NFeatToSelect: Number of features to select (INTEGER, DEFAULT=10)% PercTest: 1/PercTest of data is used for testing. Remain is used %      for training the classifier (DEFAULT= 10, RANGE [5,...,50]). % NRepThres: Number of Cross-validation repetitions when 'Standard' %            ErrorEstMethod is used (INTEGER>10, DEFAULT=50). % GammaParam: The accuracy of the feature selection when the ttest %             method is used (DEFAULT = 0.015)% OUTPUT  % ResultMat: Contains the features selected, the correct %      classification rate (CCR) achieved, and its Limits. Format:% # Features | CCR | Feature Index | DownLimit | UpperLimit%=============== Load The Patterns ===============================% NPatterns: The number of Patterns% KFeatures: The number of features% CClasses : The number of features% Patterns, features and Targets in a single matrix of % NPatterns X (KFeatures + 1) dimensionality. % The additional feature column is the Targets. % Patterns: FLOAT numbers in [0,1]% Targets: INTEGER in {1,2,...,C}, where C the number of classes.  [Patterns, Targets] = DataLoadAndPreprocess(DatasetToUse); [NPatterns, KFeatures] = size(Patterns);CClasses = max(Targets);%=============== End Loading Data ================================ %============  Feature Selection Settings ========================global HISTORYtable    % For Reportingglobal StopByUser      % Violent Stop of Algo by user (GUI)ConfMatSwitch = 0;     % View Confusion matrix switch: 0 off, 1 onPercTest      = 10;    % 1/PercTest of data is used for testing.%NFeatToSelect  = 4;   % Options:  >3TotalNStepsThres = 250; % Thresh. # for inclusion-exclusions steps.                         % Options: > NFeatToSelect                         GammaParam = 0.025;     % Confidence interval to control number of                         % repetitions. Options:0<GammaParam <1.                         % The lowest, the better. GammaParam                         % will be used to override unecessary                         % repetition with a stat. test.if strcmp(ErrorEstMethod,'ProposedAB')    NRepThres = [];     % It is estimated automatically (see [1]).     NDc = floor((1-1/PercTest)*NPatterns/CClasses);elseif strcmp(ErrorEstMethod,'ProposedA') || ...       strcmp(ErrorEstMethod,'Standard')   NRepThres = 50;      % Options: >10, The greater, the better.     GammaParam = [];        NDc = floor((1-1/PercTest)*NPatterns/CClasses);elseif strcmp(ErrorEstMethod,'Resubstitution')    NRepThres  =1;      % Train and Testing with all patterns.    NDc = floor(NPatterns/CClasses);end%=============   End Feature Selection Settings ==================TimeStampStart =  clock;%============  Log Report Formats ================================StrLine = ['------------------------------------------' ...                                     '-------------------------'];                       StrInclusionStep = ['\n\t Conditional Inclusion Step\n' StrLine ...'\nFeature | Corr. Classif.\t | Crossval. Repet.| Best Current\n'];StrExclusionStep =['\n\t\t Exclusion Step \t\t\t\n' StrLine ...    '\n Feature | Corr. Classification |  Crossval. Repet.'...                                              '|  Best Current\n'];StrFeatSelected=['\n\t\t\t\t Features Discarded \n' StrLine '\n'...'#Feat| Correct | Features | LowCL  | UpCL  | LowCL\n' ...'Remov| Classif.| Discarded| Hyper  | Hyper | Mahal\n'];                               FormatToPrintInterStep =['%3d \t|\t\t %1.3f \t\t\t|\t' ...                                        '%3d \t\t|       %1.3f\n'];FormatToPrintExterStep = [' %3d | \t%4.3f |'...                           '\t %3d\t | %4.3f  | %4.3f |  %4.3f\n'];%==================================================================      %=============== Feature Selection Initialize =====================%NTestSet   = NPatterns/PercTest; % # of samples in test set CritvalOpt = zeros(1,KFeatures); % Maximum criterion value for sets                                  % of all sizes.SelectedFeatPool = 1:KFeatures; % Pool of selected feature indices.PoolRemainFeat   = [];          % Pool of remaining feat indices.ResultMat        = [];      % Result matrix with selection history.NSteps           = 0;       % No Inclusion or exclusion step yet. NSelectedFeat    = KFeatures; % # of selected features.ContinueFlag     = 1;         % Stop criterion not met yet.StopByUser       = 0;        % Not stoped by userFlagAll          = 0;       LogViewOfIntStep = 1;%================== Handle too many features ======================if NDc < KFeatures + 1    RandSelectFeatIndices = randperm(KFeatures);    SelectedFeatPool = RandSelectFeatIndices(1:(NDc-1));    SelectedFeatPool = sort(SelectedFeatPool);    PoolRemainFeat = setxor(1:KFeatures, SelectedFeatPool);    NSelectedFeat = NDc-1;    disp('Select Randomly some features')endInitialSelectedFeatPool = SelectedFeatPool;%=================== Plot Feature Lines ===========================if ~isempty(handles)    if NPatterns > KFeatures        xLim = NPatterns;  yLim = KFeatures;    else         yLim = NPatterns;  xLim = KFeatures;    end        axes(handles.YelLinesAxes);    axis([0 xLim 0 yLim]); axis manual    hold on    for IndexFeatures = 1:length(SelectedFeatPool)      if (NPatterns > KFeatures)          HYelLines(SelectedFeatPool(IndexFeatures)) = ...              plot([0 NPatterns+2],...               (SelectedFeatPool(IndexFeatures)-.5)*ones(1,2),'y');      else          HYelLines(SelectedFeatPool(IndexFeatures)) = ...            plot((SelectedFeatPool(IndexFeatures)-.5)*...                                    ones(1,2),[0 NPatterns+2],'y');       end    end    set(gca,'Visible','off');    drawnow    set(findobj(gcf,'Tag','ListSelFeats'), 'String', ...                                        num2str(SelectedFeatPool));end%==================End of Initialization ==========================while ContinueFlag == 1 && StopByUser == 0 %== Begin Steps ========    if NSteps >= TotalNStepsThres        ContinueFlag = 0;    end    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%    %---------- Exclusion -----------    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%    fprintf(1,[StrExclusionStep StrLine '\n']);    % First Internal of First External: Use all Features    if NSelectedFeat == KFeatures        HISTORYtable(end+1,1) = 0;        [Critval, ConfMatOptBack, lowCLOptBack, upCLOptBack] = ...                        BayesClassMVGaussPDFs(...            Patterns, Targets, PercTest, ErrorEstMethod,...            NRepThres, GammaParam, 0, ConfMatSwitch);        HISTORYtable(end  ,2) = Critval;        HISTORYtable(end  ,4) = Critval;                fprintf(1, FormatToPrintInterStep, ...            HISTORYtable(end,1), ...            HISTORYtable(end,2),  HISTORYtable(end,3),Critval);        CritvalMaxCurrBack = Critval;        FlagAll = 1;    end        Critval            = zeros(1,length(PoolRemainFeat));    CritvalMaxCurrBack = 0;    for FeatSerialIndx = 1:NSelectedFeat        CandidateFeatSet = SelectedFeatPool;        %Remove one feature from the selected ones.        CandidateFeatSet(FeatSerialIndx) = [];        HISTORYtable(end+1,1)= -SelectedFeatPool(FeatSerialIndx);                [Critval(FeatSerialIndx), ConfMat, lowCL, upCL] = ...            BayesClassMVGaussPDFs( Patterns(:,...            CandidateFeatSet), Targets, PercTest,...            ErrorEstMethod, NRepThres, GammaParam, ...            CritvalMaxCurrBack, ConfMatSwitch);        HISTORYtable(end  ,2) = Critval(FeatSerialIndx);        % If removing this feature gives the best result so far        % (or the same result using less features), and we have        % not yet removed all NFeatToSelect features, store it.        if strcmp(ErrorEstMethod,'ProposedAB')            CritvalThres = CritvalMaxCurrBack + GammaParam;        else            CritvalThres = CritvalMaxCurrBack;        end        if Critval(FeatSerialIndx)     >= CritvalThres            CritvalMaxCurrBack       = Critval(FeatSerialIndx);            ConfMatOptBack              = ConfMat;            lowCLOptBack                = lowCL;            upCLOptBack                 = upCL;            FeatSerialIndxCritvalMaxCurr= FeatSerialIndx;            HISTORYtable(end  ,4)    = Critval(FeatSerialIndx);            FlagAll = 0;        end        fprintf(1, FormatToPrintInterStep,...            HISTORYtable(end,1), ...            HISTORYtable(end,2),  HISTORYtable(end,3),...            CritvalMaxCurrBack);    end % end check all selected features (End Internal Step)    ConfMatFinal             = ConfMatOptBack;    lowCLFinal               = lowCLOptBack;    upCLFinal                = upCLOptBack;    NSteps                   = NSteps+1;    HISTORYtable(end, 4)     = CritvalMaxCurrBack;    CritvalMaxCurrForw       = CritvalMaxCurrBack;        if FlagAll     ResultMat(end+1,1:5) =[KFeatures-NSelected ...                    CritvalMaxCurrBack 0 lowCLOptBack upCLOptBack];    else     CritvalOpt(NSelectedFeat)= CritvalMaxCurrBack ;     NSelectedFeat            = NSelectedFeat - 1;     PoolRemainFeat           = [PoolRemainFeat ...                   SelectedFeatPool(FeatSerialIndxCritvalMaxCurr)];     PoolRemainFeat           = sort(PoolRemainFeat);     ResultMat(end+1,1:5)    =[KFeatures-NSelectedFeat ...          CritvalMaxCurrBack  ...                -SelectedFeatPool(FeatSerialIndxCritvalMaxCurr) ...                                        lowCLOptBack upCLOptBack ];                                        SelectedFeatPool(FeatSerialIndxCritvalMaxCurr ) = [];    end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -