⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 forwsel_main.m

📁 Feature Selection using matlab
💻 M
📖 第 1 页 / 共 2 页
字号:
%==================================================================%--- Forward Feature selection by Correst Classification Rate    --%-------------------Main Function ---------------------------------%==================================================================% function [ResultMat] = ForwS_main % Matlab function of the two forward selection algorithms:% a) Sequential Forward Selection              (SFS)% b) Sequential Floating Forward Selection     (SFFS)% Methods are improved by a t-test and a Information Loss  % evaluation. The criterion in feature selection is the correct % classification achieved by the Bayes classifier when each % probability density function is modeled as single Gaussian. % Main function: % - The feature selection method and our improvement on sequential%   selection algorithms% Secondary function: BayesClassMVGaussPDFs% - Bayes Classifier with Gaussian modeled PDFs %   using Crossvalidation or Resubstitution methods.%   Copyright 2003-2009 Dimitrios Ververidis, AIIA Lab.%   $Revision: 5.1.3 $  $Date: 09/01/2009 $% REFERENCES:% [1] D. Ververidis and C. Kotropoulos, "Fast and accurate feature %     subset selection applied into speech emotion recognition," %     Els. Signal Process., vol. 88, issue 12, pp. 2956-2970, 2008.% [2] D. Ververidis and C. Kotropoulos, "Optimum feature subset %     selection with respect to the information loss of the %     Mahalanobis distance in high dimensions," under preparation, %     2009. function [ResultMat, ConfMatOpt, Tlapse, OptimumFeatureSet,...        OptimumCCR] = ForwSel_main(DatasetToUse, ErrorEstMethod,...                            MahalInfoLossMethod, FSMethod, handles)  format short g% INPUT% DatasetToUse:   STRING ('finalvecXXX' where XXX=Your dbname)% ErrorEstMethod: Error Estimation Method (STRING)  %                 Values: 'Standard'   stands for cross-validation%                         'ProposedA'  stands for cross-validation  %                         'ProposedAB' stands for cross-validation %                                      (see paper [1])   %                         'Resubstitution' Train and Test sets are  %                                        the whole data set% MahalInfoLossMethod: To view Limits of CCR wrt Dimensionality%                      found with Mahalanobis Info Loss (see [2]).% FSMethod           : Feature Selection method  ('SFFS','SFS')% NFeatToSelect: Number of features to select (INTEGER, DEFAULT=10)% PercTest: 1/PercTest of data is used for testing. Remain is used %      for training the classifier (DEFAULT= 10, RANGE [5,...,50]). % NRepThres: Number of Cross-validation repetitions when 'Standard' %            ErrorEstMethod is used (INTEGER>10, DEFAULT=50). % GammaParam: The accuracy of the feature selection when the ttest %             method is used (DEFAULT = 0.015)% OUTPUT  % ResultMat: Contains the features selected, the correct %      classification rate (CCR) achieved, and its Limits. Format:% # Features | CCR | Feature Index | DownLimit | UpperLimit%=============== Load The Patterns ===============================% NPatterns: The number of Patterns% KFeatures: The number of features% CClasses : The number of features% Patterns, features and Targets in a single matrix of % NPatterns X (KFeatures + 1) dimensionality. % The additional feature column is the Targets. % Patterns: FLOAT numbers in [0,1]% Targets: INTEGER in {1,2,...,C}, where C the number of classes.  [Patterns, Targets] = DataLoadAndPreprocess(DatasetToUse); [NPatterns, KFeatures] = size(Patterns);CClasses = max(Targets);%=============== End Loading Data ================================ %============  Feature Selection Settings ========================global HISTORYtable    % For Reportingglobal StopByUser      % Violent Stop of Algo by user (GUI)ConfMatSwitch = 0;     % View Confusion matrix switch: 0 off, 1 onPercTest      = 10;    % 1/PercTest of data is used for testing.%NFeatToSelect  = 4;   % Options:  >3TotalNStepsThres = 250; % Thresh. # for inclusion-exclusions steps.                         % Options: > NFeatToSelect                         GammaParam = 0.025;     % Confidence interval to control number of                         % repetitions. Options:0<GammaParam <1.                         % The lowest, the better. GammaParam                         % will be used to override unecessary                         % repetition with a stat. test.if strcmp(ErrorEstMethod,'ProposedAB')    NRepThres = [];     % It is estimated automatically (see [1]).     NDc = floor((1-1/PercTest)*NPatterns/CClasses);elseif strcmp(ErrorEstMethod,'ProposedA') || ...       strcmp(ErrorEstMethod,'Standard')   NRepThres = 50;      % Options: >10, The greater, the better.     GammaParam = [];        NDc = floor((1-1/PercTest)*NPatterns/CClasses);elseif strcmp(ErrorEstMethod,'Resubstitution')    NRepThres  =1;      % Train and Testing with all patterns.    NDc = floor(NPatterns/CClasses);end%=============   End Feature Selection Settings ==================TimeStampStart =  clock;%============  Log Report Formats ================================StrLine = ['------------------------------------------' ...                                     '-------------------------'];                       StrInclusionStep = ['\t\t Inclusion Step\n' StrLine '\n' ...'Feature | Corr. Classif. | Crossval. Repet.| Best Current\n'];StrExclusionStep =['\n\t\t  Conditional Exclusion Step \t\t\t\n'... StrLine '\n Feature | Corr. Classification |  Crossval. Repet.'...                                              '|  Best Current\n'];StrFeatSelected=['\n\t\t Features Selected \n' StrLine '\n'...'#Feat| Correct | Features | LowCL  | UpCL  | LowCL\n' ...'Selec| Classif.| Selected | Hyper  | Hyper | Mahal\n'];                               FormatToPrintInterStep =['%3d \t|\t %1.3f \t|\t' ...                                        '%3d \t|  %1.3f\n'];FormatToPrintExterStep = [' %3d | \t%4.3f |'...          '\t %3d\t | %4.3f  | %4.3f |  %4.3f  \n'];%==================================================================      %=============== Feature Selection Initialize =====================   %NTestSet   = NPatterns/PercTest; % # of samples in test set CritvalOpt = zeros(1,KFeatures); % Maximum criterion value for sets                                  % of all sizes.                   PoolRemainFeat   = 1:KFeatures; % Pool of remaining feat indices.SelectedFeatPool = [];      % Pool of selected feature indices.ResultMat        = [];      % Result matrix with selection history.NSteps           = 0;       % No Inclusion or exclusion step yet. NSelectedFeat    = 0;       % # of selected features.ContinueFlag     = 1;       % Stop criterion not met yet.LogViewOfIntStep  =1 ;OptimumFeatureSet = [];%==================End of Initialization ==========================while ContinueFlag == 1 && StopByUser == 0 %== Begin Steps ========    if NSteps >= TotalNStepsThres        ContinueFlag = 0;    end    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%    %---------- Inclusion -----------    %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%    Critval = zeros(1,length(PoolRemainFeat));    if LogViewOfIntStep == 1      fprintf(1,[StrInclusionStep StrLine '\n']);    end        CritvalMaxCurrForw= 0;    % Begin of internal step of inclusion    for FeatSerialIndx = 1:length(PoolRemainFeat)        % Add one feature to the already selected ones.        CandidateFeatSet = [SelectedFeatPool, ...            PoolRemainFeat(FeatSerialIndx)];        HISTORYtable(end+1,1) = PoolRemainFeat(FeatSerialIndx);        [Critval(FeatSerialIndx), ConfMat, lowCL, upCL] = ...            BayesClassMVGaussPDFs( Patterns(:,CandidateFeatSet),...            Targets, PercTest, ErrorEstMethod, ...            NRepThres, GammaParam, ...            CritvalMaxCurrForw, ConfMatSwitch);        HISTORYtable(end  ,2) = Critval(FeatSerialIndx);        % If this feature is the best so far and we have not yet        % selected NFeatToSelect features, store it.        if strcmp(ErrorEstMethod,'ProposedAB')            CritvalThres = CritvalMaxCurrForw + GammaParam;        else            CritvalThres = CritvalMaxCurrForw;        end        if Critval(FeatSerialIndx)     >= CritvalThres            CritvalMaxCurrForw          = Critval(FeatSerialIndx);            ConfMatOpt                  = ConfMat;            lowCLopt                    = lowCL;            upCLopt                     = upCL;            FeatSerialIndxCritvalMaxCurr= FeatSerialIndx;            HISTORYtable(end  ,4)       = Critval(FeatSerialIndx);        end                if LogViewOfIntStep == 1        fprintf(1, FormatToPrintInterStep, HISTORYtable(end,1), ...                      HISTORYtable(end,2),  HISTORYtable(end,3),...                                               CritvalMaxCurrForw);        end    end  % End of Internal step of inclusion    % Add it to the set of selected features    FeatureToInclude= PoolRemainFeat(FeatSerialIndxCritvalMaxCurr);        SelectedFeatPool = [SelectedFeatPool FeatureToInclude];    % and remove it from the pool.    PoolRemainFeat(FeatSerialIndxCritvalMaxCurr) = [];    NSelectedFeat                         = NSelectedFeat + 1;    CritvalOpt(NSelectedFeat)             = CritvalMaxCurrForw;    ResultMat(end+1,1:5)                  = [NSelectedFeat ...        CritvalMaxCurrForw SelectedFeatPool(end) lowCLopt upCLopt];    NSteps = size(ResultMat,1);    %------ Curse-of-dim limits For Inclusion ---------------------    if strcmp(MahalInfoLossMethod, 'on')        LowLimitMahalInfLoss(NSteps) =  ...            MahalaInfoLoss(NSelectedFeat, ResultMat(end,2),...                     NDc, CClasses, ErrorEstMethod);        ResultMat(end, 6) = LowLimitMahalInfLoss(NSteps);    else        LowLimitMahalInfLoss(NSteps) = 0;        ResultMat(end, 6) = LowLimitMahalInfLoss(NSteps);    end    %--------------------------------------------------------------       %----------------------- Plot Module --------------------------    if ~isempty(handles)        axes(handles.YelLinesAxes);        axis([0 NPatterns 0 KFeatures]); axis manual        hold on        if (NPatterns > KFeatures)

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -