⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 hdda_find_seuil.m

📁 高维数据判别分析程序!数据挖掘及图像分析利器!matlab版!
💻 M
字号:
function prms = hdda_find_seuil(Xl,varargin);% High Dimensionality Discriminant Analysis - This function allows to learn% the dimensionality parameter from the learning dataset. %% Usage: s = hdda_find_seuil(X,'model','AiBiQiDi','n_it',10);%% Author: C. Bouveyron <charles.bouveyron@inrialpes.fr> - 2004-2006% % Reference: C. Bouveyron, S. Girard and C. Schmid, "High Dimensional Discriminant Analysis",%            Communications in Statistics, Theory and methods, in press, 2007.% Global parametersmodel = 'AiBiQiDi'; display = 0;[N,p] = size(Xl.data);n_it = 25;n = floor(9 * size(Xl.data,1) / 10);seuils = [1e-3,5e-3,1e-2:1e-2:19e-2,2e-1:5e-2:4e-1];dim_max = p-1;common_d = 0;    % PARAMETERS MANAGEMENTvarrem={};for i=1:2:length(varargin)    if ~isstr(varargin{i}) | ~exist(varargin{i},'var')        varrem = varargin(i:end);    end    eval([varargin{i} '= varargin{i+1};']);endif ~isempty(strmatch(model,strvcat('AijBiQiD', 'AijBQiD', 'AiBiQiD', 'ABiQiD', 'AiBQiD', 'ABQiD','AjBQD', 'ABQD'),'exact'))    common_d = 1;    %fprintf('--> the dimensions will be common between classes!\n')end% Finding the optimal "seuil" !fprintf('--> Learning: please wait ');if ~common_d    for j=1:length(seuils)        fprintf('.');        s = seuils(j);        for i=1:n_it            [Xtrn.data,Xtrn.cls,Xtst.data,Xtst.cls] = halfsampling(Xl.data,Xl.cls,n);            prms = hdda_learn(Xtrn,'model',model,'seuil',s,'display',0);            res = hdda_classif(prms,Xtst.data);            taux(j,i) = sum(res == Xtst.cls) / size(Xtst.cls,1);        end      end    fprintf('\n');        % Display results    tx = mean(taux'); st = std(taux');[val,ind] = max(tx);    s_opt = seuils(ind);    fprintf('--> Optimal threshold: %g\n',s_opt);        % Draw results    if display,         figure, plot(seuils,tx,'*'), hold on,        plot(seuils(ind),tx(ind),'ro'),        axis([0 seuils(end) min(tx)-(1-max(tx)) 1])        for i=1:length(seuils), plot([seuils(i),seuils(i)],[tx(i)-st(i)/2,tx(i)+st(i)/2],':+'), end    end        % Learn the classifier    prms = hdda_learn(Xl,'model',model,'seuil',s_opt,'display',0)    % Finding the optimal dimension !       else     dims = [1:dim_max];    for d=1:dim_max        fprintf('.');        for i=1:n_it            [Xtrn.data,Xtrn.cls,Xtst.data,Xtst.cls] = halfsampling(Xl.data,Xl.cls,n);            prms = hdda_learn(Xtrn,'model',model,'dim',d,'display',0);            res = hdda_classif(prms,Xtst.data);            taux(d,i) = sum(res == Xtst.cls) / size(Xtst.cls,1);        end    end    fprintf('\n');        % Display results    tx = mean(taux'); st = std(taux');[val,ind] = max(tx);    d_opt = dims(ind);    fprintf('--> Optimal dimension: %g\n',d_opt);        % Draw results    if display,         figure,        plot(tx,'-*'), hold on, plot(dims(ind),tx(ind),'ro'),        axis([0 dim_max+1 min(tx)-(1-max(tx)) 1])        for i=1:dim_max, plot([i,i],[tx(i)-st(i)/2,tx(i)+st(i)/2],':+'), end    end        % Learn the classifier    prms = hdda_learn(Xl,'model',model,'dim',dim_opt,'display',0)end%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%function [L,cls_l,T,cls_t] = halfsampling(X,cls,n);% [L,cls_l,T,cls_t] = halfsampling(X,cls,n);[N,p] = size(X);ind = randperm(N);L = X(ind(1:n),:); cls_l = cls(ind(1:n));T = X(ind(n+1:end),:); cls_t = cls(ind(n+1:end));

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -