⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 fs_entropy.asv

📁 data reduction with fuzzy rough sets or fuzzy mutual information
💻 ASV
字号:
%% compute reduct from numerical data, categorical data and their mixtures with fuzzy information entropy. 
%% Please refer to the following papers: 
%% Qinghua Hu, Daren Yu, Zongxia Xie. Information-preserving hybrid data reduction based on fuzzy-rough techniques. Pattern recognition letters. 2006, 27 (5): 414-423
%% Qinghua Hu, Yu Daren, Zongxia Xie,Jinfu Liu.  Fuzzy probabilistic approximation spaces and their information measures. IEEE transactions on fuzzy systems. 2006, 14 (2): 191-201
%% Qinghua Hu, Daren Yu. Entropies of fuzzy indiscernibility relation and its operations. International Journal of uncertainty, fuzziness and knowledge-based systems. 12 (5):575-589. 2004
%% We compute a reduct with fuzzy information enttropy if their are numerical attributes; otherwise, we search a reduct with Shannon's entropy/ 
%%  In fact, Shannon's entropy and fuzzy entropy are unified in the same
%  form in this model.  
%% dependency is employed as the heuristic rule.
function select_feature=fs_entropy(data,if_fuzzy,neighbor)

[row column]=size(data);

%%%%%%%%%%%%%compute the matrix correlation%%%%%%%%%
if (if_fuzzy==0)
    for i=1:column
        col=i;
        r=[];
        eval(['ssr' num2str(col) '=[];']);
        for j=1:row      
            a=data(j,col);
            x=data(:,col);
       
            for m=1:length(x)
                r(j,m)=kersim_crisp(a,x(m),neighbor);
            end
        end
        eval(['ssr' num2str(col) '=r;']);
    end  

else
    for i=1:column
        col=i;
        r=[];
        eval(['ssr' num2str(col) '=[];']);
        for j=1:row      
            a=data(j,col);
            x=data(:,col);       
            for m=1:length(x)
                r(j,m)=kersim(a,x(m),neighbor);
            end
        end
        eval(['ssr' num2str(col) '=r;']);
    end      
 
 end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%data reduct based on entropy 
n=[];
sig=[];
x=0;
base=ones(row);
r=eval(['ssr' num2str(column)]);
entropyd=entropy(r);
attrinu=column-1;
for j=attrinu:-1:1
    for i=1:attrinu
       r1=eval(['ssr' num2str(i)]);
        sig(i)=entropyd+entropy(min(r1,base))-entropy(min(min(r1,r),base));
    end
    [x1,n1]=max(sig);
    x=[x;x1];
    len=length(x);
    if abs(x(len)-x(len-1))>0.001
        base1=eval(['ssr' num2str(n1)]);
        base=min(base,base1);
        n=[n;n1];
    else
        break
    end
end
select_feature=n;

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -