⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 runlogitboostalgorithm_all.m

📁 Logitboost 是一种改进的boosting算法
💻 M
字号:
function start
switch 3
    case 1
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\sonar1.data');
        train_label=sonar1(:,61);
        train=sonar1(:,1:60);
        
    case 2
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\glass.data');
        train=glass(:,2:10);
        train_label=glass(:,11)>2;
        
    case 3
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\ionosphere.data');
        train=ionosphere(:,2:34);
        train_label=ionosphere(:,1);
        
    case 4
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\waveform.data');
        train=waveform(:,1:21);
        train_label=waveform(:,22);
        
    case 5
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\vowel.data');
        train=vowel(:,4:13);
        train_label=vowel(:,1);
        
    case 6
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\usps.mat');
        train=test;
        train_label=test_label;
        
    case 7
        load ('C:\University\PhD\AdaBoost\LogitBoost\Matlab\newAlgorithm.mat');
        train=test;
        train_label=test_label;
        
end
%sonar3_label=sonar3_label-min(sonar3_label);mean(sonar3_label==0);
%sonar2_label=sonar2(:,30);sonar2_label=sonar2_label-min(sonar2_label);mean(sonar2_label==0);
runLogitBoostAlgorithm(train,train_label)

%%****************************************************************************************
%% LogitBoost (2 classes): 
%% An adaptive Newton algorithm for fitting an additive multiple logistic regression model.  
%% 1. Start with weights Wi=1/N , i = 1...N, F(x)=0 and p(xi)= 1/2. 
%% 2. Repeat for m = 1,2...M 
%%   (a) Compute working responses and weights
%%                           y*i - p(xi)
%%	                 Zi = ----------------
%%					       p(xi)(1-p(xi))
%%
%%                   Wi = pj(xi)(1-pj(xi))
%%   (b) Fit the function fm(x) by a weighted least璼quares regression of zi to xi with weights wi.	   
%%   (c) Update the function Fm(x) <-- F(x) + 1/2*fm(x)  and p(x) <-- exp(F(x)) / [exp(F(X)) + exp(-F(X))] 
%% 3. Output the classifier sign[F(x)] = sign[sum(fm(x))]. 
%%******************************************************************************************/
function runLogitBoostAlgorithm(train,train_label)
disp('run Logitboost Algorithm with number of cycles');
% Number of Iteration
Cycle=1000;
k=10;
% Size of train data
[N,J]= size(train);
% weights
Wi=ones(N,1)/N;
F=zeros(N,1)/N;  
h=ones(N,1);
a1=max(train);
a2=min(train);
H=zeros(N,k*J);
% Weak learning
for i=1:k
    H(:,(i-1)*J+1:i*J) = train > ones(N,1)*(a2 + (a1 - a2).* (i/k));
end

d = zeros(1,k*J);
d1 = zeros(1,k*J);
d2 = zeros(1,k*J);
iter=0;
Fold=1e9;
while max(max(abs(F-Fold)))>1e-5 & iter < Cycle
    iter = iter + 1
    Fold=F;
    S=train_label;
    y=(S>0)-(S<=0);
    yy=(y+1)/2;
    a = exp(F);
    a_minus = exp(-F);
    b = a + a_minus;
    p = a./(b+1e-6);
    Zi=(yy-p)./(p.*(1-p)+1e-5);
    Wi=p.*(1-p);
    H = H - (ones(N,1) * mean(H));
    d1 = (Wi.*Zi)' * H;
    d2 = Wi' * (H.^2);
    d = (d1.^2 ./ (d2+1e-20));
    im = min(find(d==max(d)));
    F = F + 0.5 * ((d1(im) / d2(im)) * H (:,im));
    F=max(-15,min(15,F));
    output=max(F);
    output=mean(((F>0).*(1-yy))+((F<=0).*yy));
    output
    [x(iter)]=(output);
    [v(iter)]=(iter);
    plot(v,x);
    grid on;
    drawnow
    disp('end of while loop')
end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -