⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 k_means_subspaces.m~

📁 一个用EM算法的源程序
💻 M~
字号:
function sampleLabels = K_means_subspaces(nullSpaces, subspaceDimensions);% K_means_subspaces is a function to group data vectors based on the% subspace dimension assumption "subspaceDimensions" and the null space% information "nullSpaces".%% The difficulties lie in the fact that the dimension of the null space of % the data matrix given by the characteristic dimension is usually % substantially larger than the real codimensions of the subspaces.% Step 1: recursively select a global energy threshold such that the% cut-off of the singluar values match the subspaceDimensions assumption% the best. [ambientDimension, charDimension, sampleNumber]= size(nullSpaces);normalizedNullSpaces=cell(1,sampleNumber);sampleSitsInDimension=zeros(1,sampleNumber);% Step 1.1: If the subspaceDimensions are all the same, then we don't need a energy% cut.mixedDimension = sum(subspaceDimensions~=subspaceDimensions(1));if mixedDimension==0    % Direct cut null spaces to their correct dimension    dimensionClassNumber = 1;    dimensionClasses = subspaceDimensions(1);    for sampleIndex=1:sampleNumber        sampleSitsInDimension(sampleIndex)=dimensionClasses;        [U,S,V]=svds(nullSpaces(:,:,sampleIndex),dimensionClasses);        normalizedNullSpaces{sampleIndex}=U;    endelse    % Step 1.2: Vote for an optimal global energy threshold.    dimensionClassNumber = 0;    for index=1:ambientDimension-1        if sum(subspaceDimensions==index)>0            dimensionClassNumber = dimensionClassNumber + 1;            dimensionClasses(dimensionClassNumber) = index;        end    end        energyVote = zeros(1,99);    U=zeros(ambientDimension,min(size(nullSpaces(:,:,1))),sampleNumber);    for sampleIndex=1:sampleNumber        [U(:,:,sampleIndex),S,V]=svd(nullSpaces(:,:,sampleIndex));        % Compute energy        singularValues(:,sampleIndex)=diag(S);        totalEnergy(sampleIndex) = sum(abs(singularValues(:,sampleIndex)).^2);        for dimensionClassIndex=1:dimensionClassNumber            partialEnergy = sum(abs(singularValues(1:dimensionClasses(dimensionClassIndex),sampleIndex)).^2);            ratio = round(partialEnergy/totalEnergy(sampleIndex)*100);            for voteIndex=max(1,ratio-2):min(99,ratio+2)                energyVote(voteIndex) = energyVote(voteIndex) + 1;            end        end    end        % The energyVote plot must contain multiple votes, and the highest vote    % must be the last peak, since any low dimension subspace can be fitted    % with a higher dimension model    % Step 1.3: Detect local maximums    MAXIMUM_DETECTION_METHOD = 2;        if MAXIMUM_DETECTION_METHOD==1        % Number crunching method        smoothSpan= 10;        smoothedVote = smooth(energyVote,smoothSpan,'moving')        maximumIndices = find( diff( sign( diff([0; smoothedVote(:); 0]) ) ) < 0 );    else        % Smoothing spline method        smoothSpan = 10;        smoothedVote = smooth(energyVote.',smoothSpan,'lowess');        maximumIndices = find( diff( sign( diff([0; smoothedVote(:); 0]) ) ) < 0 );    end    % Step 1.4: Choose an optimal maximum point by comparing the variation    % of the sample numbers for all classes.    optVariation = inf;    for index=1:length(maximumIndices)        % Cut dimensions based on the maximum assumption        ratio = maximumIndices(index)/100;        for sampleIndex=1:sampleNumber            dimensionAssigned = false;            for dimensionClassIndex=1:dimensionClassNumber-1                partialEnergy = sum(abs(singularValues(1:dimensionClasses(dimensionClassIndex),sampleIndex)).^2);                if partialEnergy/totalEnergy(sampleIndex) >= ratio                    sampleDimensions(sampleIndex) = dimensionClasses(dimensionClassIndex);                    dimensionAssigned = true;                    break;                end            end            if dimensionAssigned == false                % Just assign the maximal value                sampleDimensions(sampleIndex) = dimensionClasses(end);            end        end                % Compute the variation of sampleDimensions        for dimensionClassIndex=1:dimensionClassNumber            sampleClassCount(dimensionClassIndex) = sum(sampleDimensions==dimensionClasses(dimensionClassIndex));            if sampleClassCount(dimensionClassIndex) == 0                break;            end        end        if sampleClassCount(dimensionClassIndex) == 0                    end        if index==1            optVariation = var(sampleClassCount);            optSampleDimensions = sampleDimensions;        else            variation = var(sampleClassCount)            if optVariation>variation                optVariation = variation;                optSampleDimensions = sampleDimensions;            end        end    end        % Step 1.5: Finally, cut basis matrices U(:,:,sampleIndex) based on    % optSampleDimensions.    sampleSitsInDimension = optSampleDimensions;    for sampleIndex=1:sampleNumber        normalizedNullSpaces{sampleIndex}=U(:,1:optSampleDimensions(sampleIndex),sampleIndex);    endend% Step 2: cluster groups within each class with the same dimension.for dimensionClassIndex=1:dimensionClassNumber    % The number of subspaces with the same dimension in the class    clusterNumber = sum(subspaceDimensions==dimensionClasses(dimensionClassIndex));        % Regroup samples in the same class    end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -