📄 gmm_classify.m
字号:
function [Y_compute, Y_prob] = GMM_classify(para, X_train, Y_train, X_test, Y_test, num_class)
Y_compute = zeros(size(Y_test)); Y_prob = zeros(size(Y_test));
if (isempty(X_train)),
fprintf('Error: The training set is empty!\n');
return;
end;
[class_set, num_class] = GetClassSet(Y_train);
if (nargin <= 5)
num_class = 2;
end;
p = str2num(char(ParseParameter(para, {'-NumMix'}, {'1'})));
num_mix = p(1);
num_feature = size(X_train, 2);
% Fix seeds for reproducible results
randn('state', 42);
rand('state', 42);
for i = 1:num_class
% Convert the binary labels into +/-1
data = X_train(Y_train == class_set(i),:);
mix = gmm(num_feature, num_mix, 'diag');
options = foptions;
options(14) = 5; % Just use 5 iterations of k-means in initialisation
% Initialise the model parameters from the data
mix = gmminit(mix, data, options);
% Set up vector of options for EM trainer
options = zeros(1, 18);
options(1) = 0; % Prints out error values.
options(14) = 20; % Max. Number of iterations.
[mix, options, errlog] = gmmem(mix, data, options);
Y_prob_matrix(:, i) = gmmprob(mix, X_test);
end;
[Y_prob Index] = max(Y_prob_matrix, [], 2);
Y_compute = class_set(Index);
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -