📄 calculate_prediction_accuracy.m
字号:
function [accuracy] = calculate_prediction_accuracy (sample1, features)
% THIS IS A SAMPLE FOR THIS FUNCTION
% MODIFY IT ACCORDING TO EACH PROBLEM SUCH THAT THE INPUTS/OUTPUTS ARE THE SAME
% FOR EXAMPLE, INSTEAD OF LOGISTIC REGRESSION USE NEURAL NETWORKS
% APPLY PCA IF NECESSARY
% ETC
% select features from the sample and add last column since this is the target variable
sample = sample1(:, [features size(sample1, 2)]);
nc = size(sample,2);
% Transform the target representation: class label '2/1' to '1/1'
i = find(sample(:, nc) == 2);
sample(i, nc) = 0;
% in 100 random selections estimate sensitivity and specificity of the predictor
% maybe 100 iterations is too many/few, consider changing it to some other value if necessary
for i = 1 : 2
train = [];
test = [];
% find zero and one examples
q0 = find(sample(:, size(sample, 2)) == 0);
q1 = find(sample(:, size(sample, 2)) == 1);
% randomly separate 10% from each into test set
q0 = q0(randperm(length(q0)));
q1 = q1(randperm(length(q1)));
n0 = floor(0.1 * length(q0));
test = [test; sample(q0(1 : n0), :)];
n1 = floor(0.1 * length(q1));
test = [test; sample(q1(1 : n1), :)];
% make balanced training set
train = sample(q1(n1 + 1 : length(q1)), :);
train = [train; sample(q0(n0 + 1 : length(q0)), :)];
% print out matrix dimensions and set some variables
[n_examples, n_cols] = size(train);
n_features = n_cols - 1;
% normalize train and test separately
[meanv, stdv, train(:, 1 : n_features)] = normalize(train(:, 1 : n_features), [], []);
[meanv, stdv, test(:, 1 : n_features)] = normalize(test(:, 1 : n_features), meanv, stdv);
% train predictor, ANN
data = [train; test];
[mse(i),R2(i),accuracy(i)] = classifierANN(data);
end
accuracy = mean(accuracy);
return
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -