📄 lvq_dd.m
字号:
% w = lvq_dd(x,fracrej,N,learningrate,epochs);% % try to detect outliers using LVQ% when labx is given, the lvq tries more or less to avoid the objects% with labx == -1%% Problem is, that in the new Matlab6 neural network toolbox, the% performance on the data is checked first. And if all data is from% one class, we immediately classify everything correctly, and nothing% is optimized further.function W = lvq_dd(x,fracrej,N,learningrate,epochs);if (nargin<5) epochs=1000;endif (nargin<4) learningrate=0.05;endif (nargin<3) N=5;endif ((nargin==2) & (size(x,2)==size(N,2))) %testing fase w = x; Dx = min(dist(N,w'),[],2); return;end[nrx,dim] = size(x);pc = [length(find(labx==1)) length(find(labx==-1))]/length(labx);minmax = [min(x)' max(x)'];if (pc(2)==0) pc = 1.0; labx = ones(1,size(x,1));else labx = ind2vec((3*ones(size(labx)) - labx)/2);endnet = newlvq(minmax,N,pc);net.trainParam.epochs = epochs;net.trainParam.lr = learningrate;net.trainParam.show = inf;net = train(net,x',labx);w = net.IW{1,1};clI = vec2ind(net.LW{2,1});dw = (w - ones(size(w,1),1)*mean(minmax')).^2;I = find(sum(dw,2)<0.001);w(I,:) = [];clI(I) = [];w = w(clI==1,:);Dx = min(dist(x,w'),[],2);return
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -