⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 svr.m

📁 SVR程序,直接用就可以了,没有错误,其中还有一些变量需再编程序算,如:MSE
💻 M
字号:
function [nsv, beta, bias] = svr(X,Y,ker,kerOptions,C,loss,e)%SVR Support Vector Regression%%  Usage: [nsv beta bias] = svr(X,Y,ker,kerOptions,C,loss,e)%%  Parameters: X          - Training inputs%              Y          - Training targets%              ker        - kernel function%              kerOptions - kernel function%              C          - upper bound (non-separable case)%              loss       - loss function%              e          - insensitivity%              nsv        - number of support vectors%              beta       - Difference of Lagrange Multipliers%              bias       - bias term%%  Author: Steve Gunn (srg@ecs.soton.ac.uk)%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% check correct number of argumentsif (nargin < 3 | nargin > 7)    help svr    returnend% fprintf('Support Vector Regressing ....\n')% fprintf('______________________________\n')n = size(X,1);if (nargin<6)    e = 0.0;endif (nargin<5)    loss='eInsensitive';endif (nargin<4)    C = Inf;endif (nargin<3)    ker='linear';end% tolerance for support vector detectionepsilon = svtol(C);%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% construct the kernel matrix% fprintf('Constructing ...\n');H = zeros(n,n);for i=1:n    for j=1:n        H(i,j) = svkernel(ker,kerOptions,X(i,:),X(j,:));    endend%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% set up the parameters for the optimization problemswitch lower(loss)    case 'einsensitive',        Hb = [H -H; -H H];        c  = [ (e*ones(n,1) - Y) ;            (e*ones(n,1) + Y) ];        vlb = zeros(2*n,1);  % set the bounds: alphas >= 0        vub = C*ones(2*n,1); %                 alphas <= C        x0  = zeros(2*n,1);  % the starting point is [0 0 0 ... 0]                neqcstr = nobias(ker); % set the number of equality constraints (1 or 0)        if neqcstr            A = [ones(1,n) -ones(1,n)];            b = 0; % set the constraint Ax = b        else            A = [];            b = [];        end            case 'quadratic',        Hb = H + eye(n)/(2*C);        c  = -Y;                vlb = -1e30*ones(n,1);        vub = 1e30*ones(n,1);        x0  = zeros(n,1); % the starting point is [0 0 0   0]                neqcstr = nobias(ker); % set the number of equality constraints (1 or 0)        if neqcstr            A = ones(1,n);            b = 0; % set the constraint Ax = b        else            A = [];            b = [];        end            otherwise        disp('Error: Unknown Loss Function\n');        end%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% Add small amount of zero order regularisation to avoid problems when Hessian% is badly conditioned. Rank is always less than or equal to n. Note that adding% to much reg will peturb solutionHb = Hb + 1e-10*eye(size(Hb));% solve the optimisation problem% fprintf('Optimizing ...\n');st = cputime;[alpha lambda how] = gunnqp(Hb, c, A, b, vlb, vub, x0, neqcstr);% fprintf('Execution time : %4.1f seconds\n',cputime - st);% fprintf('Status : %s\n',how);switch lower(loss)    case 'einsensitive',        beta =  alpha(1:n) - alpha(n+1:2*n);            case 'quadratic',        beta = alpha;end% fprintf('|w0|^2    : %f\n',beta'*H*beta);% fprintf('Sum beta : %f\n',sum(beta));%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% compute the number of support vectorssvi = find( abs(beta) > epsilon );nsv = length( svi );% fprintf('Support Vectors : %d (%3.1f%%)\n',nsv,100*nsv/n);% implicit bias, b0bias = 0;% Explicit bias, b0if nobias(ker) ~= 0        switch lower(loss)        case 'einsensitive',            % find bias from average of support vectors with interpolation            % error. SVs with interpolation error e have alphas: 0 < alpha < C            svii = find( abs(beta) > epsilon & abs(beta) < (C - epsilon));                        if length(svii) > 0                bias = (1/length(svii))*sum(Y(svii) - e*sign(beta(svii)) - H(svii,svi)*beta(svi));            else                fprintf('No support vectors with interpolation error e - cannot compute bias.\n');                bias = (max(Y)+min(Y))/2;            end                    case 'quadratic',            bias = mean(Y - H*beta);    end    end

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -