⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 ls_svm2.asv

📁 支持向量机学习中用到的用Matlab编写的工具箱。
💻 ASV
字号:
% 支持向量机最小二乘算法(LS-SVM)--可用于分类和回归???% function [ output_args ] = SvmLs( input_args )clcclearclose allm = 4;         % 样本维数TrainSample = [1+rand(m,5),2+rand(m,5),3+rand(m,5),4+rand(m,5)];TrainTarget = [ones(1,5),2*ones(1,5),3*ones(1,5),4*ones(1,5)];TestSample = [1+rand(m,5),2+rand(m,5),3+rand(m,5),4+rand(m,5)];TestTarget = [ones(1,5),2*ones(1,5),3*ones(1,5),4*ones(1,5)];% 这里 X 每一行是一个数据,Y 是列向量X = TrainSample';Y = TrainTarget';Xt = TestSample';Yt = TestTarget';%===========================================================================% 参数初始化type = 'c';gam = 1;sig2 = 1;kernel = 'RBF_kernel';preprocess = 'preprocess';codefct = 'code_MOC';      % 将“多类”转换成“两类”的编码方案%===========================================================================[Yc,codebook,old_codebook] = code(Y, codefct)% Encode and decode a multi-class classification task into multiple binary classifiers%%     1. For encoding:%% >> [Yc, codebook, old_codebook] = code(Y, codefct)% >> [Yc, codebook, old_codebook] = code(Y, codefct, codefct_args)% >> Yc = code(Y, given_codebook)% %       Outputs    %         Yc               : N x nbits encoded output classifier%         codebook(*)      : nbits*nc matrix representing the used encoding%         old_codebook(*)  : d*nc matrix representing the original encoding%       Inputs    %         Y                : N x d matrix representing the original classifier%         codefct(*)       : Function to generate a new codebook (e.g. code_MOC)%         codefct_args(*)  : Extra arguments for codefct%         given_codebook(*): nbits*nc matrix representing the encoding to use%% Different encoding schemes are available:% %     1. Minimum Output Coding (code_MOC) %     2. Error Correcting Output Code (code_ECOC)%       This coding scheme uses redundant bits. %     3. One versus All Coding (code_OneVsAll)%     4. One Versus One Coding (code_OneVsOne) %===========================================================================% 寻找最优参数%[gam, sig2, cost] = tunelssvm({X,Y,type,gam,sig2,kernel,preprocess})%[model, cost] = tunelssvm(model)% Tune the hyperparameters of the model with respect to the given performance measure%% >> [gam, sig2, cost] = tunelssvm({X,Y,type,igam,isig2,kernel,preprocess})% >> [gam, sig2, cost] = tunelssvm({X,Y,type,igam,isig2,kernel,preprocess}, StartingValues)% >> [gam, sig2, cost] = tunelssvm({X,Y,type,igam,isig2,kernel,preprocess},...%                                          StartingValues, optfun, optargs)% >> [gam, sig2, cost] = tunelssvm({X,Y,type,igam,isig2,kernel,preprocess},...%                                          StartingValues, optfun, optargs, costfun, costargs)%%      Outputs    %        gam     : Optimal regularization parameter%        sig2    : Optimal kernel parameter(s)%        cost(*) : Estimated cost of the optimal hyperparameters%      Inputs    %        X       : N x d matrix with the inputs of the training data%        Y       : N x 1 vector with the outputs of the training data%        type    : 'function estimation' ('f') or 'classifier' ('c')%        igam    : Starting value of the regularization parameter%        isig2   : Starting value of the kernel parameter(s) (bandwidth in the case of the 'RBF_kernel')%        kernel(*) : Kernel type (by default 'RBF_kernel')%        preprocess(*) : 'preprocess'(*) or 'original'%        StartingValues(*) : Starting values of the optimization routine (or '[]')%        optfun(*) : Optimization function (by default 'gridsearch')%        optargs(*) : Cell with extra optimization function arguments%        costfun(*) : Function estimating the cost-criterion (by default 'crossvalidate')%        costargs(*) : Cell with extra cost function arguments%===========================================================================% 训练[alpha, b] = trainlssvm({X,Yc,type,gam,sig2,kernel,preprocess})% Train the support values and the bias term of an LS-SVM for classification or function approximation% % >> model = trainlssvm(model)% >> model = trainlssvm(model, X, Y)%%       Outputs    %         model          : Trained object oriented representation of the LS-SVM model%       Inputs    %         model          : Object oriented representation of the LS-SVM model%         X(*)           : N x d matrix with the inputs of the training data%         Y(*)           : N x 1 vector with the outputs of the training data%%===========================================================================% 识别Yd0 = simlssvm({X,Yc,type,gam,sig2,kernel}, {alpha,b}, Xt)% Evaluate the LS-SVM at the given points%% >> [Yt, Zt, model] = simlssvm(model, Xt)% %       Outputs    %         Yt       : Nt x m matrix with predicted output of test data%         Zt(*)    : Nt x m matrix with predicted latent variables of a classifier%         model(*) : Object oriented representation of the LS-SVM model%       Inputs    %         model    : Object oriented representation of the LS-SVM model%         Xt       : Nt x d matrix with the inputs of the test data%===========================================================================% 将训练输出解码成原始格式Yd = code(Yd0,old_codebook,[],codebook)%     2. For decoding:% % >> Yd = code(Yc, old_codebook,[], codebook)% >> Yd = code(Yc, old_codebook,[], codebook, codedist_fct)% >> Yd = code(Yc, old_codebook,[], codebook, codedist_fct, codedist_args)% %       Outputs    %         Yd               : N x nc decoded output classifier%       Inputs    %         Y                : N x d matrix representing the original classifier%         codebook         : d*nc matrix representing the original encoding%         old_codebook     : bits*nc matrix representing the encoding of the given classifier%         codedist_fct     : Function to calculate the distance between to encoded classifiers (e.g. codedist_hamming)%         codedist_args(*) : Extra arguments of codedist_fctPercent = %===========================================================================%===========================================================================% 注意:以这两种写法等价% ============== 1 ==============% [Yc,codebook,old_codebook] = code(Y, codefct)% [alpha, b] = trainlssvm({X,Yc,type,gam,sig2,kernel,preprocess})% Yd0 = simlssvm({X,Yc,type,gam,sig2,kernel}, {alpha,b}, Xt)% Yd = code(Yd0,old_codebook,[],codebook)% ============== 2 ==============% model = initlssvm(X,Y,type,gam,sig2,kernel,preprocess)% model = changelssvm(model,'codetype',codefct)% model = trainlssvm(model)% Yd = simlssvm(model, Xt)

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -