⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 flda.html

📁 一个关于数据聚类和模式识别的程序,在生物化学,化学中因该都可以用到.希望对大家有用,谢谢支持
💻 HTML
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"                "http://www.w3.org/TR/REC-html40/loose.dtd"><html><head>  <title>Description of flda</title>  <meta name="keywords" content="flda">  <meta name="description" content="LDA Linear discriminant analysis">  <meta http-equiv="Content-Type" content="text/html; charset=big5">  <meta name="generator" content="m2html &copy; 2003 Guillaume Flandin">  <meta name="robots" content="index, follow">  <link type="text/css" rel="stylesheet" href="../m2html.css"></head><body><a name="_top"></a><div><a href="../index.html">Home</a> &gt;  <a href="index.html">dcpr</a> &gt; flda.m</div><!--<table width="100%"><tr><td align="left"><a href="../index.html"><img alt="<" border="0" src="../left.png">&nbsp;Master index</a></td><td align="right"><a href="index.html">Index for dcpr&nbsp;<img alt=">" border="0" src="../right.png"></a></td></tr></table>--><h1>flda</h1><h2><a name="_name"></a>PURPOSE <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2><div class="box"><strong>LDA Linear discriminant analysis</strong></div><h2><a name="_synopsis"></a>SYNOPSIS <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2><div class="box"><strong>function [newSampleIn, discrimVec] = lda(sampleIn, sampleOut, discrimVecNum) </strong></div><h2><a name="_description"></a>DESCRIPTION <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2><div class="fragment"><pre class="comment">LDA Linear discriminant analysis
    Usage:
    [NEWSAMPLE, DISCRIM_VEC] = lda(SAMPLE, discrimVecNum)
    SAMPLE: Sample data with class information
        (Each row of SAMPLE is a sample point, with the 
        last column being the class label ranging from 1 to
        no. of classes.)
    discrimVecNum: No. of discriminant vectors
    NEWSAMPLE: new sample after projection

    Reference:
    J. Duchene and S. Leclercq, &quot;An Optimal Transformation for
    Discriminant Principal Component Analysis,&quot; IEEE Trans. on
    Pattern Analysis and Machine Intelligence,
    Vol. 10, No 6, November 1988

    Type &quot;flda&quot; for a self-demo.</pre></div><!-- crossreference --><h2><a name="_cross"></a>CROSS-REFERENCE INFORMATION <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2>This function calls:<ul style="list-style-image:url(../matlabicon.gif)"><li><a href="initfknn.html" class="code" title="function fuz_class = initfknn(sampledata, k)">initfknn</a>	INITfknn Initialize fuzzy membership grades of sample output for fuzzy KNN.</li></ul>This function is called by:<ul style="list-style-image:url(../matlabicon.gif)"><li><a href="fldainsel.html" class="code" title="function recog = fldainsel(feature, class, dim, k1, k2);">fldainsel</a>	LDAINSEL LDA for input selection</li></ul><!-- crossreference --><h2><a name="_subfunctions"></a>SUBFUNCTIONS <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2><ul style="list-style-image:url(../matlabicon.gif)"><li><a href="#_sub1" class="code">function selfdemo</a></li></ul><h2><a name="_source"></a>SOURCE CODE <a href="#_top"><img alt="^" border="0" src="../up.png"></a></h2><div class="fragment"><pre>0001 <a name="_sub0" href="#_subfunctions" class="code">function [newSampleIn, discrimVec] = lda(sampleIn, sampleOut, discrimVecNum)</a>0002 <span class="comment">%LDA Linear discriminant analysis</span>0003 <span class="comment">%    Usage:</span>0004 <span class="comment">%    [NEWSAMPLE, DISCRIM_VEC] = lda(SAMPLE, discrimVecNum)</span>0005 <span class="comment">%    SAMPLE: Sample data with class information</span>0006 <span class="comment">%        (Each row of SAMPLE is a sample point, with the</span>0007 <span class="comment">%        last column being the class label ranging from 1 to</span>0008 <span class="comment">%        no. of classes.)</span>0009 <span class="comment">%    discrimVecNum: No. of discriminant vectors</span>0010 <span class="comment">%    NEWSAMPLE: new sample after projection</span>0011 <span class="comment">%</span>0012 <span class="comment">%    Reference:</span>0013 <span class="comment">%    J. Duchene and S. Leclercq, &quot;An Optimal Transformation for</span>0014 <span class="comment">%    Discriminant Principal Component Analysis,&quot; IEEE Trans. on</span>0015 <span class="comment">%    Pattern Analysis and Machine Intelligence,</span>0016 <span class="comment">%    Vol. 10, No 6, November 1988</span>0017 <span class="comment">%</span>0018 <span class="comment">%    Type &quot;flda&quot; for a self-demo.</span>0019 0020 <span class="comment">%    Roger Jang, 990829</span>0021 0022 <span class="keyword">if</span> nargin&lt;1, <a href="#_sub1" class="code" title="subfunction selfdemo">selfdemo</a>; <span class="keyword">return</span>; <span class="keyword">end</span>0023 <span class="keyword">if</span> nargin&lt;3, discrimVecNum = size(sampleIn ,2); <span class="keyword">end</span>0024 0025 <span class="comment">% ====== Initialization</span>0026 n = size(sampleIn, 1);0027 m = size(sampleIn,2);0028 A = sampleIn;0029 mu = mean(A);0030 0031 <span class="comment">% ====== Compute B and W</span>0032 <span class="comment">% ====== B: between-class scatter matrix</span>0033 <span class="comment">% ====== W:  within-class scatter matrix</span>0034 <span class="comment">% MMM = \sum_k m_k*mu_k*mu_k^T</span>0035 U = sampleOut';0036 count = sum(U, 2);    <span class="comment">% Cardinality of each class</span>0037 <span class="comment">% Each row of MU is the mean of a class</span>0038 MU = U*A./(count*ones(1, m));0039 MMM = MU'*diag(count)*MU;0040 W = A'*A - MMM;0041 B = MMM - n*mu'*mu;0042 0043 <span class="comment">% ====== Find the best discriminant vectors</span>0044 invW = inv(W);0045 Q = invW*B;0046 D = [];0047 <span class="keyword">for</span> i = 1:discrimVecNum,0048     [eigVec, eigVal] = eig(Q);0049     [maxEigVal, index] = max(abs(diag(eigVal)));  0050     D = [D, eigVec(:, index)];    <span class="comment">% Each col of D is a eigenvector</span>0051     Q = (eye(m)-invW*D*inv(D'*invW*D)*D')*invW*B;0052 <span class="keyword">end</span>0053 newSampleIn = A*D(:,1:discrimVecNum); 0054 discrimVec = D;0055 0056 <span class="comment">%---------------------------------------------------</span>0057 <a name="_sub1" href="#_subfunctions" class="code">function selfdemo</a>0058 <span class="comment">% Self demo using IRIS dataset</span>0059 load iris.dat0060 sampleIn = iris(:, 1:end-1);0061 sampleOut = iris(:, end);0062 sampleFuzzyOut = <a href="initfknn.html" class="code" title="function fuz_class = initfknn(sampledata, k)">initfknn</a>(iris, 3);0063 newSampleIn = feval(mfilename, sampleIn, sampleFuzzyOut);0064 data = newSampleIn;0065 index1 = find(iris(:,5)==1);0066 index2 = find(iris(:,5)==2);0067 index3 = find(iris(:,5)==3);0068 figure;0069 plot(data(index1, 1), data(index1, 2), <span class="string">'*'</span>, <span class="keyword">...</span>0070      data(index2, 1), data(index2, 2), <span class="string">'o'</span>, <span class="keyword">...</span>0071      data(index3, 1), data(index3, 2), <span class="string">'x'</span>);0072 legend(<span class="string">'Class 1'</span>, <span class="string">'Class 2'</span>, <span class="string">'Class 3'</span>);0073 title(<span class="string">'LDA projection of IRIS data onto the first 2 discriminant vectors'</span>);0074 loo_error = knnrloo([data(:, 1:2) iris(:, end)]);0075 xlabel([<span class="string">'Leave-one-out misclassification count = '</span>, int2str(loo_error)]);0076 axis equal; axis tight;0077  0078 figure;0079 plot(data(index1, 3), data(index1, 4), <span class="string">'*'</span>, <span class="keyword">...</span>0080      data(index2, 3), data(index2, 4), <span class="string">'o'</span>, <span class="keyword">...</span>0081      data(index3, 3), data(index3, 4), <span class="string">'x'</span>);0082 legend(<span class="string">'Class 1'</span>, <span class="string">'Class 2'</span>, <span class="string">'Class 3'</span>);0083 title(<span class="string">'LDA projection of IRIS data onto the last 2 discriminant vectors'</span>);0084 loo_error = knnrloo([data(:, 3:4) iris(:, end)]);0085 xlabel([<span class="string">'Leave-one-out misclassification count = '</span>, int2str(loo_error)]);0086 axis equal; axis tight;</pre></div><hr><address>Generated on Thu 30-Oct-2008 12:53:56 by <strong><a href="http://www.artefact.tk/software/matlab/m2html/">m2html</a></strong> &copy; 2003</address></body></html>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -