⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 n_nvecs_doc.html

📁 张量分析工具
💻 HTML
字号:
<html xmlns:mwsh="http://www.mathworks.com/namespace/mcode/v1/syntaxhighlight.dtd">   <head>      <meta http-equiv="Content-Type" content="text/html; charset=utf-8">         <!--This HTML is auto-generated from an M-file.To make changes, update the M-file and republish this document.      -->      <title>Generating the leading mode-n vectors</title>      <meta name="generator" content="MATLAB 7.2">      <meta name="date" content="2007-01-10">      <meta name="m-file" content="N_nvecs_doc"><style>body {  background-color: white;  margin:10px;}h1 {  color: #990000;   font-size: x-large;}h2 {  color: #990000;  font-size: medium;}/* Make the text shrink to fit narrow windows, but not stretch too far in wide windows.  On Gecko-based browsers, the shrink-to-fit doesn't work. */ p,h1,h2,div.content div {  /* for MATLAB's browser */  width: 600px;  /* for Mozilla, but the "width" tag overrides it anyway */  max-width: 600px;  /* for IE */  width:expression(document.body.clientWidth > 620 ? "600px": "auto" );}pre.codeinput {  background: #EEEEEE;  padding: 10px;}@media print {  pre.codeinput {word-wrap:break-word; width:100%;}} span.keyword {color: #0000FF}span.comment {color: #228B22}span.string {color: #A020F0}span.untermstring {color: #B20000}span.syscmd {color: #B28C00}pre.codeoutput {  color: #666666;  padding: 10px;}pre.error {  color: red;}p.footer {  text-align: right;  font-size: xx-small;  font-weight: lighter;  font-style: italic;  color: gray;}  </style></head>   <body>      <div class="content">         <h1>Generating the leading mode-n vectors</h1>         <introduction>            <p>The leading mode-n vectors are those vectors that span the subspace of the mode-n fibers. In other words, the left singular               vectors of the n-mode matricization of X.            </p>         </introduction>         <h2>Contents</h2>         <div>            <ul>               <li><a href="#1">Using nvecs to calculate the leading mode-n vectors</a></li>               <li><a href="#11">Using nvecs for the HOSVD</a></li>            </ul>         </div>         <h2>Using nvecs to calculate the leading mode-n vectors<a name="1"></a></h2>         <p>The <tt>nvecs</tt> command efficient computes the leading n-mode vectors.         </p><pre class="codeinput">rand(<span class="string">'state'</span>,0);X = sptenrand([4,3,2],6) <span class="comment">%&lt;-- A sparse tensor</span></pre><pre class="codeoutput">X is a sparse tensor of size 4 x 3 x 2 with 6 nonzeros	(1,2,1)    0.8385	(2,3,1)    0.5681	(3,2,1)    0.3704	(3,3,1)    0.7027	(4,2,2)    0.5466	(4,3,2)    0.4449</pre><pre class="codeinput">nvecs(X,1,2) <span class="comment">%&lt;-- The 2 leading mode-1 vectors</span></pre><pre class="codeoutput">ans =    0.5810    0.7687    0.3761   -0.5451    0.7219   -0.3347    0.0000    0.0000</pre><pre class="codeinput">nvecs(X,1,3) <span class="comment">% &lt;-- The 3 leading mode-1 vectors</span></pre><pre class="codeoutput">ans =    0.5810    0.7687    0.0000    0.3761   -0.5451   -0.0000    0.7219   -0.3347   -0.0000    0.0000   -0.0000    1.0000</pre><pre class="codeinput">nvecs(full(X),1,3) <span class="comment">%&lt;-- The same thing for a dense tensor</span></pre><pre class="codeoutput">ans =    0.5810    0.7687    0.0000    0.3761   -0.5451    0.0000    0.7219   -0.3347    0.0000   -0.0000    0.0000    1.0000</pre><pre class="codeinput">X = ktensor({rand(3,2),rand(3,2),rand(2,2)}) <span class="comment">%&lt;-- A random ktensor</span></pre><pre class="codeoutput">X is a ktensor of size 3 x 3 x 2	X.lambda = [ 1  1 ]	X.U{1} = 		    0.1365    0.1991		    0.0118    0.2987		    0.8939    0.6614	X.U{2} = 		    0.2844    0.9883		    0.4692    0.5828		    0.0648    0.4235	X.U{3} = 		    0.5155    0.4329		    0.3340    0.2259</pre><pre class="codeinput">nvecs(X,2,1) <span class="comment">%&lt;-- The 1 leading mode-2 vector</span></pre><pre class="codeoutput">ans =    0.7147    0.6480    0.2633</pre><pre class="codeinput">nvecs(full(X),2,1) <span class="comment">%&lt;-- Same thing for a dense tensor</span></pre><pre class="codeoutput">ans =    0.7147    0.6480    0.2633</pre><pre class="codeinput">X = ttensor(tenrand([2,2,2,2]),{rand(3,2),rand(3,2),rand(2,2),rand(2,2)}); <span class="comment">%&lt;-- A random ttensor</span></pre><pre class="codeinput">nvecs(X,4,2) <span class="comment">%&lt;-- The 1 leading mode-2 vector</span></pre><pre class="codeoutput">ans =    0.6725    0.7401    0.7401   -0.6725</pre><pre class="codeinput">nvecs(full(X),4,2) <span class="comment">%&lt;-- Same thing for a dense tensor</span></pre><pre class="codeoutput">ans =    0.6725    0.7401    0.7401   -0.6725</pre><h2>Using nvecs for the HOSVD<a name="11"></a></h2><pre class="codeinput">X = tenrand([4 3 2]) <span class="comment">%&lt;-- Generate data</span></pre><pre class="codeoutput">X is a tensor of size 4 x 3 x 2	X(:,:,1) = 	    0.0272    0.6831    0.6085	    0.3127    0.0928    0.0158	    0.0129    0.0353    0.0164	    0.3840    0.6124    0.1901	X(:,:,2) = 	    0.5869    0.7176    0.4418	    0.0576    0.6927    0.3533	    0.3676    0.0841    0.1536	    0.6315    0.4544    0.6756</pre><pre class="codeinput">U1 = nvecs(X,1,4); <span class="comment">%&lt;-- Mode 1</span>U2 = nvecs(X,2,3); <span class="comment">%&lt;-- Mode 2</span>U3 = nvecs(X,3,2); <span class="comment">%&lt;-- Mode 3</span>S = ttm(X,{pinv(U1),pinv(U2),pinv(U3)}); <span class="comment">%&lt;-- Core</span>Y = ttensor(S,{U1,U2,U3}) <span class="comment">%&lt;-- HOSVD of X</span></pre><pre class="codeoutput">Y is a ttensor of size 4 x 3 x 2	Y.core is a tensor of size 4 x 3 x 2		Y.core(:,:,1) = 	    1.9279    0.0684   -0.0009	    0.0669   -0.1193   -0.1012	   -0.0229    0.3216    0.0848	    0.0013    0.0852   -0.0282		Y.core(:,:,2) = 	    0.0560   -0.2314   -0.0911	   -0.2194    0.5059   -0.0932	   -0.0393    0.0200   -0.3189	   -0.1786   -0.0452    0.0974	Y.U{1} = 		    0.6809   -0.4132   -0.6031    0.0424		    0.3358    0.8973   -0.2230    0.1799		    0.1548   -0.1551    0.3452    0.9126		    0.6321    0.0064    0.6836   -0.3647	Y.U{2} = 		    0.4538    0.8780   -0.1521		    0.7111   -0.4597   -0.5319		    0.5370   -0.1332    0.8330	Y.U{3} = 		    0.5412    0.8409		    0.8409   -0.5412</pre><pre class="codeinput">norm(full(Y) - X) <span class="comment">%&lt;-- Reproduces the same result.</span></pre><pre class="codeoutput">ans =  1.2486e-015</pre><pre class="codeinput">U1 = nvecs(X,1,2); <span class="comment">%&lt;-- Mode 1</span>U2 = nvecs(X,2,2); <span class="comment">%&lt;-- Mode 2</span>U3 = nvecs(X,3,2); <span class="comment">%&lt;-- Mode 3</span>S = ttm(X,{pinv(U1),pinv(U2),pinv(U3)}); <span class="comment">%&lt;-- Core</span>Y = ttensor(S,{U1,U2,U3}) <span class="comment">%&lt;-- Rank-(2,2,2) HOSVD approximation of X</span></pre><pre class="codeoutput">Y is a ttensor of size 4 x 3 x 2	Y.core is a tensor of size 2 x 2 x 2		Y.core(:,:,1) = 	    1.9279    0.0684	    0.0669   -0.1193		Y.core(:,:,2) = 	    0.0560   -0.2314	   -0.2194    0.5059	Y.U{1} = 		    0.6809   -0.4132		    0.3358    0.8973		    0.1548   -0.1551		    0.6321    0.0064	Y.U{2} = 		    0.4538    0.8780		    0.7111   -0.4597		    0.5370   -0.1332	Y.U{3} = 		    0.5412    0.8409		    0.8409   -0.5412</pre><pre class="codeinput">100*(1-norm(full(Y)-X)/norm(X)) <span class="comment">%&lt;-- Percentage explained by approximation</span></pre><pre class="codeoutput">ans =   74.1571</pre><p class="footer"><br>            Published with MATLAB&reg; 7.2<br></p>      </div>      <!--##### SOURCE BEGIN #####%% Generating the leading mode-n vectors
% The leading mode-n vectors are those vectors that span the subspace of
% the mode-n fibers. In other words, the left singular vectors of the
% n-mode matricization of X. 
%% Using nvecs to calculate the leading mode-n vectors
% The |nvecs| command efficient computes the leading n-mode vectors.
rand('state',0);
X = sptenrand([4,3,2],6) %<REPLACE_WITH_DASH_DASH A sparse tensor
%%
nvecs(X,1,2) %<REPLACE_WITH_DASH_DASH The 2 leading mode-1 vectors
%% 
nvecs(X,1,3) % <REPLACE_WITH_DASH_DASH The 3 leading mode-1 vectors
%%
nvecs(full(X),1,3) %<REPLACE_WITH_DASH_DASH The same thing for a dense tensor
%%
X = ktensor({rand(3,2),rand(3,2),rand(2,2)}) %<REPLACE_WITH_DASH_DASH A random ktensor
%%
nvecs(X,2,1) %<REPLACE_WITH_DASH_DASH The 1 leading mode-2 vector
%%
nvecs(full(X),2,1) %<REPLACE_WITH_DASH_DASH Same thing for a dense tensor
%%
X = ttensor(tenrand([2,2,2,2]),{rand(3,2),rand(3,2),rand(2,2),rand(2,2)}); %<REPLACE_WITH_DASH_DASH A random ttensor
%%
nvecs(X,4,2) %<REPLACE_WITH_DASH_DASH The 1 leading mode-2 vector
%%
nvecs(full(X),4,2) %<REPLACE_WITH_DASH_DASH Same thing for a dense tensor
%% Using nvecs for the HOSVD
X = tenrand([4 3 2]) %<REPLACE_WITH_DASH_DASH Generate data
%% 
U1 = nvecs(X,1,4); %<REPLACE_WITH_DASH_DASH Mode 1
U2 = nvecs(X,2,3); %<REPLACE_WITH_DASH_DASH Mode 2
U3 = nvecs(X,3,2); %<REPLACE_WITH_DASH_DASH Mode 3
S = ttm(X,{pinv(U1),pinv(U2),pinv(U3)}); %<REPLACE_WITH_DASH_DASH Core
Y = ttensor(S,{U1,U2,U3}) %<REPLACE_WITH_DASH_DASH HOSVD of X
%%
norm(full(Y) - X) %<REPLACE_WITH_DASH_DASH Reproduces the same result.

%% 
U1 = nvecs(X,1,2); %<REPLACE_WITH_DASH_DASH Mode 1
U2 = nvecs(X,2,2); %<REPLACE_WITH_DASH_DASH Mode 2
U3 = nvecs(X,3,2); %<REPLACE_WITH_DASH_DASH Mode 3
S = ttm(X,{pinv(U1),pinv(U2),pinv(U3)}); %<REPLACE_WITH_DASH_DASH Core
Y = ttensor(S,{U1,U2,U3}) %<REPLACE_WITH_DASH_DASH Rank-(2,2,2) HOSVD approximation of X
%%
100*(1-norm(full(Y)-X)/norm(X)) %<REPLACE_WITH_DASH_DASH Percentage explained by approximation##### SOURCE END #####-->   </body></html>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -