⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 chain.m

📁 markov matlab code, give a detail impletation of markov procedure
💻 M
字号:
%  chain.m simulates a Markov chain on {1, 2, ..., n} given an initial %  distribution and a transition matrix. % Program to simulate a Markov chain% You might want to set up another .m file that defines mu and P% The program assumes that the states are labeled 1, 2, ...% Below is a sample, which you can use by uncommenting these lines% mu=[1 0 0];     % initial distribution% P=[[.6 .3 .1]; [.3 .3 .4]; [.4 .1 .5]];  % transition matrixn=80;           % number of time steps to takex=zeros(1,n+1); % clear out any old valuest=0:n;          % time indicesx(1)=rando(mu); % generate first x value (time 0, not time 1)for i=1:n,  x(i+1) = rando(P(x(i),:));endplot(t, x, '*');axis([0 n 0 (length(mu)+1)]);

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -