⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 ann.mdl

📁 这是用MATLAB 编写的有关神经网络的程序
💻 MDL
📖 第 1 页 / 共 5 页
字号:
"its \n  of the hidden layer. This structure is used in an attempt to mirror "
"\n  the topology of the input manifold. <br>\n  As in the RAN, the learning a"
"lgorithm, in order to decrease the error, \n  changes weights, positions and "
"widths of the basis functions.\n  The estimation error e(k) is accumulated lo"
"cally to each neuron and \n  used to determine where (and if) to activate a n"
"ew neuron. <br> \n  DSC stands for Dynamic Cell Structure.<br>\n<br>\n  OUTPU"
"T EQUATION: <br>\n  At any given time t, if x(t) is the input vector, then th"
"e h-th output \n  of the neural network is : <br>\n  ys(h,t)=W(h,t)*g(x(t),S("
"h,t),C(h,t)) <br>\n  where W(h,t) is the output weight matrix related to the "
"h-th output, \n  g is the vector of radial basis functions of the input x(t),"
" and\n  finally S(h,t) and C(h,t) are vectors of widths and centers \n  (rela"
"tive to the h-th output).<br>\n<br>\n  STATE EQUATION (Learning Algorithm): <"
"br>\n  Being e(h,t)=y(h,t)-ys(h,t) the h-th element of the error vector, \n  "
"at a time t, and x(t) the input vector at the same time, we indicate \n  with"
" bmu the nearest unit (among those related to the h-th output) \n  to the cur"
"rent position of the input, and with sec is the nearest \n  unit after the bm"
"u. The neighborhood of the bmu is defined as the set of \n  all the units tha"
"t are connected to the bmu by the interlayer connection \n  matrix CN, that i"
"s all the units i such that CN(h,t,bmu,i) > 0. <br>\n  Firstly, the connectio"
"n matrix is updated, by setting to 1 the strength of \n  the connection betwe"
"en bmu and sec, ( that is CN(h,t+1,bmu,sec)=1 and \n  CN(h,t+1,sec,bmu)=1 ), "
"and by multiplying by a value alpha < 1 the \n  strength of all the other con"
"nections ( that is for every i,j <> bmu,sec \n  CN(h,t+1,i,j)=CN(h,t,i,j)*alp"
"ha ). Also, all the connections whose \n  strength is less than a threshold t"
"heta are deleted (i.e. their strength \n  is set to 0). This kind of updating"
" is also called \"Hebbian Learning\". <br>\n  The next step in the network ad"
"aptation algorithm consist in moving the \n  positions of the BMU and its nei"
"ghborhood toward the current input x(t), \n  following a so called \"Kohonen"
"\" rule. Specifically, if C(h,t,i) is\n  the position of the neuron i, relate"
"d to the output h, at time t,\n  then for each neuron i belonging to the neig"
"hborhood of the bmu \n  we have C(h,t+1,i)=epsilon(i)*(x(t)-C(h,t,i)). <br>\n"
"  Each neuron i is associated with a value called resource, R(h,t,i) and \n  "
"at this point in the algorithm, the resource of the bmu is updated, \n  speci"
"fically, the resource of the bmu is set to the error e(h,t) \n  divided by th"
"e number of times that the unit has been selected as bmu. <br>\n  If the mean"
" value of the resource of the whole network is greater than \n  a certain thr"
"eshold RsThr, and if the last neuron activation was more than \n  lambda step"
"s ago, then a new neuron is activated. <br>\n  The new neuron n is placed bet"
"ween the position of the unit with highest \n  resource w and the position of"
" the unit with highest resource within the \n  neighborhood of w, excluding w"
" itself, let us indicate it with v. \n  In detail, C(h,t+1,n)=C(h,t,w)+b*(C(h"
",t,v)-C(h,t,w)), where \n  b=R(h,t,w)/(R(h,t,w)+R(h,t,v)). <br>\n  The interl"
"ayer connections from w to n and from n to v are set to 1, \n  the original c"
"onnection between w and v is set to 0. Both resource and \n  weight of the ne"
"w neuron are computed by interpolating the resource and \n  weight of the two"
" neurons w and v:  <br> \n  R(h,t+1,n)=R(h,t,w)+b*(R(h,t,v)-R(h,t,w)) <br>\n "
" W(h,t+1,n)=W(h,t,w)+b*(W(h,t,v)-W(h,t,w)) <br>\n  The width of the basis fun"
"ction of n is set to \n  S(h,t,n)=overlap*(C(h,t,v)-C(h,t,w)), where overlap "
"is the so called\n  overlapping factor. <br>\n  Finally, as a last step of th"
"e adaptation algorithm, the vector X(t)\n  containing all the neural network "
"weights and widths,\n  is updated according to the gradient rule: <br>\n  X(t"
"+T)=X(t)-eta*(dys/dX)*e(t) <br>\n  where eta is the learning rate, dys/dX is "
"a jacobian matrix,\n  and T is the sampling time.\n</p>\n<p>\n  The final mas"
"k parameter is the sampling time of the block, T.\n</p>\n<p>\n  This block is"
" implemented with the matlab s-function dcsg5.m\n  to use it you should have "
"that file, (or its p-coded  version dcsg5.p),\n  in the matlab path. <br>\n  "
"For further reference see some papers on DSC Networks.<br>\n</p>\n<p>\n  Giam"
"piero Campa, June 4 2003\n</p>"
      MaskPromptString	      "[Ni No]|[Nmax Overlap RsThr Lambda Alpha Theta]"
"|[Epsb Epsn etaW etaS etaC]|Initial Condition, size = (Nmax*(Nmax+Ni+5)+2)*No"
"|Sample Time"
      MaskStyleString	      "edit,edit,edit,edit,edit"
      MaskTunableValueString  "on,on,on,on,on"
      MaskCallbackString      "||||"
      MaskEnableString	      "on,on,on,on,on"
      MaskVisibilityString    "on,on,on,on,on"
      MaskVariables	      "Dim=@1;norlat=@2;eta=@3;S=@4;T=@5;"
      MaskInitialization      "if prod(size(Dim))==1,Dim=[Dim 1]; end"
      MaskIconFrame	      on
      MaskIconOpaque	      on
      MaskIconRotate	      "none"
      MaskIconUnits	      "autoscale"
      MaskValueString	      "[4 1]|[50     0.2        0.01    600       0.99"
"    0.005]|[0.03  0.003  0.1  0.001   0 ]|0|0.05"
      Port {
	PortNumber		1
	Name			"nyn"
	TestPoint		off
	RTWStorageClass		"Auto"
      }
      System {
	Name			"DCS (Matlab)"
	Location		[82, 283, 391, 459]
	Open			off
	ModelBrowserVisibility	off
	ModelBrowserWidth	200
	ScreenColor		"automatic"
	PaperOrientation	"landscape"
	PaperPositionMode	"auto"
	PaperType		"usletter"
	PaperUnits		"inches"
	ZoomFactor		"100"
	AutoZoom		on
	Block {
	  BlockType		  Inport
	  Name			  "x"
	  Position		  [30, 53, 60, 67]
	  Port			  "1"
	  PortWidth		  "-1"
	  SampleTime		  "-1"
	  DataType		  "auto"
	  SignalType		  "auto"
	  Interpolate		  on
	}
	Block {
	  BlockType		  Inport
	  Name			  "e"
	  Position		  [30, 78, 60, 92]
	  Port			  "2"
	  PortWidth		  "-1"
	  SampleTime		  "-1"
	  DataType		  "auto"
	  SignalType		  "auto"
	  Interpolate		  on
	}
	Block {
	  BlockType		  Inport
	  Name			  "LE"
	  Position		  [30, 103, 60, 117]
	  Port			  "3"
	  PortWidth		  "-1"
	  SampleTime		  "-1"
	  DataType		  "auto"
	  SignalType		  "auto"
	  Interpolate		  on
	}
	Block {
	  BlockType		  Demux
	  Name			  "Demux1"
	  Ports			  [1, 2, 0, 0, 0]
	  Position		  [210, 45, 215, 120]
	  BackgroundColor	  "black"
	  ShowName		  off
	  Outputs		  "[Dim(2) (2+norlat(1)*(norlat(1)+Dim(1)+5))*"
"Dim(2)]"
	}
	Block {
	  BlockType		  Mux
	  Name			  "Mux5"
	  Ports			  [3, 1, 0, 0, 0]
	  Position		  [90, 45, 95, 125]
	  ShowName		  off
	  Inputs		  "[Dim(1) Dim(2) 1]"
	  DisplayOption		  "bar"
	}
	Block {
	  BlockType		  "S-Function"
	  Name			  "S-Function"
	  Ports			  [1, 1, 0, 0, 0]
	  Position		  [120, 61, 185, 109]
	  FunctionName		  "dcsg5"
	  Parameters		  "Dim,norlat,eta,S,T"
	  PortCounts		  "[]"
	  SFunctionModules	  "''"
	  MaskIconFrame		  on
	  MaskIconOpaque	  on
	  MaskIconRotate	  "none"
	  MaskIconUnits		  "autoscale"
	}
	Block {
	  BlockType		  Outport
	  Name			  "ys"
	  Position		  [235, 58, 265, 72]
	  Port			  "1"
	  OutputWhenDisabled	  "held"
	  InitialOutput		  "[]"
	}
	Block {
	  BlockType		  Outport
	  Name			  "X"
	  Position		  [235, 93, 265, 107]
	  Port			  "2"
	  OutputWhenDisabled	  "held"
	  InitialOutput		  "[]"
	}
	Line {
	  SrcBlock		  "LE"
	  SrcPort		  1
	  DstBlock		  "Mux5"
	  DstPort		  3
	}
	Line {
	  SrcBlock		  "e"
	  SrcPort		  1
	  DstBlock		  "Mux5"
	  DstPort		  2
	}
	Line {
	  SrcBlock		  "x"
	  SrcPort		  1
	  DstBlock		  "Mux5"
	  DstPort		  1
	}
	Line {
	  SrcBlock		  "S-Function"
	  SrcPort		  1
	  DstBlock		  "Demux1"
	  DstPort		  1
	}
	Line {
	  SrcBlock		  "Mux5"
	  SrcPort		  1
	  DstBlock		  "S-Function"
	  DstPort		  1
	}
	Line {
	  SrcBlock		  "Demux1"
	  SrcPort		  2
	  DstBlock		  "X"
	  DstPort		  1
	}
	Line {
	  SrcBlock		  "Demux1"
	  SrcPort		  1
	  DstBlock		  "ys"
	  DstPort		  1
	}
      }
    }
    Block {
      BlockType		      SubSystem
      Name		      "DCSL (Matlab)"
      Ports		      [3, 2, 0, 0, 0]
      Position		      [265, 369, 335, 451]
      BackgroundColor	      "darkGreen"
      ShowPortLabels	      on
      MaskType		      "LDCS NN"
      MaskDescription	      " Self Adaptive Discrete Time Piecewise Linear D"
"CS Neural Network"
      MaskHelp		      "<p>\n  This Neural Network is used to adaptivel"
"y approximate\n  a vector field y=f(x), where x(t) is a vector of size Ni, \n"
"  and y(t) is a vector of size No.\n</p>\n<p>\n  The first input is x (normal"
"ly but not necessarily \n  scaled between -1 and 1).<br>\n  The second input "
"is the error signal (i.e. e=y-ys).<br>\n  The third input is the learning ena"
"ble:\n       with LE=1 the learning is enabled, \n       with LE=0 the learni"
"ng is disabled.\n</p>\n<p>\n  The first output is the learned function ys(x)."
" <br>\n  The second output is the states matrix reshaped columnwise.\n  Note "
"that the network has in total No*(Nmax*(Nmax+Ni+5)+2) states,\n  where Nmax i"
"s the maximum number of neurons per output.\n</p>\n<p>\n  The first parameter"
" in the mask is a vector containing Ni and No, \n  namely the dimensions (num"
"ber of elements) of x and y.\n</p>\n<p>\n  The second parameter is a vector c"
"ontaining:<br>\n  1) Nmax : the maximum number of active neurons for a single"
" output. \n      The total maximum number of active neurons in the whole netw"
"ork \n      is this value multplied by the number of outputs, that is Nmax*No"
". <br>\n  2) Resource Threshold, a new neuron is added only if the \n      re"
"source is greater than than this threshold. <br>\n  3) Lambda, is the number "
"of steps between insertion,\n      a new neuron is activated only if the numb"
"er of steps \n      from the last activation is greater than lambda. <br>\n  "
"4) Alpha, is the connection weight decay constant\n      for the neighborhood"
" of the best matching unit.<br>\n  5) Theta is the connection deletion thresh"
"old, that is when \n      the weight falls below this threshold, it is set to"
" 0, \n      so the neuron is deactivated.\n</p>\n<p>\n  The third parameters "
"is a 3 elements vector containing : <br>\n  1) The two Kohonen update coeffic"
"ients for the best \n  matching unit and its neighborhood <br>\n  2) The lear"
"ning rate for all the neurons weights.\n</p>\n<p>\n  The initial condition mu"
"st be a vector of size \n  No*(Nmax*(Nmax+Ni+4)+2). If its norm is zero, then"
" an \n  appropriate initial condition (two neurons near zero) is chosen.\n</p"
">\n<p>\n  STATE VECTOR MEANING: <br>\n  Each output h is related to a contigu"
"ous vector of Nmax*(Nmax+Ni+5)+2 \n  states that are organized as follows: <b"
"r>\n  ofY + 1 counter :  counts sampling times since last neuron activation. "
"<br>\n  ofY + 1 + [1..Nmax*Nmax] : interlayer connection matrix. <br>\n  ofY "
"+ 1 + Nmax*Nmax + [1..Nmax*Ni] : neurons centers <br>\n  ofY + 1 + Nmax*(Nmax"
"+Ni) + [1..Nmax]: neurons weigths <br>\n  ofY + 1 + Nmax*(Nmax+Ni+1) + [1..Nm"
"ax] : neurons cumulative resource <br>\n  ofY + 1 + Nmax*(Nmax+Ni+2) + [1..Nm"
"ax] : neurons resource<br>\n  ofY + 1 + Nmax*(Nmax+Ni+3) + [1..Nmax] : number"
" of times that each neuron has been bmu <br>\n  ofY + 1 + Nmax*(Nmax+Ni+4) + "
"1 : number of active neurons <br>\n  where ofY=(h-1)*(Nmax*(Nmax+Ni+4)+2) is "
"the offset related to the h-th output.<br>\n  It is important to note that th"
"e states related to a certain output\n  are independent from the states relat"
"ed to a different output.\n</p>\n<p>\n  BRIEF EXPLANATION OF THE ALGORITHM: <"
"br>\n  This Neural Network is essentially a piecewise linear network with an "
"\n  additional lateral connection structure between the neural units \n  of t"
"he hidden layer. This structure is used in an attempt to mirror \n  the topol"
"ogy of the input manifold. <br>\n  The learning algorithm, in order to decrea"
"se the error, \n  changes weights and positions of the basis functions.\n  Th"
"e estimation error e(k) is accumulated locally to each neuron and \n  used to"
" determine where (and if) to activate a new neuron. <br> \n  DSC stands for D"
"ynamic Cell Structure.<br>\n<br>\n  OUTPUT EQUATION: <br>\n  At any given tim"
"e t, if x(t) is the input vector, we indicate \n  with bmu the nearest unit ("
"among those related to the h-th output) \n  to the current position of the in"
"put, and with sec is the nearest \n  unit after the bmu. The neighborhood of "
"the bmu is defined as the set of \n  all the units that are connected to the "
"bmu by the interlayer connection \n  matrix CN, that is all the units i such "
"that CN(h,t,bmu,i) > 0. <br>\n  Being W(h,t,bmu) and W(h,t,sec) the weights a"
"ssociated with the \n  bmu and the sec units, and dist(bmu,x) and dist(sec,x)"
" their distances \n  from the current input point x(t), then the h-th output "
"of the neural \n  network is :<br>\n  ys(h,t)=W(h,t,bmu) if dist(sec,x)>dist("
"sec,bmu) <br>  \n  ys(h,t)=W(h,t,bmu)+b*(W(h,t,sec)-W(h,t,bmu)) otherwise <br"
">\n  where b=dist(bmu,x)/(dist(bmu,x)+dist(sec,x)) <br>\n<br>\n  STATE EQUATI"
"ON (Learning Algorithm): <br>\n  As a first step, the connection matrix is up"
"dated, by setting to 1 the \n  strength of the connection between bmu and sec"
", ( that is \n  CN(h,t+1,bmu,sec)=1 and CN(h,t+1,sec,bmu)=1 ), and by multipl"
"ying by a \n  value alpha < 1 the strength of all the other connections ( tha"
"t is for \n  every i,j <> bmu,sec CN(h,t+1,i,j)=CN(h,t,i,j)*alpha ). \n  Also"
", all the connections whose strength is less than a threshold theta \n  are d"
"eleted (i.e. their strength is set to 0). \n  This kind of update law is also"
" called \"Hebbian Learning\". <br>\n  The next step in the network adaptation"
" algorithm consist in moving the \n  positions of the BMU and its neighborhoo"
"d toward the current input x(t), \n  following a so called \"Kohonen\" rule. "
"Specifically, if C(h,t,i) is\n  the position of the neuron i, related to the "
"output h, at time t,\n  then for each neuron i belonging to the neighborhood "
"of the bmu \n  we have C(h,t+1,i)=epsilon(i)*(x(t)-C(h,t,i)). <br>\n  Each ne"
"uron i is associated with a value called resource, R(h,t,i) and \n  at this p"
"oint in the algorithm, the resource of the bmu is updated, \n  specifically, "
"the resource of the bmu is set to the error e(h,t),\n  divided by the number "
"of times that the unit has been selected as bmu. \n  Note that e(h,t)=y(h,t)-"
"ys(h,t) is the h-th element of the error vector. <br>\n  If the mean value of"
" the resource of the whole network is greater than \n  a certain threshold Rs"
"Thr, and if the last neuron activation was more than \n  lambda steps ago, th"
"en a new neuron is activated. <br>\n  The new neuron n is placed between the "
"position of the unit with highest \n  resource w and the position of the unit"
" with highest resource within the \n  neighborhood of w, excluding w itself, "
"let us indicate it with v. \n  In detail, C(h,t+1,n)=C(h,t,w)+b*(C(h,t,v)-C(h"
",t,w)), where \n  b=R(h,t,w)/(R(h,t,w)+R(h,t,v)). <br>\n  The interlayer conn"
"ections from w to n and from n to v are set to 1, \n  the original connection"
" between w and v is set to 0. Both resource and \n  weight of the new neuron "
"are computed by interpolating the resource and \n  weight of the two neurons "
"w and v:  <br> \n  R(h,t+1,n)=R(h,t,w)+b*(R(h,t,v)-R(h,t,w)) <br>\n  W(h,t+1,"
"n)=W(h,t,w)+b*(W(h,t,v)-W(h,t,w)) <br>\n  The width of the basis function of "
"n is set to \n  S(h,t,n)=overlap*(C(h,t,v)-C(h,t,w)), where overlap is the so"
" called\n  overlapping factor. <br>\n  Finally, as a last step of the adaptat"
"ion algorithm, the vector W(t)\n  containing all the neural network weights i"
"s updated according to \n  the gradient rule: <br>\n  W(t+T)=W(t)-eta*(dys/dW"
")*e(t) <br>\n  where eta is the learning rate, dys/dW is the jacobian matrix,"
"\n  and T is the sampling time.\n</p>\n<p>\n  The final mask parameter is the"
" sampling time of the block, T.\n</p>\n<p>\n  This block is implemented with "
"the matlab s-function dcslin4.m\n  to use it you should have that file, (or i"
"ts p-coded  version dcslin4.p),\n  in the matlab path. <br>\n  For further re"
"ference see some papers on DSC Networks.<br>\n</p>\n<p>\n  Giampiero Campa, J"
"une 10 2003\n</p>"
      MaskPromptString	      "[Ni No]|[Nmax RsThr Lambda Alpha Theta]|[Epsb E"
"psn etaW]|Initial Condition , size=(Nmax*(Nmax+Ni+4)+2)*No|Sample Time"
      MaskStyleString	      "edit,edit,edit,edit,edit"
      MaskTunableValueString  "on,on,on,on,on"
      MaskCallbackString      "||||"
      MaskEnableString	      "on,on,on,on,on"
      MaskVisibilityString    "on,on,on,on,on"
      MaskVariables	      "Dim=@1;nrlat=@2;eta=@3;S=@4;T=@5;"
      MaskInitialization      "if prod(size(Dim))==1,Dim=[Dim 1]; end"
      MaskIconFrame	      on
      MaskIconOpaque	      on
      MaskIconRotate	      "none"
      MaskIconUnits	      "autoscale"
      MaskValueString	      "[4 1]|[50     0.01    600       0.99    0.005]|"

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -