📄 bsomdemo1.html
字号:
<title>Bayesian Self-Organizing Map Simulation using Java Applet</title><body bgcolor=#ddeeee><h1>Bayesian Self-Organizing Map Simulation (Ver. 1.3)</h1><p>The <i>Bayesian self-organizing map</i> (BSOM)is a method for estimating a probability distribution generating data pointson the basis of a Bayesian stochastic model.It is also regarded as a learning method for a kind of neural network.The black dots in the below figure denote artificially generated data points.The blue circles denote the multiple centroids of a BSOM model, which are parameters to specify the configuration of distributions.Initially the centroids are positioned randomly.The blue links between the centroids represent a predetermined topology of the model, which gives a constraint into the estimation of the parameters.In the Bayesian framework, such a constraint is expressed as a prior probability for the parametersand used for the stabilization of the estimation.In the present simulation, a line-segment topology is used.<p>This applet searches for the maximum <i>a posteriori</i> (MAP) estimates of the centroid parametersusing an <i>expectation-maximization</i> (EM) algorithm.You can start this algorithm by pressing the `learn' button.You can also initialize the centroids randomly by the `init' button.<p>The BSOM model has a pair of hyperparameters: alpha and beta, which represent`the strength of topological constraint' and `the estimate of noise level in data'respectively.You can vary them using the sliders.Observe the variation of the centroid configuration according to the values of the hyperparametersand grasp their meaning.Then try to find the optimal values of the hyperparameters giving the best centroid configuration.Remark that the configuration depends on not only the present values of hyperparametersbut also their history.Poor moving of the hyperparameterswill lead to a poor local optimal configuration.Actually the BSOM has an ability to search for the optimal values of the hyperparameters by itself.This ability is made active by pressing the `auto' button.<p>You can vary the distribution of artificial datausing the sliders named `width', `height', `phase' and `noise level'.You can also vary the number of the centroids by entering the number in #unitand pressing the return key.<hr><applet archive="bsom1.zip" code = "BsomDemo1.class" width = 420 height = 450><param name = "scale" value = "70" ><param name = "xd" value = "210" ><font color="#ff0000"><h1>Your browser can't interpret java-applets.</h1></font>So I can't show my demonstration for you.Please use the newest version of Netscape Navigator or another browser(Internet Explorer, HotJava, etc.) that can interpret (beta) java-applets.The newest version of Netscape Navigator can be gotten from the below site:<br><A HREF="http://home.netscape.com/comprod/mirror/index.html"><IMG SRC="../../gif/netnow3.gif" BORDER=1 WIDTH=88 HEIGHT=31></A><br>The newest version of Internet Explorer can be gotten from the below site:<br><A HREF="http://www.microsoft.com/ie/download/"><img src="../../gif/btn_download.gif"></a><p>Here, I show static figures instead.The first figure shows an initial state and the second shows a stationary state.<p><img src="za.gif"><img src="zb.gif"><p></applet><hr><h2>Note</h2><p>(1) The values displayed over the sliders are all relative.The sliders for hyperparameters are on log scales.<p>(2) You should start learning from a high value of alpha anda low value of beta, otherwise the BSOM will fall into an entangled configuration.When it falls into an entangled configuration,you can make the configuration simple by increasing alpha or decreasing beta.Alpha and 1/beta correspond to the temperature of physical systems.Strategy for avoiding poor local-optimum traps by slowly decreasing temperaturefrom a high-temperature state is called <i>simulated annealing</i>.<p>(3) Automatic hyperparameter search may fail if alpha is too largeor beta is too small at the start.In such a case, a little decreasing of alpha or increasing betamay lead to good search.However, when the noise level is too large,the BSOM gives up trying to detect a signal andmakes its configuration simplest (i.e., a straight line segment).<p>(4) You can move a centroid directly by mouse dragging.<p>(5) By pressing `density' button, estimated density is displayed using the gray scale.<p>(6) I strongly recommend using browsers with a just-in-time (JIT) compiler.<hr><h2>Related Models and Methods</h2><p>When alpha is fixed to an infinitely large value, BSOM is similar to<i>principal component analysis</i> (PCA).On the other hand, when alpha is fixed to zero(i.e., topological constraint is ignored)BSOM is regarded as <i>clustering analysis</i> based ona spherical Gaussian mixture model. Moreover, when beta is infinitely large, BSOM is almost same asthe <i>k-means</i> algorithm, vector quantization (VQ) and competitive learning.Thus, BSOM is regarded as an intermediate method between PCA and clustering.The <i>elastic net</i> is also an estimation algorithm for BSOMby the gradient ascent method, though I used an EM algorithm here.<p><hr><a href="http://www.aist.go.jp/NIBH/~b0616/Lab/BSOM1/sources.html"><h2>Sources</h2></a><hr><h2>Plan</h2><ul> <li> User data will be usable. <li> Extension to higher dimension.</ul><hr><h2>References</h2>A. Utsugi (1996) <a href="http://www.aist.go.jp/NIBH/~b0616/tssom.html">``Topology selection for self-organizing maps"</a>,Network: Computation in Neural Systems, vol. 7, no. 4 (in press)<br>A. Utsugi (1997) <a href="http://www.aist.go.jp/NIBH/~b0616/hssom.html">``Hyperparameter selection for self-organizing maps"</a>,Neural Computation, vol. 9, no. 3 (in press)<p><hr><br><a href="http://www.aist.go.jp/NIBH/~b0616/Lab/index-e.html">Back to Lab</a><br><a href="http://www.aist.go.jp/NIBH/~b0616/index.html">Go to my home page</a><p>Akio Utsugi <address><a href="mailto:utsugi@nibh.go.jp">utsugi@nibh.go.jp</a></address><p>Jun 24 1996 created<br>Nov 12 1996 updated</body>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -