⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 tutor_nn_excute.m

📁 这是个神经网络工具箱的一点MATALAB 程序
💻 M
字号:
%   File Name      : tutor_nn_excute.m
%   Purpose        : excuting the commands associated with the buttons of the
%                    RNN tutorial window 
%   Author         : Hossam E. Mostafa Abdelbaki, School of Computer Science, 
%                    University of Centeral Florida (UCF). 
%   Release        : ver. 1.0.
%   Date           : October 1998.
%
%       RNNSIM is a software program available to the user without any 
%   license or royalty fees. Permission is hereby granted to use, copy, 
%   modify, and distribute this software for any purpose. The Author 
%   and UCF give no warranty, express, implied, or statuary for the 
%   software including, without limitation, waranty of merchantibility 
%   and warranty of fitness for a particular purpose. The software 
%   provided hereunder is on an "as is"  basis, and the Author and the 
%   UCF has no obligation to provide maintenance, support, updates, 
%   enhancements, or modifications. 
%
%       RNNSIM  is available for any platform (UNIX, PCWIN, MACHITOCH). 
%   It runs under MATLAB ver. 5.0 or highrer. 
%
%       User feedback, bugs, or software and manual suggestions can 
%   be sent via electronic mail to :   ahossam@cs.ucf.edu

function tutor_nn_excute(dir)

  RnnTutorFigHndl = findobj('Tag','RnnTutorFig');
  RnnTutorFigST1Hndl = findobj(RnnTutorFigHndl,'Tag','RnnTutorFigST1');
  RnnTutorFigPrevPB = findobj(RnnTutorFigHndl,'Tag','RnnTutorFigPrevPB');
  RnnTutorFigNextPB = findobj(RnnTutorFigHndl,'Tag','RnnTutorFigNextPB');
  UD_PB1 = str2num(get(RnnTutorFigPrevPB,'UserData'));
  set(RnnTutorFigST1Hndl,'FontSize',13 );
  set(RnnTutorFigST1Hndl,'HorizontalAlignment','left');

  switch dir
   
     case  'Next'  
        UD_PB1 = UD_PB1 + 1;
        
     case 'Prev'  
        UD_PB1 = UD_PB1 - 1;
        
     case 'Quit'  
        close(RnnTutorFigHndl);
        return
  end   
  
  if (UD_PB1 < -1)
     UD_PB1 = -1;
  end
  
  if(UD_PB1 >= 12)
     UD_PB1 = 12;
  end
  
  qq = num2str(UD_PB1);
  set(RnnTutorFigPrevPB,'UserData', qq);
  
  if(UD_PB1 == -1 )
     set(RnnTutorFigPrevPB, 'Enable', 'off');
     set(RnnTutorFigNextPB, 'Enable', 'on');
     
     tutorStr = ...
        ['                                   RNN TUTORIAL                 '];      
     set(RnnTutorFigST1Hndl,'String',tutorStr);
  end
  
  if(UD_PB1 == 0 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   1/11                       '      
        '     The RNN model allows arbitrary interconnections          '
        '     between neurons. It has a mathematical proof that        '
        '     it has a unique solution if the stability conditions are '
        '     met. By appropriately mapping  external signals and      ' 
        '     neuron states into certain physical quantities, it has   '
        '     been applied successfully  to several practical problems '   
        '     The algorithm implemented in this program is based       '
        '     on the following publication:                            '
        '                                                              '
        '     E. Gelenbe "Learning in the recurrent random             '
        '     network", Neural Computation, 5, pp 154-164, 1993.       ']; 
     set(RnnTutorFigST1Hndl,'String',tutorStr);
  end
  
  
  if(UD_PB1 == 1 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                               2/11                     '      
        'Difference between the Random Neural Network (model/learning  algorithm)'  
        'and other supervised Neural Network (models/learning algorithms):       '                           
        '     ==============================================================     '
        '1- Its representation for the artificial neurons is more close to the   '
        '   biophysical neurons. Where a neuron fires a train of impulses along  '
        '   its axon when it is exited and after some time the neuron may fire   '
        '   another train of pulses as a result of the same excitation.          '
        '   (see Kandel, Schwartz,  Principle of neural science , Elsevier,      '
        '   Amsterdam, 1995.                                                     '
        '                                                                        '
        '2- It has a general learning algorithm that can be applied to the feed  '
        '    forward model ad the recurrent model.                               '
        '                                                                        '
        '3- It needs more time during training than other models but once the    '
        '   network is trained it becomes very fast (faster than the other       '
        '   trained NN models) since there is no calculation of a non linear     '
        '   function in the Random Neural Network ( RNN ) model.                 '];
     set(RnnTutorFigST1Hndl,'Fontsize',10);
     set(RnnTutorFigST1Hndl,'String',tutorStr);
  end
     
  if(UD_PB1 == 2 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
     ['                                                 3/11                   '      
      '4- The trained network has simple hardware implementation.              '
      '                                                                        '
      '5- All the weights resulting from training of the RNN are unsigned      '
      '   numbers  and this eliminates problems in the hardware implementation '
      '   of negative weights. Most NN models do not put restriction on the    '
      '   sign or value of the weights. This limitation is hardly noticeable in'
      '   computer simulation but it becomes a critical issue when it comes to '
      '   dedicated hardware implementation, be they digital or analog.        '
      '  (see: P.H. Graf and L.D. Jackel., "Analog electronic neural network   '
      '   circuits," IEEE circuits and Devices magazine, pp.  44-55, July 1989 '
      '  T. Kohonen, Self Organization and Associative Memory. Springer Verlag,'
      '  Berlin, 1989.)                                                        '];
   set(RnnTutorFigST1Hndl,'Fontsize',10);
   set(RnnTutorFigST1Hndl,'String',tutorStr);
  end
  
  if(UD_PB1 == 3 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   4/11                       '      
        '     You can learn more about  the RNN from  the              '
        '     following puplications :                                 '
        '      ============================                            '
        '       A- THE BASIC RNN THEORY                                '  
        '      ============================                            ' 
        '      1- E. Gelenbe, "Random neural networks with             '
        '         positive and negative signals and product            '
        '         form solution," Neural Computation, Vol. 1,          '
        '         No. 4, pp 502-510, 1989.                             '];
     set(RnnTutorFigST1Hndl,'String',tutorStr);  
  end
  
  if(UD_PB1 == 4)
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   5/11                  '      
        '2-  E. Gelenbe,  "Stability of the random neural network '
        '    model," Neural Computation, Vol. 2, No. 2,           '
        '    pp 239-247, 1990.                                    ' 
        '3-  E. Gelenbe, A. Stafylopatis, "Global behavior of     '  
        '     homogenous random neural systems,"  Applied         ' 
        '     Mathematical Modelling, Vol. 15,  pp 534-541, 1991. '
        '4-  E. Gelenbe "Learning in the recurrent random         '
        '    network", Neural Computation, 5, pp 154-164, 1993.   '];
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
  end
  
  if(UD_PB1 == 5)
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   6/11                  '      
        '      ================================                   '
        '       B- APPLICATIONS IN OPTIMIZTION                    '
        '      ================================                   ' 
        '1-  E. Gelenbe, V. Koubi, and F. Pekergin "Dynamical     '
        '        random neural network approach to the traveling  '
        '        salesman problem," ELEKTRIK, Vol. 2, No. 2,      '
        '        pp l-10, 1994.                                   ' 
        '2-  A. Ghanwani, "A qualitative comparison of neural     '
        '       network models applied to the vertex covering     '
        '       problem, ELECTRIK, Vol. 2, No. 2, pp 1l-18, 1994. '];
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
  end
  if(UD_PB1 == 6 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   7/11                   '      
        '3- A. Ghanwani, "Neural networks for network optimization '
        '        Master thesis, Duke University, 1995.             '
        '4- E. Gelenbe "Learning in the recurrent random network", '
        '    Neural Computation, 5, pp 154-164, 1993.              '
        '  ========================                                '
        '   C- TEXTURE GENERATION                                  '
        '  ========================                                '
        '1- V. Atalay, E. Gelenbe, and N. Yalabik "The random      '
        '       neural network model for texture generation,"      '
        '       International Journal of Pattern Recognition and   ' 
        '       Artificial Intelligence, Vol. 6(1),pp 131-141, 1992'];
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
  end
  
  if(UD_PB1 == 7 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   8/11                   '      
        '2- V. Atalay, and E. Gelenbe "Parallel algorithm for      '
        '    colour texture generation using the random neural     '
        '    network model." International Journal of Pattern      '
        '    Recognition and Artificial and-telligence, Vol. 6,    '
        '    No. 2 & 3, pp 437-446, 1992.                          '
        '  ============================                            '
        '   C- FUNCTION APPROXIMATION                              ' 
        '  ===========================                             '
        '1- E. Gelenb, Z. H. Mao, and Y. D. Li, "Function          '
        '    approximation with random neural network," to appear  '
        '    in IEEE Trans. on Neural Networks Nov. 1998.          '];           
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
  end
  
  if(UD_PB1 == 8 )
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   9/11                   '      
        '  ==============================                          '
        '  D- IMAGE AND VIDEO COMPRESSION                          ' 
        '  ==============================                          '
        '1- E. Gelenbe, C. Cramer, and M. Sungur,"Random           '
        '    neural network learning and image compression,"       '
        '     Proceedings of the IEEE International Conference     '
        '     on Neural Networks, pp. 3996-3999, 1994              '
        '2- E. Gelenbe, C. Cramer, and M. Sungur, P. Gelenbe       '
        '     "Traffic and video quality in adaptive neural        '
        '     compression," Multimedia Systems, Vol. 4, pp.        '
        '     357-369, 1996.                                       ']; 
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
  end
  
  if(UD_PB1 == 9)
     set(RnnTutorFigPrevPB,'Enable','on');
     set(RnnTutorFigNextPB,'Enable','on');
     tutorStr = ...
       ['                                   10/11                  '      
        '3- C. Cramer, E. Gelenbe, and H. Bakircioglu "Low bit     ' 
        '     rate video compression with neural networks and      '
        '     temporal subsampling," Proceedings of the IEEE,      '
        '      Vol. 84, No. 10, pp. 1529-1543, October 1996.       '
        '4- E. Gelenbe, T. Feng, and K.R.R. Krishnan "Neural       '
        '       network methods for volumetric magnetic resonance  '
        '       imaging of the human brain," Proceedings of the    '
        '       IEEE, Vol. 84, No. 10, pp. 1488-1496, October 1996.']; 
     set(RnnTutorFigST1Hndl,'String',tutorStr);     
    end
    
    if(UD_PB1 == 10)
       set(RnnTutorFigPrevPB,'Enable','on');
       set(RnnTutorFigNextPB,'Enable','off');
       tutorStr = ...
         ['                                   11/11                  '      
          '       =================                                  '
          '            E- BOOKS                                      ' 
          '       =================                                  '
          '1- E. Gelenbe (ed. and co-author), "Neural Networks:      '
          '       Advances and Applications I, North-Holland Pub.    '
          '       Co. (Amsterdam), 1991.                             '
          '2- E. Gelenbe (ed. and co-author), "Neural Networks:      '
          '       Advances and Applications II, North-Holland Pub.   '
          '       Co. (Amsterdam), 1992.                             '];
       set(RnnTutorFigST1Hndl,'String',tutorStr);     
    end
    

   

   
  

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -