⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 nn_var

📁 快速傅立叶变换程序代码,学信号的同学,可要注意了
💻
字号:
# neural net variables## file to be read by commander.p# each line is turned into a structure entry   	(str)#     	a default-setting entry			(def)#     	a usage-printing entry			(usg)# 	and a command line reading entry	(clr)# of these, the S, U and C can be disabled by >No S, etc.# and the D can be disabled by -.## variable    type deflt flag  short_name extra_action long_name#nc->report 	d  1 	-nnr report 		-	reporting stylenc->verbose 	d  1 	-nnv verbose		-	verbosity 0/1/2nc->decverbose 	d  1 	-nndv decverbose	-	verbosity in decodingnc->decwrite 	d  0 	-nndwrite verbosity	-	write decoding info to filenc->decfile	s  - 	-nndecfile file nc->decwrite=2; file for decoding info#>PU   NLNE; fprintf( fp, " Neural net initialization:");>PS  /* Neural net initialization */##  int write ; /* if write == 2 then default outfile is overridden */#  int read ;  /* if read  == 2 then default infile is overridden */#nc->read        d  0   	-nnread read	 	-	whether to read wtsnc->infile    	s  -    -nnin infile   	nc->read=2;  	weights from (instead of default)nc->outfile    	s  -    -nnout outfile  nc->write=2;  	weights out (instead of default)nc->init_rule  	d  1 	-nninit rule		-	how to init wtsnc->def_w 	f 1.0 -nndef_w  def_w		-	default initial weightnc->def_b 	f 0.0 -nndef_b def_b		- 	default initial biasnc->sigma_w0 	f  0.3 	-nnsigmaw0 sigma	-	initial random wtsnc->wseed	ld 2489 -nnwseed wseed		-	weight randomization#>PU   NLNE; fprintf( fp, " Neural net training:");>PS  /* 	About training */#nc->train	d  0 	-nntrain train		-	whether to trainnc->train_n	d  100 	-nnn n 			-	training numbernc->test_n	d  1000	-nntn n			-	training numbernc->trseed	ld 4896 -nntrseed trseed 	-	defines training setnc->teseed	ld 126999 -nnteseed teseed 	-	test setnc->regularize  d  1	-nnregularize	r	-	type of regularization 0/1/2>No Snc->alpha[1] 	f  0.000001  -nna1 a1 		-	regularization of biasnc->alpha[2] 	f 0.01	     -nna2 a2	 	-	regularization of inpsnc->alpha[3] 	f 20.0       -nna3 a3	 	-	regularization of 2nd type inps>S#   regularization constants:					#   bias is 1, inputs are 2 					#   initial runs gave sigma1 = 35 and s2 = 0.31	 	#>PU  NLNE; fprintf( fp, " Neural net optimizer:");>PS  /* Optimization procedure */#nc->opt		d 2 -nnopt opt			- macopt1 or 2nc->LOOP	d 5 -nnloops loops		-	Number of macopt runsnc->itmax 	d 100 -nnitmax itmax 		-	max no line searchesnc->tolmin 	f 0.00001 -nntolmin tolmin	- final tolerance in trainingnc->tol0 	f 0.1  -nntol0 tol0		-	initial tolerance in trainingnc->rich 	d 0  -nnrich rich		-	expensive optimizer?nc->end_on_step d 1  -nneos eos			- termination condition is that step is smallnc->CG		d 0  -cg cg			- whether to check gradient, on how manync->epsilon 	f 0.0001 -nneps epsilon 	- epsilon for check gradientnc->evalH 	d 1  	-nnevalH evalH		- evaluate hard performance measures#>PU  NLNE; fprintf( fp, " Neural net decoding procedure:");>PS  /* Neural net decoding procedure */#nc->hitlist_policy d 2 	-nnhp hp			-	1=if threshold exceeded; 2=sort nc->hitlist_thresh f 0.99 -nnhpt t  	   	-	hitlist thresholdnc->hitlist_n	d 10 	-nnhpn	n		-	number to aim to hit nc->hitlist_low  f 0.5  -nnhpl l		- 	-nc->decodits	d 10 	-nndecodits its		-	max number of iterations to do when decodingnc->decodn	d 1000 	-nndecodn n		-	number of examples to try to decodenc->decodseed	ld 126999 -nndecodseed seed	-	seed for decoding tests#>PS  /* various leftovers */#>No CUnc->write	d  1  -		 -		-	whether to write weightsnc->writeit 	d  0  	- -		 	-	undefined weight writing flagnc->RC 		d 2   - -			-  	no of reg classesnc->tolf 	f 0.5 - - 			-	factor by which tolerance decreasesnc->tol 	f 0.01 - -			-	tolerance in trainingnc->itEH	d -1 - - -				iterative decoder's errornc->itEHwb	d -1 - - -				iterative decoder's error (whole blocks)>CU>No Snet->thresh 	f 0.5 -nnthresh thresh		-	hard decision boundary>S

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -