⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 todo

📁 Lin-Kernighan heuristic for the TSP and minimum weight perfect matching
💻
字号:
Medium term (post-proposal, pre-writing thesis):------------------------------------------------Add depth logging to match, or factor it out from jbmr and then add toboth jbmr and match.Still need to do non-metric inspired instances.Medium long term (post-writing, pre-defense):---------------------------------------------Long term (post-defense):-------------------------Factor out options into an lkoption object.  This should elide many of thespurious reasons for including lk.h.  That should help in moving toa library, and it should reduce make times when lk.w changes.Implement Martin-Otto-Felten -style chained optimization (maybe)(randomized acceptance of worse tours.)Optional:---------Fix tspreorder so it works on all edge types (can't reorder upper row matricesright now...)  Maybe use a fixed-up TSP.pm.  Update: I'm fixing up TSP.pm.  It is functional enough to create a simpletsplib2rothberg.plFor multiple runs, DSJ uses randomized construction heuristics.  Seeemail from him August 1 1998.Update: With randomized construction routines, this isn't so critical, so I downgraded its priority.Insert code into match.w so that we can get graphs out of it.Upgrade nn so it does nq. (Done)	Be smarter about it, e.g. don't search saturated quadrants.Write an edge length distribution sampler.	This might not be as good as I might think, because the clones are	not that interesting, visually anyway (for now).Convert to use HAVE_* macros. (partially done)Write self-check target lk.Deprecated items:-----------------Write better scripts for generating graphs.	1998/10/17.  See data dir, *.pl, and gen*.pl in this dir.Create a script to generate all the data in the data/ directoryaccording to the rules given in my methodology section.Write a script to perform Held-Karp computations on the instancesfor which I haven't got values from elsewhere.	1998/10/17.  See data dir, *.pl, and gen*.pl in this dir.Non-metric instance data.	1989/10/17.  DSJRandom now included; see data dir.Adapt others' code for max-weight matching to min-weight matching.  Must learn primal-dual.Update: Rothberg's code does min weight and max weight matching.  But ituses type int as the length type.	Done. 1998/10.  Watch out in Rothberg's code for 2args vs. 3 args to	perfect_match.	Even better, get blossom4 from William Cook & Andre Rohe.  That works	much better and faster.Find out that fscking GEO distance function.  Check the referenceimplementations, e.g. 	Done. 1998/10.  Got in touch with DSJ about DSJ_RANDOM, too,	and made it the same as behaviour in Concorde, w.r.t. seeds, etc.Fix Held-Karp to try to use the nnlists for all but the lastiteration.  This affects decluster (MST computation) too.	Done. 1998/10/16   I also tuned parameters.Implement randomized construction routines.  See mail from DSJ Aug 1 1998.1998/8/9: Done.  Now I don't have to rely on tspreorder to transforminstances.  But it was trickier than I thought it would be.Right now I do all possibilites for t6 t7 t8 when t1 through t6 is nota valid 3-change.  JBMR don't: they do only the first valid one (feasibileand cumgain).  See Johnson and McGeoch chapter, page 257, first fullparagraph.  So my code does more work than theirs, potentially.  I mustfix that with a compile-time option.Update: 1998/8/8 (lk-0.4.6).  This is a command-line option:	--no-extra-backtrack (the default), and	--extra-backtrackA -c nq 10 -i 100 (40 quadrant neighbours, ILK(100)) run on dsj1000, theoptimization phase is 2-8 times faster, with better answers (sometimes 2times closer to optimal).  Sounds good.But variance in run times is still moderate: up to 25% on differenttspreorder'ed runs.  Modify all code so it can handle arbitrary instances.	Now LK runs from start to finish with arbitrary instances supported by 	read (i.e. LOWER_DIAG_ROW, UPPER_ROW, EXPLICIT, EUC_2D, CEIL_2D).	1998/4/16.Finish jitter	src/jitter.wWrite shake	src/shake.wGenerate Bentley distributions	script/tspbgen.pl.inRun experiments with --maxdepth 50	This is now supported in lkdoit via --lk-optionGenerate TSP instances based on distribution of edge lengths.	DONE.  They all look the same: very sparse, fractal dimension 1.1?Switch over from LINEAR to SPLAY in the tabu check.	In light of declustering this might not matter much.Implement CEIL_2D, and check that kdtree is ok with it.Implement supports_kd_tree.	DONE October 1997.  Both work.Write to log: cost(t[1],t[2]) and decluster_d(t[1],t[2]):	DONE.  It appears variation in runtimes is mainly	due to start tour variation, and these effects might be	exacerbated by very clustered instances.Be smarter about which city we next choose for t[1].	- most recently changed	- longest edge remaining among dirty cities	- edge with longest excess over cluster distance. (this might be great!) 	- edge with longest excess over nearest neighbour. (this might be better...)	In light of the above, this might not matter much.Dust off HK code from 1994.	Done.  It works with read, error, memory as in LK.	Now integrate it with lk?	Add better option handling.graft held-karp code onto lk, or write a script to get its output and	dump it into an answer database.	Integrate		prim.w		hk.w		ascend.w		ee.w		branch.w		kruskal.w		upper.w	Namespace clean:		prim	Sequential only:		primThis item has been finished.  The output isn't pretty, but it's functional.		write a perl script to combine results into a latex table.	Done.  	See script/least.pl for getting least length among runs on an instance.	See script/results.pl for generating a table.  It's semi-automatic.extend expt.pl to allow for clones (perhaps parameter inside expt.data)	Done.  See also the ``require'' directive for expt scripts.Adapt match for graphs with odd number of vertices (see. DIMACS1)	Done.  See src/read.wRelease the code as lk-0.1.0.Update: Released at lk-0.4.0See http://www.cs.utoronto.ca/~neto/research/lkCheck the distribution on IRIX.  	Use the native C compiler on IRIX.  	(CC=cc ./configure; also, make sure we aren't using COFF)Check the distribution on cygwin32.	(I get an internal compiler error with beta 18.  Did not investigate.)Update: Under beta19, it compiles once I change the return type ofgetpagesize to size_t as it properly should be.JBMR and Cook et.al. use a FIFO for the dirty/active cities.  Do the same,with a compile-time option for previous behaviour.Update: done in lk-0.4.5.ILK for matching.	Done.  match.w version 1.12	Iterating LK for matching doesn't seem to improves solutions.  But it	*does* take longer.... :-)Use hash tables (even broken ones) for TABU check. (maybe)->Update: Cook et. al's Concorde is released.  It uses a conservative test:hash function is h(u,v) = u xor v.  It doesn't resolve collisions withchains.  It just answers ``present'' possibly too many times.  Simple,effective, and mostly right.   But IIRC, the splay tree dictionary code uses no more than 6% of runtime, and it is precise.->Update: Finished 1998/8/20.  I implemented precise tabu for TSP lists based on hash (xor func) with chaining.  Same deg vs. no_d dichotomy.Thank goodness!  Still don't have it for matching.  But it's slightly more complicated because matching is simpler.  (It backs off one by onein the backtracking phase.)  I'd have to change the tabu_hash moduleinterface.MST dangle and construct (and other inspired distributions)->Update: Sun Aug 23 17:45:21 EDT 1998	Dangle and construct now works.  See tspgen.  However, geometry	sometimes makes pure dangling and constructing impossible.  We try	our greedy best, though, and report how far off we are.	The instances produced by dangle and construct are *locally* similar	in TSP, but globally are very stringy and too far separated, much	like tspgen output when packing factor and join length bias are small.	Alas.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -