⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 readme

📁 SVDD的工具箱
💻
📖 第 1 页 / 共 2 页
字号:
  For this, also an example script is supplied.- Removed the 'oldplot' option in plotroc, due to very unclear spagetti  code and potential disastrous bugs- Improved the dd_crossval to take class priors into account.- Added an ex_dd9 to show dd_crossval- Added a variant of the ROC plot, the askerplot.* Notes on version 1.1.2- Optimized the simpleroc.m significantly.- Added the normalization of the classifier output to something like  probabilities- Removed some very very small bug from incsvdd.- Make preparations for OCC normalisation by introducing featdom ranges.- Some bugfixes- Improved the output of the outmog_dd: use Bayes rule to find the  posteriors.- Made the find_target work also on just labels.- Fixes a very rare situation in oc_set when just outlier data is  available* Notes on version 1.1.1- Changed dd_crossval to take the class information into account.- Added the feature in incsvdd that the kernel-type and kernel-parameter  can be exchanged (to make incsvdd compatible wich consistent_occ).- Removed some superfluous blanks in the display of oc_set.- Tiny improvement in the incsvdd* Notes on version 1.1.0- Change in the revision numbering. I'm going more in the Linux kernel  numbering, but we will see if I can stay consistent;-)- Added the gendatouts.m- Added the gendatoutg.m- Split nndd.m into dnndd.m and nndd.m- Split kcenter_dd.m into dkcenter_dd.m and kcenter_dd.m, and simplified  the individual classifiers- Added the  dknndd.m (now knndd.m is actually *not* a wrapper for  dknndd.m)- Changed the gauss_dd such it stores the inverted cov. matrix instead  of the original cov. matrix. Saves double work.- scale_range.m now is back to a linear distribution over the distances  instead of logarithmic. It looks better that way.- Changed oc_set to be able to handle several classes that become target  class.- Added the outmog_dd.m, which makes it possible to train a Mixture of  Gaussians using outlier objects during training.* Notes on version 1.12:- Changed the implementation of the dd_auc such that the AUC over a  restricted domain is more interpretable when you consider a  'standard'/'traditional' ROC curve.- Added the incremental SVDD for very large datasets, for user defined  kernels or for the case that not good QP optimizer is present. I  consider this still a bit experimental, but it actually works pretty  well!- Added the Mixture of Gaussians which can also model outlier data  (using both an almost uniform outlier class combined with normal  Gaussian clusters). - Make dlpdd work on non-square distance matrices (by Ela Pekalska).- Removed a bug in the dd_roc_old (thanks to Piotr Juczscak).- Removed a bug in dlpdd (thanks to Elzbieta Pekalska).* Notes on version 1.11:- Changed some implementation of newsvdd such that it uses the  standard optimizer.- Plotsom is now standard in Prtools, so removed from the Contents.m- Included an index in the manual.* Notes on version 1.10:- Significantly rewrote and rearranged oc_set.m and target_class.m- Changed dd_error.m and dd_roc.m to mimic testc.m  Also included the computation of the precision and recall.- Completely rewrote the ROC computation. Large amounts of complexity  are just removed (and thus also some features, I'm sorry).- Support the selection of hyperparameters using the consistency  criterion.- Added the robustified Gaussian (rob_gauss_dd) and the minimum  covariance determinant Gaussian (mcd_gauss_dd).- Removed a bad, bad, bad bug from gausspdf.m. - Made all the Gaussian methods use mahaldist.m for their evaluation.- Completely rewrote the SVDD. The confusing parameters fracrej and  fracerr are removed, and all the quadratic optimizers (libsvm, qld,  quadprog) are integrated.- Added the SVDD using general kernel definitions: ksvdd.m (although  it has a very annoying feature that you have to supply the values  for K(z,z) during the evaluation of object z, when you just supply  the kernel matrix: have a look at the help)- Rewrote the LPDD in terms of DLPDD, MYPROXM and DISSIM.- Implemented the SOM now nicely and removed the most obvious bugs.- Added the dd_crossval.- Added the dd_f1, for the computation of the f1-score.* Notes on version 1.01:- Changed the order of the mtimes: so w*a is replaced by a*w- Removed a bug in the creation of a one-class dataset from a  more-than-two class dataset in oc_set.m* Notes on version 1.00:- There is a *significant* change from updating from prtools3 to  prtools4 (prtools3.2.2 or higher). The definitions of the objects  'dataset' and 'mapping' have been upgraded. This requires the rewriting  of almost all code! It can therefore happen that new results are not  identical to results obtained by previous versions of the tools (but  they should not be very large).- dd_error is totally rewritten- names of is_ocset and is_occ are renamed to isocset and isocc to be  more consistent with the rest of matlab and prtools- som_dd is added.* Notes on version 0.99:- introduced dissim.m* Notes on version 0.95:- added a bit of help to each of the m-files.- programmed my own very basic kmeans clustering, because I needed it  also for other things. Therefore added  mykmeans.m- added plotroc.m to plot the classical ROC curve- made an extra check in dd_roc to see where the outputs of the target  class is stored (for my OCC's it is always in the first column, but  for general PRTools classifiers this does not have to be the case).  Now dd_roc should work for all prtools classifiers (trained on data  with 'target' and 'outlier' labels of course).- dd_fp.m added: compute the error on the outliers (fraction false  positive) of a trained classifier for a given error on the target  class (fraction false negative).- made my own version of proxm.m (myproxm.m) which uses the lpdistm.m.  It is used in kwhiten.m.- removed some horrible bug in lpdd! (one bloody minus sig...)- another horrible bug from kwhiten, in the case a fixed  dimensionality was requested... Furthermore, in case of a fraction  of retained variance was requested, the threshold is now set such  that *at least* this fraction is retained (could be higher also).- corrected the nu parameter in svdd and newsvdd in cases when example  outliers are used in training. Note that it cannot be done  completely correctly in newsvdd, because there just one single nu  parameter is allowed for all data.- included in plotg.m the possibility to just plot the decision  boundary.* Notes on version 0.9:! in the early versions of the svdd, the support vectors were  classified as outliers. Now they are forced to be target objects.  This will therefore change the classification results!- added gendatout:  generation of spherically distributed outlier objects- changed the place in which distm(a) was computed in the original  version of svdd. In previous versions, it was done over and over  again in f_svs, but now it is moved to the main svdd.m- removed a bug in range_svdd, where the sqrt of the D has to be taken  for the range of sigma.- fixed a bug in dd_roc. Now it is possible to supply 1D datasets for  computing the roc curve.- fixed an error in the help of dd_auc- added the function relabel- replaced all explicit references of the function name by 'mfilename'  in all one-class classifiers- added the random_dd, which randomly assigns labels- added lpdd.m, the linear programming data description. It works on  distances, and therefore I also had to add:  ddistm.m and lpdistm.m- added kwhiten.m, normalization to unit variance in the kernel space.  For that also center.m was needed.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -