📄 http:^^www.cs.utexas.edu^users^ml^uncertain.html
字号:
MIME-Version: 1.0
Server: CERN/3.0
Date: Tuesday, 07-Jan-97 15:56:18 GMT
Content-Type: text/html
Content-Length: 10283
Last-Modified: Wednesday, 28-Aug-96 15:57:24 GMT
<title>Uncertain Reasoning</title><h1>Uncertain Reasoning</h1>To view a paper, click on the open book image. <br> <br><ol><! ===========================================================================><a name="rapture-dissertation-96.ps.Z"</a><b><li>Combining Symbolic and Connectionist Learning Methods to RefineCertainty-Factor Rule-Bases<br></b>J. Jeffrey Mahoney<br>Ph.D. Thesis, Department of Computer Sciences, University of Texas at Austin, May, 1996.<p><blockquote>This research describes the system RAPTURE, which is designed torevise rule bases expressed in certainty-factor format. Recentstudies have shown that learning is facilitated when biased withdomain-specific expertise, and have also shown that many real-worlddomains require some form of probabilistic or uncertain reasoning inorder to successfully represent target concepts. RAPTURE was designedto take advantage of both of these results. <p>Beginning with a set of certainty-factor rules, along withaccurately-labelled training examples, RAPTURE makes use of bothsymbolic and connectionist learning techniques for revising the rules,in order that they correctly classify all of the training examples. Amodified version of backpropagation is used to adjust the certaintyfactors of the rules, ID3's information-gain heuristic is used to addnew rules, and the Upstart algorithm is used to create new hiddenterms in the rule base. <p>Results on refining four real-world rule bases are presented thatdemonstrate the effectiveness of this combined approach. Two of theserule bases were designed to identify particular areas in strands ofDNA, one is for identifying infectious diseases, and the fourthattempts to diagnose soybean diseases. The results of RAPTURE arecompared with those of backpropagation, C4.5, KBANN, and otherlearning systems. RAPTURE generally produces sets of rules that aremore accurate that these other systems, often creating smaller sets ofrules and using less training time. <p></blockquote><!WA0><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/rapture-dissertation-96.ps.Z"><!WA1><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="banner-proposal-95.ps.Z"</a><b><li> Refinement of Bayesian Networks by Combining Connectionist andSymbolic Techniques <br></b>Sowmya Ramanchandran<br>Ph.D. proposal, Department of Computer Sciences, University of Texasat Austin, 1995. <p><blockquote>Bayesian networks provide a mathematically sound formalism forrepresenting and reasoning with uncertain knowledge and are as suchwidely used. However, acquiring and capturing knowledge in thisframework is difficult. There is a growing interest in formulatingtechniques for learning Bayesian networks inductively. While theproblem of learning a Bayesian network, given complete data, has beenexplored in some depth, the problem of learning networks withunobserved causes is still open. In this proposal, we view thisproblem from the perspective of theory revision and present a novelapproach which adapts techniques developed for revising theories insymbolic and connectionist representations. Thus, we assume that thelearner is given an initial approximate network (usually obtained froma expert). Our technique inductively revises the network to fit thedata better. Our proposed system has two components: one componentrevises the parameters of a Bayesian network of known structure, andthe other component revises the structure of the network. Thecomponent for parameter revision maps the given Bayesian network intoa multi-layer feedforward neural network, with the parameters mappedto weights in the neural network, and uses standard backpropagationtechniques to learn the weights. The structure revision component usesqualitative analysis to suggest revisions to the network when it failsto predict the data accurately. The first component has beenimplemented and we will present results from experiments on real worldclassification problems which show our technique to be effective. Wewill also discuss our proposed structure revision algorithm, our plansfor experiments to evaluate the system, as well as some extensions tothe system.</blockquote><!WA2><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/banner-proposal-95.ps.Z"><!WA3><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="banner-icnn-96.ps.Z"</a><b><li>Revising Bayesian Network Parameters Using Backpropagation<br></b>Sowmya Ramachandran and Raymond J. Mooney<br><cite>Proceedings of the International Conference on NeuralNetworks (ICNN-96)</cite>, Special Session on Knowledge-Based ArtificialNeural Networks, Washington DC, June 1996. <p><blockquote>The problem of learning Bayesian networks with hidden variables is known tobe a hard problem. Even the simpler task of learning just the conditionalprobabilities on a Bayesian network with hidden variables is hard. In thispaper, we present an approach that learns the conditional probabilities ona Bayesian network with hidden variables by transforming it into amulti-layer feedforward neural network (ANN). The conditional probabilitiesare mapped onto weights in the ANN, which are then learned using standardbackpropagation techniques. To avoid the problem of exponentially largeANNs, we focus on Bayesian networks with noisy-or and noisy-andnodes. Experiments on real world classification problems demonstrate theeffectiveness of our technique.</blockquote><!WA4><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/banner-icnn-96.ps.Z"><!WA5><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="rapture-ml-94.ps.Z" </a><b> <li> Comparing Methods For Refining Certainty Factor Rule-Bases </b> <br> J. Jeffrey Mahoney and Raymond J. Mooney <br> <cite> Proceedings of the Eleventh International Workshop on MachineLearning</cite>, pp. 173-180, Rutgers, NJ, July 1994. (ML-94) <p><blockquote>This paper compares two methods for refining uncertain knowledge bases usingpropositional certainty-factor rules. The first method, implemented in theRAPTURE system, employs neural-network training to refine the certaintiesof existing rules but uses a symbolic technique to add new rules. The secondmethod, based on the one used in the KBANN system, initially adds acomplete set of potential new rules with very low certainty and allowsneural-network training to filter and adjust these rules. Experimental resultsindicate that the former method results in significantly faster training andproduces much simpler refined rule bases with slightly greater accuracy.</blockquote><!WA6><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/rapture-ml-94.ps.Z"><!WA7><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="rapture-isiknh-94.ps.Z" </a><b> <li> Modifying Network Architectures For Certainty-Factor Rule-Base Revision</b> <br> J. Jeffrey Mahoney and Raymond J. Mooney <br> <cite> Proceedings of the International Symposium on IntegratingKnowledge and Neural Heuristics 1994</cite>, pp. 75-85, Pensacola, FL,May 1994. (ISIKNH-94) <p><blockquote> This paper describes RAPTURE --- a system for revisingprobabilistic rule bases that converts symbolic rules into aconnectionist network, which is then trained via connectionisttechniques. It uses a modified version of backpropagation to refinethe certainty factors of the rule base, and uses ID3'sinformation-gain heuristic (Quinlan) to add new rules. Work iscurrently under way for finding improved techniques for modifyingnetwork architectures that include adding hidden units using theUPSTART algorithm (Frean). A case is made via comparison with fullyconnected connectionist techniques for keeping the rule base as closeto the original as possible, adding new input units only as needed.</blockquote><!WA8><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/rapture-isiknh-94.ps.Z"><!WA9><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="rapture-connsci-94.ps.Z" </a><b> <li> Combining Connectionist and Symbolic Learning to Refine Certainty-Factor Rule-Bases </b> <br> J. Jeffrey Mahoney and Raymond J. Mooney <br> <cite> Connection Science</cite>, 5 (1993), pp. 339-364. (Special issue onArchitectures for Integrating Neural and Symbolic Processing) <p><blockquote>This paper describes Rapture --- a system for revising probabilistic knowledgebases that combines connectionist and symbolic learning methods. Rapture usesa modified version of backpropagation to refine the certainty factors of aMycin-style rule base and it uses ID3's information gain heuristic to addnew rules. Results on refining three actual expert knowledge bases demonstratethat this combined approach generally performs better than previous methods.</blockquote><!WA10><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/rapture-connsci-94.ps.Z"><!WA11><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="rapture-mlw-92.ps.Z" </a><b> <li> Combining Symbolic and Neural Learning to Revise Probabilistic Theories </b> <br> J. Jeffrey Mahoney & Raymond J. Mooney <br> <cite> Proceedings of the 1992 Machine Learning Workshop on IntegratedLearning in Real Domains</cite>, Aberdeen Scotland, July 1992. <p><blockquote>This paper describes RAPTURE --- a system for revising probabilistictheories that combines symbolic and neural-network learning methods. RAPTURE uses a modified version of backpropagation to refine the certaintyfactors of a Mycin-style rule-base and it uses ID3's information gain heuristicto add new rules. Results on two real-world domains demonstrate that thiscombined approach performs as well or better than previous methods.</blockquote><!WA12><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/rapture-mlw-92.ps.Z"><!WA13><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><hr><address><!WA14><a href="http://www.cs.utexas.edu/users/estlin/">estlin@cs.utexas.edu</a></address>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -