📄 http:^^www.cs.utexas.edu^users^sowmya^pubs.html
字号:
MIME-Version: 1.0
Server: CERN/3.0
Date: Monday, 06-Jan-97 22:41:20 GMT
Content-Type: text/html
Content-Length: 6223
Last-Modified: Monday, 12-Aug-96 18:54:14 GMT
<title>Papers</title>To view a paper, click on the open book image. <br> <br><h1>Uncertain Reasoning</h1><ol><! ===========================================================================><a name="banner-proposal-95.ps.Z"</a><b><li> Refinement of Bayesian Networks by Combining Connectionist andSymbolic Techniques <br></b>Sowmya Ramanchandran<br>Ph.D. proposal, Department of Computer Sciences, University of Texasat Austin, 1995. <p><blockquote>Bayesian networks provide a mathematically sound formalism forrepresenting and reasoning with uncertain knowledge and are as suchwidely used. However, acquiring and capturing knowledge in thisframework is difficult. There is a growing interest in formulatingtechniques for learning Bayesian networks inductively. While theproblem of learning a Bayesian network, given complete data, has beenexplored in some depth, the problem of learning networks withunobserved causes is still open. In this proposal, we view thisproblem from the perspective of theory revision and present a novelapproach which adapts techniques developed for revising theories insymbolic and connectionist representations. Thus, we assume that thelearner is given an initial approximate network (usually obtained froma expert). Our technique inductively revises the network to fit thedata better. Our proposed system has two components: one componentrevises the parameters of a Bayesian network of known structure, andthe other component revises the structure of the network. Thecomponent for parameter revision maps the given Bayesian network intoa multi-layer feedforward neural network, with the parameters mappedto weights in the neural network, and uses standard backpropagationtechniques to learn the weights. The structure revision component usesqualitative analysis to suggest revisions to the network when it failsto predict the data accurately. The first component has beenimplemented and we will present results from experiments on real worldclassification problems which show our technique to be effective. Wewill also discuss our proposed structure revision algorithm, our plansfor experiments to evaluate the system, as well as some extensions tothe system.</blockquote><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/banner-proposal-95.ps.Z"><img align=top src="paper.xbm"></a><p><! ===========================================================================><a name="banner-icnn-96.ps.Z"</a><b><li>Revising Bayesian Network Parameters Using Backpropagation<br></b>Sowmya Ramachandran and Raymond J. Mooney<br>To appear in the Proceedings of the International Conference on NeuralNetworks (ICNN-96), Washington, D.C., June 1996, Special Session onKnowledge-Based Artificial Neural Networks. <p><blockquote>The problem of learning Bayesian networks with hidden variables is known tobe a hard problem. Even the simpler task of learning just the conditionalprobabilities on a Bayesian network with hidden variables is hard. In thispaper, we present an approach that learns the conditional probabilities ona Bayesian network with hidden variables by transforming it into amulti-layer feedforward neural network (ANN). The conditional probabilitiesare mapped onto weights in the ANN, which are then learned using standardbackpropagation techniques. To avoid the problem of exponentially largeANNs, we focus on Bayesian networks with noisy-or and noisy-andnodes. Experiments on real world classification problems demonstrate theeffectiveness of our technique.</blockquote><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/banner-icnn-96.ps.Z"><img align=top src="paper.xbm"></a><p><h1>Qualitative Modeling & Diagnosis</h1><! ===========================================================================><a name="misq-rt-qr-94.ps.Z"</a><b> <li>Learning Qualitative Models for Systems with Multiple Operating Regions<br></b>Sowmya Ramachandran, Raymond J. Mooney and Benjamin J. Kuipers <br><cite>Proceedings of the Eight International Workshop of QualitativeReasoning about Physical Systems</cite>, pp. 212-223, Nara, Japan,June 1994. (QR-94)<blockquote>The problem of learning qualitative models of physical systems fromobservations of its behaviour has been addressed by severalresearchers in recent years. Most current techniques limit themselvesto learning a single qualitative differential equation to model theentire system. However, many systems have several qualitativedifferential equations underlying them. In this paper, we present anapproach to learning the models for such systems. Our techniquedivides the behaviours into segments, each of which can be explainedby a single qualitative differential equation. The qualitative modelfor each segment can be generated using any of the existing techniquesfor learning a single model. We show that results of applying ourtechnique to several examples and demonstrate that it is effective.</blockquote><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/misq-rt-qr-94.ps.Z"</a><p><img align=top src="paper.xbm"></a><p><! ===========================================================================><h1>Neural Networks</h1><b> <li>Information Measure Based Skeletonisation<br></b>Sowmya Ramachandran and Lorien Y. Pratt<br><cite>Advances in Neural Information Processing SystemsVol. 4.</cite>, pp. 1080-1087, Denver, Colorado 1992<blockquote>Automatic determination of proper neural network topology by trimmingover-sized networks is an important area of study, which haspreviously been addressed using a variety of techniues. In this paper,we present Information Based Skeletonisation (IMBS), a new approach tothis problem where superfluous hidden units are removed based on theirinformation measure (IM). This measure, borrowed from decision ttreeinduction techniques, refelcts the degree to which the hyperplaneformed by a hidden unit discriminates between training dataclasses. We show the results of applying IMBS to three classificationtasks and demonstrate that it removes a substantial number of hiddenunits without significantly affecting network performance.</blockquote><hr>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -