⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 http:^^www.cs.utexas.edu^users^ml^nl.html

📁 This data set contains WWW-pages collected from computer science departments of various universities
💻 HTML
📖 第 1 页 / 共 2 页
字号:
logic programming (ILP).  ILP, which investigates the learning ofrelational (first-order) rules, provides an empirical method foracquiring knowledge within traditional, symbolic parsing frameworks.<p>This dissertation details the architecture, implementation andevaluation of CHILL a computer system for acquiring naturallanguage parsers by training over corpora of parsed text.  CHILLtreats language acquisition as the learning of search-control ruleswithin a logic program that implements a shift-reduce parser.  Controlrules are induced using a novel ILP algorithm which handles difficultissues arising in the induction of search-control heuristics.  Boththe control-rule framework and the induction algorithm are crucial toCHILL's success.<p>The main advantage of CHILL over propositional counterparts isits flexibility in handling varied representations.  CHILL hasproduced parsers for various analyses including case-role mapping,detailed syntactic parse trees, and a logical form suitable forexpressing first-order database queries.  All of these tasks areaccomplished within the same framework, using a single, generallearning method that can acquire new syntactic and semantic categoriesfor resolving ambiguities.<p>Experimental evidence from both aritificial and real-world corporademonstrate that CHILL learns parsers as well or better thanprevious artificial neural network or probablistic approaches oncomparable tasks.  In the database query domain, which goes beyond thescope of previous empirical approaches, the learned parser outperformsan existing hand-crafted system.  These results support the claim thatILP techniques as implemented in CHILL represent a viablealternative with significant potential advantages over neural-network,propositional, and probablistic approaches to empirical parserconstruction.<p></blockquote><!WA16><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-dissertation-95.ps.Z"><!WA17><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="wolfie-ack-95.ps.Z"</a><b><li>Acquisition of a Lexicon from Semantic Representations of Sentences<br></b>Cynthia A. Thompson<cite>33rd Annual Meeting of the Association of Computational Linguistics</cite>, pp. 335-337, Boston, MA July 1995 (ACL-95). <p><blockquote>A system, WOLFIE, that acquires a mapping of words to their semanticrepresentation is presented and a preliminary evaluation is performed.Tree least general generalizations (TLGGs) of the representations of input sentences are performed to assist in determining the representationsof individual words in the sentences.  The best guess for a meaningof a word is the TLGG which overlaps with the highest percentage of sentence representations in which that word appears. Some promising experimental results on a non-artificial dataset are presented.</blockquote><!WA18><A href="http://www.cs.utexas.edu/users/ml/ftp/papers/wolfie-acl-95.ps.Z"><!WA19><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="chill-compling-95.ps.Z"</a><b><li>An Inductive Logic Programming Method for Corpus-based ParserConstruction<br></b>John M. Zelle and Raymond J. Mooney<br>Submitted to Computational Lingusitics<p><blockquote>In recent years there has been considerable research into corpus-basedmethods for parser construction.  A common thread in this research hasbeen the use of propositional representations for learned knowledge.This paper presents an alternative approach based on techniques from asubfield of machine learning known as inductive logic programming(ILP).  ILP, which investigates the learning of relational(first-order) rules, provides a way of using empricial methods toacquire knowledge within traditional, symbolic parsing frameworks.  Wedescribe a novel method for constructing deterministic Prolog parsersfrom corpora of parsed sentences.  We also discuss several advantagesof this approach compared to propositional alternatives and presentexperimental results on learning complete parsers using severalcorpora including the ATIS corpus from the Penn Treebank.</blockquote><!WA20><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-compling-95.ps.Z"><!WA21><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="chill-ijcai-nll-95.ps.Z"</a><b><li>A Comparison of Two Methods Employing Inductive Logic Programming forCorpus-based Parser Constuction<br></b>John M. Zelle and Raymond J. Mooney<br><cite>Working Notes of the IJCAI-95 Workshop on New Approaches toLearning for Natural Language Processing</cite>, pp.79-86, Montreal,Quebec, August, 1995.<p><blockquote> This paper presents results from recent experiments with CHILL,a corpus-based parser acquisition system.  CHILL treats grammaracquisition as the learning of search-control rules within a logicprogram.  Unlike many current corpus-based approaches that usepropositional or probabilistic learning algorithms, CHILL usestechniques from inductive logic programming (ILP) to learn relationalrepresentations.  The reported experiments compare CHILL'sperformance to that of a more naive application of ILP to parseracquisition.  The results show that ILP techniques, as employed inCHILL, are a viable alternative to propositional methods andthat the control-rule framework is fundamental to CHILL'ssuccess.</blockquote><!WA22><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-ijcai-nll-95.ps.Z"><!WA23><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="chill-ifoil-ml-95.ps.Z"</a><b><li>Inducing Logic Programs without Explicit Negative Examples<br></b>John M. Zelle, Cynthia A. Thompson, Mary Elaine Califf, and Raymond J. Mooney<br><cite>Proceedings of the Fifth International Workshop on Inductive LogicProgramming</cite>, Leuven, Belguim, Sepetember 1995.<p><blockquote> This paper presents a method for learning logic programswithout explicit negative examples by exploiting an assumption of<i>output completeness</i>. A mode declaration is supplied for thetarget predicate and each training input is assumed to be accompaniedby all of its legal outputs.  Any other outputs generated by anincomplete program implicitly represent negative examples; however,large numbers of ground negative examples never need to be generated.This method has been incorporated into two ILP systems, CHILLIN andIFOIL, both of which use intensional background knowledge.  Tests ontwo natural language acquisition tasks, case-role mapping andpast-tense learning, illustrate the advantages of the approach.</blockquote><!WA24><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-ifoil-ml-95.ps.Z"><!WA25><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="foidl-jair-95.ps.Z"</a><b><li>Induction of First-Order Decision Lists: Results on Learning the Past Tense of English Verbs<br></b>Raymond J. Mooney and Mary Elaine Califf <br><cite>Journal of Artificial Intelligence Research</cite>, 3 (1995) pp. 1-24.<blockquote>This paper presents a method for inducing logic programs from examples thatlearns a new class of concepts called first-order decision lists, definedas ordered lists of clauses each ending in a cut.  The method, called FOIDL, is based on FOIL but employs intensional background knowledge and avoids the need for explicit negative examples.  It is particularly useful for problems that involve rules with specific exceptions, such as learning the past-tense of English verbs, a task widely studied in the context of the symbolic/connectionist debate.  FOIDL is able to learn concise, accurate programs for this problem from significantly fewer examples than previous methods (both connectionist and symbolic).</blockquote><!WA26><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/foidl-jair-95.ps.Z"><!WA27><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="chill-aaai-94.ps.Z" </a><b> <li> Inducing Deterministic Prolog Parsers From Treebanks:A Machine Learning Approach </b> <br>  John M. Zelle and Raymond J. Mooney <br> <cite> Proceedings of the Twelfth National Conference onAI</cite>, pp. 748-753, Seattle, WA, July 1994. (AAAI-94) <p> <blockquote>This paper presents a method for constructing deterministic, context-sensitive,Prolog parsers from corpora of parsed sentences. Our approach uses recentmachine learning methods for inducing Prolog rules from examples (inductivelogic programming).  We discuss several advantages of this method compared torecent statistical methods and present results on learning complete parsersfrom portions of the ATIS corpus.</blockquote><!WA28><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-aaai-94.ps.Z"><!WA29><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="dolphin-chill-proposal-93.ps.Z" </a><b> <li> Learning Search-Control Heuristics for Logic Programs:Applications to Speedup Learning and Language Acquisition </b> <br>John M. Zelle <br> Ph.D. proposal, Department of Computer Sciences, University of Texasat Austin, 1993. <p><blockquote> This paper presents a general framework, learningsearch-control heuristics for logic programs, which can be used toimprove both the efficiency and accuracy of knowledge-based systemsexpressed as definite-clause logic programs.  The approach combinestechniques of explanation-based learning and recent advances ininductive logic programming to learn clause-selection heuristics thatguide program execution.  Two specific applications of this frameworkare detailed: dynamic optimization of Prolog programs (improvingefficiency) and natural language acquisition (improving accuracy).  Inthe area of program optimization, a prototype system, DOLPHIN, is ableto transform some intractable specifications into polynomial-timealgorithms, and outperforms competing approaches in several benchmarkspeedup domains.  A prototype language acquisition system, CHILL, isalso described.  It is capable of automatically acquiring semanticgrammars, which uniformly incorprate syntactic and semanticconstraints to parse sentences into case-role representations.Initial experiments show that this approach is able to constructaccurate parsers which generalize well to novel sentences andsignificantly outperform previous approaches to learning case-rolemapping based on connectionist techniques.  Planned extensions of thegeneral framework and the specific applications as well as plans forfurther evaluation are also discussed.  </blockquote><!WA30><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/dolphin-chill-proposal-93.ps.Z"><!WA31><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><a name="chill-aaai-93.ps.Z" </a><b> <li> Learning Semantic Grammars With Constructive Inductive Logic Programming </b> <br> John M. Zelle  and Raymond J. Mooney <br> <cite> Proceedings of the Eleventh National Conference of the AmericanAssociation for Artificial Intelligence</cite>, pp. 817-822,Washington, D.C. July 1993 (AAAI-93). <p><blockquote>Automating the construction of semantic grammars is a difficult andinteresting problem for machine learning.  This paper shows how thesemantic-grammar acquisition problem can be viewed as the learning ofsearch-control heuristics in a logic program.  Appropriate controlrules are learned using a new first-order induction algorithm thatautomatically invents useful syntactic and semantic categories.Empirical results show that the learned parsers generalize well tonovel sentences and out-perform previous approaches based onconnectionist techniques.</blockquote><!WA32><a href="file://ftp.cs.utexas.edu/pub/mooney/papers/chill-aaai-93.ps.Z"><!WA33><img align=top src="http://www.cs.utexas.edu/users/ml/paper.xbm"></a><p><! ===========================================================================><hr><address><!WA34><a href="http://www.cs.utexas.edu/users/estlin/">estlin@cs.utexas.edu</a></address>

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -