http:^^www.cs.jhu.edu^~kasif^
来自「This data set contains WWW-pages collect」· EDU^~KASIF^ 代码 · 共 339 行
EDU^~KASIF^
339 行
Date: Wed, 20 Nov 1996 23:04:16 GMT
Server: NCSA/1.5.1
Last-modified: Sun, 31 Mar 1996 17:20:42 GMT
Content-type: text/html
Content-length: 12622
<title> Simon Kasif's home page </title><h1> Simon Kasif's home page </h1><!WA0><img align = top src = "http://www.cs.jhu.edu/~kasif/kasif.gif"> <p><hr><h1> Research and Teaching </h1><pre>High-Performance Intelligent Systems: Computational modelling and control of complex systems Machine Learning Systems for data analysis and data mining Parallel Intelligent Systems Automated Reasoning General Interests: Parallel Computation Computational Neuroscience Computational Biology </pre><p><H2> <!WA1><a href = "http://www.cs.jhu.edu/~kasif/send-out"> CFP: AAAI Symposium on Learning Complex Behaviors </a> </H2><h2> Research Interests </h2>Intelligent information processing systems are likely to play a most significant role in the 21st century, and the design and analysis of these systems is oneof the great scientific challenges we face today.Consider the following realistic possibilities, which arecurrently the main focus of ongoing work in intelligent systems research around the globe.<p>An intelligent vision system installed in a car tracks the road and wakes up the dozing driver when the car is beginning to to head towards the shoulder. Thousands of lives could be saved annually.Another data-driven system could help to perform preciseradioactive therapy on a cancerous tumorin a way that minimizes damage to adjacent cells. An interactive automated system could consider a set of symptoms and warn of a likely heart attack in an emergency room patient.Another medical system can perform sophisticated analysis of a collection of tests, help assess the probability of cancer recurrence and suggest a schedule for repeated evaluations. <p>The basic principle driving these systems is that they canLEARN to function from very large corpuses of data.What makes this technology possible is the remarkable match between the massive amount of information that can be stored on modern computers and the amazing computational capabilities (speed) of these machines. <p>Our society is currently entering a new period where massive amounts of heterogeneous information are generated and stored on computers and are readily available for explorationover academic networks, digital libraries, and commercial information services.At the same time, modern networks of computers can perform billions of computation steps per second.This rather unique match of raw computer power and sea of available information creates many previously unfeasible opportunities for building intelligent systems. Coping with information explosion, scientific data mining, interactive modelling of unstructured data and complex processes, and intelligent data-driven decision-making with uncertaintyare among the major current focus areas in computer science. <p>One striking example of the use of this computing power was a recent application developed at Johns Hopkins University.A chess end-game analysis program (designed by Lewis Stiller) running on a 65,536 processor machine and using gigabytes of storage found chess end-games that absolutely require more than 220 moves to win a simple chess game. Chess masters for generations used a rule that declared a draw after only 50 moves.The old rule was dictated by human experience that could not perform this type of analysis.<p>A different program, implemented by an interdisciplinary group of researchers from Astronomy and Computer Science,automatically finds stars, galaxies or filters cosmic rays in images obtained by the Hubble Telescope. This program is using a decision tree induction system implemented by S. Murthy at the Johns Hopkins University.This program is very fast and can easily process millions of astronomical objects in a short amount of time.<p>My current research program focuses on high-performance computer systems and efficient algorithmsthat provide the computational capabilities to support this critical information technology.Our main focus is the design and analysis ofalgorithms for high-performance intelligent systems. Specifically, weare interested in efficient parallel and sequential algorithms for modelling and controlling complex processes (e.g., computer systems or neural networks) --intelligent information retrieval and modelling of large data depositories (that can be used in educational environments) --constraint networks and probabilistic networks (that can be used for decision-making with uncertainty, e.g., medical diagnosis) -- and machine learning. The algorithms designed by our grouphave been applied to a variety of applications, such aspublic health, computer vision, computer game playing, astronomy and several biomedical applications.<p><h2> Academic Experience </h2><pre>Jun 1991 & present Associate Professor, Dept. of Computer Science joint appointment in Cognitive Science since 1989.July 1994 - Jun 95 Sabbatical Leave: NEC Research Laboratories and Computer Science Department, Princeton University Sep 1987 - Jun 91 Assistant Professor, Dept. of Computer Science, The Johns Hopkins UniversitySep 1989 - Jun 91 Assistant Professor, JOINT appointment, Cognitive Science Center The Johns Hopkins UniversitySep 1985 - Sep 87 Assistant Professor, Dept. of Electrical Engineering and Computer Science The Johns Hopkins University. Jun 1987 - Aug 87 Visiting Scientist, Weizmann Institute of ScienceJun 1985 - Aug 85 Research Scientist, University of Maryland. May 1985 Ph.D. Computer Science Department, University of Maryland. </pre><h2> Recent Journal Publications (since 1990) </h2><ol><li>Kasif, S., On the Parallel Complexity of Discrete Relaxation in Constraint Networks,Artificial Intelligence Journal, pp. 275-286, October 1990.<li>Delcher, A. and S. Kasif,Efficient Parallel Term Matching and Anti-Unification,Journal of Automated Reasoning, pp. 391--406, 1992.<li>Kasif, S., S. Banerjee, A. Delcher and G. Sullivan,Some Results on the Complexity of Symmetric Connectionist Networks,Annals of Mathematics and Artificial Intelligence, Nov. 1993, 327-344.<li>Kasif, S. and A. Delcher,Analysis of Local Consistency in Parallel Constraint Networks, Artificial Intelligence, 307--327, 1994.<li>Kasif, S., Optimal Parallel Algorithms for Quad-Tree Problems, Journal of Computer Vision and Image Processing, pp. 281-285, May 1994. <li>Salzberg, S., D. Heath, A. Delcher and S. Kasif, Best Case Analysis of Nearest Neighbours Algorithms, IEEE Transaction on Pattern Analysis and Machine Intelligence,June 1995, 17:6, 599-608.<li>Heath, D., S. Kasif, S. R. Kosaraju, S. Salzberg and G. Sullivan, Learning Nested Concept Classes with Limited Memory, to appear in the Journal of Experimental and Theoretical AI, 1996.<li>Heath, D. and S. Kasif,On Voronoi Covers with Applications to Machine Learning,Computational Geometry: Theory and Applications, Nov. 1993, 289-305.<li>Delcher, A. and S. Kasif,Term Matching on a Mesh-Connected Array of Processors,Annals of Mathematics and Artificial Intelligence, Volume 14, 177--186, 1995.<li>Murthy, S., S. Kasif and S. Salzberg,A System for Induction of Oblique Decision Trees,Journal of Artificial Intelligence Research, 2:1 (1994),1--33.<li>Delcher, A, A. Grove, S. Kasif and J. Pearl, Logarithmic Time Queries and Updates in Probabilistic Networks, to appear, Journal of Artificial Intelligence Research, 1996.<li>D. Waltz, and S. Kasif, On Reasoning from Data, to appear, Computing Surveys}, 1996.<li>Delcher, A. and S. Kasif,The Complexity of Incremental Parallel Computations'', in review.<li>Bright, J., Kasif, L. Stiller, Memory-Efficient Parallel Algorithmsfor Bi-Directional Search'', in review. <li>Kasif, S., Salzberg, S. and Waltz, D., Rachlin, J. and Aha, D.,Towards of a Framework for Memory-Based Reasoning, in review. <li>D. Dobkin, D. Gunopoulous, T. Fulton, S. Kasif, and S. Salzberg,Induction of Shallow Decision Trees ,IEEE Trans. on Pattern Analysis and Machine Intelligence, in review. </ol><h2> Selected Conference Publications (since 1990) </h2><ol><li>Heath, D., S. Kasif, S. R. Kosaraju, S. Salzberg and G. Sullivan, ``Learning Nested Concept Classes with Limited Storage'', Proceedings of the International Joint Conference on Artificial Intelligence(IJCAI-91), pp. 777-782, 1991.<li>Salzberg, S., D. Heath, A. Delcher and S. Kasif, ``Learning with a Helpful Teacher'',Proceedings of the International Joint Conference on Artificial Intelligence,(IJCAI-91), pp. 705-711, 1991.<li>S. Kasif and A. Delcher, ``Improved Decision Making in Game Trees: Recovering from Pathology'', Proceedings of the National Conference on Artificial Intelligence (AAAI-92), pp. 513-518, July 1992.<li>D. Heath, S. Kasif and S. Salzberg,``Learning Oblique Decision Trees'', Computational Learning Theory and Natural Learning Systems, 1992.<li>Kasif, S. and A. Delcher,Analysis of Local Consistency in Parallel Constraint Networks International Conference on Artificial Intelligence and Vision, pp. 217-231, 1992.<li>S. Kasif,``Iterative Focusing and Hashing: An Alternative to Alpha-Beta'',International Conference on Artificial Intelligence and Vision, pp. 59-72, 1992.<li>D. Heath, S. Kasif and S. Salzberg,Learning Oblique Decision Trees,Proceedings of the International Joint Conference on Artificial Intelligence, (IJCAI 93), pp. 1002--1007, August 1993. <li>Murthy, S., S. Kasif, S. Salzberg and R. Beigel,OC1: A Randomized Algorithm for Building Oblique Decision Trees,Proceedings of the National Conference on Artificial Intelligence, (AAAI-93),pp. 322--327, July 93. <li>\itemDelcher, A., S. Kasif, H. Goldberg and W. Xsu,Protein Secondary-Structure Modeling with Probabilistic Networks,International Conference on Intelligent Systems and Molecular Biology,pp. 109--117, 1993. <li>Delcher, A., S. Kasif, H. Goldberg and W. Xsu,Application of Probabilistic Causal Trees to Analysis of Protein Secondary Structure, Proceedings of the National Conference on Artificial Intelligence, (a short version of the above), pp. 316--321, July 1993. <li>Bright, J., Kasif, L. Stiller, Exploiting Algebraic Structure in Parallel State-Space Search, Proc. of the 11-th National Conf. on Artificial Intelligence, July 1994, (AAAI-94), pp. 1341-1346.<li>Rachlin, J., S. Kasif, S. Salzberg and D. Aha,Toward of a better understanding of Memory-Based Classifiers, Proceeding of the 11-th Intern. Conf. on MachineLearning, pp. 242--250, July 1994.<li>Fulton, T., S. Kasif and S. Salzberg, Efficient Algorithms for Finding Multi-Way Splits for Decision Trees, JHU TR, December 1993, Proceeding of the 12-th Intern. Conf. on MachineLearning, July 1995.<li>Delcher, A, A. Grove, S. Kasif and J. Pearl, Logarithmic Time Queries and Updates in Probabilistic Networks, Proceedings of the 1995 Conference on Uncertainty in AI, August 1995.<li>D. Dobkin, D. Gunopoulous, S. Kasif,Induction of Low-Depth Decision Trees'' International Conference on Mathematics and Artificial Intelligence, January 1996.<li>S. Weiss, S. Kasif, and E. Brill, Towards a Framework for Adaptive Information Retrieval,AAAI Spring Symposium on Information Retrieval (to appear 1996). </ol><h2> Ph.D Students</h2><ol><li> <!WA2><a href = "http://blaze.cs.jhu.edu/delcher/home.html"> Art Delcher, </a>Ph.D. 1989, M.A. in Mathematics, Johns Hopkins University. Thesis area: Parallel Algorithms for Artificial Intelligence.Currently, Full Professor and Chairman CS Department Loyola College,Adjunct Research Faculty Johns Hopkins Computer Science Department.<li> <!WA3><a href = " http://mission.amil.jhu.edu/projects/liver"> David Heath </a>Ph.D. 1992, BS in EE, Cal Tech, Wolman Fellow,Thesis area: Algorithms for Machine Learning.Currently, faculty in Johns Hopkins Medical School, Computer-Assisted Radiology.<li>Lewis Stiller, BS in Mathematics, Johns Hopkins University, Ph.D. May 1995, National Defense Fellowship,Thesis area: Exploiting Symmetry in Parallel Computation, currently at Berkeley, CA.<li> <!WA4><a href = "http://sugar.cs.jhu.edu/~trux/"> Truxton Fulton </a>BS in Computer Science, Cal. Tech, Wolman Fellow.Research area: Adaptive Computer Systems, expected 1996. <li> <!WA5><a href = "http://www.cs.jhu.edu/~weiss/"> Scott Weiss </a>BS in Computer Science, Carnegie-Mellon University, Wolman Fellow.Research Area: Adaptive Information Retrieval, expected 1996.</ol><!WA6><A HREF="http://www.cs.jhu.edu/~kasif/hot.html">Information</A>
⌨️ 快捷键说明
复制代码Ctrl + C
搜索代码Ctrl + F
全屏模式F11
增大字号Ctrl + =
减小字号Ctrl + -
显示快捷键?