📄 http:^^www.ai.mit.edu^projects^handarm-haptics^manipulation.html
字号:
<table border=5 cellspacing=5 cellpadding=5><tr><a name="S2.1"><th bgcolor=steelblue>Robust Grasping in UnstructuredEnvironments</th></a></tr><tr><td align=center><!WA22><img src="http://www.ai.mit.edu/projects/handarm-haptics/images/picksort.gif" width=478 height=357></td></tr><tr><td>One of our current projects, funded by NASA/JPL, is to develop afundamental understanding of the problem of combining real time visionand touch sensor data with robot control, to yield robust, autonomousand semi-autonomous grasping and grasp-stabilization. The researchis focused on providing conceptual and experimental support of plannedand on-going NASA missions utilizing earth-orbiting and planetarysurface robotics.<P>We have implemented a high-speed active vision system, amulti-processor operating system, and basic algorithms for acquisition andgrasp of stationary spherical and cylindrical objects using coordinatedrobotic vision, touch sensing, and control. Preliminary experiments on thetracking of moving objects have also been completed. Concurrently,research into an integrated wrist-hand design used forperforming sensor guided grasps, and a preliminary design for anext-generation miniature end-effector are being completed.</td></tr></table></center><p><center><table border=5 cellspacing=5 cellpadding=5><tr><a name="S2.2"><th colspan=2 bgcolor=steelblue>Robotic Catching ofFree Flying Objects</th></a></tr><tr><td colspan=2>Another direction of our research, funded by Fujitsu, Furukawa, andthe Sloan Foundation, is to accomplish real-time robustcatching of free flying objects. We are currently focusing onspherical balls of various sizes. We are alsoexperimenting with additional objects with different dynamiccharacteristics such as sponge balls, cylindrical cans, and paperairplanes.</td></tr><tr><td colspan=2 align=center><!WA23><img src="http://www.ai.mit.edu/projects/handarm-haptics/images/wam7-2.gif" width=446height=297></td></tr><tr><td colspan=2>Our system uses low cost vision processing hardware for simpleinformation extraction. Each camera signal is processed independentlyon vision boards designed by other members of the MIT AI Laboratory(the <!WA24><a href="http://www.newtonlabs.com/">Cognachrome Vision TrackingSystem</a>). These vision boards provide us with the center of area, major axis, number of pixels, and aspect ratio of the colorkeyed image. The two Fast Eye Gimbals allow us to locate and trackfast randomly moving objects using "Kalman-like" filtering methodsassuming no fixed model for the behavior of the motion. Independentof the tracking algorithms, we use least squares techniques to fitpolynomial curves to prior object location data to determine thefuture path. With this knowledge in hand, we can calculate a path forthe WAM to match trajectories with the object to accomplish catchingand smooth object/WAM post-catching deceleration. <P>In addition to the basic least squares techniques for path prediction, we study experimentally nonlinear estimation algorithms to give "long term"real-time prediction of the path of moving objects, with the goal of robustacquisition. The algorithms are based on stable on-line construction ofapproximation networks composed of state space basis functions localized inboth space and spatial frequency. As a initial step, we have studied thenetwork's performance in predicting the path of light objects thrown inair. Further application may include motion prediction of objects rolling,bouncing, or breaking up on rough terrains.<P>Some recent successful results for the application of this networkhave been obtain in catching of sponge balls and even paper airplanes!</td></tr><tr><td align=center><!WA25><a href="http://www.ai.mit.edu/projects/handarm-haptics/catching.html"><!WA26><img src="http://www.ai.mit.edu/projects/handarm-haptics/images/wamcatch.gif" width=166height=250></a><br>Click to view <!WA27><a href="http://www.ai.mit.edu/projects/handarm-haptics/catching.html">WAMcatching</a>.</td><td align=center><!WA28><a href="http://www.ai.mit.edu/projects/handarm-haptics/airplane.html"><!WA29><img src="http://www.ai.mit.edu/projects/handarm-haptics/images/airp-catch.gif" width=168height=250></a><br>Click to view <!WA30><a href="http://www.ai.mit.edu/projects/handarm-haptics/airplane.html">WAM Airplanecatching</a>.</td></tr><tr><td>© 1995 Photo courtesy of <!WA31><ahref="http://www.scinetphotos.com/">Hank Morgan</a></td></tr></table></center><p><hr><center><!WA32><a href="#S0">[Introduction]</a><!WA33><a href="#S1">[Our Robots]</a><!WA34><a href="#S2">[Our Research]</a><!WA35><a href="#S3">[References]</a></center><hr><!---------------------------------------><!------------ References ---------------><!---------------------------------------><center><a name="S3"><h1>Partial List of References</h1></a></center><!WA36><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA37><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/aiaa96-abst.html"><I>Autonomous Rock Acquisition</I></A>,D.A. Theobald, W.J. Hong, A. Madhani, B. Hoffman, G. Niemeyer, L.Cadapan, J.J.-E. Slotine, J.K. Salisbury, Proceedings of the AIAAForum on Advanced Development in Space Robotics, Madison, Wisconsin,August 1-2, 1996.<P><!WA38><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA39><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/iser95-abst.html"><I>Experiments in Hand-EyeCoordination Using Active Vision</I></A>, W. Hong and J.J.E. Slotine,Proceedings of the Fourth International Symposium on ExperimentalRobotics, ISER'95, Stanford, California, June 30-July 2, 1995.<P><!WA40><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA41><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/hong-msthesis-abst.html"><I>Robotic Catching andManipulation Using Active Vision</I></A>, W. Hong, M.S. Thesis,Department of Mechanical Engineering, MIT, September 1995.<P><!WA42><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA43><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/neurocomp95-abst.html"><I>Space-Frequency Localized BasisFunction Networks for Nonlinear System Estimation and Control</I></A>,M. Cannon and J.J.E. Slotine, Neurocomputing, 9(3), 1995.<P><!WA44><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA45><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/adaptvis-abst.html"><I>Adaptive Visual Tracking andGaussian Network Algorithms for Robotic Catching</I></A>, H. Kimuraand J.J.E. Slotine, DSC-Vol. 43, Advances in Robust and NonlinearControl Systems, Winter Annual Meeting of the ASME, Anaheim, CA, pp.67-74, November 1992.<P><!WA46><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA47><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/robotcatch-abst.html"><I>Experiments in RoboticCatching</I></A>, B.M. Hove and J.J.E. Slotine,Proceedings of the 1991 American Control Conference, Vol. 1, Boston, MA,pp. 380-385, June 1991.<P><!WA48><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA49><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/perfadapt-abst.html" ><I>Performance in AdaptiveManipulator Control</I></A>, G. Niemeyer and J.J.E. Slotine,International Journal of Robotics Research 10(2), December, 1988.<p><!WA50><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <i>Preliminary Design of a Whole-Arm Manipulation System (WAM)</i>,J.K. Salisbury, W.T. Townsend, B.S. Eberman, D.M. DiPietro,Proceedings 1988 IEEE International Conference on Robotics and Automation, Philadelphia, PA, April 1988.<P><!WA51><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <i>The Effect of Transmission Design on Force-Controlled ManipulatorPerformance</i>, W.T. Townsend, PhD Thesis, Department of MechanicalEngineering, MIT, April 1988. See also MIT AI Lab Technical Report 1054.<P><!WA52><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <i>Whole Arm Manipulation</i>, J.K. Salisbury, Proceedings4th International Symposium on Robotics Research, Santa Cruz, CA, August,1987.<P><!WA53><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA54><a href="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/fegthesis-abst.html"><i>Design and Control of aTwo-Axis Gimbal System for Use in Active Vision</i></a>, N. Swarup,S.B. Thesis, Dept. of Mechanical Engineering, MIT, Cambridge, MA,1993.<p><!WA55><IMG SRC="http://www.ai.mit.edu/gifs/ball.red.gif" ALT="*"> <!WA56><A HREF="http://www.ai.mit.edu/projects/handarm-haptics/abstracts/portvis-abst.html"><I>A High Speed Low-Latency PortableVision Sensing System</I></A>, A. Wright, SPIE, September 1993.<P> <hr><center><!WA57><a href="#S0">[Introduction]</a><!WA58><a href="#S1">[Our Robots]</a><!WA59><a href="#S2">[Our Research]</a><!WA60><a href="#S3">[References]</a></center><hr><p><i>Maintainer: <!WA61><a href="mailto:jesse@ai.mit.edu">jesse@ai.mit.edu</a>,Comments to: <!WA62><a href="mailto:wam@ai.mit.edu">wam@ai.mit.edu</a><br>Last Updated: Mon Aug 26 15:18:36 EDT 1996, jesse@ai.mit.edu<br>© 1996, All rights reserved.</i></body>
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -