📄 cornell's autonomous peanutbot.htm
字号:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3c.org/TR/1999/REC-html401-19991224/loose.dtd">
<!-- saved from url=(0109)http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html -->
<HTML xmlns="http://www.w3.org/1999/xhtml"><HEAD><TITLE>Cornell's Autonomous PeanutBot</TITLE><!-- This document provides the basis of a semantically structured web page authored in XHTML 1.0 Transitional using established Cornell University naming conventions.-->
<META http-equiv=Content-Type content="text/html; charset=ISO-8859-1">
<META http-equiv=Content-Language content=en-us><LINK href="favicon.ico"
type=image/x-icon rel="shortcut icon"><!-- All layout and formatting should be controlled through Cascading Stylesheets (CSS). The following link tag should appear in the head of every page in the website. see styles/screen.css.--><LINK
media=screen href="Cornell's Autonomous PeanutBot.files/screen.css"
type=text/css rel=stylesheet><LINK media=print
href="Cornell's Autonomous PeanutBot.files/print.css" type=text/css
rel=stylesheet>
<META content="MSHTML 6.00.2900.2873" name=GENERATOR></HEAD>
<BODY class=twocolumn><!-- The following link provides a way for people using text-based browsers and screen readers to skip over repetitive navigation elements so that they can get directly to the content. It is hidden from general users through CSS.-->
<DIV id=skipnav><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#content">Skip
to main content</A> </DIV>
<HR>
<!-- The following div contains the Cornell University logo and search link -->
<DIV id=cu-identity>
<DIV id=cu-logo><A href="http://www.cornell.edu/"><IMG height=75
alt="Cornell University"
src="Cornell's Autonomous PeanutBot.files/cu_logo_unstyled.gif" width=263
border=0></A> </DIV><!-- The search-form div contains a form that allows the user to search either pages or people within cornell.edu directly from the banner. -->
<DIV id=search-form>
<FORM action=/search/ method=get>
<DIV id=search-input><LABEL for=search-form-query>SEARCH:</LABEL> <INPUT
id=search-form-query name=q> <INPUT id=search-form-submit type=submit value=go name=submit> </DIV>
<DIV id=search-filters><INPUT id=search-filters1 type=radio CHECKED value=""
name=tab> <LABEL for=search-filters1>Pages</LABEL> <INPUT id=search-filters2
type=radio value=people name=tab> <LABEL for=search-filters2>People</LABEL> <A
href="http://instruct1.cit.cornell.edu/search/">more options</A>
</DIV></FORM></DIV></DIV>
<HR>
<!-- The header div contains the main identity and main navigation for the site -->
<DIV id=header>
<DIV id=identity>
<H1>PeanutBot, The Audio Homing Robot</H1></DIV>
<DIV id=navigation>
<UL>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#main">Main</A>
</LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#photos">Photos</A>
</LI></UL></DIV></DIV>
<HR>
<DIV id=wrap><!-- The content div contains the main content of the page -->
<DIV id=content><!-- The section-navigation div contains the second level of site navigation. These links appear at the top of the left sidebar of the two-column page. -->
<DIV id=section-navigation>
<UL>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#intro">Introduction</A>
</LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#highlevel">High-Level
Design</A> </LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#hardware">Hardware
Design</A> </LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#software">Software
Design</A> </LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#results">Results</A>
</LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#conclusions">Conclusions</A>
</LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#appendices">Appendices</A>
</LI>
<LI><A
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2007/ai45_hkc2_sbs43/ai45_hkc2_sbs43/index.html#ref">References</A>
</LI></UL></DIV>
<HR>
<!-- The main div contains the main contents of the page. It will be displayed as the wide right column with the beige background. -->
<DIV id=main>
<DIV id=intro>
<H2>Introduction</H2>
<P>Sensing in autonomous vehicles is a growing field due to a wide array of
military and reconnaissance applications. The Adaptive Communications and
Signals Processing Group (ACSP) research group at Cornell specializes in
studying various aspects of autonomous vehicle control. Previously, ACSP has
examined video sensing for autonomous control. Our goal is to build on their
previous research to incorporate audio source tracking for autonomous control.
</P>
<P>Our project involves implementing a signal processing system for audio
sensing and manipulation for the control of an autonomous vehicle. We are
working with the ACSP to develop PeanutBot to help advance their research in
audio sensor networks. Our system will have two modes, autonomous and control.
In autonomous mode, the robot will detect and follow pulses of a predetermined
set of frequencies and the robot will approach the source. In control mode, the
robot will execute commands by an administrator on PC transmitted to the robot
via an RS-232 serial connection. </P></DIV>
<DIV class=main-photo-large><IMG alt=Photo1
src="Cornell's Autonomous PeanutBot.files/PeanutPhoto.jpg"> </DIV>
<DIV id=highlevel>
<H2>High Level Design</H2>
<P>The PeanutBot robot consists of three microphone circuits, three servo
motors, an MCU and a PC. The concept chart of the system and communication
protocols is shown in Figure 1. </P>
<DIV class=main-photo-large><IMG alt="Block Diagram"
src="Cornell's Autonomous PeanutBot.files/high_level_image.gif">
<P class=caption>Figure 1 </P></DIV>
<P>The PC is used to communicate with the MCU in control mode for transmitting
commands. During development, the PC communication was useful for testing,
debugging and verification. A basic block diagram of the system is shown below
in Figure 2. </P>
<DIV class=main-photo-large><IMG alt="High Level Design"
src="Cornell's Autonomous PeanutBot.files/high_level_design.gif">
<P class=caption>Figure 2 </P></DIV>
<P>The three microphones were used to triangulate the angle of the source
relative to the robot. The audio source plays a continuous stream of pulses.
Pulses were chosen over a continuous tone because, instead of detecting phase
difference in the audio signal, our system detects the arrival time of the
signal at a certain amplitude at each microphone. The robot is designed to be
autonomous and is, therefore, not synchronized with the pulse generator. As a
result, the time of flight of each impulse is not available and the robot is
unable to quantify the distance to the source. Instead, the robot advances by a
small predetermined distance and listens for the signal again. To find the sound
source, the robot listens for the arrival of an impulse on any of the three
microphones. Once an impulse has been detected at one of the microphones, the
robot records the microphone data at 10 microsecond intervals for 10
milliseconds. Using this data, the arrival time of the impulse at each
microphone is calculated and the direction of the source is obtained. Once the
angle of the source has been identified, the robot rotates and pursues the
source for a short period, and then promptly resumes triangulation of the signal
to repeat the process. </P>
<H4>Background Mathematics</H4>
<P>The three microphones are placed at equal distances (7 inches apart) and one
microphone is chosen as the first microphone. To find the location of the sound
source, the difference in the arrival time of the signal at the microphones is
calculated according to the equations shown below in Figure 3. </P>
<DIV class=main-photo-large><IMG alt=triangulation
src="Cornell's Autonomous PeanutBot.files/triangulation.jpg">
<P class=caption>Figure 3 </P></DIV>
<P>To calculate the angle of the source with respect to the front of the car, a
lookup table containing arrival times and angles is used. The arrival times in
the lookup table are calculated using the speed of sound at Ithaca's altitude
(343.966 m/s) and the distance between microphone one and the other microphones
on the plane of the sound wave fronts for each angle in the table. This table
maps the time differences t1 and t2 to a specific angle with an accuracy of 1
degree. Once the arrival times are observed, an angle is chosen based on the
closeness of the relative arrival times to t1 and t2. </P>
<H4>Logical Structure</H4>
<P>PeanutBot has three software state machines for the servo control, user
control mode, and autonomous control mode. The robot boots up in autonomous mode
but can be transferred into user controlled mode if given instructions via its
serial port. The control mode that is selected operates on its data, updates the
appropriate servo variables, and transfers control over to the servo control
state machine. The servo state machine will read and operate on the servo
control variables and, once finished, return control to the mode which called
it.</P>
<H4>Hardware / Software Tradeoffs</H4>
<P>During the design of the robot, there were hardware and software tradeoffs.
Most notably, interfacing with the microphones had a complicated circuit to
parse information before the software on the MCU manipulated the data. While the
MCU does have an 8-channel A/D converter which is significantly more than the
3-channels required for triangulation, the on-board A/D converter requires
several hundred microseconds to converge for a single channel, and only 1
channel can be read at a time. As a result, reading all three microphones on the
MCU would require about 1-2 ms. Since the microphones were positioned 7 inches
apart, it would take less time for the sound wave to travel from the first
microphone to the second microphone then for the first A/D reading to converge.
Furthermore, reading the microphones in serial instead of parallel would create
an inherent delay added asymmetrically to the microphones, making it difficult
to triangulate the source of the microphone. Consequently, most of the
manipulation of the microphones was done in hardware to maintain the
functionality of the robot.</P>
<H4>Standards</H4>
<P>The design of the robot conforms to IEEE standards such as the RS232 standard
⌨️ 快捷键说明
复制代码
Ctrl + C
搜索代码
Ctrl + F
全屏模式
F11
切换主题
Ctrl + Shift + D
显示快捷键
?
增大字号
Ctrl + =
减小字号
Ctrl + -