⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 cornell's autonomous peanutbot.htm

📁 Sensing in autonomous vehicles is a growing field due to a wide array of military and reconnaissance
💻 HTM
📖 第 1 页 / 共 3 页
字号:
protocol. </P></DIV>
<DIV id=hardware>
<H2>Hardware Design</H2>
<P>The hardware design consists of three microphone circuits, a microcontroller, 
a robot frame, three servo motors, three omni-directional wheels, a 9V battery 
for the MCU and circuitry, and a AA-battery pack for the servos. Each microphone 
circuit consists of amplifiers, filters, and comparators. A diagram of the 
Hardware and software is shown below in Figure 4. </P>
<DIV class=main-photo-large><IMG alt="Mid Level Design" 
src="Cornell's Autonomous PeanutBot.files/mid_level_diagram.gif"> 
<P class=caption>Figure 4 </P></DIV>
<DIV id=figure5>
<H4>Microphone Circuitry</H4>
<P>To control the level of the microphone output, resistors are used to center 
the signal around 1.5V. This level-shifted output is amplified via an 
operational amplifier and then passed through a passive lowpass filter. It is 
then put through a half-wave rectifier with a capacitor to bridge the gaps 
between positive swings caused by the half-wave rectifier. That output is then 
passed though an analog comparator to discretize the signal for reading into the 
port pin on the MCU. The discrete output of the microphone circuit is 
approximately 4V when sound is detected and 0V when no sound is detected. All 3 
microphone circuit outputs are read in parallel using PORTA on the 
microcontroller. The output is also passed through an LED circuit to ground in 
order to debug when a particular microphone circuit detects sound. A schematic 
of the circuit is shown in Figure 5.</P>
<DIV class=main-photo-large><IMG alt="Microphone Schematic" 
src="Cornell's Autonomous PeanutBot.files/microphone_circuit.gif"> 
<P class=caption>Figure 5</P></DIV></DIV>
<H4>Servo Circuitry</H4>
<P>The three servos were connected to their own 6V power supply which was 
filtered using a small capacitor. Each servo was connected to its own pin on 
PORTB of the MCU and was controlled by a PWM which actuated the servo as 
desired. The circuit of the servos is shown in Figure 6. </P>
<DIV class=main-photo-large><IMG alt="Servo Schematic" 
src="Cornell's Autonomous PeanutBot.files/servo_circuit.gif"> 
<P class=caption>Figure 6</P></DIV>
<H4>Robot Frame Design</H4>
<P>The robot's mechanical frame comes as a kit which requires minor assembly of 
its provided pieces and connectors. The omni-directional wheels were provided 
separate from the kit, and proved to be a bit too large for both the frame and 
the servos that came with the kit. In order to make the wheels fit on the frame, 
the servos were mounted lower than intended, which gave the wheel enough 
clearance with the frame so that it fit. In order to mount the wheel on the 
servo's powered axle, we purchased flame-retardant nylon tubing which provided 
for a tighter fit between the axle of the servo and the inside of the wheel. The 
inside of the wheel was also lined with cement in order to ensure a tight fit. 
The battery packs were mounted to the bottom of the robot frame with duct tape 
and the circuits were all placed on top of the robot and held in place via wire 
tension.</P>
<H4>Hardware Stuff that Didn't Work</H4>
<P>There were many routes that were experimented with before the best hardware 
design was found. The servo control circuitry was simple and therefore did not 
require multiple revisions; however, the analog filter circuit for the 
microphones went through multiple revisions of both design and fabrication. A 
bandpass filter was originally implemented on our op-amp in order to pass an 
audio pulse of 2kHz frequency from the microphone, which was selected based on 
its wavelength as compared to the dimensions of the Peanutbot. The bandpass was 
being implemented on the same op-amp as our signal amplifier in order to save 
fabrication space and complexity, however its protoboard design gave unreliable 
results and poor attenuation of frequencies outside of the passband. Therefore, 
additional highpass and lowpass filters were added to the output of the op-amp 
bandpass/amplifier combination in order to implement a 2-pole bandpass filter to 
achieve greater attenuation of frequencies outside of the passband. However, the 
quality of the overall filter was still lacking and the overall signal 
attenuation was too great. Adding another amplifier stage to increase the signal 
amplitude to desirable levels was unfavorable because of its increased 
fabrication complexity. Next, the amplifier circuit was separated from the 
filter circuitry and passive filters were implemented instead. However, it was 
noted that the highpass filter was still attenuating the signal greatly at all 
frequencies instead of filtering the signal selectively. Therefore, the design 
was changed to the current design, lowering the target frequency to the sub-1kHz 
range and using only a passive low-pass filter with the op-amp 
amplifier.</P></DIV>
<DIV id=software>
<H2>Software Design</H2>
<P>The MCU software has three state machines that control the servos, user 
control mode, and autonomous control mode. During execution, the system uses an 
interrupt driven hardware timer, interrupt driven communication to the PC, and 
input from the microphones to update the state machine. The MCU continuously 
loops over code that updates the state machines and pulses the servos as needed 
to rotate the robot and move the robot forward. The servos have four states, 
Idle, Waiting, Rotating and Forward. When the robot moves, it transitions from 
the Idle state to the Rotating or Forward states, and pulses the servos for 1.5 
ms. The robot then transitions from the Rotating or Forward state to the Waiting 
state. When the system is in controlled mode, it uses an interrupt driven serial 
communication protocol to receive user commands and send debugging information 
to the terminal. The code was designed for easy integration into a wireless mesh 
network. A preprocessor flag was used to incorporate the optional flexibility of 
wireless PDA but still allow debugging via HyperTerm. The servo code and most of 
the user code was derived from <A 
href="http://instruct1.cit.cornell.edu/courses/ee476/FinalProjects/s2006/jzs3_da65/jzs3_da65/">SearchBot</A>. 
The format for a valid input to control mode is described below in Table 1.</P>
<TABLE cellSpacing=0 cellPadding=5 border=1>
  <THEAD>
  <TR>
    <TH>Control</TH>
    <TH>Rotation Angle</TH>
    <TH>Distance</TH></TR></THEAD>
  <TBODY>
  <TR class=row1>
    <TD>0 to 2</TD>
    <TD>-180<SUP>o</SUP> to 180<SUP>o</SUP></TD>
    <TD>-300 to 300</TD></TR></TBODY></TABLE>
<P>The control states are described in Table 2 below: </P>
<TABLE cellSpacing=0 cellPadding=5 border=1>
  <THEAD>
  <TR>
    <TH>Value</TH>
    <TH>Mode</TH>
    <TH>Function</TH></TR></THEAD>
  <TBODY>
  <TR class=row1>
    <TD>0</TD>
    <TD>User Control</TD>
    <TD>Rotate the given angle, then move forward the given distance</TD></TR>
  <TR class=row2>
    <TD>1</TD>
    <TD>Autonomous Control</TD>
    <TD>Track and home onto the audio pulse</TD></TR>
  <TR class=row1>
    <TD>2</TD>
    <TD>Update Servo Scaling</TD>
    <TD>Modify the servo scaling factors increase or decrease the robot's 
      responsiveness</TD></TR></TBODY></TABLE>
<P>The heart of the software is the autonomous mode. The autonomous mode has 
five states, Stabilize, Listen, Locate, Mode and Done. When the robot enters 
autonomous mode, the robot enters the Stabilize state. The Stabilize state waits 
for silence on all three microphones to ensure that robot will only listen to 
the start of the audio pulse, and not the middle or end of a pulse. Once the 
robot hears silence, it transitions to the Listen state. In the listen state, 
the MCU constantly samples the microphone port pins until the start of an audio 
pulse is detected. Once the pulse is detected, the robot records the next 10 ms 
of audio and determines the time of the first pulse on each microphone. If only 
one or two of the microphones hear the pulse, then the sample is discarded. The 
robot repeats this process of sampling the microphones for a total of four 
consistent pulses. After the data from all four audio pulses are sampled, the 
machine transitions into the Locate state. In the Locate state the robot 
averages the four time samples. The three average microphone timestamps are then 
used as indexes into a Matlab-generated lookup table stored in flash memory. The 
mathematics in the Matlab script are described above. The lookup tables is used 
to calculate the direction of the source. The robot then records the analysis, 
and transitions to the Move state. In the Move state, the robot updates the 
Servo states to rotate the calculated angle and move forward by 20 cm.</P>
<P>In the future, code should be written which analyzes the last several moves 
to determine if its net movement is negligible. If there is not any net 
movement, the robot transitions into the Done state, and remains there until the 
robot is reset by the user. Otherwise, if the robot has made progress towards 
the source, then the robot returns to the Stabilize state and the process 
repeats. </P>
<H4>Software Stuff That Didn't Work</H4>
<P>Several designs of the software were considered before the final design was 
implemented. One failed implementation was designed to have very simple 
microphone circuits that are sampled continuously by a multi-channel AD 
converter. However, due to inherent hardware limitations on the Atmel Mega32 
microcontroller this design was not feasible. Additionally, due to the variation 
in the microphone circuit, the microphones needed to be sampled multiple times 
and averaged. Furthermore, outliers in the sample had to be ignored to prevent 
skewing the average of the microphone samples. </P></DIV>
<DIV id=results>
<H2>Results</H2>
<P>Testing results prove that the hardware of the project works properly. The 
robot can be commanded to rotate and move via serial in the user controlled mode 
and it properly detects audio stimulus in the autonomous control mode. The 
direction that the robot moves in the autonomous mode is sometimes inconsistent 
with the direction of the audio source, possibly due to audio reflections or 
inconsistent readings on the microphones. The output readings of the microphones 
are outputted to the serial port if a "debug' mode is initiated in the software, 
and they show that the robot's three microphones typically detect sound when 
they are expected to. However, at times the robot still reacts randomly and goes 
in an unexpected direction. The consistency has increased slightly since a 
4-reading average was implemented, and also if the stimulus to the microphones 
is loud and clear such that all 3 microphones detect it cleanly and accurately. 
</P>
<P>The analog circuit requires time to settle between audio pulses due to 
capacitive discharging. Therefore, there is a two second wait time implemented 
between the sets of readings. Additionally, by averaging four readings, the time 
to locate the audio source has increased. </P>
<P>Safety is ensured by adding an accessible power switch on the side of the 
robot chassis which disables both the power to the servos and the power to the 
MCU. The user may put the robot in user control mode at any time as the 
communication is implemented via interrupts. Our robot抯 primary interface with 
the Administrator is via the PC, and as a result, there are not any interface 
design considerations for people with special needs. We do not have any 
intellectual property issues since we will design the system and code except for 
the wireless code which will be used with the permission from the ACSP. Lastly, 
we will operate within FCC regulations since our communication occurs between 
two FCC-compliant devices. Additionally, the robot does not interfere with other 
people's designs as there is no RF communication and we produce minimal electric 
and acoustic noise. The robot will, however, react to loud audio pulses 
generated by other people's designs. </P></DIV>
<DIV id=conclusions>
<H2>Conclusions</H2>
<P>The microphone data observed through the serial port to the PC meets the 
project expectations when the audio source is relatively close and the volume is 
high. However, the sensitivity of the microphones and the analog circuit did not 
meet our expectations. When the volume of the audio signal is too low, the 
results are somewhat unreliable. Additionally, the capacitors used in the 
circuit proved to have a large margin of error (~20%). The design of the circuit 
was altered to minimize this effect, but the error was not eliminated. 
Subsequently, the capacitors provided limitations on the filtering of the audio 
signal and therefore the inputs provided to the MCU. This cascaded through the 
design and could be causing some of the unexpected movements the robot 
occasionally makes. Future designs could take this into account and either 
minimize the number of capacitors further or order higher quality capacitors 
with smaller margins of error. However, since the user controlled mode does not 
rely on capacitors to determine the response of the robot, it responds 
excellently to the commands it receives and therefore meets all expectations 
fully. Another solution to the circuit error problem would be to analyze the 
signal using software. This requires an MCU with an A/D converter that could 
listen to three inputs simultaneously. Since the design is derived from the 
Searchbot project from Spring 2006, the source code for the control modes and 
the servo control acted as a shell for the PeanutBot code.</P>
<H4>Ethical Considerations</H4>
<P>The project has been designed with the IEEE Code of Ethics in mind. As 
required, the project team accepts all responsibility for issues concerning the 
safety, health and welfare of the public, and the project team is responsible 
for disclosing factors that might endanger the public or the environment. Since 
the project involves an autonomous control mode where the robot is not under 
direct human control, this is an ever-present consideration. PeanutBot also 
strives to avoid real or perceived conflicts of interest whenever possible, and 
to disclose them to affected parties when they do exist. The project team has 
strived to be honest and realistic in stating claims or estimates based on 
available data. The results of testing have been presented in an unaltered 
manner. The team has not been subject to bribery, and will not be involved in 
bribery in the future. The team strives to improve the understanding of 
technology, its appropriate application, and potential consequences by 
understanding the military and reconnaissance applications of the robot and 

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -