⭐ 欢迎来到虫虫下载站! | 📦 资源下载 📁 资源专辑 ℹ️ 关于我们
⭐ 虫虫下载站

📄 low.c

📁 卡耐基.梅隆大学的机器人仿真软件(Redhat linux 9下安装)
💻 C
📖 第 1 页 / 共 4 页
字号:
//// This Program is provided by Duke University and the authors as a service to the// research community. It is provided without cost or restrictions, except for the// User's acknowledgement that the Program is provided on an "As Is" basis and User// understands that Duke University and the authors make no express or implied// warranty of any kind.  Duke University and the authors specifically disclaim any// implied warranty or merchantability or fitness for a particular purpose, and make// no representations or warranties that the Program will not infringe the// intellectual property rights of others. The User agrees to indemnify and hold// harmless Duke University and the authors from and against any and all liability// arising out of User's use of the Program.//// low.c// Copyright 2005 Austin Eliazar, Ronald Parr, Duke University//#include "low.h"#include "mt-rand.h"TOdo odometry;//// Error model for motion// Note that the var terms are really the standard deviations. See our paper on// learning motion models for mobile robots for a full explination of the terms.// This model is for our ATRV Jr on a carpeted surface, at a specific period in// in time (and thus a specific state of repair). Do not take this model as anything// indicative of a model you should expect on your own robot.// For a more modular design, you may find it useful to move these definitions out to// the file "ThisRobot.c"//double meanC_D = 0.0;double meanC_T = 0.0;double varC_D = 0.2;double varC_T = 0.2;double meanD_D = 1.0;double meanD_T = 0.0;double varD_D = 0.4;double varD_T = 0.2;double meanT_D = 0.0;double meanT_T = 1.0;double varT_D = 0.2;double varT_T = 0.4;// Number of passes to use to cull good particles#define PASSES 9// Maximum error we will allow for a trace in grid squares. Basically, this is the// level of of "background noise", or the base chance that something weird and completely// random happened with the sensor, not conforming to our observation model.#define MAX_TRACE_ERROR exp(-24.0/LOW_VARIANCE)// A constant used for culling in Localize#define WORST_POSSIBLE -10000// Used for recognizing the format of some data logs.#define LOG 0#define REC 1#define USC 2int FILE_FORMAT;carmen_FILE *readFile;carmen_logfile_index_p logfile_index;//// Structures//// A sample is a lot like a short-lived particle. More samples are generated than particles,// since we are certain that most of the generated samples will not get resampled. The basic// difference is that since samples are not expected to be resampled, they don't need an entry// in the ancestry tree, nor do they need to update the map.struct TSample_struct {   // The position of the sample, with theta being the facing angle of the robot.  double x, y, theta;    // The incremental motion which this sample moved during this iteration.   // D is the major axis of lateral motion, which is along the average facing angle during this time step   // C is the minor axis, which is rotated +pi from D   // T is the angular change in facing angle.  double C, D, T;   // The current probability of this sample, given the observations and this sample's map  double probability;   // The index into the array of particles, indicating the parent particle that this sample was resampled    // from. This is mostly used to determine the correct map to use when evaluating the sample.  int parent;};typedef struct TSample_struct TSample; // The number of iterations between writing out the map as a png. 0 is off.int L_VIDEO = 0; // Every particle needs a unique ID number. This stack keeps track of the unused IDs.int cleanID;int availableID[ID_NUMBER]; // We generate a large number of extra samples to evaluate during localization, much larger than the number of true particles. // We store the samples that are being localized over in newSample, rather than keep a true particle for each.TSample newSample[SAMPLE_NUMBER]; // In order to compute the amount of percieved motion from the odometry, the last odometry readings are recorded  // The actual percieved movement is the current odometry readings minus these recorded 'last' readings.double lastX, lastY, lastTheta; // No. of children each particle gets, based on random resamplingint children[PARTICLE_NUMBER]; // savedParticle is where we store the current set of samples which were resampled, before we have // created an ID and an entry in the ancestry tree for each one.TParticle savedParticle[PARTICLE_NUMBER];int cur_saved_particles_used; // Keeps track of what iteration the SLAM process is currently on.int curGeneration; // Stores the most recent set of laser observations.TSense sense; // This array stores the color values for each grid square when printing out the map. For some reason, // moving this out as a global variable greatly increases the stability of the code.unsigned char map[MAP_WIDTH][MAP_HEIGHT];FILE *diffFile;//// AddToWorldModel//// sense- the current laser observations// particleNum - the index into the particle array to the entry that is making additions to the map//void AddToWorldModel(TSense sense, int particleNum) {  int j;  // Run through each point that the laser found an obstruction at  for (j=0; j < SENSE_NUMBER; j++)     // Normalize readings relative to the pose of current assumed position    LowAddTrace(l_particle[particleNum].x, l_particle[particleNum].y, sense[j].distance, (sense[j].theta + l_particle[particleNum].theta), 		l_particle[particleNum].ancestryNode->ID, (sense[j].distance < MAX_SENSE_RANGE));}//// CheckScore//// Determine the fitness of a laser endpoint//// sense- the set of all current laser observations// index- the specific number of the laser observation that we are going to score// sampleNum- the idnex into the sample array for the sample we are currently concerned with//// Returns the unnormalized posterior for current particle//inline double CheckScore(TSense sense, int index, int sampleNum) {  double a;  a = LowLineTrace(newSample[sampleNum].x, newSample[sampleNum].y, (sense[index].theta + newSample[sampleNum].theta), 		   sense[index].distance, l_particle[ newSample[sampleNum].parent ].ancestryNode->ID, 0);  return MAX(MAX_TRACE_ERROR, a);}// // QuickScore//// Basically the same as CheckScore, except that the value returned is just a heuristic approximation,// based on a small distance near the percieved endpoint of the scan, which is intended to be used to // quickly cull out bad samples, without having to evaluate the entire trace.// The evaluation area is currently set at 3.5 grid squares before the endpoint to 3 grid squares past // the endpoint. There is no special reason for these specific values, if you want to change them.//inline double QuickScore(TSense sense, int index, int sampleNum) {  double distance, eval;  if (sense[index].distance >= MAX_SENSE_RANGE)    return 1;  distance = MAX(0, sense[index].distance-3.5);  eval = LowLineTrace((int)(newSample[sampleNum].x + (cos(sense[index].theta + newSample[sampleNum].theta) * distance)),		      (int)(newSample[sampleNum].y + (sin(sense[index].theta + newSample[sampleNum].theta) * distance)),		      (sense[index].theta + newSample[sampleNum].theta), 		      3.5, l_particle[ newSample[sampleNum].parent ].ancestryNode->ID, 3);  return MAX(MAX_TRACE_ERROR, eval);}//// Localize//// This is where the bulk of evaluating and resampling the particles takes place. // Also applies the motion model//void Localize(TSense sense){  double ftemp;   double threshold;  // threshhold for discarding particles (in log prob.)  double total, normalize;   double turn, distance, moveAngle; // The incremental motion reported by the odometer  double CCenter, DCenter, TCenter, CCoeff, DCoeff, TCoeff;  double tempC, tempD;  // Temporary variables for the motion model.   int i, j, k, p, best;  // Incremental counters.  int keepers = 0; // How many particles finish all rounds  int newchildren[SAMPLE_NUMBER]; // Used for resampling    // Take the odometry readings from both this time step and the last, in order to figure out  // the base level of incremental motion. Convert our measurements from meters and degrees   // into terms of map squares and radians  distance = sqrt( ((odometry.x - lastX) * (odometry.x - lastX)) 		 + ((odometry.y - lastY) * (odometry.y - lastY)) ) * MAP_SCALE;  turn = (odometry.theta - lastTheta);  // Keep motion bounded between pi and -pi  if (turn > M_PI/3)    turn = turn - 2*M_PI;  else if (turn < -M_PI/3)    turn = turn + 2*M_PI;  // Our motion model consists of motion along three variables; D is the major axis of motion,   // which is lateral motion along the robot's average facing angle for this time step, C is the  // minor axis of lateral motion, which is perpendicular to D, and T is the amount of turn in   // the robot's facing angle.   // Since the motion model is probablistic, the *Center terms compute the expected center of the  // distributions of C D and T. Note that these numbers are each a function of the reported   // odometric distance and turn which have been observed. The constant meanC_D is the amount of   // effect that the distance reported from the odometry has on our motion model's expected motion  // along the minor axis. All of these constants are defined at the top of this file.  CCenter = distance*meanC_D + (turn*meanC_T*MAP_SCALE);  DCenter = distance*meanD_D + (turn*meanD_T*MAP_SCALE);  TCenter = (distance*meanT_D/MAP_SCALE) + turn*meanT_T;  // *Coeff computes the standard deviation for each parameter when generating gaussian noise.  // These numbers are limited to have at least some minimal level of noise, regardless of the  // reported motion. This is especially important for dealing with a robot skidding or sliding  // or just general unexpected motion which may not be reported at all by the odometry (it   // happens more often than we would like)  CCoeff = MAX((fabs(distance*varC_D) + fabs(turn*varC_T*MAP_SCALE)), 0.8);  DCoeff = MAX((fabs(distance*varD_D) + fabs(turn*varD_T*MAP_SCALE)), 0.8);  TCoeff = MAX((fabs(distance*varT_D/MAP_SCALE) + fabs(turn*varT_T)), 0.10);  // To start this function, we have already determined which particles have been resampled, and   // how many times. What we still need to do is move them from their parent's position, according  // to the motion model, so that we have the appropriate scatter.  i = 0;  // Iterate through each of the old particles, to see how many times it got resampled.  for (j = 0; j < PARTICLE_NUMBER; j++) {    // Now create a new sample for each time this particle got resampled (possibly 0)    for (k=0; k < children[j]; k++) {      // We make a sample entry. The first, most important value is which of the old particles       // is this new sample's parent. This defines which map is being inherited, which will be      // used during localization to evaluate the "fitness" of that sample.      newSample[i].parent = j;            // Randomly calculate the 'probable' trajectory, based on the movement model. The starting      // point is of course the position of the parent.      tempC = CCenter + GAUSSIAN(CCoeff); // The amount of motion along the minor axis of motion      tempD = DCenter + GAUSSIAN(DCoeff); // The amount of motion along the major axis of motion      // Record this actual motion. If we are using hierarchical SLAM, it will be used to keep track      // of the "corrected" motion of the robot, to define this step of the path.      newSample[i].C = tempC / MAP_SCALE;      newSample[i].D = tempD / MAP_SCALE;      newSample[i].T = TCenter + GAUSSIAN(TCoeff);      newSample[i].theta = l_particle[j].theta + newSample[i].T;      // Assuming that the robot turned continuously throughout the time step, the major direction      // of movement (D) should be the average of the starting angle and the final angle      moveAngle = (newSample[i].theta + l_particle[j].theta)/2.0;      // The first term is to correct for the LRF not being mounted on the pivot point of the robot's turns      // The second term is to allow for movement along the major axis of movement (D)      // The last term is movement perpendicular to the the major axis (C). We add pi/2 to give a consistent      // "positive" direction for this term. MeanC significantly shifted from 0 would mean that the robot      // has a distinct drift to one side.      newSample[i].x = l_particle[j].x + (TURN_RADIUS * (cos(newSample[i].theta) - cos(l_particle[j].theta))) + 	               (tempD * cos(moveAngle)) + (tempC * cos(moveAngle + M_PI/2));      newSample[i].y = l_particle[j].y + (TURN_RADIUS * (sin(newSample[i].theta) - sin(l_particle[j].theta))) + 	               (tempD * sin(moveAngle)) + (tempC * sin(moveAngle + M_PI/2));      newSample[i].probability = 0.0;      i++;    }  }  // Go through these particles in a number of passes, in order to find the best particles. This is  // where we cull out obviously bad particles, by performing evaluation in a number of distinct  // steps. At the end of each pass, we identify the probability of the most likely sample. Any sample  // which is not within the defined threshhold of that probability can be removed, and no longer   // evaluated, since the probability of that sample ever becoming "good" enough to be resampled is  // negligable.  // Note: this first evaluation is based solely off of QuickScore- that is, the evaluation is only  // performed for a short section of the laser trace, centered on the observed endpoint. This can  // provide a good, quick heuristic for culling off bad samples, but should not be used for final  // weights. Something which looks good in this scan can very easily turn out to be low probability  // when the entire laser trace is considered.

⌨️ 快捷键说明

复制代码 Ctrl + C
搜索代码 Ctrl + F
全屏模式 F11
切换主题 Ctrl + Shift + D
显示快捷键 ?
增大字号 Ctrl + =
减小字号 Ctrl + -