16-199: Building the Future
Spring 2004

Mobot Competition Entry

Project Description
Jack Wu and I applied for two SURG grants pertaining to the biped project, the second of which was to consider the unique sensing and navigation issues present when dealing with a biped robot. Due to some of the unforseen difficulties we encountered with the primary biped project, we decided to use this project as an opportunity to experiment with navigation issues present in the annual mobot competition.

Our entry in the mobot competition used the chassis from a Ford R/C toy truck with the motors controlled directly by a motor controller that communicated with a basic stamp. Steering was handled by another basic stamp that received streaming data from a CMUcam2 and then issued commands to a servo attached to the CMUcam2 itself to direct the front wheels.

Project State
Initial tests were very positive, demonstrating that the CMUcam2 could very effectively locate the white line and track it under varying lighting conditions.

Some difficulty was encountered at the hills, on which the robot must be directed to slow down lest it fly off the course. At first, we attempted to use an inertial measurement package that gave us tilt and acceleration, but we were unsuccessful in applying it to sense the hills because its output appeared to be rather noisy and was easily affected by the motion of the vehicle itself. In the end, a timer activated by a gate sensor was used to activate a motor reversal that slowed down the vehicle. Though not a preferred solution, we found this to work quite well in practice.

The second major difficulty we encountered was a difficulty to properly track splits and joins of the white line in the decision point portion of the course. We attempted to use a method that relied on the relative size of the bounding-box supplied by the CMUcam2's color tracking function, but this proved to be noisy and far less than perfect. An optimal solution would use odometry to determine the location of the robot on the course and to choose the appropriate branch from a pre-determined map. Faced with a lack of time, we also chose a combination of timing, gate counting, and CMUcam2 input for this portion of the course. At best, our solution still could not reliably solve the decision points.

Despite these difficulties, our robot was still very competitive. Ours was likely the fastest in the undergraduate class, and we won the $99 mini-challenge for the linear portion of the course.

Unfortunately, on race day, we were struck by quite a lot of bad luck. Less than 15 minutes before the race began, our robot flipped and struck concrete steps adjacent to a hill, snapping off the voltage regulator on the CMUcam2. We managed to repair this in time, but it now appeared that the pre-programmed timings used for slowing down on hills were off. On our first attempt, the robot failed to stay on the course while going down the first hill. On our second attempt, the robot failed to steer at all, which we determined to be a result of an improper peripheral initialization sequence. After the race, however, our robot easily navigated through the first half of the course. Had we been successful on our second attempt, we would have surely won at least third place.

Future Intentions
We intend, very simply, to win the mobot competition next year. To make this a realistic goal, we intend to continue work on the mobot over the summer. Odometry will be a definite addition, along with a better motor controller with additional features, possibly including a full PID loop. Additionally, I intend to use a far more powerful processor (the MPC555 discussed in another project) that will ease programming significantly and enable us to consider more advanced vision algorithms.

Relevant Links

Pictures and movies will be posted once uploaded by Jack.