Introduction
For our final project we decided to build a robot that could navigate from one location to any given target and avoid obstacles in its way. The robot body used a very primitive design that included a cardboard box for the body and Minute Maid wheel caps. A two wheeled design for the robot allowed us to program the bot so that it could turn while remaining in its position. Stepper motors were used to propel the wheels and navigation and obstacle avoidance was implemented using IR sensors.
High Level Design
The inspiration for the robot and its functionality, as mentioned briefly in the summary, came from our second-last project in ECE 314. We had to design a mechanism using the philosophy of state machines to make a robot move from its home to a target to steal a flag and then come back, all the while avoiding collision with tricky obstacles and also simultaneously avoiding and beating the competitor’s robot which was trying to do the same.
Obstacle avoidance seemed fairly straight forward and there were a number of sensors on the market that would do the job. We chose the Sharp IR Distance ranging sensor due to its low cost. Accuracy in distance was not an issue for us since we just needed to know if or not there was an obstacle in our way. We thus decided against using the Devantech sensor which was fairly expensive and would eat into our budget completely.
Target finding was a more difficult thing to implement from both the hardware and software perspective. We experimented with various IR led transmitters and photo transistors but were unable to acquire any decent signal beyond 2 feet. We also experimented with the idea of using dead reckoning.
Given that we were using stepper motors for movement, it was theoretically possible to measure the distance moved by each wheel. Also knowing that our bot turned in the same place we could track the angle that the bot rotated by and use that for any subsequent distance measurements with respect to our target location. This however failed since the stepper motors skipped a considerable amount as they were not powerful enough to deliver a steady torque to the wheels.
Two weeks into the project we stumbled across the idea of using a standard TV remote control for target finding. From experience we knew that the IR remote control had a fairly good range of about 10 feet or so. What we needed were some sensors that could pick up this signal. We were able to sample 4 IR 38 Khz Vishay sensors from Newark One Electronics. The combination of the IR remote and sensor helped us address the problem of target finding.
The bot body was constructed out of a simple cardboard box. Minute maid caps were used as rear wheels and a ping pong ball was used as a third supporting wheel.
Hardware Design
Motion:
Bot movement was achieved using 2 stepper motors that were provided by Prof. Land. We made extensive use of theVertical Plotter project, SP 01 for understanding the operation of the stepper motors.
The operation of these 7.5 degree stepper motors is documented below. The motors had 2 brown leads that needed to be connected to Vcc. The rest of the wires had to be excited in the sequence of brown, red, orange and white to operate the motor. The following sequence was used to test the motor operation in the beginning.
Brown | Red | Orange | White |
+ | |||
+ | |||
+ | |||
+ |
Thereafter to provide more torque to the motor 2 of the wires were excited at a time in the following sequence to provide a half step at every cycle. When operated at 12 V the motors provided us with sufficient torque to move the bot.
Brown | Red | Orange | White |
+ | + | ||
+ | + | ||
+ | + | ||
+ | + |
As a first step the motors were driven using the TIP31C transistors. To drive each of the 2 motors we would need 8 of these transistors and assembling the circuit would be cumbersome. Also the circuit with 8 transistors would occupy a considerable amount of space on the bot whose size was limited. We thus decided to use the ULN2003 chip to drive the motor and isolate them from the microcontroller. The microcontroller output was connected to pins 1 through 7 on the chip and the output across pins 10 - 12 were connected to the stepper motors. 2 of these chips were used, one for each motor.
ULN2003 Pin Config
The following code was written to step through each of the motors.
A speed differential on the wheels was used to turn the bot. A ping pong ball placed on the front acted as a third supporting wheel. The 2 wheel configuration allowed us to turn the bot in its place.
Obstacle Detection:
The Sharp GP2D02 sensor was used for the purpose of obstacle detection. Using the sensor was fairly straightforward. Using the microcontroller, we implemented a state machine which outputted a sequence of signals to the sensor as illustrated below. Once this was done the output of the sensor was read in serially to obtain distance information. If an obstacle was found to be in a vicinity of about 7 cm or closer a flag was set so that the bot would avoid the obstacle.
More information on the Sharp GP2D02 sensor may be found here
Direction Tracking:
The robot would need a mechanism of tracking some kind of input sent from far away, and use that information to determine the direction of motion. We decided to build an infra-red sensor capable of picking up an IR beam from 4-7 feet away. We were inspired by the IR sensing used in a Spring 2002 project entitled “Line following Autonomous Car.” We got a Radioshack emitter-receiver pair from Professor Land and built a source and receiver circuit.
The circuit schematic is shown below:
Schematic of initial IR source-receiver circuit
The sensor would pick up the IR beam from 7 feet away but the voltage received would be very small, ranging from 50 mV at 5 feet to 1V at 2”. The first problem was that the sensor was picking up noise from the 60 Hz room light and also sunlight. This was substantially reduced by using a simple RC high pass filter with a cutoff at 1 kHz. We used a 555 timer to drive the emitter at frequencies higher than 220 Hz, such as 230 Hz, 345 Hz, 0.5 kHz, and beyond using different RC pairs. This solved the noise problem. Next, we realised that the sensor voltage would have to be fed into a MCU port. This meant that we needed a voltage of the order of 2.5-3V to get any meaningful output from the sensor. We imagined that having a simple gain stage would solve the problem. So we tried using an inverting amplifier, then a non-inverting amplifier, and finally a comparator all based on the LMC 7111 op-amp. The gain drove the voltage to the power rail at short distances but it actually destroyed the receptivity. The sensor would no longer pick up a signal from farther than one and a half feet. One solution was to use the Analog Comparator in the Mega 32 MCU but since we needed four such blocks, one for each sensor, this was not sufficient.
We were also having problems with the 555 timer at the source side. This led us to use a standard DVD remote control which sends IR beams at 38 kHz. Shaun found cheap IR receiver modules online, and he was able to sample four Vishay TSOP1138 sensors. Looking at the pin-out and block diagram of the TSOP1138 given below, the reader can appreciate the level of complexity needed in a good sensor.
Schematic of Vishay TSOP1138 sensor used to receive a DVD remote-control IR beam
To control the receptivity of the sensor, we decided to try PVC electrical tape. It worked perfectly. The trick is to drape the sensor in layers of airtight tape, and then cut a small hole in the middle, and then tunnel it using a small tape-covered tube. This restricts the sensor vision to one direction only. The hole-size and the length of the tubing determine the field of vision. Shaun mounted the four sensors on an arch made from cardboard and our robot was ready to go!
Program design
Motor Control:
The below C code was used to operate the stepper motors. A simple state machine was implemented to implement the stepping sequence.
char forward(char x){ switch(x){ case (1+2): x = 2+4; break; case (2+4): x = 4+8; break; case (4+8): x = 8+1; break; case (8+1): x = 1+2; break; default: x = 1+2; break; } return x; } FORWARD STEP | char backward(char x){ switch(x){ case (1+2): x = 8+1; break; case (2+4): x = 1+2; break; case (4+8): x = 2+4; break; case (8+1): x = 4+8; break; default: x = 1+2; break; } return x; } BACKWARD STEP |
To run the motor in the opposite direction the mot was stepped in the opposite sequence. Using this code we could move the bot in each of the four directions. To turn right the left motor was made to move forward whereas the right motor moved backwards thereby allowing the bot to turn in its position.
Direction Sensing:
Navigating the bot relied on 2 AI's one for obstacle avoidance and the other for direction finding. The Sharp sensor was used for obstacle detection whereas 4 Vishay 38 Khz IR sensors were used for direction sensing. A standard TV remote Control was used to hone the bot into its target location. When detected by the Vishay sensor the output of the sensor would go low. We programmed the AI so that each of the sensors were polled once every 100 micro seconds.
Programming direction finding proved to be harder than expected. We experienced issues with more than one sensor picking up a signal. Consequently we had to come up with an AI scheme that would prioritize the input signals from the sensor and only use relevant information. The bot was programmed so that it would reset and begin to rotate in its position, scanning the vicinity for the target in the left direction once about every 3 seconds. This was done so that we could correct the bot direction every once in a while. This also helped us deal with the issue of 2 or more sensors picking up an IR signal.
The directional AI was programmed in the following manner:
- While scanning if the forward sensor port went low, that would imply that the bot was facing in the direction of the IR transmitter. We would thus set the direction flag to move forward.
- Once the bot started moving towards the target any other sensor inputs were ignored. This was done so as to prevent the bot direction from changing constantly due to noise picked up by the left and right sensors when the bot moved in the forward direction.
- Every 3 seconds the bot AI was reset so that it began scanning for the IR transmitter once again.
- The left and right sensors were programmed so that if either of them were triggered the bot would set a flag in that corresponding direction.
Obstacle Avoidance:
A state machine was implemented for the purpose of obstacle avoidance. In the case that no obstacle was detected the bot would continue to move in the direction determined by the Vishay sensors as documented above. In the event that an obstacle was detected the bot would immediately turn in position so as to avoid the obstacle. The bot was programmed so that it would continue turning as long as there was an obstacle.
When implementing obstacle avoidance there were cases when the bot would turn and avoid the obstacle but have one of its edges collide with the obstacle. This would happen since the distance ranging sensor was located in the center of the bot and could not detect stuff not directly in front of it. To resolve this problem we programmed the AI so that the bot would conservatively continue avoiding the obstacle for a small amount of time even once the no obstacle flag was set. By doing so we ensure that the bot would collide into anything in had just avoided.
Once an obstacle was avoided the bot would then continue moving forward in that direction for a while ignoring all sensor inputs. This was done in order to steer past some longer obstacles. Once this was done the bot AI was reset so that it would again begin to scan for the IR transmiiter.
Bot avoiding obstacles
Results
Our bot was satisfactorily able to navigate and avoid obstacles.
Obstacle detection worked fairly accurately using the Sharp Sensor. Issues with the edge of bot colliding with an object that it had just avoided were fixed in the software AI. These issues arose since the Sharp sensor was mounted in the center on the bot and had a limited field of view.
The stepper motors worked satisfactorily. Using a 12 V external power supply we were able to provide them with enough power to steer the bot.
Using black tape to gag the sensor, we were able to cut down its receptivity and create tunnel-vision. Using four such sensors, one for each direction, we had a mechanism of tracking an IR beam. This knowledge would enable the robot to choose the direction of motion. The accuracy of the sensors is very good at ranges of two-five feet. At closer distances, since two sensors are at 45 degrees to the target, it becomes more difficult to choose one direction. This is resolved by the AI in the software.
We went through ten days of pain in trying out different variations which would succeed but to no avail. A big stumbling block was the lack of adequate experience in analog design, leading to over-dependence on Professor Land for trouble-shooting and new ideas. However, after this period, we knew what we wanted but also realised the enormity of the task. Essentially, design of an IR receiver module was a project itself. Here’s some sound advice: if you know what you want, look around in catalogues and on the internet for cheap parts which will do the job and contact the vendor and request for samples.
We were overjoyed when we first tested the TSOP1138 because it was powerful enough to pick up beams from more then 20 feet away. Ironically, and perhaps comically, we now faced the opposite problem. Earlier, we had laboured to get adequate gain. Now we had too much receptivity. It was proving to be extremely difficult to restrict the sensor’s vision to a particular direction. It was modeled on a real world IR sensor designed to get the input at all costs, which means if there was no line of sight between remote control and sensor, it would still receive the beams reflected by surrounding objects giving it a 360 degrees field of view. We tried all sorts of tricks, wrapping it with layers of cardboard, cupping it with our hands but to no avail. Using layers of aluminium foil worked but it would often short the Vcc connection to the other pins. It is easy to see how Electrical Engineering is EE (electrifyingly entertaining).
Learning how to beg, scavenge, and nag is a big part of this course. You have a strict limit on your budget, and you have to be very resourceful.
Code Listing
#include <Mega32.h>
#include <stdio.h> //for debugging using printf, etc
#include <stdlib.h>
#include <math.h>
//timeout values for each task
#define prescale0 3
#define t1 1
#define kgC 1770
#define koC 29
#define LCDwidth 16 //characters
// State variables used by sensor for obstacle detection
#define RESET 0
#define INIT 1
#define MEAS 2
#define WAIT 3
// State variables used by navigation AI
#define NOMOVE 0
#define FORWARD 1
#define RIGHT 2
#define LEFT 3
#define BACKWARD 4
// State variables used for obstacle detection
#define NOOBSTACLE 0
#define OBSTACLE 1
#define TURN 2
#define GOFORWARD 3
//the subroutines
void sensor(void); // detect distance to obstacle
void initialize(void); // all the usual mcu stuff
unsigned char reload, //timer 0 reload to set 1 mSec
lcd_buffer[17]; // LCD display buffer
int count, count1, count2, count3, count4, Cdis;
char time1, state, index, Ct[8], C[6];
float Ccm;
unsigned int temp;
unsigned char stepR, // Right motor step
stepL, // Left motor step
obstacle; // Obstacle detection flag
unsigned char dir, // Direction of movement
target, // Direction of target
obsDir; // Obstacle detection
// Declare Timer 0 interrupt
interrupt [TIM0_OVF] void timer0_overflow(void)
{
TCNT0=reload;
if(time1 > 0) --time1;
}
// B R O W
// + +
// + +
// + +
// + +
// Stepping sequence
// Step the motor in the FORWARD direction
char forward(char x){
switch(x){
case (1+2):
x = 2+4;
break;
case (2+4):
x = 4+8;
break;
case (4+8):
x = 8+1;
break;
case (8+1):
x = 1+2;
break;
default:
x = 1+2;
break;
}
return x;
}
// Step the motor in the backward direction
char backward(char x){
switch(x){
case (1+2):
x = 8+1;
break;
case (2+4):
x = 1+2;
break;
case (4+8):
x = 2+4;
break;
case (8+1):
x = 4+8;
break;
default:
x = 1+2;
break;
}
return x;
}
// Determine direction of movemnent
void movement(void){
// Turn RIGHT
if( dir == RIGHT ){
stepR = forward(stepR);
stepL = forward(stepL);
}
// Turn LEFT
else if( dir == LEFT ){
stepR = backward(stepR);
stepL = backward(stepL);
}
// Move FORWARD
else if( dir == FORWARD ){
stepR = forward(stepR);
stepL = backward(stepL);
}
// Move BACKWARD
else if( dir == BACKWARD ){
stepR = backward(stepR);
stepL = forward(stepL);
}
// Output stepper sequence to motor
PORTA = (stepL<<4) | stepR;
}
void initialize(void)
{
// Vin
DDRC.0=1;
PORTC.0=1;
// Vout
DDRC.1=0;
PORTC.1=0;
// PORT A D OUPUT
DDRA = 0xff;
DDRD = 0xff;
// PORT B INPUT
DDRB = 0x00;
// Used for RS232 debugging
UCSRB = 0x10 + 0x08;
UBRRL = 103;
UBRRH = 0;
// Initially scan the vicinity for the target
dir = RIGHT;
// Initially reset to no obstacle
obsDir = NOOBSTACLE;
//set up timer 0
reload=256-26; //value for 0.1 msec
TCNT0=reload; //preload timer 1 so that is interrupts after 1 mSec.
TCCR0=prescale0; //prescalar to 64
TIMSK=1; //turn on timer 0 overflow ISR
// Initiate counters
time1 = t1;
count = 0;
count2 =0;
count3 = 0;
count4 = 0;
count1 = 0;
// Initiate obstacle distance to 0
Cdis = 0;
// Reset the distance counter
for(index=0;index<8;index++){
Ct[index]=0;
}
// Initialize obstacle sensor
index = 0;
state = RESET;
#asm
sei
#endasm
}
void getObsDir(void){
switch(obsDir){
case NOOBSTACLE:
dir = FORWARD;
if(obstacle)
obsDir = OBSTACLE;
break;
case OBSTACLE:
dir = RIGHT;
if(!obstacle)
obsDir = TURN;
break;
case TURN:
if( count2++ > 7500 ){
count2 = 0;
obsDir = NOOBSTACLE;
}
if(obstacle)
obsDir = OBSTACLE;
}
}
void main(void)
{
// Initialize
initialize();
while(1) {
// If there is an obstacle in the vicinity set a flag
if(Ccm > 34) obstacle = 0;
else obstacle = 1;
// Interrupt
if(time1 == 0){
// AI to find direction
switch(obsDir){
// In the event that there is no obstacle
// the bot scans the vicinity for the target every 3 sec
// Once it is facing the target it starts moving forward
// in that direction
// Every 3 seconds the bot beings to scan again
case NOOBSTACLE:
// In case an obstacle is detected
// deal with the obstacle
if(obstacle){
obsDir = OBSTACLE;
break;
}
// If bot is scanning and the forward sensor comes on
// then move in the forward direction
if(~PINB.0 == 1 && ( dir != FORWARD) ){ // FORWARD blue
dir = FORWARD;
}
// If the bot is scanning and the right sensor comes on
// continue scanning in the right direction
if( ~PINB.1 == 1 && ( dir != FORWARD)){ // RIGHT red
dir = RIGHT;
}
// If the bot is scanning and the left sensor comes on
// continue scanning in the left direction
if( ~PINB.3 == 1 && ( dir != FORWARD)){ // LEFT black
dir = LEFT;
}
// If the bot is facing backwards
// start scanning in the left direction
if( ~PINB.2 == 1 ){ // BACKWARD green
dir = LEFT;
}
// Reset the bot every 3 secs so that it starts scanning again
if(count3++ > 30000){
dir = LEFT;
count3 = 0;
}
break;
case OBSTACLE:
// In case of an obstacle turn right to avoid the obstacle
dir = RIGHT;
if(!obstacle){
count2 = 0;
obsDir = TURN;
}
break;
case TURN:
// Continue avoiding the obstacle for a small time
// and then move in the forward direction
if( count2++ > 7500 ){
obsDir = GOFORWARD;
count4 = 0;
}
if(obstacle)
obsDir = OBSTACLE;
break;
case GOFORWARD:
// Move ahead in the forward direction once
// the obstacle has been avoided
dir = FORWARD;
if( count4++ > 20000){
if(!obstacle){
obsDir = NOOBSTACLE;
count3 = 0;
}else
obsDir = OBSTACLE;
}
if(obstacle)
obsDir = OBSTACLE;
break;
}
// Check for obstacle
sensor();
// Step motors in the intended direction of movement
if(count1++ > 250){
count1 = 0;
movement();
}
}
}
}
void sensor(void){
// Reset timer
time1 = t1;
switch(state){
case RESET:
PORTC.0 = 0;
count = 0;
state = INIT;
break;
case INIT:
// Set Vin low for 70 ms
PORTC.0 = 0;
if(count++ > 700){
count = 0;
index = 0;
state = MEAS;
}
break;
case MEAS:
// Toggle Vin high and low alternately 8 times
if(count++ <= 15){
// Record serial ouput from sensor into an array
PORTC.0 = !PORTC.0;
if(PORTC.0 & (index++ < 8 ))
Ct[index] = PINC.1;
}else{
// Use array to compute distance in
PORTC.0 = 1;
count = 0;
index = 0;
state = WAIT;
Cdis = (Ct[0]*128)+(Ct[1]*64)+(Ct[2]*32)+(Ct[3]*16)+(Ct[4]*8)+(Ct[5]*4)+(Ct[6]*2)+(Ct[7]*1);
Ccm = kgC/(Cdis-koC);
ftoa(Ccm,3,C);
printf("%s",C); // debug
}
break;
case WAIT:
// Set Vin low for 1.5 msec and then continue measuring distance
if(count++ > 15){
count = 0;
state = INIT;
}
break;
default:
state = RESET;
break;
}
}
Conclusions
We are quite happy and satisfied with the performance of our robot. If we had more time, we would have made the AI more robust, and taken the robot closer to its software compatriot in the ECE 314 project. Essentially, the fun lies in designing state variables to help the robot remember its past, to enable intelligent decisions about direction of motion. This would make the robot immune to L-shaped obstacles.
Ethical Considerations: Referring to the IEEE code of ethics accessible athttp://www.ieeeusa.org/documents/CAREER/CAREER_LIBRARY/ethics.html, we, Aditya Jhunjhunwala and Shaun D’Souza, hereby declare on the second day of May, 2003 that:
1. We accept responsibility in making engineering decisions consistent with the safety, health and welfare of the public. Our project objective does not contradict this goal.
2. Our work was consistent with the spirit of improving the understanding of technology. This project has been a great learning experience. It was immensely enjoyable with lots of intellectual satisfaction. We understood our limitations in Analog Design, and appreciated the fact that people have designed and commercialised very useful analog circuits.
3. We will strive to maintain and improve our technical competence and to undertake technological tasks for others only if qualified by training or experience, or after full disclosure of pertinent limitations. Our project decision described in the Summary section proves that we did not venture into a field in which we lacked expertise, namely Kalman filtering. We accepted our limitations in Analog Design, and used commercial sensors.
4. We will seek, accept, and offer honest criticism of technical work, acknowledge and correct errors, and credit properly the contributions of others. We sought the advice of Professor Land and our class-mates while encountering different problems, while offering our own opinion to fellow students in their undertakings.
5. We will treat fairly all persons regardless of such factors as race, religion, gender, disability, age, or national origin. Our project will definitely excite a person regardless of his caste, colour, or creed.
0 comments:
Post a Comment