final report zack bell zmaps red bot blue bot eel 4665 ......integrated system ... velocity...

23
Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665/5666 Intelligent Machines Design Laboratory Instructors: Dr. A. Antonio Arroyo, Dr. Eric M. Schwartz TAs: Andy Gray, Nick Cox

Upload: others

Post on 21-Apr-2020

22 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

Final Report

Zack Bell

ZMAPS Red Bot Blue Bot

EEL 4665/5666 Intelligent Machines Design Laboratory

Instructors: Dr. A. Antonio Arroyo, Dr. Eric M. Schwartz

TAs: Andy Gray, Nick Cox

Page 2: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

2

TABLE OF CONTENTS Abstract .......................................................................................................................................... 3

Executive Summary ...................................................................................................................... 3

Introduction ................................................................................................................................... 3

Integrated System ......................................................................................................................... 4

Mobile Platform ............................................................................................................................ 6

Actuation ........................................................................................................................................ 9

Sensors ......................................................................................................................................... 11

Behaviors ..................................................................................................................................... 18

Experimental Layout and Results ............................................................................................. 20

Documentation ............................................................................................................................ 22

Appendices ................................................................................................................................... 22

Page 3: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

3

Abstract

Computer vision has been used for many years to analyze images and get meaningful information

that can be used. This information can be many things depending on the application. This project

uses computer vison to determine the two dimensional location of an object with known

dimensions in a room. Using the position it is possible to approximate two dimensional maps of

rooms or other enclosed areas using inexpensive equipment. This goal is accomplished using a

pair of robots named ZMAPS Red Bot and ZMAPS Blue Bot. The robots work together to

accomplish this task.

Executive Summary

This paper descirbes the work I have put into a pair of robots over the past four months. The

designed robots map out enclosed spaces using a camera on a servo driven pan and tilt mechanism

on one robot and an object of known dimensions and color on the other robot. The robots use

velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

encoders for velocity feedback. The object used is a 9 inch by 9 inch pink clinder. It is tracked

using the camera on a pan and tilt mechanism. Using the pin hole camera model measurements of

the position of the object are made relative to the camera. The angle of the measured distance is

determined from the pan servo angle. The robot with the object drives around and an occupancy

map is made by showing where the robot drives. During this process the assumption is made that

anywhere the robot drives is open space. This allows the assumption to be made that all remaining

locations must be occupied. This allows for maps within a 15 foot radius to be formed within 2.5%

of true measurement.

Introduction

I have enjoyed path planning and mapping for some time. While this appreciation did not begin

with robots recently I have found a great interest in using robots to accomplish the goal of planning

and mapping. I also enjoy control of dynamic systems. One of the more complex and interesting

problems being planning and mapping with autonomous vehicles. As time goes on the auto

industry is making cars safer by adding many autonomous features. Such as lane detection, parking,

and obstacle detection to apply or assist in applying brakes. Many of these features use computer

vision and radar to determine objects position and the car maps a way to detect lanes, park, or

avoid an obstacle.

Outside of the auto industry the robotics community is growing rapidly. Many of these robots

always need at least one thing, an ability to sense the world around them and make control

decisions based on observations. I believe one of the most powerful and affordable ways to do this

is through computer vision. My interest in computer vision sensing for control led me to come up

with Zack’s Mapping Squad (ZMAPS).

This project introduces the first members of the ZMAPS family Red Bot and Blue Bot. The

ZMAPS robots work together using a camera on a pan and tilt mechanism and an object of known

dimensions. Using these tools the ZMAPS robots build a map of their local environment starting

with a small enclosed space.

Page 4: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

4

This paper explains the overall ZMAPS network, the ZMAPS robots platforms, how the robots

move, what the robots use for sensing, how the robots build maps, and some experiments used to

tune the system.

Integrated System

The ZMAPS network is composed of a router, an Asus laptop, an ODROID U3 computer (Red

Bot), an ODROID C1 computer (Blue Bot), and two Arduino Mega2560 microcontrollers (Mega).

The computers run ROS in Ubuntu Linux. These computers are used for communication, image

processing, object detection, object tracking, obstacle avoidance, and building two dimensional

maps. The microcontrollers are used for reading the IR range finders, running the dual motor

drivers to drive the motors, driving the camera servos, and reading the motor encoders.

Red Bot uses a PS3 Eye Camera to observe Blue Bot, to get the distance to Blue Bot from Red

Bot, and get the angle error between the centroid of an object on Blue Bot and the center of the

camera frame. It uses a PID controller to send angle commands in a way its Mega can understand.

It sends information to its Mega to update servo positions for keeping Blue Bot in center frame.

Feedback from the servos is given to determine the servos current angle. It streams this information

off of and on to the network using a Panda wireless USB adapter.

Blue Bot uses sensor information from its Mega for determining its position relative to objects. It

responds by changing its velocity based on the distance to objects in an effort to avoid obstacles.

It uses a PID controller to convert these velocity commands into integers its Mega can understand.

The mega then transmits these commands to the motor drivers. Feedback on speed is given by the

rotary encoders on the motors. It uses Panda wireless USB adapter for streaming information off

of and on to the network.

The laptop takes information from an Xbox controller to determine if the system should move to

autonomous mode or not. If the system is in manual mode the Xbox controller is used to drive the

robots around. The laptop also takes information from the robots to build occupancy maps based

on the angle and distance measurements given by Red Bot. It also hosts the core for the network.

Figure 1 shows a block diagram of all the devices in the network and how they will communicate

or transfer power.

Page 5: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

5

Figure 1. Block diagram of the ZMAPS network. The arrows show the flow of information or

power. A double arrow shows information goes both ways.

Page 6: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

6

Mobile Platform

Red Bot and Blue Bot have very similar mobile platforms. Most of the bodies are made from balsa

wood. The bases are circular discs of balsa wood. Each layer is separated by four inch tall pieces

of red oak. This was used because it bonds easily to the balsa with adhesives and it was readily

available. Two parallel rectangular holes were cut offset from the edge of the disc for the drive

wheels. This was done so the outer diameters of the platforms are equal around the entire

circumference and smooth making it less likely for the platform to become trapped on an edge.

This especially helps Blue Bot if it gets close to an object.

Arranging the wheels like this also allows for differential driving and steering of the platform. The

wheels are ABS hubs with silicone treads for grip. The motors attach to the wheels by an M3

tapped aluminum coupler and M3 machine screws. The motors attach to the base of the platform

by ninety degree aluminum brackets and M3 machine screws. The platform also has two plastic

ball caster wheels to help with support and balance but allow for slip. The base also holds the three

IR sensors for detecting the relative position of obstacles. These were mounted using more balsa

wood that was bonded to the base using adhesives. The motor driver is attached to the base using

M3 standoffs. The entire drive assembly was placed on the base layer to keep it together. Apart

from the lipo battery which was put above one layer to make it easy to remove.

The second layer holds the Mega, the power distribution, and the motor lipo battery. This layer is

partially circular but has the front cut back slightly to allow for easier access to the bottom of the

platform. The center of the layer has a large hole cut out to allow the wires from the base layer to

easily pass through to the Mega, lipo battery, and power distribution board. The Mega, power

distribution board, and the lipo battery are attached to the layer using Velcro and adhesives.

The third layer holds the ODROID and the ODROID lipo battery. This layer is cut back very far

to allow easy access to the second layer and the Mega. The ODROID and its battery are attached

to the layer using Velcro and adhesives.

The final layer of the two robots are where they primarily differ. Red Bot has a camera on a pan

and tilt mechanism driven by two servos to track Blue Bot. Blue Bot has a pink cylinder of known

dimensions on top of it to enable easy tracking. This layer is cut back to allow the antenna from

the Panda wireless USB adapter on the ODROID to easily sit upright and have the best transmitting

ability. This layer design is identical to the third layer design on both robots. The cylinder is

attached to Blue Bot using Velcro and the pan and tilt mechanism is attached to red bot using

adhesives. A picture of the layout of Blue Bot is shown in Figure 2. A picture of the layout of Red

Bot is shown in Figure 3.

It became difficult to attach anything to the robots after they were constructed. This was not the

best way to attach everything but it works. I would have rather drilled holes for everything and

had more solid connections but a final design was not known at the time and there was not a way

to know what the final design would look like. Another thing that has become obvious is not having

suspension on the robots causes them to rock back and forth on initial acceleration from stopped

position. This becomes irritating but due to the current models wheel layout and the wheels

themselves a simple solution is not available. It does not have a large effect on the robots but it is

irritating. Originally the platform was made to not rock but after getting stuck on the lip of some

Page 7: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

7

carpet because there was no suspension the rocking was allowed. In the future suspension will be

considered.

Figure 2. Image of Blue Bot. Can see each layer and its respective components.

Page 8: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

8

Figure 3. Image of Red Bot. Can see each layer and its respective components.

Page 9: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

9

Actuation

Red Bot and Blue Bot both drive their wheels by two twelve volt Pololu brushed motors. These

motors are geared to reduce rotational speed and increase torque. The motors also have built in

quadrature rotary encoders to give feedback on the rotational rate of the motors. The motors are

driven by Pololu dual bidirectional motor drivers each with two full H-Bridges. Each H-Bridge

accepts a two bit number to give forward or reverse direction and allow for braking to the positive

reference or the ground reference. The motor drivers accept a pulse width modulated signal (PWM)

to give effort level to the output. The motor drivers also give feedback on the amount of current

being drawn. These PWM signals can be as high as 20 KHz which is nice because it is out of the

audible range.

Red Bot and Blue Bot use their Megas to drive the motors using a PID velocity controller in

software that passes an effort level to the Mega. Red Bot’s Mega uses a PID position controller for

the two servos used to operate the pan and tilt mechanism holding Red Bot’s camera. The servo

controllers are convenient for the project but also hinder the accuracy of the tracking. In the future

a motor and rotary encoder will be used to control the pan and tilt mechanism. The servo hinders

the tracking because it requires the signal to be an angle that is directly desired. This angle is

passed by keeping a low frequency signal high for a certain amount of time. This limits the overall

resolution possible. However a rotary encoder is only limited by the number of counts it is capable

of. Additionally it would be possible to easily include a velocity and position controller instead of

just a position controller. This will in the future allow for smoother tracking.

It was difficult to generate good command signals for the servos and motors because the loop time

was so low. The 10 Hz loop time does not work well at all for tracking or velocity control. It works

okay but the system has large steady state error because the integral gain of the PID cannot be

turned too high otherwise wind up occurs because of the low loop rate. The proportional gain also

cannot be turned too high otherwise the system goes unstable because the oscillations get too great.

The derivative gain has the same problem. Overall the system loop time needs to be increased well

over ten times more. Unfortunately the IR loop times cannot exceed 20 Hz. In the future these will

be placed in a condition to not sample them every time. This will allow for better control. Another

problem with control was the driving frequency not being above the audible range. This caused

the motors to be less stable. The frequency the motors are currently driven is around 4000 Hz. This

is too low for driving these motors and it shows in the stability of the system. This compounds

with the low loop time to greatly decrease the controllability of the motors. This will be fixed in

the future.

The code for the robots control is found in the appendix. The hardware code on the Megas is the

red_bot_v1.ino and blue_bot_v1.ino. These have all the code for driving the motors, reading the

encoders, reading the servos, driving the servos, and reading the IR proximity sensors.

The higher level code for driving the motors is broken up into many parts. The three IR sensors on

each robot generate a subtractive velocity command depending on how far away they read they

are from any object. The value for the IR intensity is passed from the Mega. Each sensor has its

own program. These can be found in the appendix as front_side_error_v3.py,

right_side_error_v3.py, and left_side_error_v3.py. These programs use hyperbolic tangents to

generate the error commands by subtracting the current position measured from a minimum

Page 10: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

10

position allowed. This keeps the vehicle off of objects. The hyperbolic function is shaped using a

scalar and the result has one subtracted from it. Using this technique it is possible to shape the

response so when everything is far away it has no effect but as it gets closer the effect gets stronger

until it saturates at the exact distance from the obstacle.

The formula is shown as:

π‘π‘šπ‘‘ = π‘‘π‘Žπ‘›β„Ž(1.75 βˆ— (π‘ π‘’π‘›π‘ π‘œπ‘Ÿ_π‘Ÿπ‘’π‘Žπ‘‘ βˆ’ π‘šπ‘–π‘›_π‘‘π‘–π‘ π‘‘π‘Žπ‘›π‘π‘’)) – 1

where the minimum distance used was 12 inches, the sensor read was the read voltage converted

to distance, and command is the output command. This leads to the response shown in Figure 4.

Figure 4 . Plot of the response for π‘π‘šπ‘‘ = π‘‘π‘Žπ‘›β„Ž(1.75 βˆ— (π‘‘π‘–π‘ π‘‘π‘Žπ‘›π‘π‘’ βˆ’ 12)) βˆ’ 1. As shown the

response saturates exactly at 12 but the distance has almost no effect until it’s within three inches.

The current motor speed is calculated for each motor by reading in the counts and counts time

from the encoders which is passed from the Mega. This value gives counts per second which is

converted to inches per second. These programs can be found in the appendix as

right_motor_measured_speed.py and left_motor_measured_speed.py.

In manual mode a proportional controller is used to convert joystick commands into desired

velocity commands. The left thumbstick forward or reverse power and the right thumbstick turning

power. The controller decides whether to be in autonomous mode or manual mode by pressing a

Page 11: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

11

button on the controller. This button must be pressed for autonomous mode otherwise it’s in

manual mode. This is done to prevent accidents from happening where the controller is dropped

or something and the robots run into something in testing. Having a finger on the kill button seems

like the best choice for starting. Nobody can predict what will happen and it is much faster to let

go than to try and press something. The controller is read using the joy package in ROS. The

controller commands are split up to the two robots using the controller_splitter.py program which

can be found in the appendix. This allows the robots to be driven independently using the same

joysticks. If the right bumper is pressed commands go to Red Bot. If the left bumper is pressed

commands go to Blue Bot. If both are pressed commands go to both and if the a button is pressed

they go autonomous.

All of these programs pass their values to the main controller called tele_motor_bluebot.py and

tele_motor_redbot.py which can be found in the appendix. In this program the command values

from the controller or the IR sensors are converted into desired velocity commands. This desired

velocity is then passed to a closed loop PID controller which uses the measured velocity programs

for feedback. In the program the max velocity is 18 inches per second. In controller mode the

thumbsticks generate some desired velocity proportional to the max. In autonomous mode the IR

increase or decrease the desired velocity to avoid obstacles as previously shown in Figure. 2. The

gains used for testing are shown in Table I.

Table I

Gains Used for PID Control of Motors

π‘˜π‘ π‘˜π‘‘ π‘˜π‘–

0.012 0.00008 0.0

A shown in Table I the integral gain was set to zero. This was done because the system sampling

speed was too low and integrator wind up occurred if the gain was high enough to take any effect.

For this reason it was set to zero.

These gains resulted in about four inches per second steady state error. In the future a better digital

controller will be used. This controller allowed the system to run at a stable state but the system

could be much better if the peak time and rise time were reduced. To do this accurately system

step data can be taken and used but for the current iteration there is not enough time to do this. The

major issue is with the sampling time which really needs to increase but a lot of time is required

to fix the sampling rates for everything in a way where the network remains stable as well as the

robots. The problem is mainly with determining the best way to pass information over the network.

Sensors

The ZMAPS robot network is designed to accomplish the task of two dimensional map building.

As stated previously Red Bot has a camera on its highest platform mounted to a pan and tilt

mechanism. The pan and tilt mechanism allows Red Bot to easily look around at its environment.

As previously stated Blue Bot has a colored object, a 9 inch by 9 inch pink cylinder, on its top

platform. The cylinder and the camera on the pan and tilt allows Red Bot to find and track Blue

Bot as Blue Bot drives through a room. The location of Blue Bot is reported as the location of Blue

Bot relative to the location of Red Bot.

Page 12: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

12

Red Bot uses image processing techniques to find the colored object in an image. It will then use

the pinhole camera model to approximate the distance to the object. This location of Blue Bot is

then passed to the laptop. The laptop will be building a map using the location values sent by Red

Bot. The map will show all the locations that Blue Bot travels and also the places it does not travel

allowing for wall and obstacle locations to be determined.

The camera used by Red Bot is a Play Station 3 Eye (PS3 Eye) camera. It works out of the box

with Ubuntu. The camera was chosen because it is inexpensive but can run at 60 frames per second

when the image is 640x480 pixels. It can also run at 125 frames per second when the image is

320x240. The configuration for Red Bot uses the camera in the 320x240 configuration at 30 fps.

This was done because the U3 was not able to process the code at the full frame size very fast but

can easily run it at this frame size. The camera is mounted on the pan and tilt mechanism driven

by two servos allowing Red Bot to easily follow Blue Bot around.

The pan and tilt mechanism is designed specifically for the PS3 Eye camera. As shown in Figure

5 the mechanism consists of the tilt mount, the pan mount, the camera, and the two servos.

The tilt mount and the pan mount are both made using an additive machine. The components are

made of Polycarbonate (PC). Because the parts are made from PC on the machine at a lower

density build they have lower stress carrying abilities. As shown in Figure5 the servo horn mount

point for the pan mount has an adhesive foam tape to help hold it together. Although adhesives

mainly work in shear this was still applied to help if the PC started to crack, which it did. Adhesive

foam tape was also added to the tilt mount where the tilt servo is attached. The screws used to

mount the tilt servo were not self-tapping and caused the small piece of wall between the servo

and the holes to slightly crack. The servo is secure but the foam tape was added for extra support.

The tilt mount was designed to mount to the existing holes in the printed board for the camera as

shown in Figure 2. It used the PS3 Eye screws to fasten the printed board to the mount. The tilt

servo horn has a cut out in the tilt mount. This takes the load from the screws and allows the horn

to carry the load. The tilt servo itself and the pan servo horn also have a cut out in the pan mount

for the same reason. The entire assembly is held securely to the highest level using the foam tape.

The pan and tilt was designed to allow the Red Bot to rotate the camera about the axes of the lens

that is that the lens center sits in place as the camera rotates about both axes. This allows the camera

tracking to be more accurate and keeps images much more smooth and stable.

Page 13: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

13

Figure 5. This image shows a view of the pan mount, the tilt mount, the camera, and the two servos

assembled on Red Bot’s highest level.

The object sits on top of Blue Bot as shown in Figure 2. A 9 inch by 9 inch cylinder was chosen

because it appears as almost a square from Red Bots viewing angle. This allows for the aspect ratio

of the object to be used as a tracking feature. The color pink was chosen because it stands out very

well against most other colors. The cylinder size was chosen because that is about as big as a

cylinder that can fit on Blue Bot.

The program used by Red Bot is called camera_centroid_radius_2.py and can be found in the

appendix. The image processing is done using Open Source Computer Vision (OpenCV) Python

libraries. The image processing has gone through multiple stages and is upon media day will be in

its tenth iteration. The code runs faster than the network can run so it was decided that was fast

Page 14: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

14

enough. The current implementation of image processing does very well tracking the object across

a large lighting gradient as long as the object remains almost upright. This is done by preventing

the camera from automatically adjusting the gain but allowing the white balance and exposure to

be automatic. Before the ball was lost easily if the light gradient was too large but a solution to

this was found in making the process not as dependent on lightness or darkness. This was primarily

done by increasing the hue of the images read in. By doing this each image could have the entire

range of value and be filtered by saturation and hue alone. This allowed for large lighting changes.

This also helped immensely in rooms with incandescent light. The effects of the yellow light was

found to be greatly reduced by shifting the image hue.

The image is then filtered using a low pass Gaussian filter defined by a 5x5 neighborhood size.

That is it looks at a small portion of the image and determines values for a pixel depending on its

neighbors in the small portion of the image. The small portion is called a kernel. The filter removes

high frequency noise in the image and effectively blurs the image. Removing the noise allows for

better color tracking because the object becomes more consistent in color.

The now blurred image is then converted from the default color values OpenCV uses Blue, Green,

and Red (BGR) to Hue, Saturation, and Value (HSV). The image thresholding is easier to perform

in HSV or other similar methods on computers. HSV allows for the change in color to be more of

tints and shades of color with pink being to the far right on the saturation values. The HSV image

is put through a threshold and afterward only the pink or close to pink tints and shades remain.

The camera settings for an image and the thresholding values found to work in varying lighting

conditions to track the pink cylinder are shown in Table II.

Table II

Camera Setting for Images and the Theshold Values Used

Brightness 0.0

Contrast 35.0

Saturation 65.0

Hue 120.0

Gain 15.0

Blur Value 21.0

Upper Hue Threshold 179.0

Upper Saturation Threshold 255.0

Upper Value Threshold 255.0

Lower Hue Threshold 136.0

Lower Saturation Threshold 60.0

Lower Value Threshold 0.0

The result of this process produces a binary image consisting of white true values and black false

values. True meaning the pixel passed the HSV value threshold conditions. This process despite

being previously blurred still has noise. An attempt to remove noise is done by eroding and dilating

the image using a 5x5 neighbor size. Eroding looks through a pixel’s neighbors, defined by a kernel,

and if all the neighbors are true then the pixel is also true. Eroding removes a large amount of noise

but also a large amount of the cylinder. This loss is gained back by dilating the image. Dilating

Page 15: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

15

looks through a pixel’s neighbors, defined by a kernel, and if any of the neighbors are true then

the pixel is also true. This restores a large amount of the lost image. The now eroded and dilated

image is a binary mask of the original BGR image where the pixels that are true represent the

desired shades and tints remaining, pink in this case.

The contours in the new binary image are found and the contour with the largest area is taken to

have its aspect ratio checked. If the ratio is within 40% of 1 the object is assumed to be the cylinder.

If it is not that object is thrown out and the next largest area is checked. If an object passes the two

tests then it is assumed the cylinder and the centroid is used to generate an angle error between the

centroid and center of frame. The camera internal values in pixels were previously determined and

are shown in Table III.

Table III

Internal Camera Values in Pixels

Focal Length Width Height

279.667 319.5 239.5

Using these values it is possible to use right triangle relations shown by the pinhole camera model

to determine the angle error between the two locations. This model is shown in Figure 6.

Figure 6. Pin hole camera model. Shows that if the focal length f and the height y1 are known it is

possible to determine the angle of the triangle which is the tracking error when point Q is the

centroid of the cylinder and the projection of point Q on the X3 axis is the center of the frame in

pixels. Similarly if x1 is also known it is possible to solve for x3 using similar triangles. This is

used to calculate the distance when the centroid and the center of frame are about the same.

As shown in Figure 6, if the difference between the center of frame and the centroid is known as

well as the focal length it is possible to calculate the tracking angle error as:

πœƒπ‘’π‘Ÿπ‘Ÿπ‘œπ‘Ÿπ‘π‘Žπ‘›= π‘Žπ‘‘π‘Žπ‘› (

𝑝π‘₯ βˆ’ 𝑐π‘₯

𝑓)

Page 16: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

16

where px is Q and the centroid, cx is the projection of Q and the center of frame, f is the focal length,

and πœƒπ‘’π‘Ÿπ‘Ÿπ‘œπ‘Ÿπ‘π‘Žπ‘› is the tracking error in the pan. The same formula can be applied for the tilt but using

py and cy instead as:

πœƒπ‘’π‘Ÿπ‘Ÿπ‘œπ‘Ÿπ‘‘π‘–π‘™π‘‘= π‘Žπ‘‘π‘Žπ‘› (

𝑝𝑦 βˆ’ 𝑐𝑦

𝑓)

The focal length was found not to change between the two axes.

Also shown in Figure 6, if the height or equally the width of the cylinder in pixels is known then

it is possible to calculate the distance to the object because the height and width of the cylinder are

known and the focal length in pixels is also known. This allows for the distance to be calculated

as:

βˆ’π‘¦1

𝑓=

π‘₯1

π‘₯3 π‘œπ‘Ÿ π‘₯3 =

π‘₯1𝑓

𝑦1

This relation can be applied to both the width and height. Which was done after fitting a minimum

area rectangle to the contour in pixel space. This fit was done in the area and aspect ratio checking

routine which is how the aspect ratio is checked. This is done to calculate the distance to the object

and the two values are averaged together to find the distance. Experimentation has shown that the

accuracy of the ball at 8 feet is approximately 3 inches. This uncertainty value was found to

increase quickly after this distance which is shown in the maps later in this report.

By calculating the tracking error it was possible to keep the cylinder at the center of the frame.

Using the tracking error angles a PID position controller was made to minimize the error. The

program that took in the angle values and calculated the command is servo_tracking_v1.py and

can be found in the appendix. The program took in the current pan angle and current tilt angle

errors from the camera threshold program and used that for position commands in the PID

controller. The gains for the PID controller are shown in Table IV.

Table IV

Gains Used for PID Control of Pan and Tilt Servos

π‘˜π‘ π‘˜π‘‘ π‘˜π‘–

0.15 0.0 0.0

The gains for the derivative and integral portions were both zero because the loop time was too

low to run them faster. This is mainly because this loop could run as fast as it wanted but the Mega

was still only updating at 10 Hz. This caused very slow but very large oscillations to occur. In the

future a faster loop time will be tried as previously stated. In addition to this servos will not be

used but instead brushed motors with rotary encoders will be used as previously stated. An image

of an angle estimation and distance estimation is shown in Figure 7. Note that the cylinder is not

the 9 inch by 9 inch cylinder but is its predecessor a 5 inch by 10 inch cylinder.

Page 17: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

17

Figure 7. Image shows an example run calculating the error in the pan and tilt angles shown as

thetax and thetay respectively. Also calculating the distance to the cylinder using the width and

height shown as zw and zh respectively. The distance used was the average called z. Zdiff was the

difference between the two. The frames per second was slowed down. The cx and cy values are

the centroid of the cylinder shown as a blue dot. The white dot is the center of the image. The

image is 640x480. A positive pan is to the left and a positive tilt is down. Which as shown both

angles are negative because it needs to go right and up. The actual distance was around one meter

as approximated. Note the dimensions are in millimeters for the distance measurements. This

image also shows how the masking process removes all of the surroundings. The original image

is shown in the bottom right and the masked image is in the top right. As shown, only the cylinder

remains. The contour of the cylinder is outlined in green in the original image. Note that in this

image the hue is normal, 90 degrees. It was not until after the current white light died a few days

later that the issue of the yellow light previously mentioned was realized.

As shown in Figure 7 the cylinder is well tracked and the angles and distance measurements are

accurate enough for tracking. This is largely shown by the accuracy in angle errors and distance

measurements. The current measurements are off by a small amount but this is largely due to the

center of the frame not being on the centroid. This shows the largest distortion for the camera, that

is radial distortion. The radial distortion coefficients were found but the added calculations slowed

Page 18: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

18

the program down too much to make the benefits worthwhile. It was found that because the pan

and tilt were trying to maintain the centroid at center the centroid was typically close to the center.

In this location the effects of radially distortion were minimal and the benefits associated with

radial distortion compensation were equally minimal.

The other sensors, that is the IR and the encoders, were previously discussed in the actuation

section as they are more related to that part of the robot.

Behaviors For now the Bots are placed in an unknown enclosed environment. Also for now Red Bot is placed

in a way where it can easily observe the course, typically the bottom right corner because its easiest

to explain. In the future it will find the optimal location but that was too much for the time available.

Blue Bot is driving around randomly and when Red Bot sees Blue Bot it will begin tracking Blue

Bot. Blue Bot will wander randomly avoiding obstacles using the previously described method of

hyperbolic tangents.

As Blue Bot drives around its location will be passed to the computer by Red Bot using the

previously described method to make distance measurements and using the current pan angle of

the servo. With this information an occupancy grid will be formed showing all of the locations

Blue Bot travels relative to Red Bot. If Blue Bot moves through a space twice it is assumed that

there is nothing in that location. As Blue Bot drives around the locations it travels are recorded on

an image which displays a blue marker in Blue Bots location and a red marker in Red Bots location.

The map starts out all black and the locations Blue Bot passes over are increased in value, that is

they become increasingly whiter. Every minute a sweep of the map is done to try and clean it up

by closing, dilating then eroding, a mask of the pixels that are hit twice. After five minutes the

image is closed one final time and the final mask is all that remains. The kernel size for closing is

3 pixels. In the image each pixel represents an inch so that equals 3 inches. This value was chosen

because it is consistently smaller than the resolution of the system. The typical resolution is around

3 inches to 6 inches. An example course and the map are shown in Figure 8 and Figure 9.

Page 19: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

19

Figure 8. Image of a course during a run. As shown Blue Bot left is driving around while Red Bot

right tracks Blue Bot. A slant is along the bottom left. A box is in the top left. Red Bot sits in the

top right. Two boxes block half the center.

Figure 9. The resulting occupancy map after about 10 minutes. As shown the bottom left shows

the slant of the wood. The top left corner where the box is was kind of covered up. The boxes are

clearly shown in the center. Also Red Bot is clearly shown in the top right. Most of the corners are

rounded because Blue Bot avoids getting that close to anything.

Page 20: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

20

The map shown in Figure 8 and Figure 9 shows that there is a large uncertainty associated with

measurements of increasing distance. This is made clear by the box in the top left being covered

slightly. And the roundness of the corners. In the future iterations Blue Bot will be making its own

measurements allowing for increased mapping speed but also more accurate maps. Overall the

maps turned out pretty well but they show a lot of room for improvement as previously mentioned.

Experimental Layout and Results

Much of the experimentation was done while trying to tune the PID controllers and calibrate the

IR sensors. A single IR was calibrated and assumed a good representation for all the IR sensors

because they are all the same model. A plot of this curve is shown in Figure 10.

Figure 10. Plot of the IR data fit. The vertical is the distance measured in inches and the horizontal

is the number returned by the analog to digital converter.

As shown in the plot the IR equation was determined to be π‘‘π‘–π‘ π‘‘π‘Žπ‘›π‘π‘’ = 7389.8(π‘–π‘Ÿ π‘£π‘Žπ‘™π‘’π‘’)βˆ’1.074.

This is found to give about 1 inch of accuracy over the ranges shown in Figure 10 which is accurate

enough for the IR sensors purpose. That is to keep the robots from hitting things.

The other experiments performed were tuning the PID controllers which values are reported in

Table I for the motor controllers and Table IV for the servos. A large amount of tuning went into

the camera settings and threshold values for the hue, saturation, and value. The final values used

are reported in Table II. Another experiment was finding the focal length and image size of the

camera which is reported in Table III. Outside of these experiments and tuning the amount of data

pushed over the network was also reduced drastically as previously discussed.

Page 21: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

21

Conclusion Overall the mapping procedure turned out successful but there are many areas that need to be

improved. Over the past four months I have learned a great amount about Linux, ROS, OpenCV,

networks, and programming in general. I have learned the most about the Python language and

have come to appreciate the simplicity of writing in an interpreted language. I have found that

Python is fast but still easy to pick up and use. I would have liked to get better in c++ but there

was not enough time to move everything over to c++ for this project which was my original goal.

The maps my robots make have turned out to be off by about 2.5% at about 15 feet. Too much

farther than this and the maps accuracy greatly decreases because of the resolution of the camera.

In the future the resolution will not be as large an issue because the robots will be able to move

through the space and build larger maps by stitching together smaller ones.

The update frequency proved to be the largest hurdle because it reduced the overall accuracy of

tracking substantially. This in turn resulted in less accurate maps. The next iteration of the system

is now being developed and will only be better after realizing these shortcomings.

I don’t know what I could have done differently to get the project going better initially besides

knowing a bunch of things I didn’t know. I came into this project only just hearing about ROS and

OpenCV. I was not used to using Linux or the terminal. Now I feel much more confident in all

these things and programming in general. None of these things were simple but time spent on them

made them much easier. I feel confident I went the good path in making the system but I could

have not wasted so much time on little things and ran the system on the ODROIDs earlier. I wish

I would have used github but I made iterations of software just fine. One thing I wish I would have

done is broken the programs up even smaller but I tested things often enough. It could always be

better though. Also learning to use the Transmission Control Protocol (TCP) and the Internet

Protocol (IP) on my own would have been helpful but I feel confident letting ROS handle those

things for me was a good choice.

I would have liked to spend more time on the mapping portion of the system but other things took

much longer than I anticipated. A better computer and camera would be nice also but the hardware

I used was okay. I wish I would of thought about suspension. Thanks for a great semester of

learning.

Page 22: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

22

Documentation

ODROIDC1:

http://www.hardkernel.com/main/products/prdt_info.php?g_code=G141578608433&tab_

idx=2

ODROIDU3:

http://www.hardkernel.com/main/products/prdt_info.php?g_code=G138745696275&tab_

idx=2

Arduino Mega 2560: http://arduino.cc/en/Main/ArduinoBoardMega2560

SharpGP2Y0A02YK0F

https://www.sparkfun.com/datasheets/Sensors/Infrared/gp2y0a02yk_e.pdf

Pololu Motor Driverhttps://www.pololu.com/product/708/resources

Pololu Motor and Encoder https://www.pololu.com/product/1442/specs

Appendices

The project website contains a list of the sensors, actuators, and boards being used in this

project in the informal proposal section of the website. It also contains links to the packages

and libraries mentioned in the report. The website address is:

https://sites.google.com/site/zacksimdlmappingsquad/home

Program Code to run the Network:

Red Bot:

Launch file for all the Red Bot programs: red_bot.launch

Front IR error command generator: front_side_error_v3.py

Right IR error command generator: right_side_error_v3.py

Left IR error command generator: left_side_error_v3.py

Left motor measured speed generator: left_motor_measured_speed.py

Right motor measured speed generator: right_motor_measured_speed.py

Visual feedback generator, finds the object: camera_centroid_and_radius_2.py

Servo command generator: servo_tracking_1.py

Motor command generator: tele_motor_redbot_1.py

Serial communication driver between Odroid and Arduino ROS package: serial_node.py

Low level driver for all the sensors: red_bot_v1.ino

Blue Bot:

Launch file for all the Blue Bot programs: blue_bot.launch

Front IR error command generator: front_side_error_v3.py

Right IR error command generator: right_side_error_v3.py

Left IR error command generator: left_side_error_v3.py

Left motor measured speed generator: left_motor_measured_speed.py

Right motor measured speed generator: right_motor_measured_speed.py

Motor command generator: tele_motor_bluebot_1.py

Serial communication driver between Odroid and Arduino ROS package: serial_node.py

Low level driver for all the sensors: blue_bot_v1.ino

Page 23: Final Report Zack Bell ZMAPS Red Bot Blue Bot EEL 4665 ......Integrated System ... velocity controllers to drive around and IR measurements for obstacle avoidance. They use rotary

23

Laptop:

Xbox controller driver ROS package: joy_node

Xbox controller command generator: controller_splitter.py

Map maker: map_display_2.py

Map viewer: map_viewer_1.py

All the code can be found on the website described above in the final report section as a pdf and

in the code section as a zip file.