2 - web viewthis is an issue as it is not posisble to ensure that a steady 10khz pwm will be...

17
FYP Progress Report Title: Camera guided robot BE Electronic & Computer Engineering College of Engineering and Informatics, National University of Ireland, Galway Student: Joseph Fleury, 09432493 17th December 2012 Project Supervisor: Martin Glavin 1 | Page

Upload: trinhngoc

Post on 30-Jan-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

FYP Progress Report

Title: Camera guided robot

BE Electronic & Computer EngineeringCollege of Engineering and Informatics, National University of Ireland, Galway

Student:Joseph Fleury, 09432493

17th December 2012

Project Supervisor:Martin Glavin

1 | P a g e

Page 2: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

Contents1 The Aim For Year............................................................................................................................3

2 Project Development Plan..............................................................................................................4

2.1 Research Done.........................................................................................................................4

2.2 Time Line Plan (Figure 2.2.1)...................................................................................................5

Figure 2.2.13 Progress To Date..........................................................................................................5

3.1 Pandaboard (Figure 3.1.1).......................................................................................................6

3.2 Application...............................................................................................................................6

2.3 Robot, Hardware & Software...................................................................................................8

4 Future Plans..................................................................................................................................10

5 Health and Safety Issues...............................................................................................................12

6 Conclusion....................................................................................................................................12

7 References....................................................................................................................................14

2 | P a g e

Page 3: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

1 The Aim For Year The objective of this project is to use a conventional webcam to guide a robot to a target,

and to identify the target accurately using a pen or laser pointer. The system should be capable of fully autonomous control over a given distance. The camera will point at the ground, and will detect & track objects on the ground to guide the motion of the robot to the objects. The system will “mark” each target on the ground using either a laser pointer or felt tip pen, and will move to the next object in the sequence until the sequence is complete.

The first phase of the project will involve getting OpenCV (Open Computer Vision) to detect objects in a desktop environment, and subsequently on an embedded applications board. The camera system will then be mounted on a robotic platform, capable of forward motion and steering acurately.

The next stage of the project will be to get a robot chassis, and, using an embedded processing board, use the camera to guide the robot. The system can use the circle detection and tracking to guide the vehicle to position itself approximately over the target.

After that the accuracy of the system. The system should be capable of following a trail of circles on the ground, track them and accurately identify their position by marking their position on the ground (maybe using a pen).

The final phase of this system will be to track images of plants on the ground (against a relatively benign background of soil), and use the pen (attached to the robot) to draw a line approaching (but not touching) the plant.

3 | P a g e

Page 4: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

2 Project Development Plan

2.1 Research DoneBefore starting to develop any software or robotic platform a lot of research was done on

suitable embedded platforms, off the shelf robots and into OpenCv itself. Alot of time was put into the type of object tracking that would suit the application best, as well as the best techniques for abstracting information about the object being tracked.

Platform for developing on is very important for the project. It need to be capable of halding threads as well as have expansion port capability. The Pandaboard was the embedded platform that I decided on because it has alot of support available for it. Another reason it was chosen is because of its powerful computing ability. It has an OMAP4060 Arm dual core processor as well as a whoping 1GB of ram. After some time trying to decide on an OS suitable for developing opencv applications, I concluded that Ubuntu OMAP4 was the most suitable since it had support from TI for the Pandaboard and developing on arm. This version on Ubuntu uses very little ram leaving a large resource available to large amount of processing.

“Learning OpenCv” O’ Reiley book, provided me with great resources for learning how to use the functions provided by the OpenCv Library such as thresholds and circle detection examples. In the end after thinking the system out I realized that using circle detection algorithm was not going to be enough for tracking an object in an image.

After looking into alternative ways to track an object in an image I discovered that google had a library available that handeled blobs in an image called the CvBlobs library. This allowed me to focus on finding the best way to thresh the image to extract a blob. It was clear to me after advice from my project supervisor that the best was for me to keep track of an object in an image was to use kalman tracking. This function provides the ability to predict where the object is going to be, which is useful if in any given frame the object does not get recognised.

After some time spent looking for a cheap robot I discovered the Dagu Rover 5. It has an encoder and 2 motors. One of the main reasons I choose this robot is that it has an encoder which can inform the application of the speed the robot is travelling at.

4 | P a g e

Page 5: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

2.2 Time Line Plan (Figure 2.2.1)

Figure 2.2.1

5 | P a g e

Page 6: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

3 Progress To Date

3.1 Pandaboard (Figure 3.1.1)The system runs on an embedded platform called a “Pandaboard”. This board has a 1.2GHz

dual core Arm processor as well as one gigabyte of RAM. This makes it very good for real time processing as it has pleanty of storage and can is very powerful. It has an SGX540 graphics core which supporting all major API's. The Linux distribution that is installed on the board is called Ubuntu OMAP4. Since Ubuntu OMAP4 has easy instructions on how to set up OpenCv and aswell as the CvBlob library, and, since it is a light weight operating system,it was chosen for development.

Figure 3.1.1

3.2 ApplicationThe program reads in a video stream from the camera and detects red circles in the images

using many different techniques. First a threshold in the HSV colour range (best to use HSV in situations where lighting may vary), which is good for detecting a wide range of colours (figure 3.2.2). The Resulting image is then ran trough a morphological kernel that checks for elliptic type shapes ( since circles are a special kind of ellipse shown in figure 3.2.1 and cvlabels can only take a rectangular shape or ellipitc shape as arguments) only once, as to not distort the image too much.

Figure 3.2.1

6 | P a g e

Page 7: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

Figure 3.2.2: HSV image with red ball

From this then the image is passed through to a CvBlobs function where blobs (the circles), which fall within the specified boundaries (min pixel size and max pixel size), are then detected and returned in the form of a blob array. This blob array is then iterrated through and the centroid of the suitable blobs are found as well as the radius. Once these two pieces of information have been found it is just a matter of using them to track the object through a kalman filter.

shown in figure 3.2.3 is the result after finding a circular blob from figure 3.2.2

Figure 3.2.3: Resulting binary image after segmentation.

A Kalman Filter is used for tracking the object as it moves in the image. It does this by using the measured series of centre points from above as the data for tracking (ie the position of the object in the (x,y) plane). It also has other variances such as types of noise and other inaccuracies, which help to produce and estimate of where the object’s centre point will be (ie where the object will be). This is important since any given frame can become corrupted due to some unknown event and the robot will need to continue on a fairly accurate assumption of where the object is.

The Kalman filter works as follows. In the prediction phase the kalman produces an estimate of the centre point(s). Once the next centre point is measured the prediction is updated using a weighted average which gives more weight to estimates with higher certanty. To make this run in real time there are two measurements needed, the centre point at the present time and the previous calculated centre point. This is achieved because of the recursive nature of the kalman filter. So far the kalman filter has been used to track a circular red topped pen as well as circular objects in a picture as they move through different frames (noise and variances are updated to suit the different enviroments). The result has been reasonably accurate in prediction and measurement. The below image, figure 3.2.4 is an image of a kalman filter working.

7 | P a g e

Page 8: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

Figure 3.2.4: Green circle actual psoition.

Red Circle is predicted position.

2.3 Robot, Hardware & Software

Figure 3.3.1 Dagu Rover 5 robot

The robot to date is not up and running due to an issue with the Pandaboard’s voltage high level (2V) on the expansion ports. The H-Bridge I am using is the MC33887 dual motor driver and it needs a minimum of 3V input in order to ensure pwm is being registered. This device was damaged during testing when the current being drawn from the power supply was too much.

To solve the problem of inadequate voltage from the pandaboard I am using the SN74LS04N inverter as a buffer for the driver. This is is producing an adaquate output voltage for the driver voltage input range. To get this to run I need the Pandaboard to output 10KHz pwm signal to drive the motors of the robot. This maybe tricky as the Pandaboard OS is not an RTOS.

The robot will feed information about its speed back to the Pandaboard over a serial connection using the encoders built into both motors (1000 state transistions = 3 rotations). The program will enterpret this information and deduce the lenght of time it will take to reach the object. This time will be based on a number of factors such as surface robot is on and speed of the robot. The entire system when put together should be able to track a sequence of objects.

8 | P a g e

Page 9: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

A robotic arm will be used to hold a pen that will be controlled by the application, will draw a point where the centre point of the circle will be. It will also be able to draw a line outlining the circumference of the circle.

9 | P a g e

Page 10: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

4 Future PlansNext on the agenda for the project is get the robot running with the Pandaboard and motor

driver. An Arduino will be needed to run the robot since the Pandaboard does not have an RTOS kernel running on it. This is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted. This will hopefully be done by the second week back and then it’s down to calibrating the robot with the camera after that. Calibration includes enterpreting the centre points in the 2D image and applying them to the real world using a robotic arm which still has not been decided on (this should be done within the first 2 weeks back as well). It also includes issues such as outside noise being introduced into the image due to movement of the robot and change of light as the robot moves to different areas.

I plan on improving my application for detecting circles by multiple different techniques such as blurring and smoothing the image before any other operation is applied to it. This should make it easier for the morphological kernel and the CvBlobs library to find a more accurate circle in an image. I also plan on adding other functionalities such as a method to detect the closest circle in the image so that the kalman filter can focus on the nearest ball first. This will make it easier for the robot to distinguish between the real world objects that is closest to it and the next object in the scene.

Once calibration has been acurately carried out, the object will be able to track a sequence of plants against a soil background and use the pen to draw a line aproaching the plant and around the plant without touching the plant.

To test the robots performance ASAP I will be using simpler applications to test robot resposes. This test will entail a white sheet of paper and using percentage of white in a frame to control the movement of the robot (e.g. left half of image is none white then turn left. If more than half the image is none white, stop. ). This will help me in the calibration process.

After this to test the performance of the robot, it will first be placed on a sheet of paper with a number of circles, which are acurately drawn a known distance from eachother. The circles will be of different sizes. The robot should be able to drive along and place a dot in the centre of each circle. When the robot reaches the end of the sequence of circles it will be able to automatically stop. After this to test the robot, it will be placed on a similar sheet of paper and draw a line up to a circle and around the circles circumference acurately. The distance of the line from the circles circumference (edge) will be controlled by a scalar factor that can be manually inserted at the start of the application.

Movement of the robot is a big issue as on different surfaces the robot will move in different ways. For example, if the robot was on a slipperly surface it might turn quicker due to sliding, but on a rough surface it might turn noticebly slower. This causes a problem when it comes to calirating the length of time at which the robot will reach the desired position. Different levels of noise might interfere with the application if this is not calibrated to suit different surfaces.

10 | P a g e

Page 11: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

Once the robot has been set up to do the above tests acurately and reliably, the the object will change to coloured circular objects in an image of a white sheet of paper on the ground. The objects will be detected by separating the frames of video into their HSV values and further processing will be done on the frames. The objects will be placed slightley out of line of the robot and the robot must be able to steer acuratley towards these objects and draw an outline of them as it moves.

The final part of this project will be to get the robot to detect a plant on a piece of paper in any given frame and draw a line around the plant. This in theory would be the line that a robotic arm that is capable of weeding would move along.

11 | P a g e

Page 12: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

5 Health and Safety IssuesThis project involves some work with batteries and motors. The robotic system will be

bought in kit form. Battery power will be used, and the battery charging should be done using a charger approved for use with that type of battery. For testing of autonomous behaviour, the robot should be fitted with a safety cutoff switch to ensure that the robot can be disabled quickly should anything go wrong with the navigation system.

6 ConclusionTo date I have been able to successfuly capture real time video images and do operations on

it. I have been able to get object tracking working resonably well using kalman filters, cvblobs and different threshing and morphological techniques and can track a red circle in real time.

The robot is not working as of yet but I expect to have it working by january. The issue with the robot is getting another motor driver and getting the correct setup for it with an arduino (since the pandaboard does not have an RTOS kernel) and the robot chasis. As for the robot arm, there still has been no choice made.

However there are still many obstacles to overcome, which as mentioned in the “Future Plans” section, will need to dealt with as quick as possible, and tackled in the right manner will reult in the desired outcomes, which I am certain I can do. The project seems to be going along very well and software issues seem to be resolved to date.

Below are the tasks not yet complete:

Distance Detection to object:

12 | P a g e

Page 13: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

Figure 6.1: Shows how the calibration would change depending on angle after mounting

This is not done because the robot is not fully assembled and calibration of the position of the camera as well as the angle is not known yet. Similar to figure 8.1.5 except camera is oriented to face the ground i.e. x is on the ground.

Get Robot, mount camera, calibrate:

This has not been fully completed. The robot has been gotten but is not fully functional since a motor driver has not been got working with it as of yet. Calibrating cannot occur until the robot is fully functional. All other tasks with the robot are not completed but I expect to complete very quickly once i have the robot up and running.

13 | P a g e

Page 14: 2 -    Web viewThis is an issue as it is not posisble to ensure that a steady 10KHz pwm will be output from the Pandaboard ports since the generation process might get interrupted

7 References1. O’Reiley “Learning OpenCv” edition September 2008, used September 2012. This was used

for learning the libraries provided by OpenCV

2. OpencCv, http://docs.opencv.org , accesed September 2012. This was used for learning the libraries provided by OpenCV

3. Pololu robotics ,http://www.pololu.com, accesed Spetember 2012. This was used for learning about the robot chassis.

4. OCF Berkeley ,http://www.ocf.berkeley.edu, accessed September 2012. This contains information on similar projects that use motion tracking.

5. Pandaboard, http://pandaboard.org/, accessed September 2012. This was used to gain information on the pandaboard.

6. Figure 8.1-5 used to explain how camera angle affects max are viewable.

14 | P a g e