the gaze controlled robotic platform creates a sensor system using a webcam. a specialized robot...

1
The Gaze Controlled Robotic Platform creates a sensor system using a webcam. A specialized robot built upon the Arduino platform responds to the webcam input on the user’s cue. By processing the image captures, the robot is directed to the target destination indicated by the user’s simulated gaze. Website: http://www.ce.rit.edu/research/projects/2010_spring/Gaze- Controlled_Robotic_Platform/index.html The user interface consists of two components, a push-button console and a pair of glasses. The glasses simulate the user’s gaze using a switch activated laser pointer. The console consists of two buttons, one to initiate robot movement and another to recalibrate the robot movement. A status LED indicates when the robot has reached its destination. The same LED with flash in the event of an error to direct the user to PC debug output. Gaze Controlled Robotic Platform RIT Computer Engineering Senior Design Project Spring 2010 Nicholas Harezga, Jeremy Thornton, Matthew Wozniak System Description User Interface Constraints Environmental Surface: Flat with no obstructions Lighting: Indoor lighting, dim to bright (detection range dependent on lighting) Robot and Software Camera Detection Range: 7’ x 6’ (camera 3-4’ high) Battery Life: ~10 hours idle; ~2 hours in use (Motor power will decrease as charge drops) Wheel encoder accuracy: 11.25 degrees Turning accuracy: 4.5 degrees Forward motion step: 0.1 inches Forward speed: ~1.6 ft/s System Overview Costs Component Cost Cost to group Asus EEE PC $291.99 Already owned Xbee 1 mW Chip (x2) $45.90 $45.90 Xbee Explorer $9.95 Already owned Xbee Explorer USB $24.95 $24.95 FT245RL to USB Board $17.95 $17.95 33mm Push Button (x2) $3.80 $3.80 Logitech Pro 9000 Webcam $59.99 Already owned Laser Module $12.90 $12.90 Arduino Pro Mini $18.95 Already owned Phototransistor for Wheel Encoders (x2) $2.26 $2.26 Ardumoto Motor Shield $24.95 $0.00* Lego DC Motors $29.99 Already owned 7.2V 3000mAh Battery $24.00 Already owned Computer Vision Acknowledgemen ts Note: Items marked with an asterisk were paid for by the CE Department. Dr. Zack Butler of the RIT Computer Science Department. Mobile robot navigation, and computer vision ideas. Karl Voelker, 5th year CS BS/MS student. Debugging and design help of computer vision algorithms. Thomas Schellenberg, 4th year CS BS/MS student. Computer vision algorithms and concepts help. Andrew Wozniak, 3rd year Graphic Design student. Poster design tips. The Team Jeremy Thornton, Nicholas Harezga, Matthew Wozniak The intent of this system was to be as simple as possible to allow users of varying technical backgrounds to set up and use the system with minimal hassle. There are 4 main components to this system: a PC with wireless communication, a scene camera to capture the operational field, the user control module, and the robotic platform. The computer will interpret the scene, determine the location of the laser relative to the robot, and wirelessly transmit instructions to the robot.

Post on 20-Dec-2015

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The Gaze Controlled Robotic Platform creates a sensor system using a webcam. A specialized robot built upon the Arduino platform responds to the webcam

The Gaze Controlled Robotic Platform creates a sensor system using a webcam. A specialized robot built upon the Arduino platform responds to the webcam input on the user’s cue. By processing the image captures, the robot is directed to the target destination indicated by the user’s simulated gaze.

Website: http://www.ce.rit.edu/research/projects/2010_spring/Gaze-Controlled_Robotic_Platform/index.html

The user interface consists of two components, a push-button console and a pair of glasses. The glasses simulate the user’s gaze using a switch activated laser pointer. The console consists of two buttons, one to initiate robot movement and another to recalibrate the robot movement. A status LED indicates when the robot has reached its destination. The same LED with flash in the event of an error to direct the user to PC debug output.

Gaze Controlled Robotic PlatformRIT Computer Engineering Senior Design Project Spring 2010

Nicholas Harezga, Jeremy Thornton, Matthew Wozniak

System Description User Interface

ConstraintsEnvironmentalSurface: Flat with no obstructionsLighting: Indoor lighting, dim to bright (detection range dependent on lighting)Robot and SoftwareCamera Detection Range: 7’ x 6’ (camera 3-4’ high)Battery Life: ~10 hours idle; ~2 hours in use (Motor power will decrease as charge drops)Wheel encoder accuracy: 11.25 degreesTurning accuracy: 4.5 degreesForward motion step: 0.1 inchesForward speed: ~1.6 ft/s

System Overview

Costs

Component Cost Cost to groupAsus EEE PC $291.99 Already ownedXbee 1 mW Chip (x2) $45.90 $45.90 Xbee Explorer $9.95 Already ownedXbee Explorer USB $24.95 $24.95 FT245RL to USB Board $17.95 $17.95 33mm Push Button (x2) $3.80 $3.80 Logitech Pro 9000 Webcam $59.99 Already ownedLaser Module $12.90 $12.90 Arduino Pro Mini $18.95 Already ownedPhototransistor for Wheel Encoders (x2) $2.26 $2.26 Ardumoto Motor Shield $24.95 $0.00*Lego DC Motors $29.99 Already owned7.2V 3000mAh Battery $24.00 Already owned3.7V Polymer Lithium Ion 2000mAh Battery $16.95 $0.00*Miscellaneous Items (Enclosure parts, breadboard, etc.) $40.00 $40.00 Total Cost $624.53 $147.76

Computer Vision

Acknowledgements

Note: Items marked with an asterisk were paid for by the CE Department.

Dr. Zack Butler of the RIT Computer Science Department. Mobile robot navigation, and computer vision ideas. Karl Voelker, 5th year CS BS/MS student. Debugging and design help of computer vision algorithms. Thomas Schellenberg, 4th year CS BS/MS student. Computer vision algorithms and concepts help. Andrew Wozniak, 3rd year Graphic Design student. Poster design tips.

The Team

Jeremy Thornton, Nicholas Harezga, Matthew Wozniak The intent of this system was to be as simple as possible to allow users of varying technical backgrounds to set up and use the system with minimal hassle. There are 4 main components to this system: a PC with wireless communication, a scene camera to capture the operational field, the user control module, and the robotic platform. The computer will interpret the scene, determine the location of the laser relative to the robot, and wirelessly transmit instructions to the robot.