gesture-based wheelchair control for physically- challenged

27
Gesture- Based Wheelchair Control for Physically- Challenged

Upload: thomasina-flynn

Post on 18-Dec-2015

229 views

Category:

Documents


6 download

TRANSCRIPT

Gesture-BasedWheelchairControl for Physically-

Challenged

ARJUN ASHOK V ARUNKUMAR M

DANIEL C J NIVED KRISHNAN

THE TEAM

CONTENTS Abstract Problem Definition Project Overview Implementation Verification Future Work References

ABSTRACT We have developed a gesture-controlled

wheelchair system for use by quadriplegics and other physically-disabled persons.

Salient features: Effortless to use Customizable Economical Power-efficient Non-intrusive

PROBLEM DEFINITION Existing powered wheelchairs demand

exertion of force for controlling them, making them unusable to a class of the physically-challenged called quadriplegics.

Movements such as pressing buttons or controlling a joystick are impossible for quadriplegics, who lack fine motor control.

PROJECT OVERVIEW The quadriplegics often retain some

imprecise motion of their fingers. Therefore, the best option is a gesture-based interaction with their environment, in particular their wheelchairs.

We have developed a robust, real-time vision-based hand gesture recognition engine reliable enough for steering a wheelchair.

PROJECT OVERVIEW Hardware specification:

IR-sensitive USB webcam Diffusion masks + mounting/enclosure x64/x86 PC platform Stripped USB keyboard circuit PIC 16F877A microcontroller-based interface DC motor steering mechanism with ULN2804

and relay-based H-bridge

Software specification: MATLAB® from The MathWorks™ Win.32/64-based operating system PIC-based H-bridge control

IMPLEMENTATION Gesture Capture Module developed.

A regular web-camera was modified into an IR-sensitive version.

An IR-illuminated backlit surface for gesture-capture was created.We used IR LEDs in a 8x8 matrix layout.

We used an assembly of tracing paper sheets to create a diffuser for IR radiation.

We used an aluminium foil mask to define an active area.

IMPLEMENTATION Gesture Recognition Module developed.

Successfully captured and processed images from IR-camera in real-time.

Defined a protocol of gestures.These are extremely simplified so as to be easy to use by the physically-challenged. Move forward Move backward Turn left Turn right Stop/brake

IMPLEMENTATION Gesture Recognition Module developed.

Created a set of templates for these commands, which are then correlated with the hand positions detected by the gesture-capture module.

There is an initial training phase for the software, where the user is able to customize all movement gestures according to their specific needs.

Return

IMPLEMENTATION Interfacing Module developed.

A USB keyboard circuit was stripped to give access to the three indicator LEDs – Scroll Lock, Caps Lock, Num Lock.

The LEDs were controlled from keyboard control libraries linked to MATLAB®, according to the gesture recognized.

This gives a 3-bit code for each gesture. These codes are given to a PIC16F877A

microcontroller, which produces outputs to control an H-bridge circuit.

IMPLEMENTATION

3-bit signal Gesture H-bridge control signal

000 Brake 0000 0000

001 Forward 1001 1001

010 Reverse 0110 0110

011 Turn Left 1001 0110

100 Turn Right 0110 1001

101 Brake 0000 0000

110 Brake 0000 0000

111 Brake 0000 0000

IMPLEMENTATION Motor Control Module developed.

Two DC motors are controlled based on inputs from the Interfacing Module.

These inputs are linked to an H-bridge circuit through ULN2804.

An H-bridge circuit consisting of 8 relays is used to direct the two motors attached to the rear wheels of the wheelchair.

IMPLEMENTATION Both the rear wheels should turn in the same

direction for forward/reverse motion of wheelchair.

To turn left, the left wheel should turn backward while right wheel turns forward. The reverse is true for right turns.

In order to brake, the wheels are made to turn in the opposite direction to current spin for quarter of a second. Then the motors are disconnected from the supply.

VERIFICATION Gesture detection verification

Gestures that are captured have been correlated in real time with the templates that are stored.

Satisfactory results have been achieved. DC motor motion test

All possible combinations of 3-bit data were fed to the microcontroller to test motion of DC motors. The inputs were fed externally.

The wheels connected to DC motors turned as expected.

VERIFICATION System Test

Motion of wheels has been tested as per the input generated by the gesture capture module. The input fed by PC via USB cable is decoded by microcontroller.

Each gesture generates a different 3-bit signal. Redundant signals have never been generated.

FUTURE PROSPECTS Porting the whole system to an embedded

system based on the Intel® Atom™ processor.

Fine-tuning the system for power-efficiency and compactness.

REFERENCES Real-Time Vision-Based Hand Tracking and

Gesture Recognition by Qing Chen. Digital Image Processing by William K Pratt. Fundamentals of Digital Image Processing by

Anil K Jain. MATLAB Image Processing Toolbox for use

with MATLAB.

THANK YOU