advanced wheel chair vatsal shah

5

Click here to load reader

Upload: vatsal-n-shah

Post on 15-Apr-2017

1.878 views

Category:

Engineering


0 download

TRANSCRIPT

Page 1: Advanced wheel chair   vatsal shah

Advanced Wheel chair

VATSAL SHAH (UG Student)

Electronics & Communication Dept.

INDUS UNIVERSITY

Ahmedabad, India

[email protected]

Abstract— The paper describes a automatically controlled

wheel chair for disabled people. The chair enables the user

to move his chair using his finger & hand. The flex sensors

and accelerometer on the glove generate ASL coded signals

which are decoded & control the chair. It also display the

information intended by the user. Additionally the

information is also converts to speech. The wireless link

between the glove & wheel chair enables any person to

operate. This advanced wheelchair system is used for physically

disabled and deaf/dumb people move around easily and to

communicate with normal people.

Index Terms—Accelerometer and Flex sensor controlled wheel

chair, Speech Synthesizer, American Sign Language, XBee.

I. INTRODUCTION

American Sign Language Detection and Voice Conversion

is implementation for designing a system in which sensor

glove is used to detect the signs of ASL performed by a

user. It is considered as the standard for communication among

deaf/dumb people. Over 100 million people worldwide, with

physical disabilities require the assistance of a wheelchair.

Two different hardware boards are available. One is placed in

the wheelchair (receiver side/robot side) and second one is

placed at the user side (transmitter side). Once the voltage is

received by the microcontroller, it needs to be transmitted over

to the other side of the system, which is the wheelchair. This is

done by the transmitting circuit present on the hand-glove,

hence realizing wireless communication between the chair and

the glove. The glove comprises flex sensors, accelerometer

on the back of the palm to measure dynamic and static gestures

which detect the position of each finger by monitoring the

bending of the flex sensors mounted on them. The sensor

circuit output is then sent to Microcontroller through ADC.

The pre-stored activated and displayed on the LCD and voice

using speaker. These data will provide a medium for normal

as well as deaf/dumb people to communicate more easily in the

society.

The different directions of motions possible are:

1) Forward: Both the motors in the forward direction.

2) Backward: Both the motors in the reverse direction.

3) Left: Left motor backward direction, Right motor in the

forward direction.

4) Right: Right motor backward direction, Left motor in the

forward direction.

In this project we have used t wo microcontrollers, a

speech IC, speaker to produce the output, LCD display

(16x2), ZigBee, Flex sensors.

The remainder of the paper is constructed as follows.

Section 2 introduces the Fingerspelling used in sign

languages. The block diagram description is presented in

section 3. Section 4 details the system description such as the

flex sensors, the accelerometer sensor, the microcontroller, the

XBee and the receiver part which contains the display unit,

the speech synthesis, the motor driver IC. The system flow

chart is described in section 5. Result and Discussion in

described in section 6. Finally, we conclude the paper in

section 7 and outline the future avenues for our work. [5]

II. FINGER SPELLING

As the third or fourth most widely used language in the

United States [1], American Sign Language (ASL) is the

primary communication means used by members of the North

American deaf community. In the ASL manual alphabet,

fingerspelling is used primarily for spelling out names or

English terms which do not have established signs. Most

of the letters are shown as the viewer would see them, but

some (C, D, G, H, K, P, Q, and to a lesser extent F, O, X) are

shown from the side for clarity (Fig. 1). However, it is also

used for emphasis for clarity, and for instruction. The device

only translates the alphabet, but we can customize a hand

movement to mean a particular word. [15] [7]

Fig.1 American Sign Language Hand Gesture

Page 2: Advanced wheel chair   vatsal shah

III. BLOCK DIAGRAM

Fig 2.shows the block diagram of a Wireless American Sign

Language Detection and Voice Conversion Flex Sensor

Controlled wheelchair for Physically Disable and Deaf/Dumb

People. Flex sensors which are variable resistance sensor

which are placed on each of the fingers. This sensor is used to

determine the position/angle of the fingers. Accelerometer is

directly interfaced to the digital ports. Microcontroller

processes the data for each particular gesture made.

Microcontroller is used to read data from different sensors and

then transmit these data to the receiver side. If compared data

get the matched then matched gesture sent with text to LCD

screen and speaker [2].

Fig2. Block Diagram

IV. SYSTEM DESCRIPTION

A. Flex Sensor

The Flex sensors (Fig. 3) are sensors that changes in

resistance depending on the amount of bend on the sensor.

They convert the change in bend to electrical resistance; the

more the bend, the more the resistance value increase. They are

usually in the form of a thin strip from 1’’ to 5” long that vary

in resistance; they could be made in a unidirectional or

bidirectional form. [7]

Fig. 3 4.5” Unidirectional Flex Sensor.

As Flex sensors are analog resistors, they work as variable

analog voltage dividers: when the substrate is bent, the sensor

produces a resistance output relative to the bend radius (Fig. 4)

[7] [11]

Fig. 4 Flex Sensor Offers Variable Resistance Readings.

The impedance buffer in the Basic Flex Sensor Circuit is a

single sided Operational Amplifier, used with these sensors

because the low bias current of the Op-Amp reduces error due

to source impedance of the flex sensor as voltage divider

(Fig. 5). Suggested Op-Amps are the LM358 or LM324.

Fig. 5 Basic Flex Sensor Circuit.

Fig. 6 Characteristics of the Flex Sensor [11]

B. Accelerometer Sensor

To detect the letters 'J' and 'Z', which require movement in

addition to hand position, we add an accelerometer to detect

the movement of the glove/hand. The accelerometer

ADXL335 is a small, thin, low power, complete 3-axis

accelerometer with signal conditioned voltage outputs. The

output signals are analog voltages that are proportional to

acceleration. The accelerometer can measure the static

acceleration of gravity in tilt-sensing applications as well as

dynamic acceleration resulting from motion, shock, or

vibration. Deflection of the structure is measured using a

differential capacitor that consists of independent fixed plates

and plates attached to the moving mass. The fixed plates are

driven by 180° out-of-phase square waves. Acceleration

deflects the moving mass and unbalances the differential

capacitor resulting in a sensor output whose amplitude is

proportional to acceleration. [10]

Display Unit

Micro

controller

Speech

Synthesizer

Speaker

Motor

Driver IC

Left Motor

Right Motor

Receiver

Micro controller

Flex Sensor

Accelerometer Sensor

Transmitter

Page 3: Advanced wheel chair   vatsal shah

C. Microcontroller

The AT89S51 is a low-power, high-performance 8-bit

microcontroller with 4K bytes of in System Programmable

Flash memory. It is compatible with the industry-standard

80C51 instruction set and pin out. The on-chip Flash allows

the program memory to be reprogrammed in-system.

AT89S51 is a powerful microcontroller which provides a

highly-flexible and cost-effective solution to many embedded

control applications. The AT89S51 provides the following

standard features: 4K bytes of Flash, 128 bytes of RAM, 32

I/O lines, two data pointers, two 16-bit timer/counters, a

five-vector two level interrupt architecture, a full duplex

serial port, on-chip oscillator, and clock circuitry. The Idle

Mode stops the CPU while allowing the RAM,

timer/counters, serial port, and interrupt system to continue

functioning. The Power-down mode saves the RAM con-

tents but freeze the oscillator, disabling all other chip

functions until the next external interrupt or hardware reset.

D. XBee Module

The XBee RF Modules was engineered to meet IEEE

802.15.4 standards and support the unique needs of low-cost,

low-power wireless sensor networks. The modules require

minimal power and provide reliable delivery of data between

devices. The modules operate within the ISM 2.4 GHz

frequency band and are pin-for-pin compatible with each other

and these modules are embedded solutions providing wireless

end-point connectivity to devices. They are designed for

specifically to replace the proliferation of individual remote

controls

Fig. 7 XBee module.

E. Display Unit

A 16 × 2 line LCD is used to display the status of two inputs (flex sensors, speech synthesis). LCD requires less power, provides backlight during lowlight vision. LCD is interfaced with a microcontroller in byte mode (8-bits of command/data are transmitted at a time). [12]

F. Speech Synthesizer

This module of the system is consisted of a microcontroller

(AT89C51), a SP0256 (speech synthesizer) IC, amplifier

circuitry and a speaker. The function of this module is to

produce voice against the respective gesture. The

microcontroller receives the eight bit data from the “bend

detection” module. It compares the eight bit data with the

predefined values. On the basis of this comparison the

microcontroller comes to know that which gesture does the

hand make. Now the microcontroller knows that which data is

send by the bend detection module, and what the meaning of

this data is. Meaning means that the microcontroller knows if

the hand is making some defined gesture and what should the

system speak. The output of the amplifier is given to the

speaker. [12] [8]

G. Motor Driver IC

L293D is a dual H-Bridge driver, so with one IC we can

interface two DC motors which can be controlled in both

clockwise and counter clockwise direction and a motor with

fixed direction of motion. All I/Os are used to connect four

motors [16]. L293D has output current of 600mA and peak

output current of 1.2A per channel. The output supply has a

wide range from 4.5V to 36V. [3]

Fig. 8 H-Bridge Driver

Driver IC has four switching elements within the bridge. These four elements are often called, high side left, high side right, low side right, and low side left (when traversing in clockwise order). The switches are turned on in pairs, either high left and lower right, or lower left and high right, but never both switches on the same "side" of the bridge. [7] [10]

TABLE I. TRUTH TABLE OF L293D

V. FLOW CHART

This section, Flowchart explains the basic working of the

system in a simple way. Initially, the gestures from the gloves

are accepted. This analog output is then converted to digital

output by the ADC of the micro- controller. This output is

then compared to the previously stored data of letters for the

corresponding gestures and it is checked for validity. If

the gestured value matches any of the pre-stored value,

the corresponding value from database is displayed on the

LCD and voice in speaker or else it goes back in the

loop. The figure below illustrates the Flowchart of the system.

[12]

Page 4: Advanced wheel chair   vatsal shah

Fig.9 Flow chart of system execution

VI. RESULT AND DISCUSSION

Advanced wheel chair is the prototype for establishing

easy communication between deaf/dumb people and normal

people. This will surely help them to be independent and

confidently express them. When a person wears a band fixed

with accelerometer and bends is finger the wheelchair moves

in corresponding direction based on the bend of the finger. For

different sign detection and conversion better and sophisticated

implementation, a matrix technique has been implemented.

Here, each sensor bend is divided in three distinct parts, viz.

Complete Bend (CB), Partial bend (PB) and Straight (S).

Range of values, associated with each bend of the respective

sensor is calculated and its digital equivalent i s f o u n d o u t .

Table 1 below, depicts the Bend characteristics corresponding

to each of the five fingers, viz. thumb, index, middle, ring

and little. Though the corresponding concept behind the idea

of the matrix technique. The CB, PB and S values for each

sensor are calculated and the range is specified. [15]

TABLE II. VALUES FOR CB, PB, S

BENDS FINGERS CB PB S

1. THUMB <=550 <=550 >550

2. INDEX <=380 381-500 >500

3. MIDDLE <=340 341-450 >450

4. RING <=390 391-480 >480

5. LITTLE <=460 461-500 >500

The accelerometer sensor is calibrated such that it produces

particular analog voltage for a corresponding tilt. At the end

of the research it is expected that we get h i g h e r accuracy

(upto 90-95%) of hand gesture recognition by using sensory

data gloves. So we have combined flex sensor and

accelerometer sensors data together and then fading to the

microcontroller. These both sensor increases accuracy,

reliability as well as comfort to the user.

Fig. 10 Hand Gesture for Wheelchair System [6]

TABLE III. FLEX SENSOR RANGE

FINGER

LETTERS

THUMB

INDEX

MIDDLE

RING

LITTLE

A S CB CB CB CB

B B S S S S

C S PB PB PB PB

D B S CB CB CB

E B PB PB PB PB

F B CB S S S

G B S CB S CB

H B S S CB CB

I B CB CB CB S

J B CB CB CB PB

K S S S CB CB

L S S CB CB CB

M B PB PB PB CB

N B PB PB CB CB

O B PB PB PB S

P S S PB CB CB

Q S S CB CB S

R B S PB CB CB

S B CB CB CB CB

T S PB CB CB CB

U B PB S CB CB

V S PB PB CB CB

W B S S S CB

X B PB CB CB CB

Y S CB CB CB S

Z B S S CB S

Fig. 11 Implementation of an alphabet ‘A’

Page 5: Advanced wheel chair   vatsal shah

VII. CONCLUSION AND FUTURE WORK

This automatically controlled chair is a useful for speech

impaired and partially paralysed patients which fill the

communication gap between patients, doctors and relatives.

They can move around easily and any person can operate this

chair by his finger movements. It will give dumb a voice to

speak for their needs and to express their gesture. System

efficiency is improved with wireless transmission is help in

long distance communication. In future work of the system

supporting more no of sign, different language mode. The

various operations like taking turns, starting or stopping

vehicles can be implemented efficiently. This system is going

to develop as hardware and software.

ACKNOWLEDGMENT

I would like to thank my HOD Prof. R N Mutagi, ECE

Department who had been guiding throughout the task to

complete the work successfully, and would also like to thank

Assoc. Prof. Hansa Shingrakhia, ECE Department and other

staff for extending their help & support in giving technical

ideas about the paper and motivating me to complete the

work effectively & successfully.

REFERENCES

[1] M. Sternberg, “The American Sign Language Dictionary”,

Multicom, 1996.

[2] Ambikagujrati, Kartigya Singh, Khushboo, Lovika Soral, Mrs.

Ambikapathy, “Hand-Talk Gloves With Flex Sensor: A

Review”, International Journal of Engineering Science

Invention, Volume 2 Issue 4, Pp.43-46, April 2013.

[3] Shruti Warad, Vijayalaxmi Hiremath, Preeti Dhandargi,

Vishwanath Bharath, P.B.Bhagavati, “Speech and Flex Sensor

Controlled Wheelchair for Physically Disabled People”

unpublished.

[4] Dr.Shaik Meeravali, M. Aparna, “Design and Development of A

Hand-Glove Controlled Wheel Chair Based On MEMS”,

International Journal of Engineering Trends and Technology

(IJETT) – Volume 4 Issue 8, August 2013.

[5] Sv Anusha, M Srinivasa Rao, P V Ramesh, “Design and

Development of A Hand Movement Controlled Wheel Chair”,

Global Journal of Advanced Engineering Technologies, Vol1,

Issue4-2012.

[6] Ajinkya Raut, Vineeta Singh, Vikrant Rajput, Ruchika Mahale,

“Hand Sign Interpreter”, The International Journal Of

Engineering And Science (IJES) B, Vol.1 Issue 2, Pages 19-25,

2012.

[7] Jamal Haydar, Bayan Dalal, Shahed Hussainy, Lina El Khansa,

Walid Fahs, “ASL Fingerspelling Translator Glove”, IJCSI

International Journal Of Computer Science Issues, Vol. 9, Issue

6, No 1, November 2012.

[8] Ata-Ur-Rehman, Salman Afghani, Muhammad Akmal, Raheel

Yousaf, “Microcontroller and Sensors Based Gesture

Vocalizer”, Proceedings of the 7th WSEAS International

Conference on SIGNAL PROCESSING, ROBOTICS and

AUTOMATION (ISPRA '08), University Of Cambridge, UK,

February 20-22, 2008.

[9] Anbarasi Rajamohan, Hemavathy R., Dhanalakshmi M., “Deaf-

Mute Communication Interpreter”, International Journal of

Scientific Engineering and Technology, Volume 2 Issue 5, 1

May 2013.

[10] S. Tameemsultana and N. Kali Saranya, “Implementation of

Head and Finger Movement Based Automatic Wheel Chair”,

Bonfring International Journal of Power Systems and Integrated

Circuits, Vol. 1, Special Issue, December 2011.

[11] Praveenkumar S Havalagi, Shruthi Urf Nivedita, “The Amazing

Digital Gloves That Give Voice To The Voiceless”,

International Journal Of Advances In Engineering &

Technology, Mar. 2013.

[12] V.Padmanabhan, M.Sornalatha, “Hand Gesture Recognition and

Voice Conversion System for Dumb People”, International

Journal of Scientific & Engineering Research, Volume 5, Issue

5, May-2014.

[13] Prakash B Gaikwad, Dr. V. K. Bairagi, “Hand Gesture

Recognition For Dumb People Using Indian Sign Language”,

International Journal Of Advanced Research In Computer

Science And Software Engineering, Vol. 4, Issue 12, Dec. 2014.

[14] J. Thilagavathy, A. Jeyapaul Murugan S. Darwin, “Embedded

Based Hand Talk Assisting System for Deaf And Dumb”,

International Journal Of Engineering Research & Technology

(IJERT), Vol. 3 Issue 3, March - 2014

[15] Kiratey Patil, Gayatri Pendharkar, Prof. G. N. Gaikwad,

“American Sign Language Detection”, International Journal Of

Scientific And Research Publications, Volume 4, Issue 11,

November 2014.