thesis

37

Upload: sai-maheshwara-reddy-manda

Post on 17-Aug-2015

10 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Thesis
Page 2: Thesis

2

A Robust Implementation of Virtual

Mouse Using Webcam

Project report submitted

in partial fulfillment of the requirement for the degree

Bachelor of Technology

in

Computer Science and Engineering

By

Sai Maheswara Reddy Manda (R082881)

Damodhar V (R082766)

Raja Paluri (R082894)

Rajiv Gandhi University of Knowledge Technologies

R.K.Valley, Idupulapaya, Vempally, Kadapa(Dist) - 516 329

Andhra Pradesh, India

Page 3: Thesis

BONAFIDE CERTIFICATE

This is to certify that the thesis entitled “A Robust Implementation of Virtual Mouse

Using Webcam” submitted by Sai Maheswara Reddy Manda (R082881),

Damodhar V (R082766), Raja Paluri (R082894), in partial fulfillment of the

requirements for the award of Bachelor of Technology in Computer Science and Engineering

is a bonafide work carried out by him under my supervision and guidance.

The thesis has not been submitted previously in part or in full to this or any other Univer-

sity or Institution for the award of any degree or diploma.

(signature)

Mr. S. Chandrasekhar

(Head of the Department)

Department of CSE

RGUKT, RK Valley

(signature)

Mr. Siva Rama Sastry Gumma

(Lecturer)

Department of CSE

RGUKT, RK Valley

Page 4: Thesis

DECLARATION

We Sai Maheswara Reddy Manda, Damodhar V, Raja Paluri hereby declare that this Thesis

entitled “A Robust Implementation of Virtual Mouse Using Webcam” submitted

by us under the guidance and supervision of Siva rama sastry Gumma is a bonafide

work. I also declare that it has not been submitted previously in part or in full to this

University or other University or Institution for the award of any degree or diploma.

Date:

Place:

(Sai Maheswara Reddy Manda (R082881))

(Damodhar V (R082766))

(Raja Paluri (R082894))

Page 5: Thesis

To,

My Parents and My well wishers

Page 6: Thesis

Acknowledgments

We would like to express our sincere gratitude to Siva rama sastry Gumma, our

project guide, for valuable suggestions and keen interest through out the progress of our

course of research.

We are grateful to Chandrasekhar Syamala, Department Coordinator, Computer

Science Department, for providing excellent computing facilities and a congenial atmo-

sphere for progressing with our project.

At the outset, we would like to thank RGUKT for providing all the necessary resources

for the successful completion of my course work. At last, but not the least We thank our

classmates and other students of computer science for their physical and moral support.

With Sincere Regards,

Sai Maheswara Reddy Manda (R082881)

Damodhar V (R082766)

Raja Paluri (R082894)

iv

Page 7: Thesis

Abstract

Virtual Interface offers a natural and intelligent mode of interaction with Machines.

Hand gesture recognition is more efficient and easier than traditional devices like keyboards

and mouse. In this paper, a vision based virtual mouse interface that can recognize the

pre-defined mouse movements in real time regardless of the context is proposed. Computer

vision technology, such as image segmentation and gesture recognition coupled with an

intuitive algorithm can control mouse tasks (left and right clicking, double-clicking, holding

and scrolling). A fixed set of manual commands and a reasonably structured environment

are considered to develop a simple, yet effective procedure for interfacing that decreases

the complexity and boosts the efficiency of present state Human Computer Interaction.

The proposed algorithm is invariant to translation, rotation, and scale of the hand. The

paper takes off with a brief introduction and few insights into the state of art. Then the

proposed methodology is unveiled. The effectiveness of the technique is demonstrated on

real imagery and the results are furnished. Finally, opportunities and challenges with a

brief note on future research possibilities are highlighted.

v

Page 8: Thesis

List of Figures

3.1 Detecting the pointer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3.2 Detected Pointer from RGB image . . . . . . . . . . . . . . . . . . . . . . 9

3.3 Detected Centroid of the pointer . . . . . . . . . . . . . . . . . . . . . . . . 10

3.4 Measured Distance between the Pointers . . . . . . . . . . . . . . . . . . . 11

3.5 Defined gesture for Left Click . . . . . . . . . . . . . . . . . . . . . . . . . 12

3.6 Left Click View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

3.7 Defined gesture for Right Click . . . . . . . . . . . . . . . . . . . . . . . . 13

3.8 Right Click View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.9 Mouse Holding View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.10 Defined gesture for Scrolling Up . . . . . . . . . . . . . . . . . . . . . . . . 15

3.11 Scrolling Up View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

3.12 Defined gesture for Scrolling Down . . . . . . . . . . . . . . . . . . . . . . 16

3.13 Scrolling Down View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4.1 Virtual Mouse GUI window . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.2 Window showing ’File’ Option . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.3 Window showing compatibility . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.4 Window showing ’Help’ Option . . . . . . . . . . . . . . . . . . . . . . . . 19

4.5 Window showing Information Set . . . . . . . . . . . . . . . . . . . . . . . 20

vi

Page 9: Thesis

List of Tables

5.1 Left Click: Selecting the icon . . . . . . . . . . . . . . . . . . . . . . . . . 22

5.2 Right Click: Drop Down Menu . . . . . . . . . . . . . . . . . . . . . . . . . 22

5.3 Right Click: Drop Down Menu . . . . . . . . . . . . . . . . . . . . . . . . . 22

5.4 Dragging Icon from one corner to another corner . . . . . . . . . . . . . . . 22

5.5 Scrolling the GOOGLE NEWS Home Page . . . . . . . . . . . . . . . . . . 22

vii

Page 10: Thesis

Contents

Acknowledgments iv

Abstract v

Abbreviations x

1 Introduction 1

1.1 Introduction to HCI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 State of the art . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

1.3 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.4 Software Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.5 Hardware Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.6 Other Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.7 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

1.8 Applications of HCI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Literature Survey 5

2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

2.2 HSV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.3 Literature Servey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3 Proposed Methodology 8

3.1 Detection of pointer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.2 Detection of Centroid of Pointer . . . . . . . . . . . . . . . . . . . . . . . . 9

3.3 Distance between the pointers . . . . . . . . . . . . . . . . . . . . . . . . . 10

3.4 Mouse Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

viii

Page 11: Thesis

3.4.1 Left Click . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

3.4.2 Double Click . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.4.3 Right Click . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.4.4 Mouse Holding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.4.5 Scrolling Up . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.4.6 Scrolling Down . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Graphical User Interface (GUI) 17

4.1 Graphic User Interface(GUI) . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.2 Starting Virtual Mouse . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.3 Stopping Virtual Mouse . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.4 Compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.5 Help . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.6 Instruction Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

5 Results 21

6 Conclusion 23

7 References 24

7.1 Reference Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

7.2 Reference Books . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

7.3 Reference Websites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

ix

Page 12: Thesis

Abbreviations

3D Three Dimensional

BlueX X-Coordinate of Blue label

BlueY Y-Coordinate of Blue label

GUI Graphical User Interface

HCI Human Computer Interaction

HDD Hard Disk Drive

HSV Hue Saturate Violet

IplImage It’s a type of image structure

Mat Matrix structure for data storage

ORB Oriented BRIEF

RAM Random Access Memory

RedX X-Coordinate of Red label

RedY Y-Coordinate of Red label

RGB Red Blue Green

SIFT Scale-invariant feature transform

SURF Speeded Up Robust Features

x

Page 13: Thesis

Chapter 1

Introduction

1.1 Introduction to HCI

Human computer interaction (HCI) is one of the most important areas of research where

researchers are in a continuous quest towards improved interfaces. Of late, smaller and

smaller devices are being requested by the users. Most of the present day interfaces need

some amount of space so as to use and cannot be used while moving. Though Touch

screens have been globally accepted for their efficiency in control applications, they cannot

be applied to desktop systems because of cost and other hardware limitations. By applying

vision technology, coloured pointers and controlling the mouse by natural hand gestures,

the work space required can be reduced. Optimizing Virtual Mouse interface is a chal-

lenging problem in its general form. Though the challenge seems to be an age old one, to

this day there hasnt been a single version qualitative enough to meet the desire of user.

The proposed technique could be attributed as such an attempt to improve the interface

in terms of speed, efficiency and reliability.

Computer vision based techniques are non-invasive and based on the way human beings

perceive information about their surroundings. Although it is difficult to design a vision

based interface for generic usage, yet it is feasible to design such an interface for a controlled

environment. This process involves 7 major steps

• Capturing image through webcam

• Smoothing the captured image

• Converting RGB scale to HSV scale

• Thresholding the converted image

1

Page 14: Thesis

• Detecting red and blue colours

• Finding centroid and radius of red and blue colours

• Implementing respective functionalities

1.2 State of the art

Gestures for virtual and augmented reality applications have experienced one of the greatest

levels of uptake in computing. Virtual reality interactions use gestures to enable realistic

manipulations of virtual objects using ones hands, for 3D display interactions or 2D displays

that simulate 3D interactions. Most existing approaches involve changing mouse parts such

as adding more buttons or changing the position of the tracking ball.

Many researchers in the human computer interaction and robotics fields have tried to

control mouse movement using video devices. However, all of them used different methods

to make a clicking event. One approach, by Erdem et al, used fingertip tracking to control

the motion of the mouse. A click of the mouse button was implemented by defining a

screen such that a click occurred when a users hand passed over the region [1]. Another

approach was developed by Chu-Feng Lien [2]. He used only the finger-tips to control the

mouse cursor and click. His clicking method was based on image density, and required

the user to hold the mouse cursor on the desired spot for a short period of time. Kamran

Miyaji and Vikram Kumar[3] implemented using two coloured tapes. They used yellow

tape to move the cursor, red yellow together for left click, yellow waiting for 7 seconds used

for right click, red yellow waiting for 7 seconds used for double click. Another approach

was developeed by Hojoon Park[4]. He used index finger for moving the cursor and angle

between the index and thumb fingers for clicking events. Another method is developed by

Asanterabi Malima[5], a finger counting system to control behavior of a robot.

To detect the object, salient future detection algorithm such as SIFT, SURF, or ORB

can be used. However, this type of complex algorithm might slow down the speed and the

system may not be used in real time. Speed and accuracy trade-off needs to be balanced.

2

Page 15: Thesis

1.3 Motivation

Drawbacks in the existing system motivated me to gesture supporting system to operate

computers.

• The existing system does not provide effective results

• Existed algorithms take much time to detect gestures.

1.4 Software Requirements

• Windows Operating System

• Webcam Drivers

• Microsoft Framework 4.0 or above

1.5 Hardware Requirements

• Intel CORE i3 Processor

• 2 GB RAM

• 1 GB HDD Space

• Webcam at 30 frames/second, 640x480 resolution

1.6 Other Requirements

• Blue tape is for moving the cursor

• Red tape is to implement other functionalities

1.7 Problem Statement

To provide a set of tools or methods for preprocessing frames through camera module

enables us to control computer operations with gestures.

3

Page 16: Thesis

1.8 Applications of HCI

• Operate Computers remotely(distantly)

• Gesture supported drawing

• For game playing

• Application Handling

4

Page 17: Thesis

Chapter 2

Literature Survey

2.1 Background

Everyone in this world communicate through word of mouth or through gestures ,but they

can communicate with computer through specific devices. Human and machine interaction

by far has only been through simple means of communication like a mouse, keyboard or

switches. Voice recognition and gesture recognition are two powerful and more natural

means of communicating with machines as is with human beings .There are many places

where we can replace existing system with gesture supported system which will enable hu-

man more comfortable with computer and they can carry their work in right manner. In

many cases for efficient detection of gestures hand gloves and color caps are used in still

background. But in addition it also depends on smoothness of background and lightning

conditions.Touch screens are also a good control interface and nowadays it is used globally

in many applications. However, touch screens cannot be applied to desktop systems be-

cause of cost and other hardware limitations. By applying vision technology and controlling

the mouse by natural hand gestures, we can reduce the work space required.

Computer vision is just one of the latest of those new tools, and is similarly powering

changes in the arts where it is used, as well as being a significant component of the growing

disciplines of new media and interactive art. As a sensing modality rather than a sensible

medium, computer vision is not itself a medium for art, but it is a powerful tool that

can be applied in many art forms, from photography and sculpture to dance and music.

OpenCV is written in C++ and its primary interface is in C++, but it still retains a less

comprehensive though extensive older C interface. There are now full interfaces in Python,

Java and MATLAB.

5

Page 18: Thesis

2.2 HSV

In practice, the choice of RGB is not particularly optimal for better results, HSV color

space is needed because whose axis is aligned with brightness. The reason for this is that,

empirically most of the variation in background tends to be along the brightness axis, not

the color axis. In HSV we have an angle to select particular color.

2.3 Literature Servey

There are many methodologies used to interact with system. Different methodologies fol-

lows different methods and ways of interaction.

1. Mouse Simulation Using Two Coloured Tapes by Kamran Niyazi, Vikram Kumar. Func-

tionalities:

• Move Cursor: Yellow Tape

• Left Click: Red Yellow together

• Right Click: Yellow Waiting for 7 seconds

• Double Click: Red Yellow Waiting for 7 seconds

Limitations:

• Mouse hold Scrolling not implemented.

• For Right Click and Double Click colored tapes should wait for 7 seconds on desired

spot.

2. A Method for Controlling Mouse Movement using a Real-Time Camera by Hojoon Park.

Functionalities:

• Move Cursor: Index Finger

• Clicking events: Angle between index and thumb fingers

Limitations:

6

Page 19: Thesis

• Mouse hold Scrolling not implemented.

3. Real-time Hand Mouse System on Handheld Devices by Chu-Feng Lien. Functionalities:

• Used finger-tips to control the mouse cursor and clicks

Limitations:

• For click events user should hold the mouse cursor on the desired spot for a short

period of time.

4. Computer vision based mouse by A. Erdem. Functionalities:

• Move Cursor: Finger tip

• Clicking events: when a users hand passed over the region

Limitations:

• Gives some unwanted clicking’s.

5. Vision-Based Hand Gesture Recognition by Asanterabi Malima. Functionalities:

• Developed a finger counting system to control behavior of a robot.

7

Page 20: Thesis

Chapter 3

Proposed Methodology

A coloured pointer has been used for the object recognition and tracking. Left and right

click events of the mouse have been achieved by detecting the position of pointers with

respective to each other on the image. In the object tracking application one of the main

problems is object detection. Based on precise thresholding with defined colour range in

the HSV colour space, object can be segmented from the image and can be used for further

execution of the mouse click events. Instead of finger tips, a colour pointer has been used

to make the object detection easy and fast. To simulate the click events of the mouse the

fingers with these colour pointers have been used.

3.1 Detection of pointer

Three tracking methods are widely employed for detection of objects from the input im-

age namely Feature based methods, Differential methods and Correlation based methods.

Feature based methods extract characteristics such as points, line segments from image se-

quences and has been proved reliable for detecting and tracking the objects. In the feature

based approach, four features namely centroid, dispersion, grey scale distribution, object

texture can be used for detecting pointers. Present technique is built upon centroid based

detection.

Input image, initially in RGB colour space is converted to HSV colour space for ease

and efficiency in identification of colours. Thresholding is then applied on converted image

with predefined range of required colour to be segmented. It results in a binary image with

concentration of white pixels at the detected pointer.

8

Page 21: Thesis

Figure 3.1: Detecting the pointer

Figure 3.2: Detected Pointer from RGB image

3.2 Detection of Centroid of Pointer

After removing all the connected components (objects) other than the pointer using the

pointer can be labelled. In other words the region can be detected. Moments can be used

to calculate the position of the centre of the object. We have to calculate 1st order axis

and the 0th order central moments of the binary image.

0th order central moments of the binary images are equal to the white area of the image

in pixels.

• X-coordinate of the position of the centre of the object = 1st order spatial moment

around x-axis / 0th order central moment.

• Y-coordinate of the position of the centre of the object = 1st order spatial moment

9

Page 22: Thesis

around y-axis / 0th order central moment.

After segmenting the coloured pointer from the image, it is approximated as an ellipse

and the centre of the pointer can be calculated for further processing with the following

equation.

x′ =

∑ki= xi

K, y′ =

∑ki= yiK

(3.1)

Where xi and yi are x and y coordinates of i pixel in the hand region, and k denotes

the number of pixels in the region.

Figure 3.3: Detected Centroid of the pointer

3.3 Distance between the pointers

Distance between pointers can be calculated from the mathematical formula mentioned

below with the two coordinates as the centroids of the detected pointers.

Distance

D =√

(RedX −BlueX) + (RedY −BlueY ).(3.2)

Here, RedX and RedY refer to the x and y-coordinates of red colored pointer where

as BlueX and BlueY refer to the x and y coordinates of blue coloured pointer. Based

on this distance between pointers, click events and scrolling are primarily differentiated

and executed. Distance between the pointers coupled with angle can form the basis for

distinction between left, right and double click events.

10

Page 23: Thesis

Figure 3.4: Measured Distance between the Pointers

In order to know the presence of contact established between two pointers, Distance is

essentially calculated.

3.4 Mouse Events

Thumb finger is used as a cursor controller to control mouse cursor. It can be attributed

as a mapping cursor control. It means that the blue coloured pointer supposed to be the

thumb finger position maps to a desktop screen position. The proposed solution has very

little time delay compared to the weighted speed cursor control. This methods efficiency is

proportional to the resolution of camera. In spite, the proposed technique works efficiently

on low resolution images.

Depending on the pointer detected in the sequence of image frames, the coordinates of

pointer mapped onto screen resolution change and the cursor moves accordingly.

Mouse click is one of the most challenging parts for the virtual mouse application. As all

we know, the left button of the mouse has a different task for a single or double clicks. To

simulate this feature another pointer has been used. When the second pointer is detected

in the image, left mouse click event has been executed. Depending on the time that the

second pointer is being detected single and the double clicks have been simulated.

3.4.1 Left Click

In order to perform left click, red colored pointer needs to be placed on the thumb finger-tip

which is already associated with blue band.

11

Page 24: Thesis

Figure 3.5: Defined gesture for Left Click

Figure 3.6: Left Click View

If (distance < (Redradius + Blueradius + 40) andredY < (blueY − 20)) then Left Click

(3.3)

Distance between two centroids should be less than sum of Radii of red, blue pointers

and additional margin 20. It should be less than the mentioned sum, so as to confirm

the existence of contact between the two pointers. 20 is considered as the noise margin,

which is directly influenced by illumination and is to be considered only for low resolution

imagery.

12

Page 25: Thesis

3.4.2 Double Click

Double click can be approximated as the two consecutive left clicks performed with a time

delay greater than 1/10th of a second. The implementation can be corresponded to the

above mentioned left click event. The conditions remain the same for both.

3.4.3 Right Click

In order to perform right click, red coloured pointer needs to be placed at the bottom end

of blue pointer. As in the case with left click, Distance between two centroids should be

less than sum of Radius of red, blue pointers and additional margin 20.

If (distance < (RedRadius + BlueRadius + 40) andRedY > (BlueY − 20)) thenRightClick

(3.4)

Figure 3.7: Defined gesture for Right Click

13

Page 26: Thesis

Figure 3.8: Right Click View

3.4.4 Mouse Holding

In order to perform mouse holding, red coloured pointer needs to be placed beside he blue

pointer such that they are on the same horizontal axis. As in the case with the above

mentioned clicks, Distance between two centroids should be less than sum of Radius of red,

blue pointers and additional margin 20.

Figure 3.9: Mouse Holding View

3.4.5 Scrolling Up

Scrolling is a very useful task with hand gestures. We also implemented this feature in our

system. Based on the distance between two centroids, the pointers are ensured that they

are not in contact. Depending on the position of pointers with respect to each other. If

the blue colored pointer is above the red coloured pointer, scrolling down is executed and

14

Page 27: Thesis

if red colored pointer is above the blue colored pointer, scrolling up is executed.

In order to perform up scrolling, y coordinate of red pointer needs to be greater than y-

coordinate of blue pointer. On both sides of Red-X, a margin of 20 pixels is considered for

better user interface. The event will be not be executed in case the centroid of one pointer

falls out of range ie. 20 pixels. This condition has been considered so as not to mess with

the scrolling down condition. The margin of 20 pixels can be increased or decreased based

up on user convenience.

RedY > BlueY

and (BlueX20) < RedX < (BlueX + 20) (3.5)

Figure 3.10: Defined gesture for Scrolling Up

Figure 3.11: Scrolling Up View

15

Page 28: Thesis

3.4.6 Scrolling Down

In order to perform down scrolling, y coordinate of red pointer needs to be less than y-

coordinate of blue pointer. On both sides of Red-X, a margin of 20 pixels is considered for

better user interface. The theory developed for scrolling up applies to scrolling down too.

RedY < BlueY

and (BlueX20) < RedX < (BlueX + 20) (3.6)

Figure 3.12: Defined gesture for Scrolling Down

Figure 3.13: Scrolling Down View

16

Page 29: Thesis

Chapter 4

Graphical User Interface (GUI)

4.1 Graphic User Interface(GUI)

The Graphical User Interface of this project is created in Visual Studio 2010. This GUI

window contains the link to run the Virtual Mouse. The GUI window contains menu bar,

which contains File, Compatibility and Help as menu options. Each option contains list of

actions dedicated to that specific menu.

Figure 4.1: Virtual Mouse GUI window

File option contains two functionalities. Those are

• Start Virtual Mouse

• Stop Virtual Mouse

4.2 Starting Virtual Mouse

To start virtual mouse go to Start Virtual Mouse in File option or pressing shortcut

(Alt+Shift+S).

17

Page 30: Thesis

Figure 4.2: Window showing ’File’ Option

It starts the Virtual Mouse, after few seconds project Virtual Mouse Window will be

displayed to the user and user can operate the system without mouse.

The user is allowed to perform the operations mentioned above.

4.3 Stopping Virtual Mouse

• In order to quit the Virtual Mouse go to Stop Virtual Mouse in File option or press

shortcut(Alt+Shift+W) on active Virtual Mouse window or Press ESC key on Virtual

Mouse Preview window.

4.4 Compatibility

• The compatibility menu button gives the basic requirements to run the project. Basi-

cally this project runs in specific software environmental conditions. Software works

only in windows system which having .NET FRAMEWORK 4.0 or above. The res-

olution levels of the webcam should be not less than 1.3 M HD.

4.5 Help

• The help menu will resolve all the problems of the user.

• It allows user to operate the software smoothly.

• Help menu button contains User guide, instructions, developer team as sub menu

items.

18

Page 31: Thesis

Figure 4.3: Window showing compatibility

• User guide graphically shows how to perform all six functionalities.

Figure 4.4: Window showing ’Help’ Option

4.6 Instruction Set

Instruction set window contains the list of instructions to use the Virtual Mouse very easily

and effectively.

19

Page 32: Thesis

Figure 4.5: Window showing Information Set

20

Page 33: Thesis

Chapter 5

Results

Method followed for evaluating performance is that time for a task, such as clicking, double

clicking and scrolling, is noted. Five experiments have been designed to get performance.

In the first experiment we placed an icon on the centre of desktop window and put the

cursor in the top-left corner. We then measured the time in how long it takes to select the

icon. In the second experiment the position of the icon is same and we only measured the

time to show the drop down menu on the icon. In the third experiment the position of the

icons is same with the second experiment and we measured the time to open the icon. In

the fourth experiment, an icon is dragged from one corner to another corner of the screen

and the time was measured. In the final experiment we opened the google news home page

and measured the time until the scroll bar moves from top to bottom. All the experiments

have been performed with people new to the interface. Except left and double click events,

all the events match to the capabilities of traditional mouse. The results are shown below:

(time = sec).

21

Page 34: Thesis

Users Trail-1 Trail-2 Trail-3 Average Traditional MouseUser-1 2.62 2.58 2.68 2.627 1.832User-2 2.41 2.54 2.48 2.48 1.742User-3 2.84 2.68 2.62 2.71 2.12User-4 2.82 2.74 2.76 2.78 1.964

Table 5.1: Left Click: Selecting the icon

Users Trail-1 Trail-2 Trail-3 Average Traditional MouseUser-1 0.68 0.64 0.72 0.68 0.54User-2 0.54 0.62 0.68 0.61 0.48User-3 0.71 0.67 0.70 0.69 0.62User-4 0.82 0.78 0.73 0.78 0.59

Table 5.2: Right Click: Drop Down Menu

Users Trail-1 Trail-2 Trail-3 Average Traditional MouseUser-1 1.48 1.52 1.46 1.49 0.81User-2 1.39 1.36 1.41 1.39 0.79User-3 1.52 1.38 1.48 1.46 0.72User-4 1.56 1.48 1.53 1.52 0.86

Table 5.3: Right Click: Drop Down Menu

Users Trail-1 Trail-2 Trail-3 Average Traditional MouseUser-1 1.72 1.70 1.73 1.72 1.68User-2 1.86 1.91 1.95 1.91 1.83User-3 1.89 1.87 1.79 1.85 1.74User-4 1.73 1.71 1.74 1.73 1.69

Table 5.4: Dragging Icon from one corner to another corner

Users Trail-1 Trail-2 Trail-3 Average Traditional MouseUser-1 1.32 1.28 1.26 1.29 1.14User-2 1.46 1.38 1.42 1.42 1.21User-3 1.68 1.56 1.62 1.62 1.42User-4 1.49 1.67 1.56 1.57 1.38

Table 5.5: Scrolling the GOOGLE NEWS Home Page

22

Page 35: Thesis

Chapter 6

Conclusion

A vision based virtual mouse interface to control the mouse cursor using a real-time camera

was developed that could perform all mouse tasks such as left and right clicking, double

clicking and scrolling. Most vision algorithms have illumination issues and the proposed

method is no exemption to it. We can expect that if the vision algorithms can work in all

environments then our system will work more efficiently.

This system could be useful in presentations and to reduce work space. In the future,

our focus would be extended to develop a virtual mouse interface with fingers, thereby

removing the coloured pointers and promoting more interaction with the user. Also more

features such as enlarging and shrinking windows, closing window, etc can be added there

by making it an integrated version of touch screen and traditional mouse.

However, system has some disadvantages such as: being variant to illumination up to

some scale and movement of the cursor is very sensitive to motion. In the near future, a

robust virtual mouse interface overcoming the above said challenges can be developed with

little modifications to the existing system.

23

Page 36: Thesis

Chapter 7

References

7.1 Reference Projects

1 Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E. Computer vision based mouse, IEEE

International Conference Acoustics, Speech, and Signal Processing, 2002. Proceed-

ings.

2 Chu-Feng Lien, Portable Vision-Based HCI A Real-time Hand Mouse System on

Handheld Devices, National Taiwan University, Computer Science and Information

Engineering Department.

3 ”Mouse Simulation Using Two Coloured Tapes by Kamran Niyazi, Vikram Kumar,

implemented using colour detection technique.

4 Hojoon Park, A Method for Controlling the Mouse Movement using a Real Time

Camera, 2008, Brown University,Providence, RI, USA Department of computer sci-

ence.

5 ”Vision-Based Hand Gesture Recognition by Asanterabi Malima developed a finger

counting system to control behavior of a robot.

• Dix, A. Finlay, J. Abowd, G. D. and Beale, R. Human-Computer Interaction, 3rd

ed.Pearson Education Limited, 2004.

• Colombo, C. Bimbo, A. D. and Valli, A. ”Visual Capture and Understanding of Hand

Pointing Actions in a 3-D Environment,” IEEE Transactions on Systems, Man, and

Cybernetics, 33(4), University of Florence,Italy, pp.677-686,August2003.

24

Page 37: Thesis

7.2 Reference Books

• Mastering opencv with practical computer vision projects

• Learning opencv by Gary Bradski and Adrian Kaehler

7.3 Reference Websites

• http://www.opencv.org/

• http://stackoverflow.com/

• http://www.dreamincode.net/

• http://msdn.microsoft.com/

• http://www.technical-recipes.com/

• http://www.intorobotics.com/

• http://www.progcode.in/

25