real time processing of electrooculographic signal to … processing/icats_2015... · real time...

6
Real time processing of electrooculographic signal to type with a virtual keyboard K. Ben Si Saïd, N. Ababou and A. Ababou Laboratory of Instrumentation University of Science and Technology Houari Boumediene BP 32 El Alia 16111 BabEzzouar Algiers- Algeria [email protected], [email protected], [email protected] Abstract— When a disabled person loses the ability to speak, he or she becomes private of all his or her usual communication tools. Several methods and techniques of communication were developed to offer new supports of communication using alternative methods. Among them the movement of eyes can be considered as an interesting technique. Because of its efficiency, electrooculographic signal (EOG) has been a widely used technique to detect human eye activity. Several applications used eye tracking as a primary source input or consign generator. In this paper, a method for writing text using EOG eye tracking on a virtual keyboard is proposed. It was developed using a new type of keyboard with 5x6 pads, eyes blinking as well as left/right saccades. Data acquisition and analysis software were developed using LabVIEW language. Different digital filters were applied and compared and a derivative method was used. Typing speed test and accuracy of the system’s response are discussed. Keywords—electrooculography, eye tracking, speech disabled, virtual keyboard. I. INTRODUCTION When people are motor disabled, they become dependent on others and usually need assistance. So, during the last decades several systems were developed with the aim of offering them certain autonomy. In the case these people are not able to control their upper limbs, alternative methods are proposed such as generating commands and instructions to some developed systems using their physiological signals such as electroencephalography (EEG) [1] and electromyography (EMG) [2], or electrooculography (EOG) [3]. Usually, a motor disabled person can still control his eyes movements, and these movements have been used in several devices and systems as a source of control or instructions. To track the movements of the eyes, two methods are generally used, (i) Video-occulography (VOG) which consists essentially on using a camera with an image processing algorithms to detect the eye gaze rotations, (ii) EOG where five surface Ag- AgCl electrodes placed around the eyes and a conditioning circuit are generally used. Among the existing systems, in the prototype described Arai and Mardiyanto [4] an eye tracking device based on VOG was proposed to control an electric wheelchair by placing a camera at 15cm from the face to follow the eye gaze movements and to generate four commands (left, right, forward, and stop). Other systems are based on electrooculography (EOG) such as those described in [5,6] where five surface electrodes were used with a conditioning card to control a wheelchair with eyes for quadriplegic disabled people. Sung et al. [7] proposed a device based on EOG eye tracking using just vertical movements of the eyes to control an android application in a tablet. The principal aim of their work was to help musicians to turn pages when playing with their instruments. The authors in ref. [8] presented a low cost device that can control some home appliances using only the left-right eye movements. Martinez et al. [9] combined between EOG and voice recognition to control a robotic arm; they discussed the results obtained by the two used methods (voice recognition and EOG). Recently, some applications aimed to offer a communication support for paralyzed people by using the EOG eye tracking to type text on virtual keyboards using different methods. Tangsuksant et al. [10] proposed a method where a square virtual keyboard with (5x5) buttons was developed. Eye movement direction was used to control a virtual keyboard; closing eye during a fixed time was used as a selection command to avoid involuntary eye blinking. Soltani et al. [11] proposed a wearable eye tracking system based on EOG communicating with a Human Computer interface (HCI) using a Bluetooth unit. The keyboard developed was a classic phone keypad with nine buttons. By analyzing the EOG signals, eight directions of eye movements (up, down, right, left, up-left, up- right, down-left, down-right) were recognized and used to select a button. A voluntary blinking has been employed to open a second page containing characters and to choose the one to be displayed. In both methods, accuracy of eye movement direction detection depends mainly on EOG signal voltage level that might be considered as a limitation. Virtual keyboard proposed in this work was based on a linear instead of matrix architecture so as to simplify the processing and to reduce the risk of errors from matrix keyboards. In order to prevent the problems associated with EOG voltage signal drift, horizontal saccadic eye movement has been selected to command the virtual keyboard, and the time derivative of EOG signal has been considered to ensure better discrimination of eye movement’s amplitude. The methodology and the experimental are briefly described respectively in sections II and III. Results and discussion are presented in section IV. Research supported by MESRS JO200220140002 CNEPRU. International Conference on Automatic control, Telecommunications and Signals (ICATS15) University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015 1

Upload: lydan

Post on 15-Apr-2018

214 views

Category:

Documents


1 download

TRANSCRIPT

Real time processing of electrooculographic signal totype with a virtual keyboard

K. Ben Si Saïd, N. Ababou and A. AbabouLaboratory of Instrumentation

University of Science and Technology Houari BoumedieneBP 32 El Alia 16111 BabEzzouar Algiers- Algeria

[email protected], [email protected], [email protected]

Abstract— When a disabled person loses the ability to speak,he or she becomes private of all his or her usual communicationtools. Several methods and techniques of communication weredeveloped to offer new supports of communication usingalternative methods. Among them the movement of eyes can beconsidered as an interesting technique. Because of its efficiency,electrooculographic signal (EOG) has been a widely usedtechnique to detect human eye activity. Several applications usedeye tracking as a primary source input or consign generator. Inthis paper, a method for writing text using EOG eye tracking ona virtual keyboard is proposed. It was developed using a newtype of keyboard with 5x6 pads, eyes blinking as well as left/rightsaccades. Data acquisition and analysis software were developedusing LabVIEW language. Different digital filters were appliedand compared and a derivative method was used. Typing speedtest and accuracy of the system’s response are discussed.

Keywords—electrooculography, eye tracking, speech disabled,virtual keyboard.

I. INTRODUCTION

When people are motor disabled, they become dependenton others and usually need assistance. So, during the lastdecades several systems were developed with the aim ofoffering them certain autonomy. In the case these people arenot able to control their upper limbs, alternative methods areproposed such as generating commands and instructions tosome developed systems using their physiological signals suchas electroencephalography (EEG) [1] and electromyography(EMG) [2], or electrooculography (EOG) [3].

Usually, a motor disabled person can still control his eyesmovements, and these movements have been used in severaldevices and systems as a source of control or instructions. Totrack the movements of the eyes, two methods are generallyused, (i) Video-occulography (VOG) which consists essentiallyon using a camera with an image processing algorithms todetect the eye gaze rotations, (ii) EOG where five surface Ag-AgCl electrodes placed around the eyes and a conditioningcircuit are generally used. Among the existing systems, in theprototype described Arai and Mardiyanto [4] an eye trackingdevice based on VOG was proposed to control an electricwheelchair by placing a camera at 15cm from the face tofollow the eye gaze movements and to generate fourcommands (left, right, forward, and stop). Other systems are

based on electrooculography (EOG) such as those described in[5,6] where five surface electrodes were used with aconditioning card to control a wheelchair with eyes forquadriplegic disabled people. Sung et al. [7] proposed a devicebased on EOG eye tracking using just vertical movements ofthe eyes to control an android application in a tablet. Theprincipal aim of their work was to help musicians to turn pageswhen playing with their instruments. The authors in ref. [8]presented a low cost device that can control some homeappliances using only the left-right eye movements. Martinezet al. [9] combined between EOG and voice recognition tocontrol a robotic arm; they discussed the results obtained by thetwo used methods (voice recognition and EOG).

Recently, some applications aimed to offer acommunication support for paralyzed people by using the EOGeye tracking to type text on virtual keyboards using differentmethods. Tangsuksant et al. [10] proposed a method where asquare virtual keyboard with (5x5) buttons was developed. Eyemovement direction was used to control a virtual keyboard;closing eye during a fixed time was used as a selectioncommand to avoid involuntary eye blinking. Soltani et al. [11]proposed a wearable eye tracking system based on EOGcommunicating with a Human Computer interface (HCI) usinga Bluetooth unit. The keyboard developed was a classic phonekeypad with nine buttons. By analyzing the EOG signals, eightdirections of eye movements (up, down, right, left, up-left, up-right, down-left, down-right) were recognized and used toselect a button. A voluntary blinking has been employed toopen a second page containing characters and to choose the oneto be displayed. In both methods, accuracy of eye movementdirection detection depends mainly on EOG signal voltagelevel that might be considered as a limitation.

Virtual keyboard proposed in this work was based on alinear instead of matrix architecture so as to simplify theprocessing and to reduce the risk of errors from matrixkeyboards. In order to prevent the problems associated withEOG voltage signal drift, horizontal saccadic eye movementhas been selected to command the virtual keyboard, and thetime derivative of EOG signal has been considered to ensurebetter discrimination of eye movement’s amplitude. Themethodology and the experimental are briefly describedrespectively in sections II and III. Results and discussion arepresented in section IV.

Research supported by MESRS JO200220140002 CNEPRU.

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

1

II. METHODOLOGY

Electrooculography is the measure of the resting potential(potential difference), arising from hyperpolarization anddepolarization existing between the cornea and the retina. Thispotential varies as the eyeball rotate. Therefore the humaneyeball can be considered as a spherical battery. To record theEOG signals, five Ag-AgCl surface electrodes were used, twoof them for the horizontal movements (Blue) and two others forthe vertical (Red) movements as shown in Fig.1. The last one(purple) was used as a reference electrode.

Fig.1. Electrodes placement and resulting signals

The movements of human eyes can give threeinformational signals using EOG: right-left, high-low eyeballmovement, and eyes blinking. It is proposed in this paper amethod to control a virtual keyboard using only two signals:the first one from eye blinking and the other from right-lefteyeball movements. The derivative of EOG signal results in thediscrimination enhancement of saccadic eye movementamplitude. However, the noise is increased by the derivativeprocess, so a filtering is needed. Four EOG signal filterspresented in literature -Butterworth low-pass filter [12,13] FFTfilter [14], Haar wavelet filter [15,16] and Savitzky-Golay filter[17,18]- have been considered in this work;

A virtual keyboard with five green bars (1 to 5) with sixcharacters on each one is proposed to the user. To control thekeyboard, he must look at first to the middle bar. Then, he canstart by using his right-left eye movements to select one baramong the five existing. The selected bar becomes light greencolored. Then he uses his voluntary eye blinking to select thedesired character among the six existing on the chosen bar (seeFig.2.).

Fig.2. Virtual keyboard 5x6 pads: five columns positions, each one containingsix different characters in six rows.

III. EXPERIMENTAL

A. EOG analog conditioningThe EOG amplitude varies from 50 to 3500 µV (around 20

µV per degree) in a frequency range from DC to 100 Hz [19].The raw emerged signals from electrodes had low magnitudeand were noisy. To get a better signal output, the conditioningcircuit shown in Fig. 3 was used for horizontal as well as forvertical electrodes. It contains five stages: the pre-amplificationstage (a) consisting of an instrumentation amplifier with a gainequal to 1000.

When using EOG to track the movements of the eyes, theresulting signals contain a low frequency noise caused byluminance, EEG, head movements, electrode placement, etc.So, it is necessary to eliminate the shifting resting potential(mean value) because this value changes [20]. This wasaccomplished by the second order high-pass active filter (b)with a cutoff frequency equal to 0.5 Hz. The third part of thiscircuit (c) is a third order, active, low-pass filter with a cut offfrequency of 30 Hz. The block (d) is an inverting amplifierwith an amplification of 3.

The resulting signal after amplification and analog filteringhas negative and positive values depending on eyeballmovements. To convert it into digital signal, a summingamplifier (e) has been added. The output signal of theconditioning circuit has a maximal magnitude equal to 4V, so itcan be processed using a USB acquisition card (See Fig.3.2)based on PIC18f2550 microcontroller which was configuredand programmed in this application as a slave. Analog EOGsignals were converted into digital data with a samplingfrequency of 50 Hz, and sent to the host (computer) using theUSB 2.0 protocol.

Fig. 3. Hardware parts of the system: (1) EOG conditioning circuit,(2) microcontroller-based USB acquisition card

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

2

B. Signal processing on LabVIEW

Fig.4. Diagram of the virtual keyboard developed on LabVIEW

The main code of the virtual keyboard developed onLabVIEW was executed in a loop as represented in Fig.4. Itdeals with opening the USB communication with theacquisition card via the bloc (A), then receiving the data of theconverted horizontal and vertical EOG in the reception buffer(B). The block (C) separates the data into two parts, one for thehorizontal and the other for the vertical EOG channel. Eachchannel signal was upgraded via block (D) then the two signalswere low-pass filtered in block (E) with fc = 3Hz.

To detect saccadic eye movement, a first order centraldifference numeric derivative method was used to eliminate theshift dc value; this was performed by the block (F). Theresulting values were compared to thresholds by the block (G)for vertical EOG channel resulting in the determination of thenumber of blinks, and block (H) for horizontal EOG channelresulting in determining which pad the user looked at. Theblock (I) determined when a character was validated using themethod described in the next sub-section, then displayed in thetext box (See Fig.2). The next block (J) is a switch-casestructure that lights up the horizontal (1-5), and vertical (6-11)LEDs, on the front face of the interface. The next part (K) is agraphical representation of the filtered and derivate signalsassociated with vertical and horizontal EOG. All the treatmentswere real time done at 50Hz sampling frequency.

C. Using the virtual keyboardThe distance between the user’s eyes and the screen must

range between 30 and 40cm. Two different lights from red andlight green LEDs indicate the current position (letter ‘O’ in theFig.2 example). Red color of the selected row moves from upto down depending on eye blinks; light green color of theselected column moves also from central position to the left orthe right columns, depending on horizontal eyes movements.The selected last row is red, and the light green middle columnhas been selected. The intersection of the selected row andcolumn corresponds to the character selected by the user. Thetext box above the keyboard contains the chosen and validatedcharacter. To validate a character, the user has to select ahorizontal position (indicated by light green color) then heblinks his eyes resulting in a jump of the red light position fromthe first row to that containing the desired character.

Fig.5. flowchart describing the typing steps

Fig.5 shows the flowchart describing the different steps tovalidate a character on the virtual keyboard. The selected bar isindicated by light green color. If no blink is detected thecharacter cannot be selected nor displayed. If one blink isperformed and its related signal detected, a light red LEDlightens on the first row of the active bar. As the blinks’number increases the next pads lighten successively dependingon the number of blinks. If this number exceeds 6, the blinkscounter returns to zero. If the eye gaze changes horizontally,the character is validated then displayed in the text box. Theprocedure is restarted for the next character.

IV. RESULTS AND DISCUSSION

A. Selecting a convenient filterFour different digital low-pass filters (fc=3Hz) have been

tested in order to select the most efficient one for EOG signalsand compared to raw data (without digital filtering) are shownin figure 6. The black curve -a- refers to raw data, while curves-b- to -e- are associated with the same data filtered respectivelyby Butterworth filter, FFT filter, Haar wavelet filter andSavitzky-Golay filter. The vertical scale of the graph isassociated with Savitzky-Golay filtered data. For the figureclearness, the other curves have been shifted up respectively by0.9V for Haar wavelet filtered curve (d), 1.8V for FFT filteredcurve (c), 2.7V for Butterworth filtered curve (b) and 3.6V forthe raw data (curve a).

From the curves’ shapes one can consider that in firstapproximation the digital third-order Butterworth filter showsbetter results than the three others.

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

3

Fig.6. EOG signals after applying 3 Hz low-pass digital filters-a- Raw data without digital filtering,-b- Third-order Butterworth filter;-c- FFT filter;-d- Haar wavelet filter;-e- Savitzky-Golay filter

Considering the effect of the four precedent different filterson time derivative of EOG signals shown in Fig.7, the third-order Butterworth filter (red curve -b-) seems also in this caseto be the filter that gave the best results with the highest valueof the signal to noise S/N ratio (37.43 dB).

Fig.7. Filter effect on time derivative of EOG signal

B. Derivative method efficiencyA graphical representation of the signals was made to show

the efficiency of the derivate method to detect saccadicmovements of the eye gaze. This was done in our laboratory. Asubject with five Ag-AgCl electrodes placed on his face asshown in Fig.1 performed a test. Firstly, one step saccadic eyemovement was considered; the subject was asked to look at themiddle of the keyboard (column 3 in Fig.2) for a time. Theresulting signal illustrated in Fig.8 shows a random variation ofthe filtered signal magnitude corresponding to the same eyegaze direction, because the current magnitude depends on theprecedent one (corresponding to the last eye gaze direction).

Fig.8. Horizontal EOG signal and related time derivative signal for one-stepsaccadic eye movement

In a second time, two steps saccadic eye movements wereconsidered: the subject was asked to look at the position (5)before returning to the middle position (3), then lookingsuccessively at position (1), (3) and so on, during a successionof times. But the same magnitude can be noted in the derivatecurve (See Fig.9) for the same horizontal eye movement (1-3,3-5, 5-3, etc.) because the derivate eliminates the dc-shiftcomponent of the signal. So, only the signal associated withsaccadic horizontal movement of the eyes remains. Thus, thedifference of the signals magnitude related to eye saccadesbecomes significant and can be easily compared to thresholdsto determine the eye gaze direction angle.

Fig.9. Horizontal EOG signal and related time derivative signal for two-stepsaccadic eye movement

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

4

C. Calibration

Fig.10 Scatter plot, trend line, and thresholds (blue dash dot lines) ofhorizontal EOG calibration

For EOG horizontal calibration a subject was asked to sitdown in a manner to have his eyes and the virtual keyboarddisplayed on the PC monitor spaced approximately by 30 to40cm. It results in an angular displacement of 15°approximately for eye gaze direction between two adjacentcolumns. Then, EOG signals were recorded for different eyesaccadic movements. For each horizontal position (1-5), sixpoints of the derivate signals have been used to plot theexperimental data and the trend line (Linear fit line) as well asthe thresholds for each column as represented in Fig.10.

D. Performance evaluationTo evaluate the performance of our system, we carried out

two tests. The first one concerned the typing speed and thesecond one was relative to the accuracy of the system’sresponse.

Typing speed testTwo subjects aged 24 and 23 participated in this test

procedure. They were asked to seat as in the calibrationexperience, and to train first for five minutes with the systemtrying to write some words. After that, they were asked to typethe “HELLO” word three times. At each experiment the timewas recorded. The obtained results are shown in table 1 wherethe average typing speed for the two subjects were respectively8.25 character/min and 8.65 character/min. When compared toother authors, the typing speed of this virtual keyboard wasnearly 3.5 times faster than the one of Tangsuksant et al.[10]and twice faster than the one of Soltani et al.[11].

TABLE I: TYPING TIME FOR “HELLO” WORD

Typing time (s) for “HELLO”Users Test 1 Test 2 Test 3 Average

1 40 35 34 36.332 37 32 35 34.67

Accuracy of system’s response:To determine the accuracy of the virtual keyboard, a test

was carried out by a subject who sat as in the calibrationprocedure, and performed saccades 25 times between themiddle button (3) and each position. The right detection of thesystem (RP) was considered when the right LED lighted up. Inthe other cases it was considered as a wrong detection (WP).Table 2 contains different results as well as the success ratio.Columns 1 and 5 were reached from the middle column 3 byperforming eye rotation amplitude of two steps, and one step isperformed for the columns 2 and 4.

TABLE II: ACCURACY OF THE SYSTEM ‘S RESPONSE

Accuracy of typing SuccessratioTests Test 1 Test 2

positions CP WP CP WPPos.1 20 05 19 06 78%Pos.2 23 02 22 03 90%Pos.4 22 03 23 02 90%Pos.5 21 04 20 05 82%

Success ratiofor tests 86 % 84 %

The results of the test show that the typing accuracy of thesystem was around 85%. Several factors could influence on thetyping accuracy, namely the electrode placements, the tirednessof the user, and his practice in using this system. This accuracycan certainly be improved by training. When compared to otherauthors, the typing accuracy of this virtual keyboard was 9%less accurate than results presented by Tangsuksant et al.[10] orSoltani et al.[11].

V. CONCLUSION

The aim of this paper was to present a communicationsystem for people who are simultaneously speech and motordisabled. The system was based on EOG eye gaze tracking towrite on a virtual keyboard developed using LabVIEW. Byusing their left-right eye movements, and their eye blinks, theresults showed that the time derivative of the EOG filteredsignal was efficient to detect the saccades of eye gazemovements that result in writing on the virtual keyboard. Thetyping speed test results showed that our system was faster, butwas slightly less accurate when compared to two otherspresented in literature. In future work, this virtual keyboardshould be improved by eliminating the modulated lowfrequency noise contained in the low frequency part of EOGsignal.

REFERENCES

[1] L. Bi, X.A. Fan and Y. Liu, “EEG-based brain-controlled mobile robots:A survey,” IEEE Trans. Hum.-Mach. Syst., vol.. 43, no. 2, pp. 161–176,March 2013.

[2] A. Ferreira, R.L. Silva, W.C. Celeste, T.F. Bastos-Filho, and M.Sarcinelli-Filho, “Human-machine interface based on muscular andbrain signals applied to a robotic wheelchair,” J Phys. Conf. Ser. vol. 90,pp. 1– 8, 2007.

[3] M. Shazzad Hossain, K. Huda, M. Ahmad, “Command the computerwith your eye – An electrooculography based approach,” Proc. 8th Int.Conf. Software, Knowledge, Information Management and ApplicationsSKIMA, pp. 1– 6, Dhaka, Bangladesh, December, 2014.

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

5

[4] K. Arai and R. Mardiyanto, “A Prototype of Electric WheelchairControlled by Eye-Only for Paralyzed User,” Journal of Robotics andMechatronics vol. 23, pp. 66–74, February 2011.

[5] R. Barea, L. Boquete, J. M. Rodriguez-Ascariz, S. Ortega, and E. López,“Sensory system for implementing a human-computer interface based onelectrooculography,” Sensors, vol. 11, no. 1, pp. 310–328, January 2011.

[6] X. Zheng, X. Li, J. Liu, W. Chen and Y. Hao, “A portable wireless eyemovement-controlled human-computer interface for the disabled, ” Proc.Intern. Conf. Complex Medical Eng. ICME, Tempe USA, April 2009.

[7] W. T. Sung, J. H. Chen, and K. Y. Chang, “ZigBee based multi-purposeelectronic score design and implementation using EOG,” SensorsActuators, A Phys., vol. 190, pp. 141–152, February 2013.

[8] V. Aswin Raj and V. Karthik Raj, “EOG Based Low Cost Device forControlling Home Appliances,” Int. Journal Innov. Res. Sci. Eng.Technol. IJIRSET vol. 3, no. 3, pp. 708–711, March 2014.

[9] J.A. Martinez., A. Ubeda, E. Ianez, J. M. Azorin and C. Perez-Vidal,“Multimodal System Based on Electrooculography and VoiceRecognition to Control a Robot Arm,” Int. Journ. Adv. Robot. Syst.,vol.10, pp. 1-9, 2013.

[10] W. Tangsuksant, C. Aekmunkhongpaisal, P. Cambua, T. Charoenpong,and T. Chanwimalueang, “Directional eye movement detection systemfor virtual keyboard controller,” The 2012 Biomed. Eng. Int. Conf.BMEiCON-2012, pp. 1–5, 2012.

[11] S. Soltani, A. Mahnam “Design of a novel wearable human computerinterface based on electrooculograghy,” 21st Iranian ConferenceElectrical Engineering (ICEE), pp. 1-5, 2013.

[12] S. Yathunanthan, L.U.R. Chandrasena, A. Umakanthan, V. Vasuki andS.R. Munasinghe, “Controlling a wheelchair by use of EOG signal ”, 4th

IEEE International Conference on Information and Automation forSustainability, ICIAFS, pp. 283-288, December 2008.

[13] N. Kim-Tien, N. Truong-Thinh, “Using electrooculogram andelectromyogram for powered wheelchair”, IEEE InternationalConference on Robotics and Biomimetics, ROBIO, pp. 1585-1590,December 2011.

[14] T. Q. D., Khoa, V. Van Toi, “A tool for analysis and classification ofsleep stages”, IEEE International Conference on AdvancedTechnologies for Communications (ATC), pp. 307-310, August 2011.

[15] D. M. Malini, & D. K. SubbaRao, “Analysis of EOG signal using HaarWavelet”, International Journal of Computer Applications Vol.2, pp. 28-31, 2011.

[16] K. Pettersson, S. Jagadeesan, K. Lukander, A. Henelius, E. Hæggström,and K. Müller, “Algorithm for automatic analysis of electro-oculographic data”, Biomedical engineering online, Vol.12, no.1, pp.110-117, 2013.

[17] T. Pander, R. Czabański, T. Przybyła and D. Pojda-Wilczek, “Saccadesdetection in optokinetic nystagmus–a fuzzy approach”, Journal ofMedical Informatics & Technologies, Vol. 19, pp. 33-39, 2012.

[18] P. Ebrahim, W. Stolzmann and B. Yang, “Eye movement detection forassessing driver drowsiness by electrooculography”, IEEE InternationalConference on Systems, Man, and Cybernetics (SMC), pp. 4142-4148,October 2013.

[19] H. Singh and J. Singh, “Human Eye Tracking and Related Issues : AReview,” Intern. Journ. Scient. Res. Publ. vol. 2, no. 9, pp. 1–9, 2012.

[20] R. Barea, L. Boquete, L. M. Bergasa, E. López and M. Mazo, “Electro-oculographic guidance of a wheelchair using eye movementscodification,” The International Journal of Robotics Research, vol. 22,no.7-8, 641-652, 2003

International Conference on Automatic control, Telecommunications and Signals (ICATS15)University BADJI Mokhtar - Annaba - Algeria - November 16-18, 2015

6