research article a robust and fast computation touchless...

18
Research Article A Robust and Fast Computation Touchless Palm Print Recognition System Using LHEAT and the IFkNCN Classifier Haryati Jaafar, Salwani Ibrahim, and Dzati Athiar Ramli Intelligent Biometric Group, School of Electrical and Electronic Engineering, Universiti Sains Malaysia Engineering Campus, 14300 Nibong Tebal, Penang, Malaysia Correspondence should be addressed to Dzati Athiar Ramli; [email protected] Received 20 October 2014; Revised 25 April 2015; Accepted 29 April 2015 Academic Editor: Dominic Heger Copyright © 2015 Haryati Jaafar et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Mobile implementation is a current trend in biometric design. is paper proposes a new approach to palm print recognition, in which smart phones are used to capture palm print images at a distance. A touchless system was developed because of public demand for privacy and sanitation. Robust hand tracking, image enhancement, and fast computation processing algorithms are required for effective touchless and mobile-based recognition. In this project, hand tracking and the region of interest (ROI) extraction method were discussed. A sliding neighborhood operation with local histogram equalization, followed by a local adaptive thresholding or LHEAT approach, was proposed in the image enhancement stage to manage low-quality palm print images. To accelerate the recognition process, a new classifier, improved fuzzy-based k nearest centroid neighbor (IFkNCN), was implemented. By removing outliers and reducing the amount of training data, this classifier exhibited faster computation. Our experimental results demonstrate that a touchless palm print system using LHEAT and IFkNCN achieves a promising recognition rate of 98.64%. 1. Introduction Palm print recognition has been widely investigated for the last decade in the field of pattern recognition. Similar to fin- gerprint recognition, palm print technology is based on the aggregate of information presented in a friction ridge impression. Although the image quality of a fingerprint is robust because of multiple lines, wrinkles, and ridges, a palm print includes even more information. A palm print covers a wider area than a fingerprint and contains characteristics such as palmar creases and triradius that are useful for recognition [1]. More importantly, ridge structures remain unchanged throughout life, except for a change in size [2]. A palm print is distinctive and thick, enabling easy cap- ture by low-resolution devices. erefore, palm print detec- tion systems have a low cost and require minimum user cooperation for extraction [3]. Most palm print biometrics utilizes scanners or charge-coupled device (CCD) cameras as the input sensor [4, 5]. Because users must touch the sensor to acquire their hand images, users are concerned about hygiene, particularly in public areas, such as hospitals, malls, and streets [6, 7]. Disease-causing organisms, such as influenza virus, can be passed by indirect contact, and a susceptible individual can be infected from contact with a contaminated surface. e surface can become contaminated easily [6]; therefore, a touchless approach is required for palm print biometric technology. e development of a touchless palm print recognition system is not straightforward. e hand position of the user during image acquisition is always changing. A touchless sys- tem does not require the user to touch or hold any platform or guidance peg. Users can open their hand, close their hand, or pose in a natural manner [6], and the hand can be deformed in other manners, including rotation, scale variability, and palm stretching, compared with touch-based systems [8]. erefore, hand tracking and valley detection are challenging. As a result, hand tracking and region of interest (ROI) seg- mentation are difficult to implement. Complex backgrounds, poor ridge structures, and small image areas result in low- quality palm print images. e presence of noise/degradation (linear or nonlinear) and illumination changes [9] may reduce recognition accuracy. e computation times for the recognition process also must be considered. Because palm print systems consist of many major processes, such Hindawi Publishing Corporation Computational Intelligence and Neuroscience Volume 2015, Article ID 360217, 17 pages http://dx.doi.org/10.1155/2015/360217

Upload: dinhxuyen

Post on 21-Jul-2019

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Research ArticleA Robust and Fast Computation Touchless Palm PrintRecognition System Using LHEAT and the IFkNCN Classifier

Haryati Jaafar Salwani Ibrahim and Dzati Athiar Ramli

Intelligent Biometric Group School of Electrical and Electronic Engineering Universiti Sains Malaysia Engineering Campus14300 Nibong Tebal Penang Malaysia

Correspondence should be addressed to Dzati Athiar Ramli dzatiusmmy

Received 20 October 2014 Revised 25 April 2015 Accepted 29 April 2015

Academic Editor Dominic Heger

Copyright copy 2015 Haryati Jaafar et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Mobile implementation is a current trend in biometric design This paper proposes a new approach to palm print recognition inwhich smart phones are used to capture palmprint images at a distance A touchless systemwas developed because of public demandfor privacy and sanitation Robust hand tracking image enhancement and fast computation processing algorithms are required foreffective touchless and mobile-based recognition In this project hand tracking and the region of interest (ROI) extraction methodwere discussed A sliding neighborhood operation with local histogram equalization followed by a local adaptive thresholdingor LHEAT approach was proposed in the image enhancement stage to manage low-quality palm print images To accelerate therecognition process a new classifier improved fuzzy-based k nearest centroid neighbor (IFkNCN) was implemented By removingoutliers and reducing the amount of training data this classifier exhibited faster computationOur experimental results demonstratethat a touchless palm print system using LHEAT and IFkNCN achieves a promising recognition rate of 9864

1 Introduction

Palm print recognition has been widely investigated for thelast decade in the field of pattern recognition Similar to fin-gerprint recognition palm print technology is based onthe aggregate of information presented in a friction ridgeimpression Although the image quality of a fingerprint isrobust because of multiple lines wrinkles and ridges a palmprint includes even more information A palm print coversa wider area than a fingerprint and contains characteristicssuch as palmar creases and triradius that are useful forrecognition [1] More importantly ridge structures remainunchanged throughout life except for a change in size [2]A palm print is distinctive and thick enabling easy cap-ture by low-resolution devices Therefore palm print detec-tion systems have a low cost and require minimum usercooperation for extraction [3] Most palm print biometricsutilizes scanners or charge-coupled device (CCD) camerasas the input sensor [4 5] Because users must touch thesensor to acquire their hand images users are concernedabout hygiene particularly in public areas such as hospitalsmalls and streets [6 7] Disease-causing organisms such

as influenza virus can be passed by indirect contact and asusceptible individual can be infected from contact with acontaminated surfaceThe surface can become contaminatedeasily [6] therefore a touchless approach is required for palmprint biometric technology

The development of a touchless palm print recognitionsystem is not straightforward The hand position of the userduring image acquisition is always changing A touchless sys-temdoes not require the user to touch or hold any platformorguidance peg Users can open their hand close their hand orpose in a natural manner [6] and the hand can be deformedin other manners including rotation scale variability andpalm stretching compared with touch-based systems [8]Therefore hand tracking and valley detection are challengingAs a result hand tracking and region of interest (ROI) seg-mentation are difficult to implement Complex backgroundspoor ridge structures and small image areas result in low-quality palm print imagesThe presence of noisedegradation(linear or nonlinear) and illumination changes [9] mayreduce recognition accuracy The computation times forthe recognition process also must be considered Becausepalm print systems consist of many major processes such

Hindawi Publishing CorporationComputational Intelligence and NeuroscienceVolume 2015 Article ID 360217 17 pageshttpdxdoiorg1011552015360217

2 Computational Intelligence and Neuroscience

HTTP

HTTP

Image username

Authentication

Matching

Acceptreject

Data collection

Palm print database

Input from user (image and username)

Accept or reject

Client(user and smart phone)

Internet(WAP HTTP PHP)

Server(MATLAB and database)

Figure 1 Overall research architecture

as data acquisition preprocessing feature extraction andclassification fast processing algorithms are crucial [10 11]

This paper focuses on solutions for low-quality palmprintimages and computation times and includes a brief discussionof hand tracking and ROI segmentationThe overall researchcan be divided into three parts which are the client or smartphone side internet side and the server side which areillustrated as in Figure 1

For the client side the Android application for capturingbiometric data is developed and it programs by using thelatest few versions of Android OS ranging from version16 to version 412 Its programs support the mobile phonecamera with the resolution up to 32 megapixels hence onlya few smart phones can be used for testing Due to theexisting camera application that varies for almost all smartphones and tablets a customized camera application withthe integration of enrolment and identification functions isdeveloped for this research The internet site is to connectthe communication between smart phone and server and theconnection is done viaWi-Fi and the PHP script is created toinvoke the MATLAB program in the server

The last part is server side where all the MATLAB pro-gramming including hand image identification ROI extrac-tion palm print feature extraction and pattern matchingalgorithms is written The server software used in the projectis free software where a personal computer serves as a serverand has limited access from the client Several palm printfeature extraction algorithms which are based on subspacemethod are developed and evaluated for the fast and efficientmobile biometric system Details of these operations can befound in Ibrahim and Ramli [12]

This study focused on the server side where two majorcontributions that is image enhancement and classificationprocesses have been developed to improve the quality of

touchless palm print recognition systems We propose a localhistogram equalization and adaptive thresholding (LHEAT)technique for image enhancement This technique is animproved version of the local histogram equalization (LHE)and local adaptive thresholding (LAT) techniques Unlikeprevious methods [13ndash16] we used the sliding neighborhoodoperation for faster computation [17] To accelerate therecognition process the improved fuzzy-based 119896 nearestcentroid neighbor (IFkNCN) was used as the classifier for thesystem The sliding neighborhood operation in the LHEATtechnique also reduces the processing time of the imageenhancement stage comparedwith the baseline LHE andLATtechniques

This paper is organized as follows Section 2 presentsrelated works and motivation The proposed classifier for thepalm print recognition system is described in Section 3 Theexperimental results are explained in Section 4 and Section 5summarizes the work

2 Related Works and Motivation

Many methods have been proposed to overcome the chal-lenges associated with palm print recognition Han and Lee[5] described two CMOS web cameras placed in parallelto segment the ROI of 1200 palm print images of identicalsize The first camera captures the infrared image for handdetection and the second camera is used to acquire the colorimage in normal lighting The images are normalized usinginformation on skin color and hand shape The normalizedimages are then segmented to determine the ROI using theordinal code approach and then classified with the Hammingdistance classifier Experimental results have shown that theequal error rate (EER) of the verification test is 054 and thatthe average acquisition time is 12 seconds Feng et al [18]

Computational Intelligence and Neuroscience 3

used the Viola-Jones method [19] to detect the hand posi-tion after capturing 2000 images In this study images wereacquired in different positions with various lighting andcluster backgrounds Subsequently a coarse-to-fine strategywas used to detect the key points on the hand The key handpoints were then verified with the shape context descriptorbefore the images were segmented into the ROIThe boostingclassifier cascade [20] has previously been applied and theaccuracy rate was 938 with a 178ms average processingtime for one image Michael et al [2] described a touchlesspalm print recognition system that was designed usinga low-resolution CMOS web camera to acquire real-timepalm print images A hand tracking algorithm that is skincolor thresholding and hand valley detection algorithm wasdeveloped to automatically track and detect the ROI of thepalm print The Laplacian isotropic derivative operator wasused to enhance the contrast and sharpness of the palm printfeature and a Gaussian low-pass filter was applied to smooththe palm print image and bridge some small gaps in the lineThemodified probabilistic neural network (PNN)was used toclassify the palm print texture The accuracy rate was greaterthan 90 Similar to previous studies Michael et al [21] usedlocal-ridge-enhancement (LRE) to enhance the contrast andsharpness of images of both the right and left handsThe LREwas used to determine which section of the image containsimportant lines and ridge patterns and then amplify onlythose areas The support vector machine (SVM) was usedand the average accuracy rates for the left and right handswere 97 to 98 respectively

Although previous researchers have achieved greater than90 accuracy the palm print image was captured in a semi-closed environment in a boxlike setup with an illuminationsource on topThis setup results in clean images with prefixedillumination settings [22] The high accuracy is not reflectiveof the real environment In the present study an Androidsmart phone was used to capture the images allowing usersto easily access their system every day Because the imageswere captured in the real environment they were exposed todifferent levels of noises and blurring because of variations inillumination background and focus Noise can also be dueto bit errors in transmission or introduced during the signalacquisition stage

We propose a touchless palm print recognition systemthat can manage real environment variability The two areasdiscussed are image enhancement and classification In imageenhancement a LHEAT technique was used The purposeof LHE is to ensure that the brightness levels are distributedequally [15 23] In the LHE the image is divided into smallblocks or local 119873 times 119872 neighborhood regions Each blockor inner window is surrounded by a larger block or outerwindow which is used to calculate the mapping functionlookup for the inner window To remove the borders ofthe block the mapping function is interpolated betweenneighborhood blocks [15] The LHE is an excellent imageenhancement method However in the palm print imageconsiderable background noise and variation in contrast andillumination exist Occasionally the LHE overenhances theimage contrast and causes degradation of the image [13 1416]Then the binarization technique LAT is applied In LAT

the threshold extracts the useful information from an imagethat has been enhanced by LHE and separates the foregroundfrom the background with nonuniform illumination Severalmethods such as those described in Bersen Niblack ChowandKaneko and Sauvola [24] have been used to calculate thethreshold values Sauvolarsquos method is most frequently usedand was implemented here because of its promising resultsfor degraded images

In the pattern recognition system there are two modesof recognition verification and identification This studyfocuses on the touchless palm print recognition system withidentification modeThe identification mode is the time dur-ing which the system recognizes the userrsquos identity by com-paring the presented sample against the entire database tofind a possible match [2] Choosing the correct classificationmodel becomes an important issue in palm print recognitionto ensure that the system can identify a person in a short timeThe 119896 nearest neighbor (kNN) method is a nonparametricclassifier widely used for pattern classification This classifieris simple and easy to implement [25] Nevertheless thereare some problems with this classifier the performance ofkNN often fails because of the lack of sample distributioninformation [26 27] and not carefully assigning the class labelbefore classification [28] IFkNCN may resolve these limita-tionsThis classifier incorporates centroid-based distance andfuzzy rule approaches with triangle inequality The classifierremoves the training samples that are far from the testingpoint or the query point by setting a threshold The trainingsamples that are located outside of the threshold are calledoutliers and defined as a noisy sample which does not fit tothe assumed class label for the query point By removing theoutliers future processing focuses on the important trainingsamples or candidate training samples and this focus reducesthe computational complexity in the searching stage Thequery point is classified based on the centroid-distance andfuzzy rule system The centroid-distance method is appliedto ensure that the selected training samples are distributedsufficiently in the region of the neighborhoodwith the nearestneighbors located around the query point Consequentlythe fuzzy-based rule is used to solve the ambiguity of theweighting distance between the query point and its nearestneighbors

3 Proposed Method

Figure 2 displays the overall procedure for a touchless palmprint recognition system

In this work a new comprehensive collection of palmprint database was developed This database currently wascontaining 2400 color images corresponding to 40 userswho were Asian race students where each user had 60 palmprint images This database will be released to the publicas benchmark data and it can be downloaded from thewebsite of Intelligent BiometricGroup (IBG)Universiti SainsMalaysia (USM) for research and educational purposes Allthe users who are taking part in the data collection arecompletely voluntary and each volunteer gave verbal consentbefore collecting the image The age of the user ranged from

4 Computational Intelligence and Neuroscience

Hand tracking and ROI segmentation

Image enhancement (i) LHEAT

Feature extraction with PCA

Classification with IFkNCN

Decision

InputPreprocessing

Noise corruption(i) Motion blur(ii) Salt and pepper

Figure 2 Block diagram of a touchless palm print recognition system

Figure 3 Data enrolment process

19 to 23 years An input image is acquired using a HTCOne XAndroidmobile phone with 8megapixels of image resolutionand a stable background The data collection is divided into3 sessions the first session is used for training purpose Thelatter two sessions are used for testing purpose The timeinterval for each session is in two weeksrsquo time

For enrolment process a user needs to follow the instruc-tion displayed on the smart phone screen as shown inFigure 3 Firstly the userwas required to sign in andkey in theimage name Subsequently the users were simply asked to puttheir palm print naturally in front of the acquisition deviceA semitransparent pink color box acts as a constraint box toensure the palm and fingers lie inside the boxThe pixels thatlie outside of the constraint will be cropped So the distancebetween hand and device is set as constant Once the imagewas captured it was saved into the database and this processwas repeated for new image and user

As no peg or other tool is used in the system the usersmay place their hands at different heights above the mobilephone camera The palm image appears large and clear when

the palm is placed near the camera Many line features andridges are captured at near distance However if the hand ispositioned too close to themobile phone the entire handmaynot be captured in the image and someparts of the palmprintimage may be occluded as shown in Figure 4(a) [6] Whenthe hand is moved away from the camera the focus fadesand some print information disappears (Figure 4(b)) [2]Theoptimal distance between the hand and mobile phone is setaccording to the image preview in the enrolment process inFigure 3 enabling the whole hand image to be captured asshown in Figure 4(c) Some examples of image of the wholepalm print are shown in Figure 5

The file were stored in JPEG format Each folder wasnamed as ldquoS xrdquo ldquoS xrdquo represents the identity of the userwhich ranges from 1 to 40 Each folder had 60 palm printimages During preprocessing the image was segmented todetermine the ROI This process is called hand tracking andROI segmentation The image was then corrupted by addingnoises such as motion blur noise and salt and pepper noiseSubsequently the LHEAT method was applied to enhance

Computational Intelligence and Neuroscience 5

(a) (b) (c)

Figure 4 Hand image detection (a) original RGB hand image (b) binarized image Hand position (a) too close (b) too far and (c) suitabledistance

User 1

User 2

User 3

Figure 5 Original hand images captured by a smart phone camera for 5 different samples

the image Then feature extraction was performed Principleanalysis component (PCA) was employed to extract theimage data and reduce the dimensionality of the input dataFinally the image was classified by the IFkNCN classifier

31 Preprocessing There are three major steps in the handtracking and ROI segmentation stage hand image identifi-cation peak and valley detection and ROI extraction [12]In the hand image identification step the RGB image is

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

2 Computational Intelligence and Neuroscience

HTTP

HTTP

Image username

Authentication

Matching

Acceptreject

Data collection

Palm print database

Input from user (image and username)

Accept or reject

Client(user and smart phone)

Internet(WAP HTTP PHP)

Server(MATLAB and database)

Figure 1 Overall research architecture

as data acquisition preprocessing feature extraction andclassification fast processing algorithms are crucial [10 11]

This paper focuses on solutions for low-quality palmprintimages and computation times and includes a brief discussionof hand tracking and ROI segmentationThe overall researchcan be divided into three parts which are the client or smartphone side internet side and the server side which areillustrated as in Figure 1

For the client side the Android application for capturingbiometric data is developed and it programs by using thelatest few versions of Android OS ranging from version16 to version 412 Its programs support the mobile phonecamera with the resolution up to 32 megapixels hence onlya few smart phones can be used for testing Due to theexisting camera application that varies for almost all smartphones and tablets a customized camera application withthe integration of enrolment and identification functions isdeveloped for this research The internet site is to connectthe communication between smart phone and server and theconnection is done viaWi-Fi and the PHP script is created toinvoke the MATLAB program in the server

The last part is server side where all the MATLAB pro-gramming including hand image identification ROI extrac-tion palm print feature extraction and pattern matchingalgorithms is written The server software used in the projectis free software where a personal computer serves as a serverand has limited access from the client Several palm printfeature extraction algorithms which are based on subspacemethod are developed and evaluated for the fast and efficientmobile biometric system Details of these operations can befound in Ibrahim and Ramli [12]

This study focused on the server side where two majorcontributions that is image enhancement and classificationprocesses have been developed to improve the quality of

touchless palm print recognition systems We propose a localhistogram equalization and adaptive thresholding (LHEAT)technique for image enhancement This technique is animproved version of the local histogram equalization (LHE)and local adaptive thresholding (LAT) techniques Unlikeprevious methods [13ndash16] we used the sliding neighborhoodoperation for faster computation [17] To accelerate therecognition process the improved fuzzy-based 119896 nearestcentroid neighbor (IFkNCN) was used as the classifier for thesystem The sliding neighborhood operation in the LHEATtechnique also reduces the processing time of the imageenhancement stage comparedwith the baseline LHE andLATtechniques

This paper is organized as follows Section 2 presentsrelated works and motivation The proposed classifier for thepalm print recognition system is described in Section 3 Theexperimental results are explained in Section 4 and Section 5summarizes the work

2 Related Works and Motivation

Many methods have been proposed to overcome the chal-lenges associated with palm print recognition Han and Lee[5] described two CMOS web cameras placed in parallelto segment the ROI of 1200 palm print images of identicalsize The first camera captures the infrared image for handdetection and the second camera is used to acquire the colorimage in normal lighting The images are normalized usinginformation on skin color and hand shape The normalizedimages are then segmented to determine the ROI using theordinal code approach and then classified with the Hammingdistance classifier Experimental results have shown that theequal error rate (EER) of the verification test is 054 and thatthe average acquisition time is 12 seconds Feng et al [18]

Computational Intelligence and Neuroscience 3

used the Viola-Jones method [19] to detect the hand posi-tion after capturing 2000 images In this study images wereacquired in different positions with various lighting andcluster backgrounds Subsequently a coarse-to-fine strategywas used to detect the key points on the hand The key handpoints were then verified with the shape context descriptorbefore the images were segmented into the ROIThe boostingclassifier cascade [20] has previously been applied and theaccuracy rate was 938 with a 178ms average processingtime for one image Michael et al [2] described a touchlesspalm print recognition system that was designed usinga low-resolution CMOS web camera to acquire real-timepalm print images A hand tracking algorithm that is skincolor thresholding and hand valley detection algorithm wasdeveloped to automatically track and detect the ROI of thepalm print The Laplacian isotropic derivative operator wasused to enhance the contrast and sharpness of the palm printfeature and a Gaussian low-pass filter was applied to smooththe palm print image and bridge some small gaps in the lineThemodified probabilistic neural network (PNN)was used toclassify the palm print texture The accuracy rate was greaterthan 90 Similar to previous studies Michael et al [21] usedlocal-ridge-enhancement (LRE) to enhance the contrast andsharpness of images of both the right and left handsThe LREwas used to determine which section of the image containsimportant lines and ridge patterns and then amplify onlythose areas The support vector machine (SVM) was usedand the average accuracy rates for the left and right handswere 97 to 98 respectively

Although previous researchers have achieved greater than90 accuracy the palm print image was captured in a semi-closed environment in a boxlike setup with an illuminationsource on topThis setup results in clean images with prefixedillumination settings [22] The high accuracy is not reflectiveof the real environment In the present study an Androidsmart phone was used to capture the images allowing usersto easily access their system every day Because the imageswere captured in the real environment they were exposed todifferent levels of noises and blurring because of variations inillumination background and focus Noise can also be dueto bit errors in transmission or introduced during the signalacquisition stage

We propose a touchless palm print recognition systemthat can manage real environment variability The two areasdiscussed are image enhancement and classification In imageenhancement a LHEAT technique was used The purposeof LHE is to ensure that the brightness levels are distributedequally [15 23] In the LHE the image is divided into smallblocks or local 119873 times 119872 neighborhood regions Each blockor inner window is surrounded by a larger block or outerwindow which is used to calculate the mapping functionlookup for the inner window To remove the borders ofthe block the mapping function is interpolated betweenneighborhood blocks [15] The LHE is an excellent imageenhancement method However in the palm print imageconsiderable background noise and variation in contrast andillumination exist Occasionally the LHE overenhances theimage contrast and causes degradation of the image [13 1416]Then the binarization technique LAT is applied In LAT

the threshold extracts the useful information from an imagethat has been enhanced by LHE and separates the foregroundfrom the background with nonuniform illumination Severalmethods such as those described in Bersen Niblack ChowandKaneko and Sauvola [24] have been used to calculate thethreshold values Sauvolarsquos method is most frequently usedand was implemented here because of its promising resultsfor degraded images

In the pattern recognition system there are two modesof recognition verification and identification This studyfocuses on the touchless palm print recognition system withidentification modeThe identification mode is the time dur-ing which the system recognizes the userrsquos identity by com-paring the presented sample against the entire database tofind a possible match [2] Choosing the correct classificationmodel becomes an important issue in palm print recognitionto ensure that the system can identify a person in a short timeThe 119896 nearest neighbor (kNN) method is a nonparametricclassifier widely used for pattern classification This classifieris simple and easy to implement [25] Nevertheless thereare some problems with this classifier the performance ofkNN often fails because of the lack of sample distributioninformation [26 27] and not carefully assigning the class labelbefore classification [28] IFkNCN may resolve these limita-tionsThis classifier incorporates centroid-based distance andfuzzy rule approaches with triangle inequality The classifierremoves the training samples that are far from the testingpoint or the query point by setting a threshold The trainingsamples that are located outside of the threshold are calledoutliers and defined as a noisy sample which does not fit tothe assumed class label for the query point By removing theoutliers future processing focuses on the important trainingsamples or candidate training samples and this focus reducesthe computational complexity in the searching stage Thequery point is classified based on the centroid-distance andfuzzy rule system The centroid-distance method is appliedto ensure that the selected training samples are distributedsufficiently in the region of the neighborhoodwith the nearestneighbors located around the query point Consequentlythe fuzzy-based rule is used to solve the ambiguity of theweighting distance between the query point and its nearestneighbors

3 Proposed Method

Figure 2 displays the overall procedure for a touchless palmprint recognition system

In this work a new comprehensive collection of palmprint database was developed This database currently wascontaining 2400 color images corresponding to 40 userswho were Asian race students where each user had 60 palmprint images This database will be released to the publicas benchmark data and it can be downloaded from thewebsite of Intelligent BiometricGroup (IBG)Universiti SainsMalaysia (USM) for research and educational purposes Allthe users who are taking part in the data collection arecompletely voluntary and each volunteer gave verbal consentbefore collecting the image The age of the user ranged from

4 Computational Intelligence and Neuroscience

Hand tracking and ROI segmentation

Image enhancement (i) LHEAT

Feature extraction with PCA

Classification with IFkNCN

Decision

InputPreprocessing

Noise corruption(i) Motion blur(ii) Salt and pepper

Figure 2 Block diagram of a touchless palm print recognition system

Figure 3 Data enrolment process

19 to 23 years An input image is acquired using a HTCOne XAndroidmobile phone with 8megapixels of image resolutionand a stable background The data collection is divided into3 sessions the first session is used for training purpose Thelatter two sessions are used for testing purpose The timeinterval for each session is in two weeksrsquo time

For enrolment process a user needs to follow the instruc-tion displayed on the smart phone screen as shown inFigure 3 Firstly the userwas required to sign in andkey in theimage name Subsequently the users were simply asked to puttheir palm print naturally in front of the acquisition deviceA semitransparent pink color box acts as a constraint box toensure the palm and fingers lie inside the boxThe pixels thatlie outside of the constraint will be cropped So the distancebetween hand and device is set as constant Once the imagewas captured it was saved into the database and this processwas repeated for new image and user

As no peg or other tool is used in the system the usersmay place their hands at different heights above the mobilephone camera The palm image appears large and clear when

the palm is placed near the camera Many line features andridges are captured at near distance However if the hand ispositioned too close to themobile phone the entire handmaynot be captured in the image and someparts of the palmprintimage may be occluded as shown in Figure 4(a) [6] Whenthe hand is moved away from the camera the focus fadesand some print information disappears (Figure 4(b)) [2]Theoptimal distance between the hand and mobile phone is setaccording to the image preview in the enrolment process inFigure 3 enabling the whole hand image to be captured asshown in Figure 4(c) Some examples of image of the wholepalm print are shown in Figure 5

The file were stored in JPEG format Each folder wasnamed as ldquoS xrdquo ldquoS xrdquo represents the identity of the userwhich ranges from 1 to 40 Each folder had 60 palm printimages During preprocessing the image was segmented todetermine the ROI This process is called hand tracking andROI segmentation The image was then corrupted by addingnoises such as motion blur noise and salt and pepper noiseSubsequently the LHEAT method was applied to enhance

Computational Intelligence and Neuroscience 5

(a) (b) (c)

Figure 4 Hand image detection (a) original RGB hand image (b) binarized image Hand position (a) too close (b) too far and (c) suitabledistance

User 1

User 2

User 3

Figure 5 Original hand images captured by a smart phone camera for 5 different samples

the image Then feature extraction was performed Principleanalysis component (PCA) was employed to extract theimage data and reduce the dimensionality of the input dataFinally the image was classified by the IFkNCN classifier

31 Preprocessing There are three major steps in the handtracking and ROI segmentation stage hand image identifi-cation peak and valley detection and ROI extraction [12]In the hand image identification step the RGB image is

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 3

used the Viola-Jones method [19] to detect the hand posi-tion after capturing 2000 images In this study images wereacquired in different positions with various lighting andcluster backgrounds Subsequently a coarse-to-fine strategywas used to detect the key points on the hand The key handpoints were then verified with the shape context descriptorbefore the images were segmented into the ROIThe boostingclassifier cascade [20] has previously been applied and theaccuracy rate was 938 with a 178ms average processingtime for one image Michael et al [2] described a touchlesspalm print recognition system that was designed usinga low-resolution CMOS web camera to acquire real-timepalm print images A hand tracking algorithm that is skincolor thresholding and hand valley detection algorithm wasdeveloped to automatically track and detect the ROI of thepalm print The Laplacian isotropic derivative operator wasused to enhance the contrast and sharpness of the palm printfeature and a Gaussian low-pass filter was applied to smooththe palm print image and bridge some small gaps in the lineThemodified probabilistic neural network (PNN)was used toclassify the palm print texture The accuracy rate was greaterthan 90 Similar to previous studies Michael et al [21] usedlocal-ridge-enhancement (LRE) to enhance the contrast andsharpness of images of both the right and left handsThe LREwas used to determine which section of the image containsimportant lines and ridge patterns and then amplify onlythose areas The support vector machine (SVM) was usedand the average accuracy rates for the left and right handswere 97 to 98 respectively

Although previous researchers have achieved greater than90 accuracy the palm print image was captured in a semi-closed environment in a boxlike setup with an illuminationsource on topThis setup results in clean images with prefixedillumination settings [22] The high accuracy is not reflectiveof the real environment In the present study an Androidsmart phone was used to capture the images allowing usersto easily access their system every day Because the imageswere captured in the real environment they were exposed todifferent levels of noises and blurring because of variations inillumination background and focus Noise can also be dueto bit errors in transmission or introduced during the signalacquisition stage

We propose a touchless palm print recognition systemthat can manage real environment variability The two areasdiscussed are image enhancement and classification In imageenhancement a LHEAT technique was used The purposeof LHE is to ensure that the brightness levels are distributedequally [15 23] In the LHE the image is divided into smallblocks or local 119873 times 119872 neighborhood regions Each blockor inner window is surrounded by a larger block or outerwindow which is used to calculate the mapping functionlookup for the inner window To remove the borders ofthe block the mapping function is interpolated betweenneighborhood blocks [15] The LHE is an excellent imageenhancement method However in the palm print imageconsiderable background noise and variation in contrast andillumination exist Occasionally the LHE overenhances theimage contrast and causes degradation of the image [13 1416]Then the binarization technique LAT is applied In LAT

the threshold extracts the useful information from an imagethat has been enhanced by LHE and separates the foregroundfrom the background with nonuniform illumination Severalmethods such as those described in Bersen Niblack ChowandKaneko and Sauvola [24] have been used to calculate thethreshold values Sauvolarsquos method is most frequently usedand was implemented here because of its promising resultsfor degraded images

In the pattern recognition system there are two modesof recognition verification and identification This studyfocuses on the touchless palm print recognition system withidentification modeThe identification mode is the time dur-ing which the system recognizes the userrsquos identity by com-paring the presented sample against the entire database tofind a possible match [2] Choosing the correct classificationmodel becomes an important issue in palm print recognitionto ensure that the system can identify a person in a short timeThe 119896 nearest neighbor (kNN) method is a nonparametricclassifier widely used for pattern classification This classifieris simple and easy to implement [25] Nevertheless thereare some problems with this classifier the performance ofkNN often fails because of the lack of sample distributioninformation [26 27] and not carefully assigning the class labelbefore classification [28] IFkNCN may resolve these limita-tionsThis classifier incorporates centroid-based distance andfuzzy rule approaches with triangle inequality The classifierremoves the training samples that are far from the testingpoint or the query point by setting a threshold The trainingsamples that are located outside of the threshold are calledoutliers and defined as a noisy sample which does not fit tothe assumed class label for the query point By removing theoutliers future processing focuses on the important trainingsamples or candidate training samples and this focus reducesthe computational complexity in the searching stage Thequery point is classified based on the centroid-distance andfuzzy rule system The centroid-distance method is appliedto ensure that the selected training samples are distributedsufficiently in the region of the neighborhoodwith the nearestneighbors located around the query point Consequentlythe fuzzy-based rule is used to solve the ambiguity of theweighting distance between the query point and its nearestneighbors

3 Proposed Method

Figure 2 displays the overall procedure for a touchless palmprint recognition system

In this work a new comprehensive collection of palmprint database was developed This database currently wascontaining 2400 color images corresponding to 40 userswho were Asian race students where each user had 60 palmprint images This database will be released to the publicas benchmark data and it can be downloaded from thewebsite of Intelligent BiometricGroup (IBG)Universiti SainsMalaysia (USM) for research and educational purposes Allthe users who are taking part in the data collection arecompletely voluntary and each volunteer gave verbal consentbefore collecting the image The age of the user ranged from

4 Computational Intelligence and Neuroscience

Hand tracking and ROI segmentation

Image enhancement (i) LHEAT

Feature extraction with PCA

Classification with IFkNCN

Decision

InputPreprocessing

Noise corruption(i) Motion blur(ii) Salt and pepper

Figure 2 Block diagram of a touchless palm print recognition system

Figure 3 Data enrolment process

19 to 23 years An input image is acquired using a HTCOne XAndroidmobile phone with 8megapixels of image resolutionand a stable background The data collection is divided into3 sessions the first session is used for training purpose Thelatter two sessions are used for testing purpose The timeinterval for each session is in two weeksrsquo time

For enrolment process a user needs to follow the instruc-tion displayed on the smart phone screen as shown inFigure 3 Firstly the userwas required to sign in andkey in theimage name Subsequently the users were simply asked to puttheir palm print naturally in front of the acquisition deviceA semitransparent pink color box acts as a constraint box toensure the palm and fingers lie inside the boxThe pixels thatlie outside of the constraint will be cropped So the distancebetween hand and device is set as constant Once the imagewas captured it was saved into the database and this processwas repeated for new image and user

As no peg or other tool is used in the system the usersmay place their hands at different heights above the mobilephone camera The palm image appears large and clear when

the palm is placed near the camera Many line features andridges are captured at near distance However if the hand ispositioned too close to themobile phone the entire handmaynot be captured in the image and someparts of the palmprintimage may be occluded as shown in Figure 4(a) [6] Whenthe hand is moved away from the camera the focus fadesand some print information disappears (Figure 4(b)) [2]Theoptimal distance between the hand and mobile phone is setaccording to the image preview in the enrolment process inFigure 3 enabling the whole hand image to be captured asshown in Figure 4(c) Some examples of image of the wholepalm print are shown in Figure 5

The file were stored in JPEG format Each folder wasnamed as ldquoS xrdquo ldquoS xrdquo represents the identity of the userwhich ranges from 1 to 40 Each folder had 60 palm printimages During preprocessing the image was segmented todetermine the ROI This process is called hand tracking andROI segmentation The image was then corrupted by addingnoises such as motion blur noise and salt and pepper noiseSubsequently the LHEAT method was applied to enhance

Computational Intelligence and Neuroscience 5

(a) (b) (c)

Figure 4 Hand image detection (a) original RGB hand image (b) binarized image Hand position (a) too close (b) too far and (c) suitabledistance

User 1

User 2

User 3

Figure 5 Original hand images captured by a smart phone camera for 5 different samples

the image Then feature extraction was performed Principleanalysis component (PCA) was employed to extract theimage data and reduce the dimensionality of the input dataFinally the image was classified by the IFkNCN classifier

31 Preprocessing There are three major steps in the handtracking and ROI segmentation stage hand image identifi-cation peak and valley detection and ROI extraction [12]In the hand image identification step the RGB image is

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

4 Computational Intelligence and Neuroscience

Hand tracking and ROI segmentation

Image enhancement (i) LHEAT

Feature extraction with PCA

Classification with IFkNCN

Decision

InputPreprocessing

Noise corruption(i) Motion blur(ii) Salt and pepper

Figure 2 Block diagram of a touchless palm print recognition system

Figure 3 Data enrolment process

19 to 23 years An input image is acquired using a HTCOne XAndroidmobile phone with 8megapixels of image resolutionand a stable background The data collection is divided into3 sessions the first session is used for training purpose Thelatter two sessions are used for testing purpose The timeinterval for each session is in two weeksrsquo time

For enrolment process a user needs to follow the instruc-tion displayed on the smart phone screen as shown inFigure 3 Firstly the userwas required to sign in andkey in theimage name Subsequently the users were simply asked to puttheir palm print naturally in front of the acquisition deviceA semitransparent pink color box acts as a constraint box toensure the palm and fingers lie inside the boxThe pixels thatlie outside of the constraint will be cropped So the distancebetween hand and device is set as constant Once the imagewas captured it was saved into the database and this processwas repeated for new image and user

As no peg or other tool is used in the system the usersmay place their hands at different heights above the mobilephone camera The palm image appears large and clear when

the palm is placed near the camera Many line features andridges are captured at near distance However if the hand ispositioned too close to themobile phone the entire handmaynot be captured in the image and someparts of the palmprintimage may be occluded as shown in Figure 4(a) [6] Whenthe hand is moved away from the camera the focus fadesand some print information disappears (Figure 4(b)) [2]Theoptimal distance between the hand and mobile phone is setaccording to the image preview in the enrolment process inFigure 3 enabling the whole hand image to be captured asshown in Figure 4(c) Some examples of image of the wholepalm print are shown in Figure 5

The file were stored in JPEG format Each folder wasnamed as ldquoS xrdquo ldquoS xrdquo represents the identity of the userwhich ranges from 1 to 40 Each folder had 60 palm printimages During preprocessing the image was segmented todetermine the ROI This process is called hand tracking andROI segmentation The image was then corrupted by addingnoises such as motion blur noise and salt and pepper noiseSubsequently the LHEAT method was applied to enhance

Computational Intelligence and Neuroscience 5

(a) (b) (c)

Figure 4 Hand image detection (a) original RGB hand image (b) binarized image Hand position (a) too close (b) too far and (c) suitabledistance

User 1

User 2

User 3

Figure 5 Original hand images captured by a smart phone camera for 5 different samples

the image Then feature extraction was performed Principleanalysis component (PCA) was employed to extract theimage data and reduce the dimensionality of the input dataFinally the image was classified by the IFkNCN classifier

31 Preprocessing There are three major steps in the handtracking and ROI segmentation stage hand image identifi-cation peak and valley detection and ROI extraction [12]In the hand image identification step the RGB image is

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 5

(a) (b) (c)

Figure 4 Hand image detection (a) original RGB hand image (b) binarized image Hand position (a) too close (b) too far and (c) suitabledistance

User 1

User 2

User 3

Figure 5 Original hand images captured by a smart phone camera for 5 different samples

the image Then feature extraction was performed Principleanalysis component (PCA) was employed to extract theimage data and reduce the dimensionality of the input dataFinally the image was classified by the IFkNCN classifier

31 Preprocessing There are three major steps in the handtracking and ROI segmentation stage hand image identifi-cation peak and valley detection and ROI extraction [12]In the hand image identification step the RGB image is

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

6 Computational Intelligence and Neuroscience

(a) (b) (c) (d)

Figure 6 Hand image detection (a) original RGB hand image (b) binarized image (c) hand contour with the Canny method (d) perfecthand boundary plot

T1

P1P2 P3

P4

T2T3

T4

T5

Figure 7 Five peaks and four valleys indicate the tips and roots ofthe fingers

transformed into a grayscale image and then converted to abinary image Because the lighting conditions in the camerasetup are uncontrolled straightforward hand identification isnot possible Noise results inmany small holesThe noise andunsmooth regions are removed by filling the small holes inthe hand region Once the noise is removed the edge of theimage is detected using the Canny edge detection algorithmThe hand boundary of the image is traced before the perfecthand counter is acquired as shown in Figure 6

Because the image was captured without pegs or guidingbars the palm print alignment varied in each collectionThis variation caused the palm print image to be affected byrotation andmayhamper accurate recognitionTherefore thelocal minima and local maxima methods were used to detectpeaks and valleys [29] As shown in Figure 7 the peak andvalley points in the hand boundary image were sorted andnamed before ROI segmentation

The locations of three reference points P1 P2 and P3need to be detected in order to set up a coordinate systemfor palm print alignment The size of ROI is dynamicallydetermined by the distance between P1 and P3 It makes theROI extraction scale invariant To locate the ROI a line wasdrawn between reference points for example P1 and P3 areshown in Figure 8(a) and labeled as ldquo119889rdquo The image was thenrotated using a command ldquoimrotaterdquo in MATLAB function

in order to ensure that the line was drawn horizontally asshown in Figure 8(b) The rotated image has the same sizeas the input image A square shape was drawn as shown inFigure 8(c) in which the length and width of the square wereobtained as

119886 = 119889+119889

65 (1)

The ROI was segmented and the region outside the squarewas discarded Then the ROI was converted from RGB tograyscale

To investigate the performance of the proposed methodin noisy environments the ROI image was corrupted usingmotion blur noise and salt and pepper noise as shown inFigure 9 The level of source noise (120590) was set to 013

32 Image Enhancement Image enhancement is an impor-tant process that improves the image quality Similar tothe LHE and LAT methods in the LHEAT method theinput image is broken into small blocks or local windowneighborhoods that contain a pixel In the LHEAT the LHEis firstly obtained to ensure an equal distribution of thebrightness levels The LAT is employed to extract the usefulinformation of the image that had been enhanced by theLTE and separated the foreground from the nonuniformillumination background An input image is broken intosmall blocks or local window neighborhoods containing apixelThis is similar in the LHE LAT and LHEAT Each blockis surrounded by a larger block The input image is definedas 119883 isin 119877

119867times119882 with dimensions of 119867 times 119882 pixels and theenhanced image is defined as 119884 isin 119877

119867times119882 with119867 times119882 pixelsThe input image is then divided into the block 119879

119894= 1 119899

of window neighborhoods with the size119908times119908 where119908 lt 119882119908 lt 119867 and 119899 = [(119867 times119882)(119908 times 119908)]

Each pixel in the small block is calculated using amapping function and threshold The size of 119908 shouldbe sufficient to calculate the local illumination level bothobjects and the background [24] However this processresults in a complex computation To reduce the computationcomplexity and accelerate the computation we used thesliding neighborhood operation [17] Figure 10 shows anexample of the sliding neighborhood operation An imagewith a size of 6 times 5 pixels was divided into blocks of window

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 7

T1

P1P2 P3

d

P4

T2T3

T4

T5

(a)

T1

P1 P2 P3

d

P4

T2 T3

T4

T5

(b)

T1

a

T2 T3

T4

T5

a = d + d65

(c)

Figure 8 ROI segmentation process (a) line drawn from P1 to P3 (b) rotated image (c) ROI selection and detection

(a) (b) (c)

Figure 9 ROI image (a) original (b) degraded with salt and pepper noise (c) degraded with motion blur noise

neighborhoods with a size of 3 times 3 pixels It is shown inFigure 10(a) The 6 times 5 image matrix was first rearranged intoa 30-column (6 times 5 = 30) temporary matrix as shown inFigure 10(b) Each column contained the value of the pixelsin its nine-row (3 times 3 = 9) window The temporary matrixwas then reduced by using the local mean (119872

119894)

119872119894=

1119873

119899

sum

119895=1119908119895 (2)

where 119908 was size of window neighborhoods 119895 was thenumber of pixels contained in each neighborhood 119894 was thenumber of columns in temporary matrix and 119873 was thetotal number of pixels in the block After determining thelocal mean in (2) there was only one row left as shown inFigure 10(c) Subsequently this row was rearranged into theoriginal shape as shown in Figure 10(d)

There are three steps in the LHE technique the probabil-ity density (PD) the cumulative distribution function (CDF)and the mapping function The probability distribution ofimage PD for each block can be expressed as follows

119875 (119894) =119899119894

119873for 119894 = 0 1 119871 minus 1 (3)

where 119899119894is the input pixel number of level 119894 is the input

luminance gray level and 119871 is gray level which is 256Subsequently the LHE uses an input-output mapping

derived from CDF of the input histogram defined as follows

119862 (119894) =

119899

sum

119894=0119875 (119894) (4)

Finally the mapping function is determined from the CDF asfollows

119892 (119894) = 119872+ [(119909119894minus119872)times119862 (119894)] (5)

where119872 is the mean value from (2)Although the image has been enhanced it remainsmildly

degraded because of the background noise and variation incontrast and illuminationThe image was corrupted with twonoises motion blur noise and salt and pepper noise Themedian filter which has a 3 times 3 mask was applied over thegrayscale image For an enhanced image 119892(119894) 119902(119894) is theoutputmedian filter of length 119897 where 119897 is the number of pixelsover which median filtering takes place When 119897 is odd themedian filter is defined as follows

119902 (119894) = median119892 (119894 minus 119896 119894 + 119896) 119896 = (119897 minus 1)2

(6)

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

8 Computational Intelligence and Neuroscience

j = 1 j = 2 j = 3

j = 4 j = 5 j = 6

j = 7 j = 8 j = 9

w = 3

w = 3

H = 6

W = 5

(a) Original image with window neighborhoods

w1

w2

w3

w4

w5

w6

w7

w8

w9

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

(b) Temporary matrix

M1M2 M30

(c) One row matrix (d) Rearranged row into the origi-nal shape

Figure 10 The sliding neighborhood operation

When 119897 is even themean of the two values at the center of thesorted sample list is usedThe purpose of filtering is to reducethe effect of salt and pepper noise and the blur of the edge ofthe image

Once the image has been filtered the image is segmentedusing the LAT technique The LAT separates the foregroundfrom the background by converting the grayscale image intobinary form Sauvolarsquos method was applied here resulting inthe following formula for the threshold

119879ℎ (119894) = 119872[1+ 119896 (119885

119877minus 1)] (7)

where119879ℎis the threshold 119896 is a positive value parameter with

119896 = 05 119877 is the maximum value of the standard deviation

whichwas set at 128 for grayscale image and119885 is the standarddeviation which can be found as

119885 = radic1

119873 minus 1

119899

sum

119895=1(119908119895minus119872) (8)

According to (8) the binarization results of Sauvolarsquos methodcan be denoted as follows

119910 (119894) =

1 if 119902 (119894) gt 119879ℎ (119894)

0 otherwise(9)

Figure 11 shows the comparison of output results afterapplying the LHE and LHEAT techniques The detail inthe enhanced image using LHEAT was sharper and finedetails such as ridges were more visible Section 41 depicts

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 9: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 9

(A) (B) (C) (D)

(a) Clean

(A) (B) (C) (D)

(b) Salt and pepper noise

(A) (B) (C) (D)

(c) Motion blur

Figure 11 Comparison of image enhancement (A) original image (B) LHE (C) LAT and (D) LHEAT techniques

the reduction in processing time and increased accuracy byapplying the proposed image enhancement techniques

33 Feature Extraction Touchless palm print recognitionmust extract palm print features that can discriminate oneindividual from another Occasionally the captured imagesare difficult to extract because the line structures are dis-criminated individually The creases and ridges of the palmcross and overlap one another complicating the featureextraction task [30] Recognition accuracy may decrease ifthe extraction is not performed properly

In this paper PCA was applied to create a set of com-pact features for effective recognition This extraction tech-nique has been widely used for dimensionality reduction incomputer vision This technique was selected because thefeatures were more robust compared with other palm printrecognition systems such as eigenpalm [31] Gabor filters[32] Fourier transform [33] and wavelets [34]

The PCA transforms the original data from large spaceto a small subspace using a variance-covariance matrixstructure The first principle component shows the mostvariance while the last few principle components have lessvariance that is usually neglected since it has a noise effect

Suppose a dataset 119909119894where 119894 = 1 2 119873 and 119909

119894is rear-

ranged in 1198752 dimension The PCA first computes the averagevector of 119909

119894and defined as

119909 =1119873

119899

sum

119894=1119909119894 (10)

whereas the deviations from 119909119894can be calculated by subtract-

ing 119909

Φ119894= 119909119894minus119909 (11)

This step obtains a new matrix

119860 = [Φ1 Φ2 Φ119899] (12)

That produces a dataset whose mean is zero 119860 is the 1198752 times 119873dimensions

Next the covariance matrix is computed

119862 =

119873

sum

119894=1Φ119894Φ119894

119879= 119860119860

119879 (13)

However (13) will produce a very large covariance matrixwhich is 1198752 times 119875

2 dimensions This causes the computation

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 10: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

10 Computational Intelligence and Neuroscience

Training samples

Threshold(i) Triangle inequality(ii) Fuzzy rule

Remove outliers

Centroid-distances

Fuzzy-based rule

Building stage

Searching stage

Figure 12 Architecture of the IFkNCN classifier

required to be huge and the systemmay slow down terribly orrun out of memory As a solution the dimensional reductionis employed where the covariance matrix is expressed as

119862 = 119860119879119860 (14)

Thus the lower dimension of covariance matrix in119873 times 119873 isobtained

Next the eigenvalues and eigenvectors of the 119862 are com-puted If the matrix 119881 = (119881

1 1198812 119881

119901) contains the eigen-

vectors of a symmetric matrix119862 then119881 is orthogonal and 119862can be decomposed as

119862 = 119881119863119881119879 (15)

where 119863 is a diagonal matrix of the eigenvalues and 119881 isa matrix of eigenvectors Then the eigenvalues and corre-sponding eigenvectors are sorted in the order to decrease thedimensions Finally the optimum eigenvectors are chosenbased on the largest value of eigenvalues The details of theseprocedures can be found in Connie et al [30]

34 Image Classification This section describes the methodsused for the IFkNCN classifier There were two stages forthis classifier the building stage and the searching stage(Figure 12) In the building stages triangle inequality andfuzzy IF-THEN rules were used to separate the samples intooutliers and train candidate samples For the searching stagethe surrounding rule was based on centroid-distance andthe weighting fuzzy-based rule was applied The query pointwas classified by the minimum distances of the 119896 neighborsand sample placement considering the assignment of fuzzymembership to the query point

Building Stage In this stage the palm print images weredivided into 15 training sets and 40 testing sets The distanceof testing samples or query point and training sets wascalculated and the Euclidean distance was used

Given a query point 119910 and training sets 119879 = 119909119895119873

119895=1 with

119909119895= 1198881 1198882 119888

119872119873 is the number of training sets 119909

119895is the

sample from the training sample119872 is the number class and

119888 is the class label of119872The distance between the query pointand training samples can be determined as follows

119889 (119910 119909119895) = radic(119910 minus 119909

119895)119879

(119910 minus 119909119895) (16)

where 119889(119910 119909119895) is the Euclidean distance119873 is the number of

training samples 119909119895is the training sample and 119910 is the query

pointThedistanceswere sorted in ascending order to determine

the minimum and maximum distance The threshold was setsuch that the training samples fell within a selected thresholddistance and were considered inliers Otherwise they wereconsidered to be outliers To determine the threshold trian-gle inequality was applied The triangle inequality methodrequires that the distance between two objects (referencepoint and training samples reference point and query point)cannot be less than the difference between the distances toany other object (query point and the training samples) [35]More specifically the distance between the query point andtraining samples satisfies the triangle inequality condition asfollows

119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (17)

where119889(119910 119911) is the distance from the query point to referencesample In this study the maximum distance obtained from(16) was assumed to be 119889(119910 119911) For faster computationthe distance between training sample and reference sample119889(119909119895 119911) was discarded To eliminate the computation of

119889(119909119895 119911) (17) was rewritten as follows

2119889 (119910 119909119895) le 119889 (119909

119895 119911) + 119889 (119910 119911) (18)

Because 119889(119910 119909119895) le 119889(119909

119895 119911) the value of 119889(119909

119895 119911) is not

necessary and (18) can be rearranged as follows

119889 (119910 119909119895) le

12119889 (119910 119911) (19)

The choice of threshold values is important because a largethreshold value requires more computation A small thresh-old makes the triangle inequality computation useless Totackle the problem the candidate outlier detection can beexpressed by the fuzzy IF-THEN rules Each input set wasmodeled by two functions as depicted in Figure 13

The membership functions were formed by Gaussianfunctions or a combination of Gaussian functions given bythe following equation

119891 (119909 120590 119888) = 119890minus(119909minus119888)

221205902

(20)

where 119888 indicates the center of the peak and 120590 controls thewidth of the distribution The parameters for each of themembership functions were determined by taking the bestperforming values using the development set [21]

The output membership functions were provided asOutlierness = High Intermediate Low and were modeledas shown in Figure 14 They have distribution functionssimilar to the input sets (which are Gaussian functions)

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 11: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 11

Short Medium Long

00

05

1

5 10 15 20 25Input variable ldquodistancerdquo

(a) The distance parameter

Medium LongClose

0

05

1

0 5 10 15 20 25Input variable ldquothresholdrdquo

(b) The threshold parameter

Figure 13 Input membership function

Low Intermediate High

00

01

05

02 03 04 05 06 07 08 09 1

1

Output variable ldquooutliernessrdquo

Figure 14 Output membership function

The training sample was determined as an outlier if thedistance of the training sample was long and the thresholdwas far and vice versa

The Mamdani model was used to interpret the fuzzyset rules This technique was used because it is intuitiveand works well with human input Nine rules were usedto characterize the fuzzy rules The main properties are asfollows

(i) If the distance is short and threshold is small thenoutlierness is low

(ii) If the distance is short and threshold is large thenoutlierness is intermediate

(iii) If the distance is long and threshold is small thenoutlierness is intermediate

(iv) If the distance is long and threshold is far thenoutlierness is high

The defuzzified output of the fuzzy procedure is influenced bythe value of 119889(119910 119909

119895) and 119889(119910 119911) The fuzzy performance with

a training sample with 119889(119910 119909119895) = 631 and reference sample

with 119889(119910 119911) = 20 is shown in Figure 15 The outlierness was0381 and the training sample was accepted as a candidatetraining sample By removing the outlier future processingonly focuses on the candidate training samples

Searching Stage A surrounding fuzzy-based rule was pro-posed in which the rule is modified by the surrounding ruleand the applied fuzzy rule The main objective of this stage

was to optimize the performance results while consideringthe surrounding fuzzy-based rules which are as follows

(i) The 119896 centroid nearest neighbors should be as close tothe query point as possible and located symmetricallyaround the query point

(ii) The query point is classified by considering the fuzzymembership values

Given a query point 119910 a set of candidate training samples119879 = 119909

119895isin 119877119898119873

119895=1 with 119909

119895= 1198881 1198882 119888

119872 where 119873 is the

number of training samples 119909119895is the training sample 119872 is

the number of classes and 119888 is the class label of 119872 theprocedures of the IFkNCN in building stage can be definedas follows

(i) Select the candidate training sample as the firstnearest centroid neighbor by sorting the distance ofthe query point and candidate training sample Let thefirst nearest centroid neighbor be 119909

1

NCN(ii) For 119896 = 2 find the first centroid of 119909

1

NCN and theother candidate training samples are given as follows

1199092119862=1199091

NCN+ 119909119895

2 (21)

(iii) Then determine the second nearest centroid neigh-bors by finding the nearest distance of the firstcentroid and query point

(iv) For 119896 gt 2 repeat the second step to find the othernearest centroid neighbors by determining the cen-troid between the training samples and previousnearest neighbors

119909119896

119888=1119896

119896

sum

119894=1119909119895

NCN+119909119895 (22)

(v) Let the set of 119896 nearest centroid neighbors119879119895119896

NCN(119910) = 119909

119895119896

NCNisin 119877119898119896

119895=1 and assign the fuzzymembership of the query point in every 119896 nearest

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 12: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

12 Computational Intelligence and Neuroscience

0

1

2

3

4

5

6

7

8

9

1 0 1

0 1

Distance = 631 Threshold = 20 Outlierness = 0381

Figure 15 Example of the fuzzy IF-THEN rules

centroid neighbor The fuzzy membership is as fol-lows

119906119894

NCN(119910) =

sum119896

119895=1 119906119894119895 (110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1)

)

sum119896

119895=1 110038171003817100381710038171003817119910 minus 119909119895119896

NCN100381710038171003817100381710038172(119898minus1) (23)

where 119894 = 1 2 119888 119888 is the number of classes 119906119894119895

is the membership degree of training sample 119909119895119896

selected as the nearest neighbor 119910 minus 119909119895119896

NCN is the

119871-norm distance between the query point 119909 and itsnearest neighbor and119898 is a fuzzy strength parameterwhich is used to determine how heavily the distance

is weighted when calculating each neighborrsquos contri-bution to the fuzzy membership values

(vi) For the value of the fuzzy strength parameter thevalue of119898 is set to 2 If119898 is 2 the fuzzy membershipvalues are proportional to the inverse of the squareof the distance providing the optimal result in theclassification process

(vii) There are two methods to define 119906119894119895 One definition

uses the crisp membership in which the trainingsamples assign all of the memberships to their knownclass and nonmemberships to other classesThe otherdefinition uses the constraint of fuzzy membership

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 13: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 13

Table 1 Performance with different sizes of the window neighborhood

119908 3 11 15 19

Image

Time (s) 007 084 109 230

that is when the 119896 nearest neighbors of each trainingsample are found (say 119909

119896) the membership of 119909

119896in

each class can be assigned as follows

119906119894119895(119909119896) =

051 + 049 (119899119895

119896) 119895 = 119894

049 (119899119895

119896) 119895 = 119894

(24)

where 119899119895denotes the number of neighbors of the 119895th

training samplesThe membership degree 119906

119894119895was defined using the

constraint of fuzzy membership The fuzzy member-ship constraint ensures that higher weight is assignedto the training samples in their own class and thatlower weight is assigned to the other classes

(ix) The query point to the class label can be classified byobtaining the highest fuzzy membership value

119862 (119910) = argmax (119906119894

NCN(119910)) (25)

(x) Repeat steps (i) to (vii) for a new query point

4 Experimental Results

As mentioned in Section 3 this study was conducted basedon 2400 palm print images from 40 users For each user15 images from the first session were randomly selectedfor training samples and the remaining 40 images fromthe second and third session were used as testing samplesTherefore a total of 600 (15 times 40) and 1600 (40 times 40) imageswere used in the experiment In order to gain an unbiasedestimate of the generalization accuracy the experiment wasthen run 10 times The advantage of this method is that all ofthe test sets are independent and the reliability of the resultscan be improved

Two major experiments image enhancement and imageclassification were conducted to evaluate the proposedtouchless palm print recognition system In the imageenhancement experiment three experiments were per-formed The first experiment determined the optimal size of

the window neighborhood for the LHEAT technique Thesecond experiment validated the usefulness of the imageenhancement technique by comparing the results with andwithout applying the image enhancement technique Thethird experiment compared the proposed LHEAT techniquewith the LHE [23] and LAT [24] techniques In the imageclassification the first experiment determined the optimalvalue of 119896 and size of feature dimensions for the IFkNCNclassifier and compared the performance of the IFkNCNwithkNN [25] 119896 nearest centroid neighborhood (kNCN) [27]and fuzzy kNN (FkNN) [28] classifiers

The performance for both image enhancement and imageclassification experiments was evaluated based on processingtime and classification accuracy (119862

119860) where the119862

119860is defined

as follows

119862119860=119873119862

119873119879

times 100 (26)

where 119873119862is the number of query points which is classified

correctly and119873119879is the total number of the query points

All experiments were performed in MATLAB R2007 (b)and tested on Intel Core i7 21 GHz CPU 6G RAM andWindows 8 operating system

41 Image Enhancement To determine the optimal size ofwindow neighborhood 119908 for the proposed method a cleanimage was obtained and the values of 119908 were set to 3 9 15and 19 The performance result was based on image qualityand processing time The results are shown in Table 1 Thewindow neighborhood of 119908 = 15 provided the best imagequality Although the image quality for119908 = 19 was similar to119908 = 15 the processing time was longer Therefore to size thewindow neighborhood 119908 = 15 was used in the subsequentexperiments

This section also validates the utility of the imageenhancement techniques discussed in Section 32 In thisexperiment the palm print features were extracted usingPCA with a feature dimension size fixed at 80 Then theIFkNCN classifier was obtained in which the value of 119896was set to 5 Table 2 shows the performance results withand without applying the image enhancement techniquesAn improvement gain of approximately 361 in the 119862

119860was

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 14: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

14 Computational Intelligence and Neuroscience

Table 2 Comparison of the image enhancement techniques

MethodCA ()

Clean Salt andpepper noise

Motion blurnoise

Without imageenhancement 9640 plusmn 114 8640 plusmn 207 8880 plusmn 148

With LHEATtechnique 9842 plusmn 055 9040 plusmn 089 9360 plusmn 089

968 9723 9868

8742 8805

9445

8468643

9122

75

80

85

90

95

100

LHE LAT LHEAT

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 16 Performance of the LHE LAT and LHEAT methods for119862119860

achieved when the proposed image enhancement methodwas applied Although the performance decreased becauseof degradation in image quality in the corrupted image theimage enhancement technique was able to recover more than90 of the image compared with results without the imageenhancement technique

The next experiment investigated how the proposedLHEAT technique compared with previous techniques suchas LHE and LAT The settings used in this experiment werethe same as in the previous experiments The result of thethree experiments is shown in Figure 16 LHEAT performedbetter than LHE and LAT yielding a 119862

119860of more than

90 for the clean and corrupted images LHE enhancedbrightness levels by distributing the brightness equally andrecovered original images that were over- and underexposedWhen LAT was applied the threshold changed dynamicallyacross the image LAT can remove background noise andvariations in contrast and illumination LHE and LAT inLHEATcomplement one another and yield promising results

LHEAT gives another advantage over other methodsin terms of its simplicity in computation Normally LHEand LAT require a time complexity of 119874(1199082 times 119899

2) with an

image of size (119899 times 119899) with a size of window neighborhood(119908 times 119908) However in the proposed LHEAT technique thetime complexity is 119874(1198992) because the sliding neighborhoodis only used to obtain local mean (119872) and local standarddeviation (119885) Hence the time required for LHEAT is muchcloser to global techniques Figure 17 shows a comparison ofcomputation times during the image enhancement processThe LHEAT technique outperformed the LHE and LATtechniques

952315242

2134

10965

29376

2645

1046

28643

27220

50100150200250300350

LHE LAT LHEAT

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 17 Performance of LHE LAT and LHEAT in processingtime

42 Image Classification Following the image enhancementexperiments the efficiency and robustness of the proposedIFkNCN classifier were evaluated The first experiment inthis section determined the optimal 119896 value for the IFkNCNclassifier To avoid situations in which the classifier ldquotiesrdquo(identical number of votes for two different classes) an oddnumber for 119896 such as 1 3 5 7 9 11 13 15 and 17 was usedand the size of feature dimensionwas fixed to 80 Comparisonresults are summarized in Table 3 IFkNCN achieved thehighest 119862

119860results when 119896 was 5 and 7 The best 119862

119860values

were 9854 plusmn 084 (119896 = 5) 9402 plusmn 054 (119896 = 5) and9120 plusmn 110 (119896 = 7) for clean salt and pepper noise andmotion blur images respectively Because there was only a012 difference between 119896 = 7 and 119896 = 5 for IFkNCN inmotion blur images the value of 119896 is set to 5 to ease thecalculation in the subsequent experiments The results alsoshowed that increasing the value of 119896 further lowers the119862119860 When 119896 increases the number of nearest neighbors

of the query point also increases In this situation sometraining samples from different classes which have similarcharacteristics were selected as the nearest neighbor andthese training samples were defined as overlapping samplesMisclassification often occurs near class boundaries in whichan overlap occurs

The second experiment determined the optimal featuredimension size for the IFkNCN classifier The 119896 value was setto 5 and the size of the feature dimensionwas set to 20 60 80100 and 120 The results are shown in Table 4 As expectedthe palm print recognition achieved optimal results whenthe size of the feature dimension was set to 120 Howeverthe value also had the highest processing time When thefeature dimension was set to 100 the processing time wasreduced twofold lower than the feature dimension of 120Thedifference in 119862

119860between the 100 and 120 feature dimensions

was relatively small (approximately 010) Therefore afeature dimension of 100 was selected as the optimal value forIFkNCN and this size was used for the next experiment

The subsequent experiment evaluated the proposed clas-sifier A comparison of IFkNCN with other previous nearestneighbor classifiers such as kNN kNCN and FkNN wasperformed The optimal parameter values that is 119896 = 5 and

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 15: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 15

Table 3 Comparison of the CA results for different 119896 values (results are in )

Image k = 1 k = 3 k = 5 k = 7 k = 9 k = 11 k = 13 k = 15 k = 17Clean 9602 plusmn 114 9635 plusmn 095 9854 plusmn 084 9812 plusmn 098 9767 plusmn 116 9658 plusmn 124 9634 plusmn 064 9682 plusmn 114 9634 plusmn 102Salt andpepper 9112 plusmn 082 9354 plusmn 126 9402 plusmn 054 9384 plusmn 096 9321 plusmn 112 9315 plusmn 145 9302 plusmn 098 9234 plusmn 126 9189 plusmn 066

Motion blur 8802 plusmn 134 8972 plusmn 122 9108 plusmn 098 9120 plusmn 110 9033 plusmn 088 8978 plusmn 045 8954 plusmn 066 8896 plusmn 182 8902 plusmn 182

Table 4 Comparison of IFkNCN of different feature dimension values

Dim Clean Salt and pepper Motion blurTime (s) CA () Time (s) CA () Time (s) CA ()

20 065 9332 plusmn 122 074 9150 plusmn 201 099 8962 plusmn 15240 086 9356 plusmn 100 083 9206 plusmn 188 103 9012 plusmn 09460 117 9534 plusmn 094 115 9295 plusmn 105 164 9095 plusmn 13280 154 9864 plusmn 126 144 9367 plusmn 122 171 9102 plusmn 098100 132 9896 plusmn 055 146 9411 plusmn 114 192 9245 plusmn 114120 543 9902 plusmn 125 498 9421 plusmn 135 524 9249 plusmn 132

9125

9425 94869878

87438945 8951

9424

88358976 8982 9365

75

80

85

90

95

100

kNN kNCN FkNN IFkNCN

Accu

racy

()

CleanSalt and pepperMotion blur

Figure 18 Comparison of IFkNCN with kNN kNCN and FkNNclassifiers based on 119862

119860

a feature dimension of 100 were used The overall perfor-mance results based on 119862

119860are described in Figure 18 By

utilizing the strength of the centroid neighborhood whilesolving the ambiguity of the weighting distance between thequery point and its nearest neighbors the IFkNCN classifieroutperformed the kNN kNCN and FkNN classifiersThe119862

119860

of the IFkNCN increased approximately 753 681 and53 in the clean salt and pepper and motion blur imagesrespectively compared with kNN kNCN and FkNN

In addition to better accuracy the proposed IFkNCNclassifier also had better processing times in all conditionsas shown in Figure 19 By using the triangle inequality andfuzzy IF-THEN rules the training samples that were notrelevant to additional processing were removed Accuracydid not decrease but the processing time was 239 s whereasthe processing times for kNN kNCN and FkNN were 782 s10917 s and 959 s respectively

The time required to execute each process that is imagepreprocessing image enhancement feature extraction and

705

12053

956 245835

10693

96 262806

10004

962 211020406080

100120140

kNN kNCN FkNN IFkNCN

Proc

essin

g tim

e (s)

CleanSalt and pepperMotion blur

Figure 19 Comparison of IFkNCN with the kNN kNCN andFkNN classifiers based on processing time

184

255

186

115

187

249

188

123

186

243

186

131

kNN kNCN FkNN IFkNCN

CleanSalt and pepperMotion blur

0

50

100

150

200

250

300

Proc

essin

g tim

e (m

s)

Figure 20 Processing speed of a touchless palm print system

image classification in the touchless palm print recognitionis shown in Figure 20 The reported time is the averagetime required to process an input image from a userThe total time to identify a user was less than 130ms

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 16: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

16 Computational Intelligence and Neuroscience

The speed demonstrated by the proposed system demon-strates that it has the potential for implementation in real-world applications

5 Conclusions and Future Works

This paper presents a touchless palm print recognitionmethod using anAndroid smart phoneThe proposed systemis accessible and practical In addition the device is cost-effective and does not require expensive hardwareThis paperfocused on image enhancement and image classificationTo enhance the quality of the acquired images we proposethe LHEAT technique Because the sliding neighborhoodoperation is applied in the LHEAT technique the compu-tation was much faster compared with previous techniquessuch as LHE and LAT The proposed technique was alsoable to reduce noise and increase the dominant line edgesin the palm print image Moreover this method workswell in noisy environments This paper also presents anew type of classifier called IFkNCN that has advantagescompared with the kNN classifier The major advantage ofthe IFkNCN classifier is that it can remove the outliersand that its computation is efficient Extensive experimentswere performed to evaluate the performance of the systemin terms of image enhancement and image classificationThe proposed system exhibits promising results Specificallythe 119862

119860with the LHEAT technique was more than 90

and the processing time was threefold lower than with theLHE and LAT methods In addition the 119862

119860achieved by

the IFkNCN method was improved to more than 90 forclean and corrupted images and the processing time wasless than 120ms which was substantially less compared withthe other tested classifiers The proposed touchless palmprint system is convenient and able to manage real-timerecognition challenges such as environmental noise andlighting changes

Although the purpose of this research has been achievedthere are some aspects that need to be taken into con-sideration for future work Firstly in order to ensure thedevelopment of touchless palm print system is more appli-cable in real application experiment in various types ofnoises needs to be extracted before the ROI extraction Sothe filtered process can be improved before the subsequentprocess is applied Secondly additional algorithms in theimage enhancement can be added to improve the LHEATperformance especially when the image is captured in var-ious types of illumination background and focus Howeveraddition of other algorithms may slow down the speed ofthis technique Thus this problem should be considered ifthe online or real-time processing algorithm is requiredFor the classification process the code optimization couldbe conducted to increase the computational efficiency ofthe IFkNCN classifier during the searching stage Since thecomplexity of each training sample in searching stage ishigh the code optimization process will be beneficial inoffering better solution to overcome this complexity prob-lem

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

The authors would like to express their gratitude for thefinancial support provided by Universiti Sains MalaysiaResearch University Grant 814161 and Research UniversityPostgraduate Grant Scheme 8046019 for this project

References

[1] Y Zhou Y Zeng and W Hu ldquoApplication and development ofpalm print researchrdquo Technology and Health Care vol 10 no 5pp 383ndash390 2002

[2] G K OMichael T Connie and A B J Teoh ldquoTouch-less palmprint biometrics novel design and implementationrdquo Image andVision Computing vol 26 no 12 pp 1551ndash1560 2008

[3] P Somvanshi and M Rane ldquoSurvey of palmprint recognitionrdquoInternational Journal of Scientific amp Engineering Research vol 3no 2 p 1 2012

[4] H Imtiaz and S A Fattah ldquoA wavelet-based dominant featureextraction algorithm for palm-print recognitionrdquoDigital SignalProcessing vol 23 no 1 pp 244ndash258 2013

[5] W-Y Han and J-C Lee ldquoPalm vein recognition using adaptiveGabor filterrdquo Expert Systems with Applications vol 39 no 18pp 13225ndash13234 2012

[6] G K O Michael C Tee and A T Jin ldquoTouch-less palm printbiometric systemrdquo inProceedings of the International ConferenceonComputer VisionTheory andApplications pp 423ndash430 2005

[7] H Sang Y Ma and J Huang ldquoRobust palmprint recognitionbase on touch-less color palmprint images acquiredrdquo Journal ofSignal and Information Processing vol 4 no 2 pp 134ndash139 2013

[8] XWuQ Zhao andWBu ldquoA SIFT-based contactless palmprintverification approach using iterative RANSAC and local palm-print descriptorsrdquo Pattern Recognition vol 47 pp 3314ndash33262014

[9] A K Jain and J Feng ldquoLatent palmprintmatchingrdquo IEEETrans-actions on Pattern Analysis and Machine Intelligence vol 31 no6 pp 1032ndash1047 2009

[10] L Fang M K H Leung T Shikhare V Chan and K F ChoonldquoPalmprint classificationrdquo in Proceedings of the IEEE Interna-tional Conference on Systems Man and Cybernetics pp 2965ndash2969 October 2006

[11] H Imtiaz and S A Fattah ldquoA spectral domain dominant featureextraction algorithm for palm-print recognitionrdquo InternationalJournal of Image Processing vol 5 pp 130ndash144 2011

[12] S Ibrahim and D A Ramli ldquoEvaluation on palm-print ROIselection techniques for smart phone based touch-less biomet-ric systemrdquo American Academic amp Scholarly Research Journalvol 5 no 5 pp 205ndash211 2013

[13] T Celik ldquoTwo-dimensional histogram equalization and con-trast enhancementrdquo Pattern Recognition vol 45 no 10 pp3810ndash3824 2012

[14] M Eramian and D Mould ldquoHistogram equalization usingneighborhood metricsrdquo in Proceedings of the 2nd CanadianConference on Computer and Robot Vision pp 397ndash404 May2005

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 17: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Computational Intelligence and Neuroscience 17

[15] B Kang C Jeon D K Han and H Ko ldquoAdaptive height-mod-ified histogram equalization and chroma correction in YCbCrcolor space for fast backlight image compensationrdquo Image andVision Computing vol 29 no 8 pp 557ndash568 2011

[16] T R Singh S Roy O I Singh andK Singh ldquoA new local adapt-ive thresholding technique in binarizationrdquo International Jour-nal of Computer Science Issues vol 8 no 6 p 271 2012

[17] J L Semmlow Biosignal and Medical Image Processing CRCPress 2011

[18] Y Feng J Li L Huang and C Liu ldquoReal-time ROI acquisitionfor unsupervised and touch-less palmprintrdquoWorld Academy ofScience Engineering and Technology vol 78 pp 823ndash827 2011

[19] P Viola and M Jones ldquoRapid object detection using a boostedcascade of simple featuresrdquo in Proceedings of the IEEE ComputerSociety Conference on Computer Vision and Pattern Recognition(CVPR rsquo01) pp I511ndashI518 December 2001

[20] N Vasconcelos and M J Saberian ldquoBoosting classifier cas-cadesrdquo in Advances in Neural Information Processing Systemspp 2047ndash2055 2010

[21] GKOMichael T Connie andA B J Teoh ldquoA contactless bio-metric system using multiple hand featuresrdquo Journal of VisualCommunication and Image Representation vol 23 no 7 pp1068ndash1084 2012

[22] C Methani Camera based palmprint recognition [Doctoral Dis-sertation] International Institute of Information TechnologyHyderabad India 2010

[23] H Zhu F H Y Chan and F K Lam ldquoImage contrast enhance-ment by constrained local histogram equalizationrdquo ComputerVision and Image Understanding vol 73 no 2 pp 281ndash2901999

[24] Y-T Pai Y-F Chang and S-J Ruan ldquoAdaptive thresholdingalgorithm efficient computation technique based on intelligentblock detection for degraded document imagesrdquo Pattern Recog-nition vol 43 no 9 pp 3177ndash3187 2010

[25] T Cover and P Hart ldquoNearest neighbor pattern classificationrdquoIEEE Transactions on Information Theory vol 13 no 1 pp 21ndash27 1967

[26] X Wu V Kumar J Ross Quinlan et al ldquoTop 10 algorithms indata miningrdquo Knowledge and Information Systems vol 14 no 1pp 1ndash37 2008

[27] B B Chaudhuri ldquoA new definition of neighborhood of a pointin multi-dimensional spacerdquo Pattern Recognition Letters vol 17no 1 pp 11ndash17 1996

[28] J Wang P Neskovic and L N Cooper ldquoImproving nearestneighbor rule with a simple adaptive distance measurerdquo PatternRecognition Letters vol 28 no 2 pp 207ndash213 2007

[29] L Q Zhu and S Y Zhang ldquoMultimodal biometric identificationsystembased on finger geometry knuckle print and palmprintrdquoPattern Recognition Letters vol 31 no 12 pp 1641ndash1649 2010

[30] T Connie A Teoh M Goh and D Ngo ldquoPalmprint recogni-tion with PCA and ICArdquo in Proceedings of the Image and VisionComputing Palmerston North New Zealand 2003

[31] G Lu D Zhang and K Wang ldquoPalmprint recognition usingeigenpalms featuresrdquo Pattern Recognition Letters vol 24 no 9-10 pp 1463ndash1467 2003

[32] W K Kong D Zhang andW Li ldquoPalmprint feature extractionusing 2-D Gabor filtersrdquo Pattern Recognition vol 36 no 10 pp2339ndash2347 2003

[33] W Li D Zhang and Z Xu ldquoPalmprint identification by Fouriertransformrdquo International Journal of Pattern Recognition andArtificial Intelligence vol 16 no 4 pp 417ndash432 2002

[34] A Kumar and H C Shen ldquoRecognition of palmprints usingwavelet-based featuresrdquo in Proceedings of the IEEE InternationalConference on Systematic Cybernetics and Informatics (SCI rsquo02)Orlando Fla USA 2002

[35] A Berman and L G Shapiro ldquoSelecting good keys for triangle-inequality-based pruning algorithmsrdquo in Proceedings of theIEEE International Workshop on Content-Based Access of Imageand Video Database pp 12ndash19 Bombay India 1998

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 18: Research Article A Robust and Fast Computation Touchless …downloads.hindawi.com/journals/cin/2015/360217.pdf · A sliding neighborhood operation with local hist ogram equalization,

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014