2013 ieee ro-man: the 22nd ieee international symposium on ...dkulic/pubs/zhangroman2013.pdf ·...
TRANSCRIPT
2013 IEEE RO-MAN: The 22nd IEEE International Symposium onRobot and Human Interactive CommunicationGyeongju, Korea, August 26-29, 2013
TuA1.1P.25
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 220
rates above 80% are achieved when identifying one out of 30-36 subjects [9,10,14,15]. Most approaches segment the continuous data into single strides, and calculate the similarity between the test sample and the training templates, assembled in the gait gallery. Only a few approaches avoid segmentation into single strides, e.g., [9, 16]. The closest match between a test gait cycle and a template in the gait gallery is found by a similarity score applied to pre-processed measures of the IMU [4] or dynamic time warping [5,7]. Descriptive statistics of each gait cycle are calculated as features in [17] and a decision tree or neural network is used for classification. [10] calculates the correlation between the test cycle and the template separately for the left and the right step [10]. They segment the data into single steps, but do not recognize whether it is a left or right step. During classification, they correlate the templates of the left and the right step with both steps of the test cycle. The maximum correlation describes the closeness to a template of the gallery. However, they do not consider the correlation between the left step and the right step of the same stride. We investigate whether the asymmetry in walking can be used to identify a person. Therefore, we introduce a feature which correlates the measures of the left step with the measures of the right step of the same stride.
Related work investigates the following conditions which may affect recognition accuracy: different footwear [30], walking over several days [5,15], and carrying a backpack [8]. Different sensor placement between the trials is included when recording over several days [5,11,15] or is considered explicitly for recording in one session [8]. This aspect is of importance because small variations in the position of the sensor on the body cannot be avoided. We also include slight variations of the sensor placement when recording our dataset. Furthermore, we investigate the robustness of identification against variations in speed. [11] includes slow, normal, and fast gait cycles into the gait gallery and report an equal error ratio of 7%. We further investigate the impact of speed when slow or fast walking is not included in the training data and how the symmetry-based approach is influenced by speed changes. Furthermore, walking on a curved route versus straight walking has not yet been considered when recording gait data with IMU. Walking on a curved route is a great challenge for computer vision, because the perspective changes. Considering this aspect, motion capture with IMUs is preferable. We investigate to what extent an identification approach developed for straight walking can be transferred to walking on a curved route without further modifying the method.
III. SEGMENTATION
In order to realize online recognition and increase the number of gait samples, an auto-segmentation process is necessary. Segmentation is the process of extracting the temporal extents of single strides from continuous time series data of gait. In this paper, we are aiming for a computationally light segmentation method to detect a single stride based only on the sensor at the belt.
A. Significant dimension for segmentation
First, we analyzed single stride hip movement waveforms of 20 subjects during normal gait and concluded that they share the same significant features. One participant¶V� VLQJOH�stride data of an IMU attached to the pelvis (forward centered)
are extracted manually and spectral analysis results are shown in Fig.2 as an example. The z-axis is the axis of forward direction of movement and this direction can be automatically determined in the case of arbitrary sensor orientation. As can be seen in Fig. 2, the angular velocity about the z-axis, CUNKí:P; has less noise than the other components. Furthermore, CUNKí:P; has two principal frequencies which are shown as two positive peaks, the smaller one (around 1.5Hz) represents the frequency of a single stride and the larger one (around 3Hz) represents the frequency of single steps. We can observe that the waveform of a single stride of CUNKí:P; has more obvious features than other degrees of freedom, even after using a Butterworth low pass filter (3Hz). Therefore, the rotational speed CUNKí:P; from the sensor on the pelvis is chosen as the significant dimension for segmentation.
Figure 2. Frequency analysis and waveform of single strides of a
participant.
B. Feature based segmentation
To enlarge the feature of CUNKí:P; of pelvis, equation (1) is applied after CUNKí:P; is filtered. The results of this transformation for one participant are shown in Fig.3. Furthermore, to make the segment point easy to detect, we can approximately regard the waveform shown in Fig.4 as the feature (:P; of a single stride.
One gait cycle begins with the heel contact with one foot and ends with the heel contract of the same foot. In this paper, we use the right foot. The gait cycle can also be divided into two phases, the stance phase and the swing phase. The cycle samples of the hip in Fig.3 are manually extracted based on the ground truth of the right heel contacting the ground. Thus in Fig.4, the first negative peak P1 represents the beginning of the stance phase of the right foot with the heel contact of the
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 221
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 222
V. TRAINING AND CLASSIFICATION
In order to test if the proposed asymmetry-based features work for human identification, a simple classifier is used. In this paper, a simple Bayes based Gaussian classifier is used for training and classification. Compared to using Hidden Markov Models for gait recognition, our approach is computionally lighter, hence, faster. To hold all features, all the componants of IÜ are used for computation without any dimensionality reduction. According to Bayes rule, the class Y to be identified from the maximum posterior probability of Y given by the test data X can be transformed from (3) to (4).
Y(X)=argmaxP:;�:; (3)
Y(X)= argmax [P::�;;P(;)] (4)
To ensure that the prior probability is uniform, one sample is left out from every labeled class in the same order during leave-one-out cross validation. Therefore, (4) can be simplified to (5) and (6). (E =1~total number of classes).
Y(X)= argmax [P::�;Ü;] (5)
2::�;Ü; L )=QOOE=J::á äÜ áêÜ; (6)
VI. EXPERIMENT
To validate the proposed approach, a human gait database was collected. This experiment was conducted at the University of Waterloo. 20 participants (12 males and 8 females) were asked to walk on a straight route and two different curved routes, while wearing IMUs. 3DUWLFLSDQWV¶ demographic information is summarized in Table I. The experiment was approved by the University of Waterloo Research Ethics Board, and signed consent was obtained from all participants.
TABLE I. PARTICIPANTS¶ DEMOGRAPHIC INFORMATION.
Age Height[cm] Weight[kg]
Ä 27 172 73
Ë 4.84 10.62 9.37
The data collection was carried out with the Shimmer IMU, a small wearable wireless inertial measurement sensor device [24]. It collects linear acceleration and angular velocity and transmits this information via Bluetooth to a nearby computer.
In this experiment, the sampling frequency was set to 128Hz. Three IMUs were used in this experiment. One is attached to the pelvis (forward centered); this sensor was used for both single stride segmentation and human identification, the second is attached to the right ankle and only used to provide the ground truth for the validation of single stride segmentation during walking, since the z-component of acceleration shows a spike due to impact when the right foot contacts the ground. The third sensor is attached to the thorax but it was not used in this paper.
A. Straight route
10 rounds of walking on a straight route (approximately 12m distance) at different speeds were collected from each participant: normal, slow and fast. Participants were asked to walk in the order of normal, slow, fast in each round, wearing the 3 IMUs. Thus in total 30 trials were recorded from each
participant. Considering the fact that walking speed is related to the personality of the individual, we did not give any H[SOLFLW�JXLGHOLQHV�RQ�KRZ�IDVW�LV�µIDVW�DQG�KRZ�VORZ�LV�VORZ� and thus the speed of the gait is based on the individual. Additionally, before each round the placement of the IMU on the trunk was altered by asking the participant to move the belt, in order to exclude the possibility that persons are identified based on the sensor placement.
B. Curved route
Candidates were asked to walk both clockwise and counter-clockwise on circles of two different sizes. One is a bigger circle (r=3.7m) and the other a smaller circle (r=1.7m). Thus in total 4 trials of curved route walking were recorded per candidate.
In summary, there are 30 trials per subject under the condition of straight route walking and 4 trials per subject under the condition of curved route walking. 5 continuous single strides were extracted from the middle of each trial via the segmentation algorithm described in Section II; the number of samples that can be used for validation is shown in Table II. Since we only use the data recorded by the unit attached on the pelvis, 6-dimensional time series data consisting of the 3 dimensions of linear velocity and 3 dimensions of acceleration are contained in each sample. Hence, in equation (2), MÜ>G? Ð �
:.
TABLE II. SUMMARY OF THE NUMBER OF SAMPLES PER SUBJECT.
Route straight
Condition normal slow fast
Trial 10 10 10
Stride 50 50 50
Total 150
Route curved
Small counter small counter small
clockwise
big
counter
big
clockwise
Trial 1 1 1 1
Stride 5 5 5 5
Total 20
VII. EXPERIMENTAL RESULTS AND VALIDATION
When the feature based segmentation algorithm is applied to our database, the thresholds set in the feature-based segmentation algorithm are the same for every speed and participant. The feature based segmentation algorithm is applied to all trials collected in our experiment and the parameter values used are shown in Table III. The segmentation results from Participant 1 (P1) are shown in Fig.5 as an example. The red lines are the ground truth obtained from the unit attached at the ankle, and the black lines are the segment points obtained from the feature based algorithm described above.
TABLE III. THRESHOLD USED IN SEGMETATION.
Threshold T1 T2 T3 T4 M1
Value 75 10 230 320 -30
Threshold M2 M3 M4 M5 M6
Value -12 480 60 60 90
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 223
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 224
x We find a significant dimension (CUNKí:P; in this paper) for feature-based segmentation. Identification is based on the observation of a single stride (one left and one right step) after segmentation. This makes it possible to identify people online by giving the identification result right after a single stride. In addition, a voting algorithm can be applied with our approach by identifying people from three or more strides instead of one stride to enhance the robustness.
x Robustness of the approach is tested under different speed conditions and sensor placements. Additionally, we investigate whether walking in a curved line has an effect on the identification process.
Future tasks include optimizing the thresholds in the segmentation algorithm automatically based on the walking speed and finding new features of gait that are robust to path curvature and comparing the proposed approach with other existing approaches.
REFERENCES
[1] '�� *DIXURY�� ³$� VXUYH\� RI� ELRPHWULF� JDLW� UHFRJQLWLRQ�� $SSURDFKHV��
VHFXULW\� DQG� FKDOOHQJHV�´� LQ� 3URFHHGLQJV� of Annual Norwegian
Computer Science Conference, 2007.
[2] D. Gafurov, E. Snekkenes, and P. Bours. Spoof attacks on gait. IEEE
Transactions on Information Forensics and Security,Special Issue on
Human Detection and Recognition, 2007.
[3] D. Gafurov. Security analysis of impostor attempts with respect to
gender in gait biometrics. In Biometrics: Theory, Applications, and
Systems, First IEEE International Conference on, 2007.
[4] D. Gafurov and P. Bours. Improved hip based individual recognition
using wearable motion recording sensor. In Security Technology,
Disaster Recovery and Business Continuity, volume 122 of
Communications in Computer and Information Science, pages 179±186.
Springer Berlin Heidelberg, 2010.
[5] M. O. Derawi, P. Bours, and K. Holien. Improved cycle detection for
accelerometer based gait authentication. In Intelligent Information
Hiding and Multimedia Signal Processing (IIH-MSP), Sixth
International Conference on, 2010.
[6] T. Kobayashi, K. Hasida, and N. Otsu. Rotation invariant feature
extraction from 3-d acceleration signals. In International Conference
on Acoustics, Speech, and Signal Processing, pages 3684±3687, 2011..
[7] M. O. Derawi, C. Nickely, P. Bours, and C. Busch. Unobtrusive
user-authentication on mobile phones using biometric gait recognition.
In Intelligent Information Hiding and Multimedia Signal Processing
(IIH-MSP), 2010 Sixth International Conference on, 2010.
[8] D. Gafurov, E. Snekkenes, and P. Bours. Gait authentication and
identification using wearable accelerometer sensor. In 5th IEEE
Workshop on Automatic Identification Advanced Technologies, 2007.
[9] �J. R. Kwapisz, G. M. Weiss, and S. A. Moore. Cell phone based
biometric identification. In IEEE Fourth International Conference on
Biometrics: Theory, Applications and Systems, 2010.
[10] H. J. Ailisto, M. Lindholm, J. Mantyjarvi, E. Vildjiounaite, and S.
Makela. Identifying people from gait pattern with accelerometers. In
Biometric Technology for Human Identification, SPIE, volume 5779,
2005.
[11] J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H.
Ailisto. Identifying users of portable devices from gait pattern with
accelerometers. In Proc. of IEEE International Conference on
Acoustics, Speech, and Signal Processing, 2005.
[12] L. Rong, Z. Jianzhong, L. Ming, and H. Xiangfeng. A wearable
acceleration sensor system for gait recognition. In 2nd IEEE
Conference on Industrial Electronics and Applications, pages 2654±
2659, 2007.
[13] N. Trung, Y. Makihara, H. Nagahara, R. Sagawa, Y. Mukaigawa, and
Y. Yagi. Phase registration in a gallery improving gait authentication.
In the International Joint Conference on Biometrics (IJCB2011). IEEE
and IAPR, 2011.
[14] E. Vildjiounaite, S.-M. Makela, M. Lindholm, R. Riihimaki, V.
Kyllonen, J. Mantyjarvi, and H. Ailisto. Unobtrusive multimodal
biometrics for ensuring privacy and information security with personal
devices. In Pervasive Computing, 4th International Conference,
PERVASIVE, pages 187±201, 2006.
[15] L. Rong, D. Zhiguo, Z. Jianzhong, and L. Ming. Identification of
individual walking patterns using gait acceleration.In The 1st
International Conference on Bioinformatics and Biomedical
Engineering, pages 543±546, 2007.
[16] T. Zhang, G. Venture, Individual Recognition from Gait using Feature
Value Method, Cybernectics and Information Technology, Vol. 12, No
3, pages 86-95, 2012.
[17] B. Huang, M. Chen, P. Huang, and Y. Xu. Gait modeling for human
identification. In IEEE International Conference on Robotics and
Automation, 2007. [18] N. Trung, Y. Makihara, H. Nagahara, Y. Mukaigawa, and Y. Yagi.
³Performance Evaluation of Gait Recognition using the Largest Inertial
Sensor-based Gait Database,´ Proc. of the 5th IAPR Int. Conf. on
Biometrics, Paper ID 182, pp. 1-7, New Delhi, India, Mar., 2012.
[19] M. Karg, K. Kühnlenz, M. Buss , Recognition of Affect Based on Gait
Patterns , IEEE Transactions on Systems, Man, and Cybernetics, Part
B: Cybernetics , 40 (2010) , no. 4 , pages 1050-1061 .
[20] B. Gelder.Towards the neurobiology of emotional body language.
Journal of Nature reviews, Neuroscience, vol. 7, pages 242±249,
March2006.
[21] C.Roether, L.Omlor, A. Christensen, and M.Giese, ³&ULWLFDO� IHDWXUHV�
for the perception of emotion from gait,´�Journal of Vision, vol.9.no.6,
pages 1-32, 2009.
[22] Sadeghi H. Local or global asymmetry in gait of people without
impairments. Gait Posture 2003; 17(3): pages 197±204.
[23] Seeley MK, Umberger BR, Clasey JL & Shapiro R. The relation
between mild leg-length inequality and able-bodied gait asymmetry.
Journal of Sports Science and Medicine 9, pages 572-579, 2010.
[24] Kaufman, K.R., Miller, L.S. and Sutherland, D.H. Gait asymmetry in
patients with limb-length inequality. Journal of Pediatric Orthopaedics,
16(2), pages 144-150, (1996)
[25] $��%XUQV��%��5��*UHHQH��0��-��0F*UDWK��7��-��2¶6KHD��%��.XULV��6��0��Ayer, F. Stroiescu, and V. Cionca. Shimmer: A wireless sensor
platform for noninvasive biomedical research. IEEE Sensors Journal,
10:1527±1534, 2010.
[26] S. Sarkar, P. J. Phillips, Z, Liu, I. R. Vega, P. Grother, and K. W.
Bowyer, " The HumanID Gait Challenge Problem: Data Sets,
Performance, and Analysis." IEEE Trans. on Pattern Analysis and
Machine Intelligence, pp. 162-177, Vol.27, No.2, 2005.
[27] 1�� %RXOJRXULV�� '�� +DW]LQDNRV�� DQG� .�1�� 3ODWDQLRWLV��� ³*DLW�Recognition: A challenging signal processing technology for biometric
LGHQWLILFDWLRQ´�� IEEE Signal Processing Magazine, p.78-90, Vol, 22,
No.6, 2005.
[28] K. Frank, M. Nadales, P. Robertson and M. Angermann. "Reliable
real-time recognition of motion related human activities using MEMS
inertial sensors." Proc. of the 23rd Int. Technical Meeting of the
Satellite Division of the Institute of Navigation, 2010.
[29] C. Strohrmann,; H Harms, C. Setz, and G. 7U|VWHU�� ³Monitoring
Kinematic Changes with Fatigue in Running Using Body-Worn
Sensors´IEEE Transactions on Information Technology in
Biomedicine, IEEE, pp.983-990 Vol. 16, 2012.
[30] D. Gafurov��DQG�(��6QHNNHQHV��³7RZDUGV�XQGHUVWDQGLQJ�WKH�XQLTXHQHVV�
RI�JDLW�ELRPHWULF´�8th IEEE Int. Conf. on Automatic Face & Gesture
Recognition, ,pp. 1-8, 2008
978-1-4799-0509-6/13/$31.00 ©2013 IEEE 225