research article low-rank linear dynamical systems for

8
Research Article Low-Rank Linear Dynamical Systems for Motor Imagery EEG Wenchang Zhang, 1,2 Fuchun Sun, 1 Chuanqi Tan, 1 and Shaobo Liu 1 1 e State Key Laboratory of Intelligent Technology and Systems, Computer Science and Technology School, Tsinghua University, FIT Building, Beijing 100084, China 2 Institute of Medical Equipment, Wandong Road, Hedong District, Tianjin, China Correspondence should be addressed to Fuchun Sun; [email protected] Received 18 September 2016; Revised 14 November 2016; Accepted 16 November 2016 Academic Editor: Feng Duan Copyright © 2016 Wenchang Zhang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. e common spatial pattern (CSP) and other spatiospectral feature extraction methods have become the most effective and successful approaches to solve the problem of motor imagery electroencephalography (MI-EEG) pattern recognition from multichannel neural activity in recent years. However, these methods need a lot of preprocessing and postprocessing such as filtering, demean, and spatiospectral feature fusion, which influence the classification accuracy easily. In this paper, we utilize linear dynamical systems (LDSs) for EEG signals feature extraction and classification. LDSs model has lots of advantages such as simultaneous spatial and temporal feature matrix generation, free of preprocessing or postprocessing, and low cost. Furthermore, a low-rank matrix decomposition approach is introduced to get rid of noise and resting state component in order to improve the robustness of the system. en, we propose a low-rank LDSs algorithm to decompose feature subspace of LDSs on finite Grassmannian and obtain a better performance. Extensive experiments are carried out on public dataset from “BCI Competition III Dataset IVa” and “BCI Competition IV Database 2a.” e results show that our proposed three methods yield higher accuracies compared with prevailing approaches such as CSP and CSSP. 1. Introduction With the development of the simpler brain rhythm sampling technique and powerful low-cost computer equipment over the past two decades, a noninvasive brain-computer interface (BCI) called electroencephalography (EEG) has attracted more and more attention than other BCIs such as magnetoen- cephalography (MEG), functional magnetic resonance imag- ing (fMRI), and near infrared spectroscopy (NIRS). Among various EEG signals, certain neurophysiological patterns can be recognized to determine the user’s intentions such as visual evoked potentials (VEPs), P300 evoked potentials, slow cortical potentials (SCPs), and sensorimotor rhythms. EEG brings hope to patients with amyotrophic lateral sclerosis, brainstem stroke, and spinal cord injury [1]. Motor imagery (MI), which is known as the mental rehearsal of a motor act without real body movement execution, represents a new approach to access the motor system for rehabilitation at all stages of the stroke recovery. People with severe motor disabilities can use EEG-BCI to realize the communication and control and even to restore their motor disabilities [2, 3]. erefore, an increasing number of researchers are working on MI-BCI for stroke patient rehabilitation [4, 5]. MI-BCI concentrates on sensorimotor - or -rhythms that has the phenomenon known as event-related synchro- nization (ERS) or event-related desynchronization (ERD). However, the MI pattern recognition is still a challenge due to the low signal-to-noise ratio, highly subject-specific data, and low processing speed. For these reasons, more and more digital signal processing (DSP) methods and machine learning algorithms are applied to the MI-BCI analysis. Unlike static signals such as images and semantics, the EEG signals are dynamic that lie in a spatiotemporal feature space. us a large variety of feature extraction algorithms are proposed, including power spectral density (PSD) values [6, 7], autoregressive (AR) parameters [8, 9], and time-frequency features [10]. For MI-BCI pattern recognition, there are mainly three types of methods: autoregressive components (AR) [11], wavelet transform (WT) [12, 13], and CSP [14, 15]. Because of effectiveness and simplicity in extracting Hindawi Publishing Corporation Computational Intelligence and Neuroscience Volume 2016, Article ID 2637603, 7 pages http://dx.doi.org/10.1155/2016/2637603

Upload: others

Post on 06-Nov-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Low-Rank Linear Dynamical Systems for

Research ArticleLow-Rank Linear Dynamical Systems for Motor Imagery EEG

Wenchang Zhang12 Fuchun Sun1 Chuanqi Tan1 and Shaobo Liu1

1The State Key Laboratory of Intelligent Technology and Systems Computer Science and Technology SchoolTsinghua University FIT Building Beijing 100084 China2Institute of Medical Equipment Wandong Road Hedong District Tianjin China

Correspondence should be addressed to Fuchun Sun fcsunmailtsinghuaeducn

Received 18 September 2016 Revised 14 November 2016 Accepted 16 November 2016

Academic Editor Feng Duan

Copyright copy 2016 Wenchang Zhang et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

The common spatial pattern (CSP) and other spatiospectral feature extraction methods have become the most effective andsuccessful approaches to solve the problem of motor imagery electroencephalography (MI-EEG) pattern recognition frommultichannel neural activity in recent years However these methods need a lot of preprocessing and postprocessing such asfiltering demean and spatiospectral feature fusion which influence the classification accuracy easily In this paper we utilizelinear dynamical systems (LDSs) for EEG signals feature extraction and classification LDSs model has lots of advantages such assimultaneous spatial and temporal feature matrix generation free of preprocessing or postprocessing and low cost Furthermorea low-rank matrix decomposition approach is introduced to get rid of noise and resting state component in order to improvethe robustness of the system Then we propose a low-rank LDSs algorithm to decompose feature subspace of LDSs on finiteGrassmannian and obtain a better performance Extensive experiments are carried out on public dataset from ldquoBCI CompetitionIII Dataset IVardquo and ldquoBCI Competition IV Database 2ardquoThe results show that our proposed three methods yield higher accuraciescompared with prevailing approaches such as CSP and CSSP

1 Introduction

With the development of the simpler brain rhythm samplingtechnique and powerful low-cost computer equipment overthe past two decades a noninvasive brain-computer interface(BCI) called electroencephalography (EEG) has attractedmore andmore attention than other BCIs such asmagnetoen-cephalography (MEG) functional magnetic resonance imag-ing (fMRI) and near infrared spectroscopy (NIRS) Amongvarious EEG signals certain neurophysiological patterns canbe recognized to determine the userrsquos intentions such asvisual evoked potentials (VEPs) P300 evoked potentials slowcortical potentials (SCPs) and sensorimotor rhythms EEGbrings hope to patients with amyotrophic lateral sclerosisbrainstem stroke and spinal cord injury [1] Motor imagery(MI) which is known as the mental rehearsal of a motoract without real body movement execution represents a newapproach to access the motor system for rehabilitation atall stages of the stroke recovery People with severe motordisabilities can use EEG-BCI to realize the communication

and control and even to restore their motor disabilities [2 3]Therefore an increasing number of researchers are workingon MI-BCI for stroke patient rehabilitation [4 5]

MI-BCI concentrates on sensorimotor 120583- or 120573-rhythmsthat has the phenomenon known as event-related synchro-nization (ERS) or event-related desynchronization (ERD)However the MI pattern recognition is still a challengedue to the low signal-to-noise ratio highly subject-specificdata and low processing speed For these reasons more andmore digital signal processing (DSP) methods and machinelearning algorithms are applied to the MI-BCI analysisUnlike static signals such as images and semantics the EEGsignals are dynamic that lie in a spatiotemporal feature spaceThus a large variety of feature extraction algorithms areproposed including power spectral density (PSD) values [67] autoregressive (AR) parameters [8 9] and time-frequencyfeatures [10] For MI-BCI pattern recognition there aremainly three types of methods autoregressive components(AR) [11] wavelet transform (WT) [12 13] and CSP [1415] Because of effectiveness and simplicity in extracting

Hindawi Publishing CorporationComputational Intelligence and NeuroscienceVolume 2016 Article ID 2637603 7 pageshttpdxdoiorg10115520162637603

2 Computational Intelligence and Neuroscience

spatial features CSP becomes one of the most popular andsuccessful solutions for MI-BCI analysis according to thewinnersrsquo methods analysis of ldquoBCI Competition III DatasetIVardquo [16 17] and ldquoBCI Competition IV Database 2ardquo [18 19]Therefore many researchers devote theirselves to improvingthe original CSP method for better performances such ascommon spatiospectral pattern (CSSP) [20] common sparsespectral spatial pattern (CSSSP) [21] subband common spa-tial pattern (SBCSP) [22] filter bank common spatial pattern(FBCSP) [23] wavelet common spatial pattern (WCSP) [24]and separable common spatiospectral patterns (SCSSP) [25]Most of these improved CSP methods fuse spectral andspatial characteristics in the spatiospectral feature space andfinally achieve success by comparison experiments

Despite its effectiveness in extracting features of MI-BCICSP needs a lot of preprocessing and postprocessing suchas filtering demean and spatiospectral feature fusion whichinfluence the classification accuracy easily In this paper weutilize linear dynamical systems (LDSs) for processing EEGsignals in MI-BCI Although LDSs succeed in the field ofcontrol to the best of our knowledge this model has barelybeen tried in the feature extraction of EEG analysis so farComparedwithCSPmethod LDSs have the following advan-tages first LDS can simultaneously generate spatiospectraldual-feature matrix second there is no need to preprocessor postprocess signals and the raw data can be directly fedinto the model third it is easy to use and of low cost last theextracted features from the LDS are much more effective forclassification

Furthermore we apply low-rank matrix decompositionapproaches [26ndash28] that have the ability to learn represen-tational matrix even in presence of corrupted data Thenoise of the data can be get rid of hence to improve therobustness However there are two ways for the EEG low-rank decomposition One aims at the EEG raw data the otheraims at features extracted fromLDSs which is proposed by usand called low-rank LDSs (LR-LDSs)

This paper mainly has the following contributions (1)We utilize LDSs for MI-EEG feature extraction to solvethe MI pattern recognition problem (2) Low-rank matrixdecomposition method is applied to improve the robustnessfor the raw data analysis (3) We propose LR-LDSs on finiteGrassmannian feature space (4) Plenty of comparison exper-iments demonstrate the effectiveness of these approaches

The rest of this paper is organized as follows Section 2provides LDSs model to realize the feature extraction of EEGsignals Section 3 presents low-rank matrix decompositionmethod for the EEG raw data analysis Section 4 introducesLR-LDSs method Then the proper classification algorithmis explained in Section 5 Section 6 compares the threeproposedmethods (LDSs LR+CSP and LR-LDSs) with otherstate-of-the-art algorithms in different databases Finally thesummary and conclusion are presented in Section 7

2 LDSs Modeling

LDSs also known as linear Gaussian state-space modelshave been used successfully in modeling and controllingdynamical systems In recent few years more and more

problems extending to computer vision [29 30] speechrecognition [31] and tactile perception [32] have been solvedby LDSs model EEG signals are sequences of brain electronsampling that have typical dynamic textures We presentthe features of EEG dynamic textures by LDSs modelingand apply machine learning (ML) algorithms to capturethe essence of dynamic textures for feature extraction andclassification

Let 119884(119905)119905=1120591 119884(119905) isin 119877119898 be a sequence of 120591 EEGsignal sample at each instant of time 119905 If there is a set of 119899spatial filters 120593120572 119877 rarr 119877119898 120572 = 1 119899 we have 119909(119905) =sum119896119894=1 119860 119894119909(119905 minus 119894) + 119861V(119905) with 119860 119894 isin 119877119899times119899 119861 isin 119877119899times119899V inde-pendent and identically distributed realization item V(119905) isin119877119899V and suppose that sequence of observed variables 119884(119905)can be represented approximately by function of dimensionalhidden state 119909(119905) 119910(119905) = 120593(119909(119905)) + 120596(119905) where 120596(119905) isin119877119898 is an independent and identically distributed sequencedrawn from a known distribution resulting in a positivemeasured sequence We redefine the hidden state of 119909(119905) tobe [119909(119905)119879 119909(119905 minus 1)119879 sdot sdot sdot 119909(119905 minus 119896)119879]119879 and consider a lineardynamic system as an autoregressive moving average processwithout firm input distribution

119909 (119905 + 1) = 119860119909 (119905) + 119861V (119905) 119910 (119905) = 120593 (119909 (119905)) + 120596 (119905)

119909 (0) = 1199090(1)

with V(119905) 120593(119909(119905)) distribution unknown howeverIn order to solve the above problem we can regard it as a

white and zero-meanGaussian noise linear dynamical systemand propose a simplified and closed-form solution

119909 (119905 + 1) = 119860119909 (119905) + 119861V (119905) V (119905) sim 119873 (0 119876) 119910 (119905) = 119862119909 (119905) + 120596 (119905) + 119910 120596 (119905) sim 119873 (0 119877)

119909 (0) = 1199090(2)

where 119860 isin 119877119899times119899 is the transition matrix that describesthe dynamics property 119862 isin 119877119898times119899 is the measurementmatrix that describes the spatial appearance 119910 isin 119877119898 is themean of 119910(119905) and V(119905) and 120596(119905) are noise components Weshould estimate the model parameters 119860119862119876 119877 from themeasurements 119910(119905) 119910(120591) and transform them into themaximum-likelihood solution

(120591) (120591) (120591) (120591)= argmax119860119862119876119877

119901 (119910 (1) 119910 (120591)) (3)

and however optimal solutions of this problem bring com-putational complexity

We apply matrix decomposition to simplify the com-putation by the closed-form solution The singular value

Computational Intelligence and Neuroscience 3

decomposition (SVD) solution is the best estimate of 119862 inFrobenius function

(120591) (120591) = argmin119862119883

119882 (120591)119865subject to 119884 (120591) = 119862119883 (120591) + 119882 (120591)

119862119879119862 = 119868(4)

Let119884 = 119880Σ119881119879 and we get the parameter estimation of

= 119880 = Σ119881119879 (5)

where = [119883(1) 119883(2) 119883(120591)] can be determined byFrobenius

(120591) = argmin119860

10038171003817100381710038171198832 (120591) minus 1198601198831 (120591 minus 1)1003817100381710038171003817119865 (6)

where 1198832(120591) = [119883(2) 119883(3) 119883(120591)] So the solution is inclosed-form using the state estimated

(120591) = [119883 ( 2) 119883 ( 3) sdot sdot sdot 119883 ( 120591)]lowast [119883 ( 1) 119883 ( 2) sdot sdot sdot 119883 ( 120591 minus 1)]dagger (7)

where dagger denotes matrix pseudoinverseWe can obtain the result [119860119862] a couple of spatiotem-

poral feature matrix The MATLAB program of LDSs canbe found in Supplementary Material algorithm 1 availableonline at httpdxdoiorg10115520162637603

3 Low-Rank Matrix Decomposition

EEG signals have poor quality because they are usuallyrecorded by electrodes placed on the scalp in a noninvasivemanner that has to cross the scalp skull and many otherlayers Therefore they are moreover severely affected bybackground noise generated either inside the brain or exter-nally over the scalp Low-rank (LR) matrix decompositioncan often capture the global information by reconstructingthe top few singular values and the corresponding singularvectors This method is widely applied in the field of imagedenoising and face recognition (FR) Concretely low-rank(LR)matrix recovery seeks to decompose a datamatrix119883 into119860 + 119864 where 119860 is a low-rank matrix and 119864 is the associatedsparse error Candes et al [33] propose to relax the originalproblem into the following tractable formulation

min119860119864

119860lowast + 120572 1198641st 119883 = 119860 + 119864 (8)

where the nuclear norm 119860lowast (the sum of the singular values)approximates the rank of 119860 and the 1198971-norm 1198641 is sparseconstraint

Then Zhang and Li [34] decompose each image intocommon component condition component and a sparse

residual Siyahjani et al [35] introduce the invariant com-ponents to the sparse representation and low-rank matrixdecomposition approaches and successfully apply to solvecomputer vision problems They add orthogonal constraintto assume that invariant and variant components are linearindependent Therefore we decompose EEG signals as acombination of three components resting state componentmotor imagery component represented by low-rank matrixand a sparse residual However in practice it needs somedigital signal processing (DSP) that is wavelet transformor discrete Fourier transform before decomposition Particu-larly raw time-domain signals without any preprocessing arenot suitable for low-rank matrix decomposition directly Thetraining dataset 119883 can be decomposed by 119883 fl 119860 + 119861 + 119864where 119860 isin 119877119898times119899 is a low-rank matrix and collects event-related EEG signal components 119861 isin 119877119898times119899 approximatesinvariant and denotes resting state signal components thatare sampled by subjects without any motor imagery and 119864 isin119877119898times119899 is the matrix of sparse noiseTherefore training datasetcan be decomposed as the following formulation

119883 fl 119860 + 119861 + 119864 (9)

On ideal condition each sampling channel of subjectrsquosbrain EEG signals in resting state is similar In other wordssum of each different 119860 raw is minimum 119861 should addcommon constraint as the following formulation

sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865 (10)

We propose optimization problem formulation as

min119860119861119864

119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865st 119883 = 119860 + 119861 + 119864

(11)

Then augmented Lagrange multiplier (ALM) [36]method is utilized to solve the above problem Theaugmented Lagrangian function 119871(119860 119861 119864 120582) is givenby

119871 (119860 119861 119864 120582) = 119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865+ ⟨120582119883 minus 119860 minus 119861 minus 119864⟩+ 1205832 119883 minus 119860 minus 119861 minus 1198642119865

(12)

where 120583 is a positive scalar and 120582 is a Lagrange multi-plier matrix We employ an inexact ALM (IALM) methoddescribed in Algorithm 1 to solve this problem where 119869(119883) =max(lansvd(119883) 120572minus1119883119865) in the initialization 1205820 = 119883119869(119883)and lansvd(sdot) computes the largest singular value

When low-rank matrix 119860 denoting event-related EEGsignal components are generated we can utilize some featureextractionmethods such asCSP andCSSP to classifyMI-BCIIn other words low-rank matrix decomposition method inthis section can be considered as a preprocessing part beforefeature extraction and classification

4 Computational Intelligence and Neuroscience

Input Observation matrix 119883 120582 penalty weights 120572 120573(1) 1205820 = 119883119869(119883) 1205830 gt 0 120588 gt 1 119896 = 0(2) while not converged do(3) Lines (4)ndash(12) solve (119860119896 119861119896 119864119896) = argmin119860119861119864 119871(119860 119861 119864 120582119896 120583119896)(4) 119895 = 0 1198600119896 = 119860119896 1198610119896 = 119861119896 1198640119896 = 119864119896(5) while not converged do(6) Line (7)-(8) solves 119860119896+1 = argmin119860 119871(119860 119861119896 119864119896 120582119896 120583119896)(7) (119880 119878 119881) = svd(119883 minus 119861119896+1 minus 119864119896+1 + 120582119896120583119896)(8) 119860119895+1

119896= 119880119878120583119896minus1 [119878]119881119879

(9) Line (10) solves 119861119896+1 = argmin119861 119871(119860119896+1 119861 119864119896 120582119896 120583119896)(10) 119864119895+1

119896= 119878120572120583119896minus1 [119883 minus 119860119895+1

119896minus 119861119895119896+ 120582119896120583119896]

(11) Update 119861119895+1119896

by solving 119864119896+1 = argmin119864 119871(119860119896+1 119861119896+1 119864 120582119896 120583119896)(12) 119895 larr 119895 + 1(13) end while(14) 119860119896+1 = 119860119895+1

119896 119861119896+1 = 119861119895+1

119896 119864119896+1 = 119864119895+1

119896

(15) 120583119896+1 = 120588120583119896 120582119896+1 = 120582119896 + 120583119896(119883 minus 119860119896+1 minus 119861119896+1 minus 119864119896+1)(16) 119896 larr 119896 + 1(17) end whileOutput (119860119896 119861119896 119864119896)Algorithm 1 Low-rank decomposition via the inexact ALMmethod

4 LR-LDSs on Finite Grassmannian

Beginning at an initial state 1199091 the expected observationsequence generated by a time-invariant model 119872 = (119860 119862)is obtained as 119864[1199101 1199102 1199101 ] = [119862119879 (119862119860)119879 (1198621198602)119879 ]1198791199091that lies in the column space of the extended observabil-ity matrix given by 119874119879infin = [119862119879 (119862119860)119879 (1198621198602)119879 ]119879 isin119877infintimes119899 LDSs can apply the extended observability subspaceO as descriptor but it is hard to calculate Turaga et al[37 38] approximate the extended observability by tak-ing the 119871-order observability matrix that is 119874(119899 119871) =[119862119879 (119862119860)119879 (119862119860119871minus1)119879]119879 In this way an LDS model canbe alternately identified as an 119899-dimensional subspace of119877119871119898

Given a database of EEG we can estimate LDSs modeland calculate the finite observability matrix that span sub-space as a point on the Riemannian manifold Then basedon low-rank and sparse matrix decomposition observabilitymatrix 119874 can be decomposed into 119863 + 119864 as the followingformulation

min119863119864

119863lowast + 120572 1198641st 119874 = 119863 + 119864 (13)

where 119863 is a low-rank matrix and 119864 is the associated sparseerror

The inexact ALM method can be also used to solvethe optimization problem like Algorithm 1 The output 119863represents low-rank descriptor for LDSs and can be employedfor the classification of EEG trails

5 Classification Algorithm

We extract features by the above LDSs model and get twofeature matrices 119860 and 119862 Unfortunately 119860 and 119862 have dif-ferent modal properties and dimensionalities So they cannot

be represented directly by a feature vector Riemannian geom-etry metric for the space of LDSs is hard to determine andneeds to satisfy several constraints Common classifiers suchas Nearest Neighbors (NNs) Linear Discriminant Analysis(LDA) and Support Vector Machines (SVM) cannot classifyfeatures in matrix form The feature matrix must be mappedto vector space We use Martin Distance [39 40] which isbased on the principal angles between two subspaces of theextended observability matrices as kernel to present distanceof different LDS feature matrix It can be defined as

1198632 (Θ119886 Θ119887) = minus2 119899sum119894=1

log 120582119894 (14)

where Θ119886 = 119862119886 119860119886 Θ119887 = 119862119887 119860119887 120582119894 is the eigenvaluesolving as the following equation

[ 0 O119886119887

(O119886119887)119879 0 ][119909119910] = 120582[O119886119886 0

0 O119887119887][119909

119910]st 119909119879O119886119886119909 = 1 119910119879O119887119887119910 = 1

(15)

where the extended observability matrices O119886 =[119862119879119886 119860119879119886119862119879119886 (119860119879119886)119899119862119879119886 ] O119887 = [119862119879119887 119860119879119887119862119879119887 (119860119879119887)119899119862119879119887 ]O119886119887 = (O119886)119879O119887 Algorithm 2 in Supplementary Materialpresents Martin Distance function programed by MATLAB

We can classify EEG signals by comparing Martin Dis-tance between training data and testing data Nearest twosamples mean that they may be of the same class Sothe forecast label and predict accuracy can be calculatedAlgorithm 3 in Supplementary Material is the classificationmethod of KNN

Considering LR-LDSs methods generating 119860 on FiniteGrassmannian unlike two feature matrices (119860119862) by LDSsEuclidean Distance and Mahalanobis Distance can describe

Computational Intelligence and Neuroscience 5

Table 1 Experimental accuracy results () obtained from each subject in BCI Competition III Dataset IVa for CSP CSSP and our proposedalgorithm (LDS)

Subject aa al av aw ay MeanCSP 7143 9464 6122 8928 7302 77918CSSP 7768 9643 6327 9063 7937 81476LDSs 7857 9643 6429 9018 7976 81846LR+CSP 7768 9643 6378 9018 7976 81566LR-LDSs 7946 9821 6378 9018 8056 82438

the distance between two feature spaces of EEG trails afterLR-LDS They are simple efficient and common for mea-suring distance between two points In order to improve theaccuracy of classification we can also employmetric learningmethods using the label information to learn a new metricor pseudometric such as neighborhood components analysisand large margin nearest neighbor

6 Experimental Evaluation

From the above sections we propose three methods forEEG pattern recognition LDSs LR+CSP and LR-LDSs Twodatasets of motor imagery EEG including BCI CompetitionIII Dataset IVa and BCI Competition IVDatabase 2a are usedto evaluate our three methods compared with other state-of-the-art algorithms such as CSP andCSSP All experiments arecarried out with MATLAB on Intel Core i7 290-GHz CPUwith 8GB RAM

61 BCI Competition III Dataset IVa Dataset IVa is recordedfrom five healthy subjects labeled as ldquoaardquo ldquoalrdquo ldquoavrdquo ldquoawrdquoand ldquoayrdquo with visual cues indicated for 35 s performingright hand and foot motor imagery The EEG signal has 118channels andmarkers that indicate the timepoints of 280 cuesfor each subject band-pass filtered between 005 and 200Hzand downsampled to 100Hz

Before feature extracting for comparison experiment theraw data needs some preprocessing Firstly we extract atime segment located from 05 to 3 s and employ FastICAto remove artifacts arising from eye and muscle movementsSecondly we chose 21 channels over the motor cortex (CP6CP4 CP2 C6 C4 C2 FC6 FC4 FC2 CPZ CZ FCZ CP1CP3 CP5 C1 C3 C5 FC1 FC3 and FC5) that related tomotor imagery

In order to improve the performance of CSP and CSSPwe apply Butterworth filter for EEG signals filtering withina specific frequency band between 8 and 30Hz whichencompasses both the alpha rhythm (8ndash13Hz) and the betarhythm (14ndash30Hz) that relate to motor imagery Then weprogram MATLAB code to get spatial filter parameters andfeature vectors by variance Finally a LDA classifier is used tofind a separating hyperplane of the feature vectors

In LDSs model the value of a hidden parameter describ-ing dimension of Riemannian feature space is closely relatedto final accuracyWe chose the highest accuracy performancesubject ldquoalrdquo and the lowest accuracy performance subjectldquoavrdquo to show the relationship between hidden parameter and

0 5 10 15 200

01

02

03

04

05

06

07

08

09

1

Hidden parameter

Accu

racy

Subject alSubject av

Figure 1 The relationship between hidden parameter and accuracyfor LDSs We choose ldquoalrdquo and ldquoavrdquo which are the highest andlowest accuracy performance respectively to show the relationshipbetween hidden parameter and accuracy

classification accuracy The result of experiment is presentedin Figure 1 which indicates that the accuracy tends to increasewhen the value of hidden parameter augments approximatelyand the highest accuracy happens near hidden parametervalue of 16

Then five methods including CSP CSSP LDSs LR+CSPand LR-LDSs are compared with each other The results arelisted in Table 1

From Table 1 the bold figures present the best perfor-mance results LR-LDSs are in the majority The last rowshows that the mean of LR-LDS classification accuracy ismuch better than CSP and a little higher than the othersComparing with CSP and LR+CSP LR method is very effi-cient and useful to improve accuracy LDSs related methodsoutperform CSP and CSSP due to their both spatial andtemporal features extraction

62 BCI Competition IV Database 2a Database 2a consistsof EEG data from 9 subjects There are four different motorimagery tasks including movement of the left hand righthand both feet and tongue At the beginning of each triala fixation cross and a short acoustic warning tone appear

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 2: Research Article Low-Rank Linear Dynamical Systems for

2 Computational Intelligence and Neuroscience

spatial features CSP becomes one of the most popular andsuccessful solutions for MI-BCI analysis according to thewinnersrsquo methods analysis of ldquoBCI Competition III DatasetIVardquo [16 17] and ldquoBCI Competition IV Database 2ardquo [18 19]Therefore many researchers devote theirselves to improvingthe original CSP method for better performances such ascommon spatiospectral pattern (CSSP) [20] common sparsespectral spatial pattern (CSSSP) [21] subband common spa-tial pattern (SBCSP) [22] filter bank common spatial pattern(FBCSP) [23] wavelet common spatial pattern (WCSP) [24]and separable common spatiospectral patterns (SCSSP) [25]Most of these improved CSP methods fuse spectral andspatial characteristics in the spatiospectral feature space andfinally achieve success by comparison experiments

Despite its effectiveness in extracting features of MI-BCICSP needs a lot of preprocessing and postprocessing suchas filtering demean and spatiospectral feature fusion whichinfluence the classification accuracy easily In this paper weutilize linear dynamical systems (LDSs) for processing EEGsignals in MI-BCI Although LDSs succeed in the field ofcontrol to the best of our knowledge this model has barelybeen tried in the feature extraction of EEG analysis so farComparedwithCSPmethod LDSs have the following advan-tages first LDS can simultaneously generate spatiospectraldual-feature matrix second there is no need to preprocessor postprocess signals and the raw data can be directly fedinto the model third it is easy to use and of low cost last theextracted features from the LDS are much more effective forclassification

Furthermore we apply low-rank matrix decompositionapproaches [26ndash28] that have the ability to learn represen-tational matrix even in presence of corrupted data Thenoise of the data can be get rid of hence to improve therobustness However there are two ways for the EEG low-rank decomposition One aims at the EEG raw data the otheraims at features extracted fromLDSs which is proposed by usand called low-rank LDSs (LR-LDSs)

This paper mainly has the following contributions (1)We utilize LDSs for MI-EEG feature extraction to solvethe MI pattern recognition problem (2) Low-rank matrixdecomposition method is applied to improve the robustnessfor the raw data analysis (3) We propose LR-LDSs on finiteGrassmannian feature space (4) Plenty of comparison exper-iments demonstrate the effectiveness of these approaches

The rest of this paper is organized as follows Section 2provides LDSs model to realize the feature extraction of EEGsignals Section 3 presents low-rank matrix decompositionmethod for the EEG raw data analysis Section 4 introducesLR-LDSs method Then the proper classification algorithmis explained in Section 5 Section 6 compares the threeproposedmethods (LDSs LR+CSP and LR-LDSs) with otherstate-of-the-art algorithms in different databases Finally thesummary and conclusion are presented in Section 7

2 LDSs Modeling

LDSs also known as linear Gaussian state-space modelshave been used successfully in modeling and controllingdynamical systems In recent few years more and more

problems extending to computer vision [29 30] speechrecognition [31] and tactile perception [32] have been solvedby LDSs model EEG signals are sequences of brain electronsampling that have typical dynamic textures We presentthe features of EEG dynamic textures by LDSs modelingand apply machine learning (ML) algorithms to capturethe essence of dynamic textures for feature extraction andclassification

Let 119884(119905)119905=1120591 119884(119905) isin 119877119898 be a sequence of 120591 EEGsignal sample at each instant of time 119905 If there is a set of 119899spatial filters 120593120572 119877 rarr 119877119898 120572 = 1 119899 we have 119909(119905) =sum119896119894=1 119860 119894119909(119905 minus 119894) + 119861V(119905) with 119860 119894 isin 119877119899times119899 119861 isin 119877119899times119899V inde-pendent and identically distributed realization item V(119905) isin119877119899V and suppose that sequence of observed variables 119884(119905)can be represented approximately by function of dimensionalhidden state 119909(119905) 119910(119905) = 120593(119909(119905)) + 120596(119905) where 120596(119905) isin119877119898 is an independent and identically distributed sequencedrawn from a known distribution resulting in a positivemeasured sequence We redefine the hidden state of 119909(119905) tobe [119909(119905)119879 119909(119905 minus 1)119879 sdot sdot sdot 119909(119905 minus 119896)119879]119879 and consider a lineardynamic system as an autoregressive moving average processwithout firm input distribution

119909 (119905 + 1) = 119860119909 (119905) + 119861V (119905) 119910 (119905) = 120593 (119909 (119905)) + 120596 (119905)

119909 (0) = 1199090(1)

with V(119905) 120593(119909(119905)) distribution unknown howeverIn order to solve the above problem we can regard it as a

white and zero-meanGaussian noise linear dynamical systemand propose a simplified and closed-form solution

119909 (119905 + 1) = 119860119909 (119905) + 119861V (119905) V (119905) sim 119873 (0 119876) 119910 (119905) = 119862119909 (119905) + 120596 (119905) + 119910 120596 (119905) sim 119873 (0 119877)

119909 (0) = 1199090(2)

where 119860 isin 119877119899times119899 is the transition matrix that describesthe dynamics property 119862 isin 119877119898times119899 is the measurementmatrix that describes the spatial appearance 119910 isin 119877119898 is themean of 119910(119905) and V(119905) and 120596(119905) are noise components Weshould estimate the model parameters 119860119862119876 119877 from themeasurements 119910(119905) 119910(120591) and transform them into themaximum-likelihood solution

(120591) (120591) (120591) (120591)= argmax119860119862119876119877

119901 (119910 (1) 119910 (120591)) (3)

and however optimal solutions of this problem bring com-putational complexity

We apply matrix decomposition to simplify the com-putation by the closed-form solution The singular value

Computational Intelligence and Neuroscience 3

decomposition (SVD) solution is the best estimate of 119862 inFrobenius function

(120591) (120591) = argmin119862119883

119882 (120591)119865subject to 119884 (120591) = 119862119883 (120591) + 119882 (120591)

119862119879119862 = 119868(4)

Let119884 = 119880Σ119881119879 and we get the parameter estimation of

= 119880 = Σ119881119879 (5)

where = [119883(1) 119883(2) 119883(120591)] can be determined byFrobenius

(120591) = argmin119860

10038171003817100381710038171198832 (120591) minus 1198601198831 (120591 minus 1)1003817100381710038171003817119865 (6)

where 1198832(120591) = [119883(2) 119883(3) 119883(120591)] So the solution is inclosed-form using the state estimated

(120591) = [119883 ( 2) 119883 ( 3) sdot sdot sdot 119883 ( 120591)]lowast [119883 ( 1) 119883 ( 2) sdot sdot sdot 119883 ( 120591 minus 1)]dagger (7)

where dagger denotes matrix pseudoinverseWe can obtain the result [119860119862] a couple of spatiotem-

poral feature matrix The MATLAB program of LDSs canbe found in Supplementary Material algorithm 1 availableonline at httpdxdoiorg10115520162637603

3 Low-Rank Matrix Decomposition

EEG signals have poor quality because they are usuallyrecorded by electrodes placed on the scalp in a noninvasivemanner that has to cross the scalp skull and many otherlayers Therefore they are moreover severely affected bybackground noise generated either inside the brain or exter-nally over the scalp Low-rank (LR) matrix decompositioncan often capture the global information by reconstructingthe top few singular values and the corresponding singularvectors This method is widely applied in the field of imagedenoising and face recognition (FR) Concretely low-rank(LR)matrix recovery seeks to decompose a datamatrix119883 into119860 + 119864 where 119860 is a low-rank matrix and 119864 is the associatedsparse error Candes et al [33] propose to relax the originalproblem into the following tractable formulation

min119860119864

119860lowast + 120572 1198641st 119883 = 119860 + 119864 (8)

where the nuclear norm 119860lowast (the sum of the singular values)approximates the rank of 119860 and the 1198971-norm 1198641 is sparseconstraint

Then Zhang and Li [34] decompose each image intocommon component condition component and a sparse

residual Siyahjani et al [35] introduce the invariant com-ponents to the sparse representation and low-rank matrixdecomposition approaches and successfully apply to solvecomputer vision problems They add orthogonal constraintto assume that invariant and variant components are linearindependent Therefore we decompose EEG signals as acombination of three components resting state componentmotor imagery component represented by low-rank matrixand a sparse residual However in practice it needs somedigital signal processing (DSP) that is wavelet transformor discrete Fourier transform before decomposition Particu-larly raw time-domain signals without any preprocessing arenot suitable for low-rank matrix decomposition directly Thetraining dataset 119883 can be decomposed by 119883 fl 119860 + 119861 + 119864where 119860 isin 119877119898times119899 is a low-rank matrix and collects event-related EEG signal components 119861 isin 119877119898times119899 approximatesinvariant and denotes resting state signal components thatare sampled by subjects without any motor imagery and 119864 isin119877119898times119899 is the matrix of sparse noiseTherefore training datasetcan be decomposed as the following formulation

119883 fl 119860 + 119861 + 119864 (9)

On ideal condition each sampling channel of subjectrsquosbrain EEG signals in resting state is similar In other wordssum of each different 119860 raw is minimum 119861 should addcommon constraint as the following formulation

sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865 (10)

We propose optimization problem formulation as

min119860119861119864

119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865st 119883 = 119860 + 119861 + 119864

(11)

Then augmented Lagrange multiplier (ALM) [36]method is utilized to solve the above problem Theaugmented Lagrangian function 119871(119860 119861 119864 120582) is givenby

119871 (119860 119861 119864 120582) = 119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865+ ⟨120582119883 minus 119860 minus 119861 minus 119864⟩+ 1205832 119883 minus 119860 minus 119861 minus 1198642119865

(12)

where 120583 is a positive scalar and 120582 is a Lagrange multi-plier matrix We employ an inexact ALM (IALM) methoddescribed in Algorithm 1 to solve this problem where 119869(119883) =max(lansvd(119883) 120572minus1119883119865) in the initialization 1205820 = 119883119869(119883)and lansvd(sdot) computes the largest singular value

When low-rank matrix 119860 denoting event-related EEGsignal components are generated we can utilize some featureextractionmethods such asCSP andCSSP to classifyMI-BCIIn other words low-rank matrix decomposition method inthis section can be considered as a preprocessing part beforefeature extraction and classification

4 Computational Intelligence and Neuroscience

Input Observation matrix 119883 120582 penalty weights 120572 120573(1) 1205820 = 119883119869(119883) 1205830 gt 0 120588 gt 1 119896 = 0(2) while not converged do(3) Lines (4)ndash(12) solve (119860119896 119861119896 119864119896) = argmin119860119861119864 119871(119860 119861 119864 120582119896 120583119896)(4) 119895 = 0 1198600119896 = 119860119896 1198610119896 = 119861119896 1198640119896 = 119864119896(5) while not converged do(6) Line (7)-(8) solves 119860119896+1 = argmin119860 119871(119860 119861119896 119864119896 120582119896 120583119896)(7) (119880 119878 119881) = svd(119883 minus 119861119896+1 minus 119864119896+1 + 120582119896120583119896)(8) 119860119895+1

119896= 119880119878120583119896minus1 [119878]119881119879

(9) Line (10) solves 119861119896+1 = argmin119861 119871(119860119896+1 119861 119864119896 120582119896 120583119896)(10) 119864119895+1

119896= 119878120572120583119896minus1 [119883 minus 119860119895+1

119896minus 119861119895119896+ 120582119896120583119896]

(11) Update 119861119895+1119896

by solving 119864119896+1 = argmin119864 119871(119860119896+1 119861119896+1 119864 120582119896 120583119896)(12) 119895 larr 119895 + 1(13) end while(14) 119860119896+1 = 119860119895+1

119896 119861119896+1 = 119861119895+1

119896 119864119896+1 = 119864119895+1

119896

(15) 120583119896+1 = 120588120583119896 120582119896+1 = 120582119896 + 120583119896(119883 minus 119860119896+1 minus 119861119896+1 minus 119864119896+1)(16) 119896 larr 119896 + 1(17) end whileOutput (119860119896 119861119896 119864119896)Algorithm 1 Low-rank decomposition via the inexact ALMmethod

4 LR-LDSs on Finite Grassmannian

Beginning at an initial state 1199091 the expected observationsequence generated by a time-invariant model 119872 = (119860 119862)is obtained as 119864[1199101 1199102 1199101 ] = [119862119879 (119862119860)119879 (1198621198602)119879 ]1198791199091that lies in the column space of the extended observabil-ity matrix given by 119874119879infin = [119862119879 (119862119860)119879 (1198621198602)119879 ]119879 isin119877infintimes119899 LDSs can apply the extended observability subspaceO as descriptor but it is hard to calculate Turaga et al[37 38] approximate the extended observability by tak-ing the 119871-order observability matrix that is 119874(119899 119871) =[119862119879 (119862119860)119879 (119862119860119871minus1)119879]119879 In this way an LDS model canbe alternately identified as an 119899-dimensional subspace of119877119871119898

Given a database of EEG we can estimate LDSs modeland calculate the finite observability matrix that span sub-space as a point on the Riemannian manifold Then basedon low-rank and sparse matrix decomposition observabilitymatrix 119874 can be decomposed into 119863 + 119864 as the followingformulation

min119863119864

119863lowast + 120572 1198641st 119874 = 119863 + 119864 (13)

where 119863 is a low-rank matrix and 119864 is the associated sparseerror

The inexact ALM method can be also used to solvethe optimization problem like Algorithm 1 The output 119863represents low-rank descriptor for LDSs and can be employedfor the classification of EEG trails

5 Classification Algorithm

We extract features by the above LDSs model and get twofeature matrices 119860 and 119862 Unfortunately 119860 and 119862 have dif-ferent modal properties and dimensionalities So they cannot

be represented directly by a feature vector Riemannian geom-etry metric for the space of LDSs is hard to determine andneeds to satisfy several constraints Common classifiers suchas Nearest Neighbors (NNs) Linear Discriminant Analysis(LDA) and Support Vector Machines (SVM) cannot classifyfeatures in matrix form The feature matrix must be mappedto vector space We use Martin Distance [39 40] which isbased on the principal angles between two subspaces of theextended observability matrices as kernel to present distanceof different LDS feature matrix It can be defined as

1198632 (Θ119886 Θ119887) = minus2 119899sum119894=1

log 120582119894 (14)

where Θ119886 = 119862119886 119860119886 Θ119887 = 119862119887 119860119887 120582119894 is the eigenvaluesolving as the following equation

[ 0 O119886119887

(O119886119887)119879 0 ][119909119910] = 120582[O119886119886 0

0 O119887119887][119909

119910]st 119909119879O119886119886119909 = 1 119910119879O119887119887119910 = 1

(15)

where the extended observability matrices O119886 =[119862119879119886 119860119879119886119862119879119886 (119860119879119886)119899119862119879119886 ] O119887 = [119862119879119887 119860119879119887119862119879119887 (119860119879119887)119899119862119879119887 ]O119886119887 = (O119886)119879O119887 Algorithm 2 in Supplementary Materialpresents Martin Distance function programed by MATLAB

We can classify EEG signals by comparing Martin Dis-tance between training data and testing data Nearest twosamples mean that they may be of the same class Sothe forecast label and predict accuracy can be calculatedAlgorithm 3 in Supplementary Material is the classificationmethod of KNN

Considering LR-LDSs methods generating 119860 on FiniteGrassmannian unlike two feature matrices (119860119862) by LDSsEuclidean Distance and Mahalanobis Distance can describe

Computational Intelligence and Neuroscience 5

Table 1 Experimental accuracy results () obtained from each subject in BCI Competition III Dataset IVa for CSP CSSP and our proposedalgorithm (LDS)

Subject aa al av aw ay MeanCSP 7143 9464 6122 8928 7302 77918CSSP 7768 9643 6327 9063 7937 81476LDSs 7857 9643 6429 9018 7976 81846LR+CSP 7768 9643 6378 9018 7976 81566LR-LDSs 7946 9821 6378 9018 8056 82438

the distance between two feature spaces of EEG trails afterLR-LDS They are simple efficient and common for mea-suring distance between two points In order to improve theaccuracy of classification we can also employmetric learningmethods using the label information to learn a new metricor pseudometric such as neighborhood components analysisand large margin nearest neighbor

6 Experimental Evaluation

From the above sections we propose three methods forEEG pattern recognition LDSs LR+CSP and LR-LDSs Twodatasets of motor imagery EEG including BCI CompetitionIII Dataset IVa and BCI Competition IVDatabase 2a are usedto evaluate our three methods compared with other state-of-the-art algorithms such as CSP andCSSP All experiments arecarried out with MATLAB on Intel Core i7 290-GHz CPUwith 8GB RAM

61 BCI Competition III Dataset IVa Dataset IVa is recordedfrom five healthy subjects labeled as ldquoaardquo ldquoalrdquo ldquoavrdquo ldquoawrdquoand ldquoayrdquo with visual cues indicated for 35 s performingright hand and foot motor imagery The EEG signal has 118channels andmarkers that indicate the timepoints of 280 cuesfor each subject band-pass filtered between 005 and 200Hzand downsampled to 100Hz

Before feature extracting for comparison experiment theraw data needs some preprocessing Firstly we extract atime segment located from 05 to 3 s and employ FastICAto remove artifacts arising from eye and muscle movementsSecondly we chose 21 channels over the motor cortex (CP6CP4 CP2 C6 C4 C2 FC6 FC4 FC2 CPZ CZ FCZ CP1CP3 CP5 C1 C3 C5 FC1 FC3 and FC5) that related tomotor imagery

In order to improve the performance of CSP and CSSPwe apply Butterworth filter for EEG signals filtering withina specific frequency band between 8 and 30Hz whichencompasses both the alpha rhythm (8ndash13Hz) and the betarhythm (14ndash30Hz) that relate to motor imagery Then weprogram MATLAB code to get spatial filter parameters andfeature vectors by variance Finally a LDA classifier is used tofind a separating hyperplane of the feature vectors

In LDSs model the value of a hidden parameter describ-ing dimension of Riemannian feature space is closely relatedto final accuracyWe chose the highest accuracy performancesubject ldquoalrdquo and the lowest accuracy performance subjectldquoavrdquo to show the relationship between hidden parameter and

0 5 10 15 200

01

02

03

04

05

06

07

08

09

1

Hidden parameter

Accu

racy

Subject alSubject av

Figure 1 The relationship between hidden parameter and accuracyfor LDSs We choose ldquoalrdquo and ldquoavrdquo which are the highest andlowest accuracy performance respectively to show the relationshipbetween hidden parameter and accuracy

classification accuracy The result of experiment is presentedin Figure 1 which indicates that the accuracy tends to increasewhen the value of hidden parameter augments approximatelyand the highest accuracy happens near hidden parametervalue of 16

Then five methods including CSP CSSP LDSs LR+CSPand LR-LDSs are compared with each other The results arelisted in Table 1

From Table 1 the bold figures present the best perfor-mance results LR-LDSs are in the majority The last rowshows that the mean of LR-LDS classification accuracy ismuch better than CSP and a little higher than the othersComparing with CSP and LR+CSP LR method is very effi-cient and useful to improve accuracy LDSs related methodsoutperform CSP and CSSP due to their both spatial andtemporal features extraction

62 BCI Competition IV Database 2a Database 2a consistsof EEG data from 9 subjects There are four different motorimagery tasks including movement of the left hand righthand both feet and tongue At the beginning of each triala fixation cross and a short acoustic warning tone appear

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 3: Research Article Low-Rank Linear Dynamical Systems for

Computational Intelligence and Neuroscience 3

decomposition (SVD) solution is the best estimate of 119862 inFrobenius function

(120591) (120591) = argmin119862119883

119882 (120591)119865subject to 119884 (120591) = 119862119883 (120591) + 119882 (120591)

119862119879119862 = 119868(4)

Let119884 = 119880Σ119881119879 and we get the parameter estimation of

= 119880 = Σ119881119879 (5)

where = [119883(1) 119883(2) 119883(120591)] can be determined byFrobenius

(120591) = argmin119860

10038171003817100381710038171198832 (120591) minus 1198601198831 (120591 minus 1)1003817100381710038171003817119865 (6)

where 1198832(120591) = [119883(2) 119883(3) 119883(120591)] So the solution is inclosed-form using the state estimated

(120591) = [119883 ( 2) 119883 ( 3) sdot sdot sdot 119883 ( 120591)]lowast [119883 ( 1) 119883 ( 2) sdot sdot sdot 119883 ( 120591 minus 1)]dagger (7)

where dagger denotes matrix pseudoinverseWe can obtain the result [119860119862] a couple of spatiotem-

poral feature matrix The MATLAB program of LDSs canbe found in Supplementary Material algorithm 1 availableonline at httpdxdoiorg10115520162637603

3 Low-Rank Matrix Decomposition

EEG signals have poor quality because they are usuallyrecorded by electrodes placed on the scalp in a noninvasivemanner that has to cross the scalp skull and many otherlayers Therefore they are moreover severely affected bybackground noise generated either inside the brain or exter-nally over the scalp Low-rank (LR) matrix decompositioncan often capture the global information by reconstructingthe top few singular values and the corresponding singularvectors This method is widely applied in the field of imagedenoising and face recognition (FR) Concretely low-rank(LR)matrix recovery seeks to decompose a datamatrix119883 into119860 + 119864 where 119860 is a low-rank matrix and 119864 is the associatedsparse error Candes et al [33] propose to relax the originalproblem into the following tractable formulation

min119860119864

119860lowast + 120572 1198641st 119883 = 119860 + 119864 (8)

where the nuclear norm 119860lowast (the sum of the singular values)approximates the rank of 119860 and the 1198971-norm 1198641 is sparseconstraint

Then Zhang and Li [34] decompose each image intocommon component condition component and a sparse

residual Siyahjani et al [35] introduce the invariant com-ponents to the sparse representation and low-rank matrixdecomposition approaches and successfully apply to solvecomputer vision problems They add orthogonal constraintto assume that invariant and variant components are linearindependent Therefore we decompose EEG signals as acombination of three components resting state componentmotor imagery component represented by low-rank matrixand a sparse residual However in practice it needs somedigital signal processing (DSP) that is wavelet transformor discrete Fourier transform before decomposition Particu-larly raw time-domain signals without any preprocessing arenot suitable for low-rank matrix decomposition directly Thetraining dataset 119883 can be decomposed by 119883 fl 119860 + 119861 + 119864where 119860 isin 119877119898times119899 is a low-rank matrix and collects event-related EEG signal components 119861 isin 119877119898times119899 approximatesinvariant and denotes resting state signal components thatare sampled by subjects without any motor imagery and 119864 isin119877119898times119899 is the matrix of sparse noiseTherefore training datasetcan be decomposed as the following formulation

119883 fl 119860 + 119861 + 119864 (9)

On ideal condition each sampling channel of subjectrsquosbrain EEG signals in resting state is similar In other wordssum of each different 119860 raw is minimum 119861 should addcommon constraint as the following formulation

sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865 (10)

We propose optimization problem formulation as

min119860119861119864

119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865st 119883 = 119860 + 119861 + 119864

(11)

Then augmented Lagrange multiplier (ALM) [36]method is utilized to solve the above problem Theaugmented Lagrangian function 119871(119860 119861 119864 120582) is givenby

119871 (119860 119861 119864 120582) = 119860lowast + 120572 1198641 + 120573sum119894 =119895

10038171003817100381710038171003817119861119894 minus 119861119895100381710038171003817100381710038172119865+ ⟨120582119883 minus 119860 minus 119861 minus 119864⟩+ 1205832 119883 minus 119860 minus 119861 minus 1198642119865

(12)

where 120583 is a positive scalar and 120582 is a Lagrange multi-plier matrix We employ an inexact ALM (IALM) methoddescribed in Algorithm 1 to solve this problem where 119869(119883) =max(lansvd(119883) 120572minus1119883119865) in the initialization 1205820 = 119883119869(119883)and lansvd(sdot) computes the largest singular value

When low-rank matrix 119860 denoting event-related EEGsignal components are generated we can utilize some featureextractionmethods such asCSP andCSSP to classifyMI-BCIIn other words low-rank matrix decomposition method inthis section can be considered as a preprocessing part beforefeature extraction and classification

4 Computational Intelligence and Neuroscience

Input Observation matrix 119883 120582 penalty weights 120572 120573(1) 1205820 = 119883119869(119883) 1205830 gt 0 120588 gt 1 119896 = 0(2) while not converged do(3) Lines (4)ndash(12) solve (119860119896 119861119896 119864119896) = argmin119860119861119864 119871(119860 119861 119864 120582119896 120583119896)(4) 119895 = 0 1198600119896 = 119860119896 1198610119896 = 119861119896 1198640119896 = 119864119896(5) while not converged do(6) Line (7)-(8) solves 119860119896+1 = argmin119860 119871(119860 119861119896 119864119896 120582119896 120583119896)(7) (119880 119878 119881) = svd(119883 minus 119861119896+1 minus 119864119896+1 + 120582119896120583119896)(8) 119860119895+1

119896= 119880119878120583119896minus1 [119878]119881119879

(9) Line (10) solves 119861119896+1 = argmin119861 119871(119860119896+1 119861 119864119896 120582119896 120583119896)(10) 119864119895+1

119896= 119878120572120583119896minus1 [119883 minus 119860119895+1

119896minus 119861119895119896+ 120582119896120583119896]

(11) Update 119861119895+1119896

by solving 119864119896+1 = argmin119864 119871(119860119896+1 119861119896+1 119864 120582119896 120583119896)(12) 119895 larr 119895 + 1(13) end while(14) 119860119896+1 = 119860119895+1

119896 119861119896+1 = 119861119895+1

119896 119864119896+1 = 119864119895+1

119896

(15) 120583119896+1 = 120588120583119896 120582119896+1 = 120582119896 + 120583119896(119883 minus 119860119896+1 minus 119861119896+1 minus 119864119896+1)(16) 119896 larr 119896 + 1(17) end whileOutput (119860119896 119861119896 119864119896)Algorithm 1 Low-rank decomposition via the inexact ALMmethod

4 LR-LDSs on Finite Grassmannian

Beginning at an initial state 1199091 the expected observationsequence generated by a time-invariant model 119872 = (119860 119862)is obtained as 119864[1199101 1199102 1199101 ] = [119862119879 (119862119860)119879 (1198621198602)119879 ]1198791199091that lies in the column space of the extended observabil-ity matrix given by 119874119879infin = [119862119879 (119862119860)119879 (1198621198602)119879 ]119879 isin119877infintimes119899 LDSs can apply the extended observability subspaceO as descriptor but it is hard to calculate Turaga et al[37 38] approximate the extended observability by tak-ing the 119871-order observability matrix that is 119874(119899 119871) =[119862119879 (119862119860)119879 (119862119860119871minus1)119879]119879 In this way an LDS model canbe alternately identified as an 119899-dimensional subspace of119877119871119898

Given a database of EEG we can estimate LDSs modeland calculate the finite observability matrix that span sub-space as a point on the Riemannian manifold Then basedon low-rank and sparse matrix decomposition observabilitymatrix 119874 can be decomposed into 119863 + 119864 as the followingformulation

min119863119864

119863lowast + 120572 1198641st 119874 = 119863 + 119864 (13)

where 119863 is a low-rank matrix and 119864 is the associated sparseerror

The inexact ALM method can be also used to solvethe optimization problem like Algorithm 1 The output 119863represents low-rank descriptor for LDSs and can be employedfor the classification of EEG trails

5 Classification Algorithm

We extract features by the above LDSs model and get twofeature matrices 119860 and 119862 Unfortunately 119860 and 119862 have dif-ferent modal properties and dimensionalities So they cannot

be represented directly by a feature vector Riemannian geom-etry metric for the space of LDSs is hard to determine andneeds to satisfy several constraints Common classifiers suchas Nearest Neighbors (NNs) Linear Discriminant Analysis(LDA) and Support Vector Machines (SVM) cannot classifyfeatures in matrix form The feature matrix must be mappedto vector space We use Martin Distance [39 40] which isbased on the principal angles between two subspaces of theextended observability matrices as kernel to present distanceof different LDS feature matrix It can be defined as

1198632 (Θ119886 Θ119887) = minus2 119899sum119894=1

log 120582119894 (14)

where Θ119886 = 119862119886 119860119886 Θ119887 = 119862119887 119860119887 120582119894 is the eigenvaluesolving as the following equation

[ 0 O119886119887

(O119886119887)119879 0 ][119909119910] = 120582[O119886119886 0

0 O119887119887][119909

119910]st 119909119879O119886119886119909 = 1 119910119879O119887119887119910 = 1

(15)

where the extended observability matrices O119886 =[119862119879119886 119860119879119886119862119879119886 (119860119879119886)119899119862119879119886 ] O119887 = [119862119879119887 119860119879119887119862119879119887 (119860119879119887)119899119862119879119887 ]O119886119887 = (O119886)119879O119887 Algorithm 2 in Supplementary Materialpresents Martin Distance function programed by MATLAB

We can classify EEG signals by comparing Martin Dis-tance between training data and testing data Nearest twosamples mean that they may be of the same class Sothe forecast label and predict accuracy can be calculatedAlgorithm 3 in Supplementary Material is the classificationmethod of KNN

Considering LR-LDSs methods generating 119860 on FiniteGrassmannian unlike two feature matrices (119860119862) by LDSsEuclidean Distance and Mahalanobis Distance can describe

Computational Intelligence and Neuroscience 5

Table 1 Experimental accuracy results () obtained from each subject in BCI Competition III Dataset IVa for CSP CSSP and our proposedalgorithm (LDS)

Subject aa al av aw ay MeanCSP 7143 9464 6122 8928 7302 77918CSSP 7768 9643 6327 9063 7937 81476LDSs 7857 9643 6429 9018 7976 81846LR+CSP 7768 9643 6378 9018 7976 81566LR-LDSs 7946 9821 6378 9018 8056 82438

the distance between two feature spaces of EEG trails afterLR-LDS They are simple efficient and common for mea-suring distance between two points In order to improve theaccuracy of classification we can also employmetric learningmethods using the label information to learn a new metricor pseudometric such as neighborhood components analysisand large margin nearest neighbor

6 Experimental Evaluation

From the above sections we propose three methods forEEG pattern recognition LDSs LR+CSP and LR-LDSs Twodatasets of motor imagery EEG including BCI CompetitionIII Dataset IVa and BCI Competition IVDatabase 2a are usedto evaluate our three methods compared with other state-of-the-art algorithms such as CSP andCSSP All experiments arecarried out with MATLAB on Intel Core i7 290-GHz CPUwith 8GB RAM

61 BCI Competition III Dataset IVa Dataset IVa is recordedfrom five healthy subjects labeled as ldquoaardquo ldquoalrdquo ldquoavrdquo ldquoawrdquoand ldquoayrdquo with visual cues indicated for 35 s performingright hand and foot motor imagery The EEG signal has 118channels andmarkers that indicate the timepoints of 280 cuesfor each subject band-pass filtered between 005 and 200Hzand downsampled to 100Hz

Before feature extracting for comparison experiment theraw data needs some preprocessing Firstly we extract atime segment located from 05 to 3 s and employ FastICAto remove artifacts arising from eye and muscle movementsSecondly we chose 21 channels over the motor cortex (CP6CP4 CP2 C6 C4 C2 FC6 FC4 FC2 CPZ CZ FCZ CP1CP3 CP5 C1 C3 C5 FC1 FC3 and FC5) that related tomotor imagery

In order to improve the performance of CSP and CSSPwe apply Butterworth filter for EEG signals filtering withina specific frequency band between 8 and 30Hz whichencompasses both the alpha rhythm (8ndash13Hz) and the betarhythm (14ndash30Hz) that relate to motor imagery Then weprogram MATLAB code to get spatial filter parameters andfeature vectors by variance Finally a LDA classifier is used tofind a separating hyperplane of the feature vectors

In LDSs model the value of a hidden parameter describ-ing dimension of Riemannian feature space is closely relatedto final accuracyWe chose the highest accuracy performancesubject ldquoalrdquo and the lowest accuracy performance subjectldquoavrdquo to show the relationship between hidden parameter and

0 5 10 15 200

01

02

03

04

05

06

07

08

09

1

Hidden parameter

Accu

racy

Subject alSubject av

Figure 1 The relationship between hidden parameter and accuracyfor LDSs We choose ldquoalrdquo and ldquoavrdquo which are the highest andlowest accuracy performance respectively to show the relationshipbetween hidden parameter and accuracy

classification accuracy The result of experiment is presentedin Figure 1 which indicates that the accuracy tends to increasewhen the value of hidden parameter augments approximatelyand the highest accuracy happens near hidden parametervalue of 16

Then five methods including CSP CSSP LDSs LR+CSPand LR-LDSs are compared with each other The results arelisted in Table 1

From Table 1 the bold figures present the best perfor-mance results LR-LDSs are in the majority The last rowshows that the mean of LR-LDS classification accuracy ismuch better than CSP and a little higher than the othersComparing with CSP and LR+CSP LR method is very effi-cient and useful to improve accuracy LDSs related methodsoutperform CSP and CSSP due to their both spatial andtemporal features extraction

62 BCI Competition IV Database 2a Database 2a consistsof EEG data from 9 subjects There are four different motorimagery tasks including movement of the left hand righthand both feet and tongue At the beginning of each triala fixation cross and a short acoustic warning tone appear

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 4: Research Article Low-Rank Linear Dynamical Systems for

4 Computational Intelligence and Neuroscience

Input Observation matrix 119883 120582 penalty weights 120572 120573(1) 1205820 = 119883119869(119883) 1205830 gt 0 120588 gt 1 119896 = 0(2) while not converged do(3) Lines (4)ndash(12) solve (119860119896 119861119896 119864119896) = argmin119860119861119864 119871(119860 119861 119864 120582119896 120583119896)(4) 119895 = 0 1198600119896 = 119860119896 1198610119896 = 119861119896 1198640119896 = 119864119896(5) while not converged do(6) Line (7)-(8) solves 119860119896+1 = argmin119860 119871(119860 119861119896 119864119896 120582119896 120583119896)(7) (119880 119878 119881) = svd(119883 minus 119861119896+1 minus 119864119896+1 + 120582119896120583119896)(8) 119860119895+1

119896= 119880119878120583119896minus1 [119878]119881119879

(9) Line (10) solves 119861119896+1 = argmin119861 119871(119860119896+1 119861 119864119896 120582119896 120583119896)(10) 119864119895+1

119896= 119878120572120583119896minus1 [119883 minus 119860119895+1

119896minus 119861119895119896+ 120582119896120583119896]

(11) Update 119861119895+1119896

by solving 119864119896+1 = argmin119864 119871(119860119896+1 119861119896+1 119864 120582119896 120583119896)(12) 119895 larr 119895 + 1(13) end while(14) 119860119896+1 = 119860119895+1

119896 119861119896+1 = 119861119895+1

119896 119864119896+1 = 119864119895+1

119896

(15) 120583119896+1 = 120588120583119896 120582119896+1 = 120582119896 + 120583119896(119883 minus 119860119896+1 minus 119861119896+1 minus 119864119896+1)(16) 119896 larr 119896 + 1(17) end whileOutput (119860119896 119861119896 119864119896)Algorithm 1 Low-rank decomposition via the inexact ALMmethod

4 LR-LDSs on Finite Grassmannian

Beginning at an initial state 1199091 the expected observationsequence generated by a time-invariant model 119872 = (119860 119862)is obtained as 119864[1199101 1199102 1199101 ] = [119862119879 (119862119860)119879 (1198621198602)119879 ]1198791199091that lies in the column space of the extended observabil-ity matrix given by 119874119879infin = [119862119879 (119862119860)119879 (1198621198602)119879 ]119879 isin119877infintimes119899 LDSs can apply the extended observability subspaceO as descriptor but it is hard to calculate Turaga et al[37 38] approximate the extended observability by tak-ing the 119871-order observability matrix that is 119874(119899 119871) =[119862119879 (119862119860)119879 (119862119860119871minus1)119879]119879 In this way an LDS model canbe alternately identified as an 119899-dimensional subspace of119877119871119898

Given a database of EEG we can estimate LDSs modeland calculate the finite observability matrix that span sub-space as a point on the Riemannian manifold Then basedon low-rank and sparse matrix decomposition observabilitymatrix 119874 can be decomposed into 119863 + 119864 as the followingformulation

min119863119864

119863lowast + 120572 1198641st 119874 = 119863 + 119864 (13)

where 119863 is a low-rank matrix and 119864 is the associated sparseerror

The inexact ALM method can be also used to solvethe optimization problem like Algorithm 1 The output 119863represents low-rank descriptor for LDSs and can be employedfor the classification of EEG trails

5 Classification Algorithm

We extract features by the above LDSs model and get twofeature matrices 119860 and 119862 Unfortunately 119860 and 119862 have dif-ferent modal properties and dimensionalities So they cannot

be represented directly by a feature vector Riemannian geom-etry metric for the space of LDSs is hard to determine andneeds to satisfy several constraints Common classifiers suchas Nearest Neighbors (NNs) Linear Discriminant Analysis(LDA) and Support Vector Machines (SVM) cannot classifyfeatures in matrix form The feature matrix must be mappedto vector space We use Martin Distance [39 40] which isbased on the principal angles between two subspaces of theextended observability matrices as kernel to present distanceof different LDS feature matrix It can be defined as

1198632 (Θ119886 Θ119887) = minus2 119899sum119894=1

log 120582119894 (14)

where Θ119886 = 119862119886 119860119886 Θ119887 = 119862119887 119860119887 120582119894 is the eigenvaluesolving as the following equation

[ 0 O119886119887

(O119886119887)119879 0 ][119909119910] = 120582[O119886119886 0

0 O119887119887][119909

119910]st 119909119879O119886119886119909 = 1 119910119879O119887119887119910 = 1

(15)

where the extended observability matrices O119886 =[119862119879119886 119860119879119886119862119879119886 (119860119879119886)119899119862119879119886 ] O119887 = [119862119879119887 119860119879119887119862119879119887 (119860119879119887)119899119862119879119887 ]O119886119887 = (O119886)119879O119887 Algorithm 2 in Supplementary Materialpresents Martin Distance function programed by MATLAB

We can classify EEG signals by comparing Martin Dis-tance between training data and testing data Nearest twosamples mean that they may be of the same class Sothe forecast label and predict accuracy can be calculatedAlgorithm 3 in Supplementary Material is the classificationmethod of KNN

Considering LR-LDSs methods generating 119860 on FiniteGrassmannian unlike two feature matrices (119860119862) by LDSsEuclidean Distance and Mahalanobis Distance can describe

Computational Intelligence and Neuroscience 5

Table 1 Experimental accuracy results () obtained from each subject in BCI Competition III Dataset IVa for CSP CSSP and our proposedalgorithm (LDS)

Subject aa al av aw ay MeanCSP 7143 9464 6122 8928 7302 77918CSSP 7768 9643 6327 9063 7937 81476LDSs 7857 9643 6429 9018 7976 81846LR+CSP 7768 9643 6378 9018 7976 81566LR-LDSs 7946 9821 6378 9018 8056 82438

the distance between two feature spaces of EEG trails afterLR-LDS They are simple efficient and common for mea-suring distance between two points In order to improve theaccuracy of classification we can also employmetric learningmethods using the label information to learn a new metricor pseudometric such as neighborhood components analysisand large margin nearest neighbor

6 Experimental Evaluation

From the above sections we propose three methods forEEG pattern recognition LDSs LR+CSP and LR-LDSs Twodatasets of motor imagery EEG including BCI CompetitionIII Dataset IVa and BCI Competition IVDatabase 2a are usedto evaluate our three methods compared with other state-of-the-art algorithms such as CSP andCSSP All experiments arecarried out with MATLAB on Intel Core i7 290-GHz CPUwith 8GB RAM

61 BCI Competition III Dataset IVa Dataset IVa is recordedfrom five healthy subjects labeled as ldquoaardquo ldquoalrdquo ldquoavrdquo ldquoawrdquoand ldquoayrdquo with visual cues indicated for 35 s performingright hand and foot motor imagery The EEG signal has 118channels andmarkers that indicate the timepoints of 280 cuesfor each subject band-pass filtered between 005 and 200Hzand downsampled to 100Hz

Before feature extracting for comparison experiment theraw data needs some preprocessing Firstly we extract atime segment located from 05 to 3 s and employ FastICAto remove artifacts arising from eye and muscle movementsSecondly we chose 21 channels over the motor cortex (CP6CP4 CP2 C6 C4 C2 FC6 FC4 FC2 CPZ CZ FCZ CP1CP3 CP5 C1 C3 C5 FC1 FC3 and FC5) that related tomotor imagery

In order to improve the performance of CSP and CSSPwe apply Butterworth filter for EEG signals filtering withina specific frequency band between 8 and 30Hz whichencompasses both the alpha rhythm (8ndash13Hz) and the betarhythm (14ndash30Hz) that relate to motor imagery Then weprogram MATLAB code to get spatial filter parameters andfeature vectors by variance Finally a LDA classifier is used tofind a separating hyperplane of the feature vectors

In LDSs model the value of a hidden parameter describ-ing dimension of Riemannian feature space is closely relatedto final accuracyWe chose the highest accuracy performancesubject ldquoalrdquo and the lowest accuracy performance subjectldquoavrdquo to show the relationship between hidden parameter and

0 5 10 15 200

01

02

03

04

05

06

07

08

09

1

Hidden parameter

Accu

racy

Subject alSubject av

Figure 1 The relationship between hidden parameter and accuracyfor LDSs We choose ldquoalrdquo and ldquoavrdquo which are the highest andlowest accuracy performance respectively to show the relationshipbetween hidden parameter and accuracy

classification accuracy The result of experiment is presentedin Figure 1 which indicates that the accuracy tends to increasewhen the value of hidden parameter augments approximatelyand the highest accuracy happens near hidden parametervalue of 16

Then five methods including CSP CSSP LDSs LR+CSPand LR-LDSs are compared with each other The results arelisted in Table 1

From Table 1 the bold figures present the best perfor-mance results LR-LDSs are in the majority The last rowshows that the mean of LR-LDS classification accuracy ismuch better than CSP and a little higher than the othersComparing with CSP and LR+CSP LR method is very effi-cient and useful to improve accuracy LDSs related methodsoutperform CSP and CSSP due to their both spatial andtemporal features extraction

62 BCI Competition IV Database 2a Database 2a consistsof EEG data from 9 subjects There are four different motorimagery tasks including movement of the left hand righthand both feet and tongue At the beginning of each triala fixation cross and a short acoustic warning tone appear

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 5: Research Article Low-Rank Linear Dynamical Systems for

Computational Intelligence and Neuroscience 5

Table 1 Experimental accuracy results () obtained from each subject in BCI Competition III Dataset IVa for CSP CSSP and our proposedalgorithm (LDS)

Subject aa al av aw ay MeanCSP 7143 9464 6122 8928 7302 77918CSSP 7768 9643 6327 9063 7937 81476LDSs 7857 9643 6429 9018 7976 81846LR+CSP 7768 9643 6378 9018 7976 81566LR-LDSs 7946 9821 6378 9018 8056 82438

the distance between two feature spaces of EEG trails afterLR-LDS They are simple efficient and common for mea-suring distance between two points In order to improve theaccuracy of classification we can also employmetric learningmethods using the label information to learn a new metricor pseudometric such as neighborhood components analysisand large margin nearest neighbor

6 Experimental Evaluation

From the above sections we propose three methods forEEG pattern recognition LDSs LR+CSP and LR-LDSs Twodatasets of motor imagery EEG including BCI CompetitionIII Dataset IVa and BCI Competition IVDatabase 2a are usedto evaluate our three methods compared with other state-of-the-art algorithms such as CSP andCSSP All experiments arecarried out with MATLAB on Intel Core i7 290-GHz CPUwith 8GB RAM

61 BCI Competition III Dataset IVa Dataset IVa is recordedfrom five healthy subjects labeled as ldquoaardquo ldquoalrdquo ldquoavrdquo ldquoawrdquoand ldquoayrdquo with visual cues indicated for 35 s performingright hand and foot motor imagery The EEG signal has 118channels andmarkers that indicate the timepoints of 280 cuesfor each subject band-pass filtered between 005 and 200Hzand downsampled to 100Hz

Before feature extracting for comparison experiment theraw data needs some preprocessing Firstly we extract atime segment located from 05 to 3 s and employ FastICAto remove artifacts arising from eye and muscle movementsSecondly we chose 21 channels over the motor cortex (CP6CP4 CP2 C6 C4 C2 FC6 FC4 FC2 CPZ CZ FCZ CP1CP3 CP5 C1 C3 C5 FC1 FC3 and FC5) that related tomotor imagery

In order to improve the performance of CSP and CSSPwe apply Butterworth filter for EEG signals filtering withina specific frequency band between 8 and 30Hz whichencompasses both the alpha rhythm (8ndash13Hz) and the betarhythm (14ndash30Hz) that relate to motor imagery Then weprogram MATLAB code to get spatial filter parameters andfeature vectors by variance Finally a LDA classifier is used tofind a separating hyperplane of the feature vectors

In LDSs model the value of a hidden parameter describ-ing dimension of Riemannian feature space is closely relatedto final accuracyWe chose the highest accuracy performancesubject ldquoalrdquo and the lowest accuracy performance subjectldquoavrdquo to show the relationship between hidden parameter and

0 5 10 15 200

01

02

03

04

05

06

07

08

09

1

Hidden parameter

Accu

racy

Subject alSubject av

Figure 1 The relationship between hidden parameter and accuracyfor LDSs We choose ldquoalrdquo and ldquoavrdquo which are the highest andlowest accuracy performance respectively to show the relationshipbetween hidden parameter and accuracy

classification accuracy The result of experiment is presentedin Figure 1 which indicates that the accuracy tends to increasewhen the value of hidden parameter augments approximatelyand the highest accuracy happens near hidden parametervalue of 16

Then five methods including CSP CSSP LDSs LR+CSPand LR-LDSs are compared with each other The results arelisted in Table 1

From Table 1 the bold figures present the best perfor-mance results LR-LDSs are in the majority The last rowshows that the mean of LR-LDS classification accuracy ismuch better than CSP and a little higher than the othersComparing with CSP and LR+CSP LR method is very effi-cient and useful to improve accuracy LDSs related methodsoutperform CSP and CSSP due to their both spatial andtemporal features extraction

62 BCI Competition IV Database 2a Database 2a consistsof EEG data from 9 subjects There are four different motorimagery tasks including movement of the left hand righthand both feet and tongue At the beginning of each triala fixation cross and a short acoustic warning tone appear

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 6: Research Article Low-Rank Linear Dynamical Systems for

6 Computational Intelligence and Neuroscience

Table 2 Experimental accuracy results () obtained from each subject in BCI Competition IV Database 2a for CSP CSSP LDSs LR+CSPand LR-LDSs methods

Subject A01E A02E A03E A04E A05E A06E A07E A08E A09E MeanCSP 9027 5313 9167 7118 6111 6424 7986 9132 9236 7724CSSP 9097 5694 9201 7292 6181 6528 7986 9306 9271 7840LDSs 9167 5556 9306 7431 6250 7083 8056 9375 9306 7948LR+CSP 9201 5868 9514 7465 6181 6528 8125 9444 9340 7963LR-LDSs 9201 5902 9444 7535 6319 6944 8125 9514 9306 8032

After two seconds the subject is cued by an arrow pointing toeither the left right down or up that denote the movementof left hand right hand foot or tongue for 125 s Then thesubjects carry out the motor imagery task for about 3 s TheBCI signals are sampled by 25 channels including 22 EEGchannels and 3 EOG channels with 250Hz and bandpass-filtered between 05Hz and 100Hz

Different from dataset IVa database 2a is a multiclas-sification problem However LDA is a two-class classifierTherefore we choose119870-NN algorithms for CSP CSSP LDSsLR+CSP and LR-LDSs methods uniformly Table 2 describesthe classification accuracies results of five above concernedmethods Similar to the results of BCI Competition IIIDataset IVa the mean accuracies of LDSs LR+CSP and LR-LDSs are higher than CSP and CSSP methods FurthermoreLR-LDSs method abstains the best performance

7 Conclusion

CSP has gained much success in the past MI-BCI researchHowever it is reported that CSP is only a spatial filterand sensitive to frequency band It needs prior knowledgeto choose channels and frequency bands Without prepro-cessing the result of classification accuracy may be poorLDSs can overcome these problems by extracting both spatialand temporal features simultaneously to improve the clas-sification performance Furthermore we utilize a low-rankmatrix decomposition approach to get rid of noise and restingstate component in order to improve the robustness of thesystem Then LR+CSP and LR-LDSs methods are proposedComparison experiments are demonstrated on two datasetsThe major contribution of our work is realization of LDSsmodel and LR algorithm forMI-BCI pattern recognitionTheproposed LR-LDSs methods achieve a better performancethan CSP and CSSP

Competing Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China (Grant nos 91420302 and 91520201)

References

[1] J R Wolpaw N Birbaumer D J McFarland G Pfurtschellerand T M Vaughan ldquoBrainndashcomputer interfaces for communi-cation and controlrdquo Clinical Neurophysiology vol 113 no 6 pp767ndash791 2002

[2] K K Ang and C Guan ldquoBrain-computer interface in strokerehabilitationrdquo Journal of Computing Science and Engineeringvol 7 no 2 pp 139ndash146 2013

[3] N Sharma V M Pomeroy and J-C Baron ldquoMotor imagery abackdoor to the motor system after strokerdquo Stroke vol 37 no7 pp 1941ndash1952 2006

[4] S Silvoni A Ramos-Murguialday M Cavinato et al ldquoBrain-computer interface in stroke a review of progressrdquoClinical EEGand Neuroscience vol 42 no 4 pp 245ndash252 2011

[5] O Bai P Lin S Vorbach M K Floeter N Hattori and MHallett ldquoA high performance sensorimotor beta rhythm-basedbrain-computer interface associated with human natural motorbehaviorrdquo Journal of Neural Engineering vol 5 no 1 pp 24ndash352008

[6] S Chiappa and S Bengio ldquoHMM and IOHMM modeling ofEEG rhythms for asynchronous bci systemsrdquo in Proceedings ofthe European Symposium on Artificial Neural Networks ESANNBruges Belgium April 2004

[7] J D R Millan and J Mourino ldquoAsynchronous BCI and localneural classifiers an overview of the adaptive brain interfaceprojectrdquo IEEE Transactions on Neural Systems and Rehabilita-tion Engineering vol 11 no 2 pp 159ndash161 2003

[8] W D Penny S J Roberts E A Curran andM J Stokes ldquoEEG-based communication a pattern recognition approachrdquo IEEETransactions on Rehabilitation Engineering vol 8 no 2 pp 214ndash215 2000

[9] G Pfurtscheller C Neuper A Schlogl and K Lugger ldquoSep-arability of EEG signals recorded during right and left motorimagery using adaptive autoregressive parametersrdquo IEEE Trans-actions on Rehabilitation Engineering vol 6 no 3 pp 316ndash3251998

[10] T Wang J Deng and B He ldquoClassifying EEG-based motorimagery tasks by means of time-frequency synthesized spatialpatternsrdquo Clinical Neurophysiology vol 115 no 12 pp 2744ndash2753 2004

[11] D J Krusienski D J McFarland and J R Wolpaw ldquoAnevaluation of autoregressive spectral estimation model orderfor brain-computer interface applicationsrdquo in Proceedings of the28th Annual International Conference of the IEEE Engineering inMedicine and Biology Society (EMBS rsquo06) pp 1323ndash1326 IEEENew York NY USA September 2006

[12] T Demiralp J Yordanova V Kolev A Ademoglu M Devrimand V J Samar ldquoTime-frequency analysis of single-sweep

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 7: Research Article Low-Rank Linear Dynamical Systems for

Computational Intelligence and Neuroscience 7

event-related potentials by means of fast wavelet transformrdquoBrain and Language vol 66 no 1 pp 129ndash145 1999

[13] D Farina O F do Nascimento M-F Lucas and C DoncarlildquoOptimization of wavelets for classification of movement-related cortical potentials generated by variation of force-relatedparametersrdquo Journal of Neuroscience Methods vol 162 no 1-2pp 357ndash363 2007

[14] H Ramoser J Muller-Gerking and G Pfurtscheller ldquoOptimalspatial filtering of single trial EEG during imagined handmovementrdquo IEEE Transactions on Rehabilitation Engineeringvol 8 no 4 pp 441ndash446 2000

[15] M Grosse-Wentrup and M Buss ldquoMulticlass common spatialpatterns and information theoretic feature extractionrdquo IEEETransactions on Biomedical Engineering vol 55 no 8 article no11 pp 1991ndash2000 2008

[16] B Blankertz K-R Muller D J Krusienski et al ldquoThe BCIcompetition III validating alternative approaches to actualBCI problemsrdquo IEEE Transactions on Neural Systems andRehabilitation Engineering vol 14 no 2 pp 153ndash159 2006

[17] httpwwwbbcidecompetitioniiiresults[18] M Tangermann KMuller A Aertsen et al ldquoReview of the BCI

competition IVrdquo Frontiers in Neuroscience vol 6 article no 552012

[19] httpwwwbbcidecompetitionivresults[20] S Lemm B Blankertz G Curio and K-R Muller ldquoSpatio-

spectral filters for improving the classification of single trialEEGrdquo IEEE Transactions on Biomedical Engineering vol 52 no9 pp 1541ndash1548 2005

[21] G Dornhege B Blankertz M Krauledat F Losch G Curioand K-R Muller ldquoCombined optimization of spatial andtemporal filters for improving brain-computer interfacingrdquoIEEE Transactions on Biomedical Engineering vol 53 no 11 pp2274ndash2281 2006

[22] Q Novi C Guan T H Dat and P Xue ldquoSub-band com-mon spatial pattern (SBCSP) for brain-computer interfacerdquo inProceedings of the 3rd International IEEEEMBS Conference onNeural Engineering (CNE rsquo07) pp 204ndash207 IEEE Kohala CoastHawaii USA May 2007

[23] K K Ang Z Y Chin H Zhang and C Guan ldquoFilterBank Common Spatial Pattern (FBCSP) in Brain-ComputerInterfacerdquo in Proceedings of the International Joint Conferenceon Neural Networks (IJCNN rsquo08) pp 2390ndash2397 Hong KongChina June 2008

[24] E A Mousavi J J Maller P B Fitzgerald and B J LithgowldquoWavelet common spatial pattern in asynchronous offline braincomputer interfacesrdquo Biomedical Signal Processing and Controlvol 6 no 2 pp 121ndash128 2011

[25] A S Aghaei M S Mahanta and K N Plataniotis ldquoSeparablecommon spatio-spectral patterns for motor imagery BCI sys-temsrdquo IEEE Transactions on Biomedical Engineering vol 63 no1 pp 15ndash29 2016

[26] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 article11 2011

[27] C-F Chen C-P Wei and Y-C F Wang ldquoLow-rank matrixrecovery with structural incoherence for robust face recogni-tionrdquo in Proceedings of the IEEE Conference on Computer Visionand Pattern Recognition (CVPR rsquo12) pp 2618ndash2625 June 2012

[28] R Vidal and P Favaro ldquoLow rank subspace clustering (LRSC)rdquoPattern Recognition Letters vol 43 no 1 pp 47ndash61 2014

[29] P Saisan G Doretto Y NWu and S Soatto ldquoDynamic texturerecognitionrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo01) pp II-58ndashII-63 Kauai Hawaii USA December 2001

[30] G Doretto A Chiuso Y N Wu and S Soatto ldquoDynamictexturesrdquo International Journal of Computer Vision vol 51 no2 pp 91ndash109 2003

[31] BMesot andD Barber ldquoSwitching linear dynamical systems fornoise robust speech recognitionrdquo IEEE Transactions on AudioSpeech and Language Processing vol 15 no 6 pp 1850ndash18582007

[32] R Ma H Liu F Sun Q Yang and M Gao ldquoLinear dynamicsystem method for tactile object classificationrdquo Science ChinaInformation Sciences vol 57 no 12 pp 1ndash11 2014

[33] E J Candes X Li Y Ma and J Wright ldquoRobust principalcomponent analysisrdquo Journal of the ACM vol 58 no 3 2011

[34] Q Zhang and B Li ldquoMining discriminative components withlow-rank and sparsity constraints for face recognitionrdquo inProceedings of the 18th ACM SIGKDD International Conferenceon Knowledge Discovery and Data Mining (KDD rsquo12) pp 1469ndash1477 ACM August 2012

[35] F Siyahjani R Almohsen S Sabri and G Doretto ldquoA super-vised low-rank method for learning invariant subspacesrdquo inProceedings of the IEEE International Conference on ComputerVision (ICCV rsquo15) pp 4220ndash4228 Santiago Chile December2015

[36] Z Lin M Chen and Y Ma ldquoThe Augmented LagrangeMultiplier Method for Exact Recovery of Corrupted Low-RankMatricesrdquo 2010

[37] P Turaga A Veeraraghavan A Srivastava and R ChellappaldquoStatistical computations on grassmann and stiefel manifoldsfor image and video-based recognitionrdquo IEEE Transactions onPattern Analysis and Machine Intelligence vol 33 no 11 pp2273ndash2286 2011

[38] M Harandi R Hartley C Shen B Lovell and C Sander-son ldquoExtrinsic methods for coding and dictionary learningon Grassmann manifoldsrdquo International Journal of ComputerVision vol 114 no 2-3 pp 113ndash136 2015

[39] R J Martin ldquoAmetric for ARMA processesrdquo IEEE Transactionson Signal Processing vol 48 no 4 pp 1164ndash1170 2000

[40] A B Chan and N Vasconcelos ldquoClassifying video with kerneldynamic texturesrdquo in Proceedings of the IEEE Computer SocietyConference on Computer Vision and Pattern Recognition (CVPRrsquo07) 6 1 pages June 2007

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Page 8: Research Article Low-Rank Linear Dynamical Systems for

Submit your manuscripts athttpwwwhindawicom

Computer Games Technology

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Distributed Sensor Networks

International Journal of

Advances in

FuzzySystems

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014

International Journal of

ReconfigurableComputing

Hindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Applied Computational Intelligence and Soft Computing

thinspAdvancesthinspinthinsp

Artificial Intelligence

HindawithinspPublishingthinspCorporationhttpwwwhindawicom Volumethinsp2014

Advances inSoftware EngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Journal of

Computer Networks and Communications

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation

httpwwwhindawicom Volume 2014

Advances in

Multimedia

International Journal of

Biomedical Imaging

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

ArtificialNeural Systems

Advances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Computational Intelligence and Neuroscience

Industrial EngineeringJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Human-ComputerInteraction

Advances in

Computer EngineeringAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014