1synopsis tensor

Upload: kedar-kulkarni

Post on 06-Apr-2018

229 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 1synopsis tensor

    1/7

    P a g e | 1

    Synopsis

    TENSOR DECOMPOSITION USING

    TUCKER AND PARAFAC IN EEGSIGNAL

    SYNOPSIS

  • 8/3/2019 1synopsis tensor

    2/7

    P a g e | 2

    Synopsis

    Problem Statement

    Increasingly large amount of multidimensional data are being generated on a daily basis

    in many applications. This leads to a strong demand for learning algorithms to extract usefulinformation from these massive data. Feature extraction and selection are key factors in model

    reduction, classification and pattern recognition problems. This is especially important for inputdata with large dimensions such as brain recording or multiview images, where appropriate

    feature extraction is a prerequisite to classification. To ensure that the reduced dataset containsmaximum information about input data we propose algorithms for feature extraction and

    classification. This is achieved based on orthogonal or nonnegative tensor (multi-array)decompositions.

    Introduction

    Electrical impulses generated by nerve firings in the brain diffuse through the head and can be

    measured by electrodes placed on the scalp, is known as electroencephalogram (EEG) and was

    first measured in humans by Hans Berger in 1929. EEG is an important clinical tool for

    diagnosing, monitoring and managing neurological disorders.

    The analysis of EEG data and the extraction of information from this data is a difficult

    problem. This problem is exacerbated by the introduction of extraneous biologically generated

    and externally generated signals into the EEG. EEG data is used for development of brain-

    computer interfaces (BCIs). The electroencephalogram (EEG), the record of the neuronal

    electrical activity, is a good indicator of abnormality in the nervous central system.

    Modern applications such as those in neuroscience, text mining, and pattern recognition generate

    massive amounts of multimodel data exhibiting dimensionality. Tensors (i.e., multi-way arrays)

    provide a natural representation for such data, and tensor decompositions and factorizations are

    emerging as promising tools for exploratory analysis of multidimensional data. Tensor

    decompositions, especially TUCKER and PARAFAC models, are important tools for feature

    extraction and classification problems by capturing multi-linear and multi-aspect structures in

    large-scale higher-order data-sets. There are a number of applications in diverse disciplines,

    especially feature extraction, feature selection, classification and multi-way clustering [13].

    Supervised and un-supervised dimensionality reduction and feature extraction methods with

    tensor representation have recently attracted great interest [14]. Given that many real-world

    data(e.g., brain signals, images, videos) are conveniently represented by tensors, traditional

    algorithms such as PCA, LDA, and ICA could treat the data as matrices or vectors [58], and are

    often not efficient. Since the curse of high dimensionality is often a major cause of limitation of

    many practical methods, dimensionality reduction is a prerequisite to practical applications in

    classification, data mining, vision and pattern recognitions fields.

  • 8/3/2019 1synopsis tensor

    3/7

    P a g e | 3

    Synopsis

    In classification and pattern recognition problems, there are three main stages: feature

    extraction, feature selection, and classifier design. The key issue is to extract and select

    statistically significant (dominant, leading) features, which allow us to discriminate different

    classes or clusters. Classifier design involves choosing an appropriate method such as Fisher

    discriminant analysis, k-nearest neighbor (KNN) rule, or support vector machines (SVM). In anutshell, the classifier computes distance or similarity among extracted features for training data

    in order to assign the test data to specific class.

    In this paper we propose a suite of algorithms for feature extraction and classification,

    especially suitable for large scale problems. In our approach, we first decompose multi-way data

    under the TUCKER decomposition with/without constraints to retrieve basis factors and

    significant features from the core tensors.

    Existing System

    Diagram of the existing feature dimensionality reduction approach for EEG signal classification

    is shown in Fig.1. In this approach, L-second epochs from EEG signals were decomposed by

    Gabor function and represented in the spatial, spectral and temporal domain as third order

    tensor. Then, a GTDA algorithm was developed to extract a multi-way discriminative subspace

    from the third order tensors and form the optimal projection vector.

    Figure 1. Block diagram of the existing feature dimensionality reduction approach.

    EEG signals Divide each class to l-

    section epoch

    Feature extraction

    based on GABOR

    functions

    Dimensionality reduction based on general tensor discriminant analysis

  • 8/3/2019 1synopsis tensor

    4/7

    P a g e | 4

    Synopsis

    Proposed System

    In this paper we propose a suite of algorithms for feature extraction and classification, especially

    suitable for large scale problems. In our approach, we first decompose multi-way data under the

    TUCKER

    decomposition with/without constraints to retrieve basis factors and significant features fromthe core tensors.

    Figure2. Dimensional reduction..

    Figure 3. Block diagram of the proposed feature dimensionality reduction approach.

    EEG signals Divide each class to l-

    section epoch

    Feature extraction

    based on GABOR

    functions

    Dimensionality reduction using TUCKER and PARAFAC and comparing

    with the general method.

  • 8/3/2019 1synopsis tensor

    5/7

    P a g e | 5

    Synopsis

    PARAFAC

    PARAFAC is one of several decomposition methods for multi-way data. PARAFAC is a

    generalization of PCA to higher order arrays, but some of the characteristics of the method are

    quite different from the ordinary two-way case.

    TUCKER

    TUCKER decomposition for a 3-way case, is a basic model for high dimensional tensors

    which allows effectively to perform model reduction and feature extraction. In contrast with

    PARAFAC, which decomposes a tensor into rank one tensor, the tucker decomposition is a form

    of higher-order principal component analysis that decomposes a tensor into a core tensor

    multiplied by a matrix along each mode.

    Hardware and Software Requirements

    Hardware Requirements

    1. Intel Processor with minimum 20GB hard disk capacity.Operating System

    1. Windows XP or windows 7 Operating SystemSoftware Requirements

    1. Matlab 72. Tensor Toolkit

  • 8/3/2019 1synopsis tensor

    6/7

    P a g e | 6

    Synopsis

    References

    [1] A. Cichocki, R. Zdunek, A.-H. Phan, and S. Amari,Nonnegative Matrix and Tensor Factorizations:

    Applications to Exploratory Multi-Way Data Analysis and Blind Source Separation,

    Wiley, Chichester, 2009.

    [2] T. Kolda and B. Bader, Tensor decompositions and applications, SIAM Review, vol. 51, no. 3,

    pp. 455500, September 2009.

    [3] J. Sun, D. Tao, S. Papadimitriou, P.S. Yu, and C. Faloutsos, Incremental tensor analysis:

    theory and applications, TKDD, vol. 2, no. 3, 2008.

    [4] D. Tao, X. Li, X. Wu, and S.J. Maybank, General tensor discriminant analysis and Gabor

    features for gait recognition,IEEE Trans. Pattern Anal. Mach. Intell., vol. 29, no. 10, pp. 1700

    1715, 2007.

    [5] A.K. Jain, R.P. Duin, and J. Mao, Statistical pattern recognition: a review,IEEE Transactions

    on Pattern Analysis and Machine Intelligence, vol. 22, pp. 437, 2000.

    [6] J.H. Friedman, Exploratory Projection Pursuit,Journal of the American Statistical Association,

    vol. 82, no. 397, pp. 249266, 1987, http://www.jstor.org/stable/2289161.

    [7] J.H. Friedman, Regularized Discriminant Analysis,Journal of the American Statistical Association,

    vol. 84, no. 405, pp. 165175, 1989, http://dx.doi.org/10.2307/2289860.

    [8] A. Hyvarinen, J. Karhunen, and E. Oja,Independent Component Analysis, JohnWiley & Sons

    Ltd, New York, 2001.

    [9] L. Tucker, Some mathematical notes on three-mode factor analysis, Psychometrika, vol. 31,

    pp. 279311, 1966.

    [10] R. Harshman, Foundations of the PARAFAC procedure: models and conditions for an explanatory

    multimodal factor analysis, UCLA Working Papers in Phonetics, vol. 16, pp. 184,

    1970.

    [11] M. Mrup, L. Hansen, and S. Arnfred, Algorithms for sparse nonnegative Tucker decompositions,

    Neural Computation , vol. 20, pp. 21122131, 2008, http://www2.imm.dtu.dk/pubdb/.

  • 8/3/2019 1synopsis tensor

    7/7

    P a g e | 7

    Synopsis

    [12] Y.-D. Kim and S. Choi, Nonnegative Tucker decomposition,Proc. of Conf. Computer Vision

    and Pattern Recognition (CVPR-2007), Minneapolis, June 2007.

    [13] L. De Lathauwer, B. de Moor, and J. Vandewalle, A multilinear singular value decomposition,

    SIAM Journal of Matrix Analysis and Applications, vol. 21, pp. 12531278, 2001.

    [14] L. De Lathauwer, B. DeMoor, and J. Vandewalle, On the best rank-1 and rank-(R1,R2, . . . ,RN)

    approximation of higher-order tensors, SIAM Journal of Matrix Analysis and Applications,

    vol. 21, no. 4, pp. 13241342, 2000.

    [15] L. Tucker, The extension of factor analysis to three-dimensional matrices in Contributions to

    Mathematical Psychology, eds. H. Gulliksen and N. Frederiksen, pp. 110127, Holt, Rinehart

    andWinston, New York, 1964.

    [16] R. Harshman, PARAFAC2: mathematical and technical notes, UCLA Working Papers in

    Phonetics, vol. 22, pp. 3044, 1972.

    [17] B. Bader, R. Harshman, and T. Kolda, Temporal analysis of social networks using threeway

    DEDICOM, Sandia National Laboratories, Albuquerque, NM and Livermore, CA, SAND

    2006-2161, April 2006, http://www.prod.sandia.gov/cgi-bin/techlib/access-control.pl/2006/

    062161.pdf.

    [18] J.-F. Cardoso and A. Souloumiac, Jacobi angles for simultaneous diagonalization, SIAM Journal

    on Matrix Analysis and Applications, vol. 17, no. 1, pp. 161164, January 1996.

    [19] L. Badea, Extracting gene expression profiles common to colon and pancreatic adenocarcinoma

    using simultaneous nonnegative matrix factorization,Proceedings of Pacific Symposium on

    Biocomputing PSB-2008, pp. 267278,World Scientific, 2008.

    [20] L. De Lathauwer and J. Vandewalle, Dimensionality reduction in higher-order signal processing

    and rank-(R1,R2, . . . ,RN) reduction in multilinear algebra,Linear Algebra Applications,

    vol. 391, pp. 3155, November 2004.