the second seminar

32
EEG Brain Signal Classification for BCI Applications http://eegclassifyandrecognize.blogspot.com/

Upload: ahmedmahany

Post on 17-Jul-2015

148 views

Category:

Education


0 download

TRANSCRIPT

EEG Brain Signal Classification for BCI

Applications http://eegclassifyandrecognize.blogspot.com/

Project MembersSupervisors

•Pro.Dr Mostafa Gad-Haqq•Pro.Dr Tareq Gharib•Dr.Howida Shded

Assistants•T.A Manal Tantawy

Team Members• Ahmed Khaled Abd El-glil (Information Systems)• Ahmed Mohamed Ahmed Mahany (Computer Systems)• Islam Ahmed Hamed Elgarhy (Computer Systems)• Kamal Ashraf Kamal El-deen (Information Systems)• Mohammed Saeed Ibrahim (Scientific Computing)

Agenda

Project Overview

• Objectives

• Problem Statement

EEG Signal Overview

System Architecture

Methodology

Past, Present, Future Work

Time Plan

References

Project Overview(Objectives)

Develop a generic EEG Classification that can be used in different brain computer interface applications.

Project Overview(Problem Statement)

EEG Signal is a composite signal.

EEG Signal Overview

An electroencephalogram is a measure of the brain's voltage changes as detected from scalp electrodes.

Measured in microvolt (µV) .

EEG Signal Overview(Cont.)

DELTA

• Up to 4

• These occur in deep sleep

• Childhood and in serious

organic brain diseases

THETA

• 4 – 7

• Move of Hand

• Idling

• These occur during childhood

ALPHA

• 8-12

• relaxed/reflecting

• closing the eyes

BETA

• 12-30

• Thinking

• Alert & working

GAMMA

• 30-100

• Short term memory of (sound , tactile,…..)

EEG Signal For Normal HumanHand Movement

Tongue Movement

EEG Signal For Normal Human

Walking && Rotation

Eye Blink

System Architecture

EEG Signal Pre-

processing

Feature Extraction

EEG Signal acquisition

Classification

EEG Signal Post-Process

Feature Selection

The whole classification system contains four parts :

• Preprocessing

• Feature Extraction and Features Selection ( Dimensionality Reduction)

• Classification

• Post processing

System Architecture(cont.)

Methodology

Independent Component Analysis(ICA) for Feature extraction.

Convolution Neural Network for classification.

Preprocessing

Filter out Noise And Remove The Artifacts

• Removing eye blinks and muscular movements

Feature Extraction & Selection

Power Spectral Density

• Temporal information from a window of data is extracted and then processed using a static classifier.

Spatial information alone is indeed powerful enough to produce state-of-the-art performance

• Independents Components Analysis (EXPLAIN NEXT)

Feature Extraction & Selection

The goal of feature subspace projections is to improve classifier robustness by reducing data dimensionality in order to facilitate better generalization, as well as reducing the learning and operating complexity of the classifiers.

Independents Components Analysis

Definition

• Mixed Signals in Matrix Notation

n

i

is1

iasAx

Signalt Independen i ~

Matrix ngMultiplexi ~

Signal dMultiplexe~

this

A

xj

Find W using the ICA Algorithm

ICA Block Diagram (2 Signals)

ssIsA)(W

s)(AWxWs

ˆ

Signal #1 Signal #2

MultiplexedSignal #2

MultiplexedSignal #1

a11

a12 a21

a22

Independents Components Analysis

ICASensor #3

Sensor #4

Sensor #2

Sensor #1

Feature #3

Feature #4

Feature #2

Feature #1

Independents Components Analysis

Classification

Neural Networks

• An information processing paradigm that is inspired by the way biological nervous systems, such as the brain.

• It Composed of large number of neurons to solve a specific problem.

How the Human Brain Learns?

Neuron collects signals from others through dendrites.

The neuron sends out spikes of electrical activity through a long axon.

Synapse converts the activity from the axon into electrical effects.

NeuralNetworks (cont.)

General function can be formalized as

X1W1 + X2W2 + X3W3 + ... > T

Xp : Input Pattern

W : Weight

NeuralNetworks (cont.)

Training Set Function

Tp: Desired output

For calculating error from error function

For calculating optimal or near optimal weights

NeuralNetworks (Multi-Layer Perceptron)

Multi-layer Perceptron

• Set (P) of several perceptrons , each of which classifies the input data differently, can be combined via an additional perceptron which receives all outputs from (P) as input.

NeuralNetworks (Multi-Layer Perceptron)

For calculating error from error function

For calculating optimal or near optimal weights

The output of neuron j in the output layer

NeuralNetworks (Multi-Layer Perceptron)

Multi-layer Perceptron Problems:

• The number of trainable parameters becomes

extremely large.

• It offers little or no invariance to shifting , scaling and other forms of distortion.

• The topology of the input data is completely ignored, yielding similar training results for all permutations of the input data.

These problems are solved by convolution neural networks concepts.

ConvolutionNeuralNetworks

It divided input pattern to set of sub sampling layers .

Each of sub sampling consists of set of features maps that have a slightly smaller resolution than the input pattern.

ConvolutionNeuralNetworks(cont.)

It trained through back propagation because all neurons in one feature map share the same weights.

This lead to an implicit reduction of the gap.

The sub sampling layers have one trainable weight so number of free parameters in it is lower than in conventional layers.

It requires less computational effort than the training of multi-layers perceptrons.

Post processing

The output of the classification system may be contains erroneous decisions.

Past, Present and Future

Past• Research about EEG Signal

Present• Study feature extraction and classification

algorithms .

Future• Implement the Classification System.

Time Plan

References

1. David Bouchain “Character Recognition Using ConvolutionalNeural Networks “. Seminar Statistical Learning Theory University of Ulm, Germany Institute for Neural Information Processing ,2006/2007.

2. Christos Stergiou and Dimitrios Siganos” Neural Networks ”.

3. Tian Lan, Deniz Erdogmus, Andre Adami, and Michael Pavel, “Feature Selection by Independent Component Analysis and Mutual

Information Maximization in EEG Signal Classification”, Department of Biomedical Engineering OGI School of Science and Engineering Oregon Health & Science University Beaverton, Oregon 97006, USA.

4. Rave Harpaz ,” Independent Component Analysis: An Introduction”Pattern Recognition Laboratory The Graduate Center, City University of

New York 365 Fifth Avenue New York, NY 10016, USA, Nov/15/ 2005.

Thanks