distributed representative reading group

13
Distributed Representative Reading Group

Upload: burton

Post on 15-Feb-2016

24 views

Category:

Documents


0 download

DESCRIPTION

Distributed Representative Reading Group . Research Highlights. Support vector machines can robustly decode semantic information from EEG and MEG Multivariate decoding techniques allow for detection of subtle, but distributed, effects - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Distributed  Representative Reading  Group

Distributed Representative Reading Group

Page 2: Distributed  Representative Reading  Group

Research Highlights

1 Support vector machines can robustly decode semantic information from EEG and MEG

2 Multivariate decoding techniques allow for detection of subtle, but distributed, effects

3 Semantic categories and individual words have distributed spatiotemporal representations

4 Representations are consistent between subjects and stimulus modalities

5 A scalable hierarchical tree decoder further improves decoding performance

Page 3: Distributed  Representative Reading  Group

why do reported results vary from study to study,?

Partly due to the statistical analysis (traditional univariate techniques ) of high-dimensional neuroimaging data

require correction for multiple comparisons to control for false positives

insensitive to subtle, but widespread, effects within the brain

yield differing results depending on the specific responses elicited by the particular experiment performed

Page 4: Distributed  Representative Reading  Group

Why chose SVM

Robust to high-dimensional data

attempt to find a separating boundary which maximizes the margin between these classes

reduces over-fitting and allows for good generalization when classifying novel data

allows for a multivariate examination of the spatiotemporal dynamics

Page 5: Distributed  Representative Reading  Group

Why hierarchical tree decoding

Single multiclass decoder can distinguish individual word representations well, it doesn’t, directly incorporate a prior knowledge about semantic classes and the features which best discriminate these categories.

To combine information from the classifier models generated to decode semantic category and individual words, a hierarchical tree framework which attempts to decode word properties sequentially were implemented

Given an unknown word, the tree decoder

First classifies it as either a large (target) or small (nontarget) object

Second classified as living or nonliving object

Finally as an individual word within the predicted semantic category

Advantages:

allows the appropriate features to be used to decode each word property, narrowing the search space before individual words are decoded.

such a tree construct is easily scalable and could allow for the eventual decoding of larger libraries of words.

Page 6: Distributed  Representative Reading  Group

Experiment

visual (SV) and auditory version (SA) language tasks

Task: Subjects were instructed to press a button if the presented word represented an object larger than 1 foot in any dimension

Stimuli: representing objects larger than 1 foot : smaller than 1 foot = 1:1

living objects (animals and animal parts) and nonliving objects (man-made items)= 1:1

How to present stimuli:

Half of the trials presented a novel word which was only shown only once during the experiment

while the other half of the trials presented 1 of 10 repeated words (each shown multiple times during the experiment).

Page 7: Distributed  Representative Reading  Group

Decoding framework

Features: Average amplitude in six 50-ms time windows were sampled from every channel and concatenated into a large feature vector for each trial Decoding living versus non living 200, 300, 400, 500, 600, and 700 ms poststimulus Decoding individual words 250, 300, 350, 400, 450, and 500 ms poststimulus

Page 8: Distributed  Representative Reading  Group

Decoding accuracy

Compared to Naive Bayes classifier , SVM is better able to handle high dimension data

Page 9: Distributed  Representative Reading  Group

SVM weights show weights show important times and locations for decoding

Page 10: Distributed  Representative Reading  Group

Decoding is not based on low-level stimulus properties

It is possible that the generated classifiers are utilizing neural activity related to low-level visual or auditory stimulus properties when decoding individual words

To test this, we performed a shuffling based on stimulus properties to evaluate this potential confounding factor.

Page 11: Distributed  Representative Reading  Group

inter-modality and inter-subject decoding show shared neural representations

inter-modality: train the classifier with one modality and test the classifier with the other modality

inter-subject : classifier was trained on data from all but one subject within a single modality and then the remaining subject was used as test data.

Page 12: Distributed  Representative Reading  Group

Hierarchical tree decoding improves decoding performance

A three-level hierarchical tree decoder was utilized to first decode the large/small distinction (utilizing amplitude and spectral features), then the living/nonliving object category (utilizing 200–700 ms amplitude features), and finally the individual word (utilizing 250–500 ms amplitude features).

Page 13: Distributed  Representative Reading  Group

Thanks for your attention!