확률 그래프 모델과 추론 · 2015-11-24 · attach prior probabilities to non-root nodes...

90
Byoung-Tak Zhang Computer Science and Engineering & Cognitive Science and Brain Science Programs Seoul National University Biointelligence Lab & Institute for Cognitive Science http://bi.snu.ac.kr/ SKT R&D, HMI Tech Lab 2014년 3월 10일(월) 확률 그래프 모델과 추론 Inference with Probabilistic Graphical Models

Upload: others

Post on 02-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Byoung-Tak Zhang

Computer Science and Engineering &

Cognitive Science and Brain Science Programs

Seoul National University

Biointelligence Lab & Institute for Cognitive Science

http://bi.snu.ac.kr/

SKT R&D, HMI Tech Lab 2014년 3월 10일(월)

확률 그래프 모델과 추론 Inference with Probabilistic Graphical Models

Page 2: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

기계학습이란?

• 학습 시스템: “환경 E와의 상호작용으로부터 획득한 경험적인 데이터 D를 바탕으로 모델 M을 자동으로 구성하여 스스로 성능 P를 향상하는 시스템” – 환경 E

– 데이터 D

– 모델 M

– 성능 P

• 특성 1: Self-improving Systems

(인공지능 관점)

• 특성 2: Knowledge Discovery

(데이터마이닝 관점)

• 특성 3: Data-Driven Software Design

(소프트웨어공학 관점)

• 특성 4: Automatic Programming

(컴퓨터공학 관점)

장병탁, 기계학습 개론, 2014 (to appear)

Page 3: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Traditional Programming

Machine Learning

Computer

Data

Program Output

Computer

Data

Output

Program

Machine Learning as Automatic Programming

Page 4: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

2012 (c) SNU

Biointelligence Lab,

http://bi.snu.ac.kr/

Machine Learning (ML): Three Tasks

• Supervised Learning – Estimate an unknown mapping from known input and target output

pairs – Learn fw from training set D = {(x,y)} s.t. – Classification: y is discrete – Regression: y is continuous

• Unsupervised Learning – Only input values are provided – Learn fw from D = {(x)} s.t. – Density estimation and compression – Clustering, dimension reduction

• Sequential (Reinforcement) Learning – Not target, but rewards (critiques) are provided “sequentially” – Learn a heuristic function fw from Dt = {(st,at,rt) | t = 1, 2, …} s.t. – With respect to the future, not just past – Sequential decision-making – Action selection and policy learning

)()( xxw fyf

xxw )(f

( , , )t t tf a rw s

Zhang, B.-T., Next-Generation Machine Learning Technologies, Communications of KIISE, 25(3), 2007 4

Page 5: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

기계학습 모델

감독 학습 모델 Neural Nets

Decision Trees

K-Nearest Neighbors

Support Vector Machines

무감독 학습 모델 Self-Organizing Maps

Clustering Algorithms

Manifold Learning

Evolutionary Learning

확률그래프 모델 Bayesian Networks

Markov Networks

Hidden Markov Models

Hypernetworks

동적시스템 모델 Kalman Filters

Sequential Monte Carlo

Particle Filters

Reinforcement Learning

(c) 2009-2010 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

5 장병탁, 기계학습 개론, 2014 (to appear)

Page 6: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Outline

• Bayesian Inference

– Monte Carlo

– Importance Sampling

– MCMC

• Probabilistic Graphical Models

– Bayesian Networks

– Markov Random Fields

• Hypernetworks

– Architecture and Algorithms

– Application Examples

• Discussion

Page 7: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayes Theorem

(c) 2010-2012 SNU Biointelligence Laboratory,

http://bi.snu.ac.kr/ 7

Page 8: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

MAP vs. ML

• What is the most probable hypothesis given data? – From Bayes Theorem

• MAP (Maximum A Posteriori) –

• ML (Maximum Likelihood) –

(c) 2010-2012 SNU Biointelligence Laboratory,

http://bi.snu.ac.kr/ 8

Page 9: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2008 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 9

Page 10: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 10

Prof. Schrater’s Lecture Notes

(Univ. of Minnesota)

Page 11: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 11

Page 12: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 12

Page 13: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 13

Page 14: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 14

Page 15: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Graphical Models

Graphical Models (GM)

Causal Models Chain Graphs Other Semantics

Directed GMs Dependency Networks Undirected GMs

Bayesian Networks

DBNs FST

HMMs

Factorial HMM Mixed Memory Markov Models

BMMs

Kalman

Segment Models

Mixture Models

Decision Trees Simple

Models

PCA

LDA

Markov Random Fields / Markov

networks

Gibbs/Boltzman Distributions

Page 16: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Networks

Page 17: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Recommendation Systems

Your friends

attended this

lecture already

and liked it.

Therefore, we

would like to

recommend it

to you !

Page 18: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

18

Bayesian Networks

Bayesian network DAG (Directed Acyclic Graph)

Express dependence relations between variables

Can use prior knowledge on the data (parameters)

A B C

P(A,B,C,D,E) = P(A)P(B|A)P(C|B)

P(D|A,B)P(E|B,C,D)

D E

n

i

iiXPP1

)|()( paX

Page 19: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Representing Probability Distributions

• Probability distribution = probability for each

combination of values of these attributes

• Naïve representations (such as tables) run into troubles

– 20 attributes require more than 220106 parameters

– Real applications usually involve hundreds of attributes

Hospital patients described by

• Background: age, gender, history of diseases, …

• Symptoms: fever, blood pressure, headache, …

• Diseases: pneumonia, heart attack, …

Page 20: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Networks - Key Idea

• Bayesian networks

• utilize conditional independence

• Graphical representation of conditional

independence respectively “causal”

dependencies

Exploit regularities !!!

Page 21: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Networks

1. Finite, directed acyclic graph

2. Nodes: (discrete) random variables

3. Edges: direct influences

4. Associated with each node: a table

representing a conditional probability

distribution (CPD), quantifying the effect the

parents have on the node

M J

E B

A

Page 22: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Networks

X1 X2

X3

(0.2, 0.8) (0.6, 0.4)

true 1 (0.2,0.8)

true 2 (0.5,0.5)

false 1 (0.23,0.77)

false 2 (0.53,0.47)

- In

tro

ductio

n

Page 23: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Example

Train

Strike

Martin

Late

Norman

Late

Project

Delay

Office

Dirty

Boss

Angry

Boss

Failure-in-Love

Martin

Oversleep

Norman

Oversleep

Use a DAG to model the causality.

Page 24: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Example

Train

Strike

Martin

Late

Norman

Late

Project

Delay

Office

Dirty

Boss

Angry

Boss

Failure-in-Love

Martin

Oversleep

Norman

Oversleep

Attach prior probabilities to all root nodes

Norman oversleep

Probability

T 0.2

F 0.8

Train Strike

Probability

T 0.1

F 0.9

Martin oversleep

Probability

T 0.01

F 0.99

Boss failure-in-love

Probability

T 0.01

F 0.99

Page 25: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Example

Train

Strike

Martin

Late

Norman

Late

Project

Delay

Office

Dirty

Boss

Angry

Boss

Failure-in-Love

Martin

Oversleep

Norman

Oversleep

Attach prior probabilities to non-root nodes

Norman

untidy

Norman oversleep

T F

Norman

untidy

T 0.6 0.2

F 0.4 0.8

Train strike

T F

Martin oversleep

T F T F

Martin

Late

T 0.95 0.8 0.7 0.05

F 0.05 0.2 0.3 0.95

Each column is summed to 1.

Page 26: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Example

Train

Strike

Martin

Late

Norman

Late

Project

Delay

Office

Dirty

Boss

Angry

Boss

Failure-in-Love

Martin

Oversleep

Norman

Oversleep

Norman

untidy

Each column is summed to 1. Boss Failure-in-love

T F

Project Delay

T F T F

Office Dirty

T F T F T F T F

Boss

Angry

very 0.98 0.85 0.6 0.5 0.3 0.2 0 0.01

mid 0.02 0.15 0.3 0.25 0.5 0.5 0.2 0.02

little 0 0 0.1 0.25 0.2 0.3 0.7 0.07

no 0 0 0 0 0 0 0.1 0.9

Attach prior probabilities to non-root nodes

Page 27: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Inference

Page 28: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

A Bayesian Network

The “ICU alarm” network

37 binary random variables

509 parameters instead of

PCWP CO

HRBP

HREKG HRSAT

ERRCAUTER HR HISTORY

CATECHOL

SAO2 EXPCO2

ARTCO2

VENTALV

VENTLUNG VENITUBE

DISCONNECT

MINVOLSET

VENTMACH KINKEDTUBE INTUBATION PULMEMBOLUS

PAP SHUNT

ANAPHYLAXIS

MINOVL

PVSAT

FIO2

PRESS

INSUFFANESTH TPR

LVFAILURE

ERRBLOWOUTPUT STROEVOLUME LVEDVOLUME

HYPOVOLEMIA

CVP

BP

Page 29: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Cf. Markov Networks

Undirected graphs

Nodes = random variables

Cliques = potentials (~ local jpd)

Page 30: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Fielded Applications

• Expert systems

Medical diagnosis (Mammography)

Fault diagnosis (jet-engines, Windows 98)

• Monitoring

Space shuttle engines (Vista project)

Freeway traffic, Activity Recognition

• Sequence analysis and classification

Speech recognition (Translation, Paraphrasing

Biological sequences (DNA, Proteins, RNA, ..)

• Information access

Collaborative filtering

Information retrieval & extraction

… among others ?

Page 31: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

32

Web Mining: e-Commerce

Page 32: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

33

KDD-2000 Web Mining Competition

Data: 465 features over 1700 customers

Features include friend promotion rate, date

visited, weight of items, price of house,

discount rate, …

Data was collected during Jan. 30 – March

30, 2000

Friend promotion was started from Feb. 29

with TV advertisement.

Aims: Description of heavy/low spenders

Web Mining: Customer Analysis

Page 33: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

34

Web Mining: Customer Analysis

Page 34: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

35

Page 35: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2007, SNU Biointelligence Lab, http://bi.snu.ac.kr/

36

Web Mining: Results

A Bayesian net for

KDD web data

V229 (Order-Average) and

V240 (Friend) directly

influence V312 (Target)

V19 (Date) was influenced by

V240 (Friend) reflecting the

TV advertisement.

[Chang et al., 2002]

Page 36: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Markov Random Fields

(Markov Networks)

Page 37: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Graphical Models

38

Directed Graph (e.g. Bayesian Network)

Undirected Graph (e.g. Markov Random Field)

(c) 2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 38: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Image Analysis

Likelihood Marginal

yProbabilit PrioriA Processn Degradatio

yProbabilit PosterioriA Image Degraded

Image OriginalImage OriginalImage DegradedImage DegradedImage Original

Pr

PrPr Pr

Original Image Degraded

(observed) Image

Transmission

Noise

Page 39: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Image Analysis

We could thus represent both the observed image (X) and

the true image (Y) as Markov random fields.

And invoke the Bayesian framework to find P(Y|X)

X – observed image

Y – true image

Page 40: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Details

Remember

P(Y|X) proportional to P(X|Y)P(Y)

P(X|Y) is the data model.

P(Y) models the label interaction.

Next we need to compute the prior P(Y=y)

and the likelihood P(X|Y).

P(Y | X) =P(X |Y )P(Y )

P(X)µP(X |Y )P(Y )

Page 41: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Back to Image Analysis

Likelihood can be modeled as a mixture of

Gaussians.

The potential is modeled to capture the domain

knowledge. One common model is the Ising

model of the form βyiyj

Page 42: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Bayesian Image Analysis

Let X be the observed image = {x1,x2…xmn}

Let Y be the true image = {y1,y2…ymn}

Goal : find Y = y* = {y1*,y2*…} such that P(Y = y*|X)

is maximum.

Labeling problem with a search space of Lmn

L is the set of labels.

m*n observations.

Page 43: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Unfortunately

Observed Image SVM MRF

Page 44: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Markov Random Fields (MRFs)

Introduced in the 1960s, a principled approach for

incorporating context information.

Incorporating domain knowledge .

Works within the Bayesian framework.

Widely worked on in the 70s, disappeared over the 80s,

and finally made a big come back in the late 90s.

Page 45: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Markov Random Field

Random Field: Let be a family of

random variables defined on the set S , in which each

random variable takes a value in a label set L. The

family F is called a random field.

Markov Random Field: F is said to be a Markov random

field on S with respect to a neighborhood system N if and

only if the following two conditions are satisfied:

},...,,{ 21 MFFFF

iF if

Positivity: ( ) 0,P f f F

)|(}){|( :tyMarkovianiiNii ffPiSfP

Page 46: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Inference

Finding the optimal y* such that P(Y=y*|X) is maximum.

Search space is exponential.

Exponential algorithm - simulated annealing (SA)

Greedy algorithm – iterated conditional modes (ICM)

There are other more advanced graph cut based

strategies.

Page 47: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Sampling and Simulated Annealing

Sampling

A way to generate random samples from a (potentially very

complicated) probability distribution.

Gibbs/Metropolis.

Simulated annealing

A schedule for modifying the probability distribution so that, at

“zero temperature”, you draw samples only from the MAP

solution.

If you can find the right cooling schedule the algorithm

will converge to a global MAP solution.

Flip side --- SLOW finding the correct schedule is non

trivial.

Page 48: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Iterated Conditional Modes

Greedy strategy, fast convergence

Idea is to maximize the local conditional probabilities

iteratively, given an initial solution.

Simulated annealing with T =0 .

Page 49: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Parameter Learning

Supervised learning (easiest case)

Maximum likelihood:

For an MRF:

( | )/1( | )

( )

U f TP f eZ

* arg max ( | )P f

Page 50: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Pseudo Likelihood

So we approximate

Large lattice theorem: in the large lattice limit M, PL

converges to ML estimate.

Turns out that a local learning method like pseudo-likelihood

when combined with a local inference method such as ICM

does quite well. Close to optimal results.

( , )

( , )( ) ( | ) =

i i Ni

i j j N j

j

U f f

i N U f fi X

f L

ePL f P f f

e

( ) ( , )ii i N

i

U f U f f

Page 51: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Hypernetworks

Page 52: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Graphical Models

53

Directed Graph (e.g. Bayesian Network)

Undirected Graph (e.g. Markov Random Field)

(c) 2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 53: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/ 54

From Simple Graphs to Higher-Order Graphs

G

F

J

A

S

G

F

J

A

S

{ , , , }

( | , , , )

( , , , | ) ( )

( , , , )

( , , , | )

( | ) ( | ) ( | ) ( | )

( | )x J G S A

P F J G S A

P J G S A F P F

P J G S A

P J G S A F

P J F P G F P S F P A F

P x F

G

F

J

A

S

{ , , , , }

( , , , , )

( | ) ( | ) ( | )( | )

( | ( ))x F J G S A

P F J G S A

P G F P J F P J A J S

P x pa x

( , ) {( , )| , { , , , } and }

( , , , , )

( , | ) ( , | ) ( , | )

( , | ) ( , | )

( , | )

( ( , ) | )he x y x y x y J G S A

x y

P F J G S A

P J G F P J S F P J A F

P G S F P G A F

P S A F

P he x y F

(1) Naïve Bayes

(2) Bayesian Net

(3) High-Order PGM

Page 54: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

x1

x2

x3

x4

x5

x6

x7

x8 x9

x10

x11

x12

x13

x14

x15

...1 2 1 2

1 2

( )

( )

( ) ( ) ( ) ( )

2 , ,...,

Model (probability distribution)

( | )

1 exp[ ( ; )]

Z( )

1 1 exp ... ,

Z( ) ( ) i i i i i ik k

k

n

n

Kk n n n

k i i i

P W

E WW

w x x xW c k

x

x

( )

1Data { }n N

nD x

[Zhang, DNA-2006]

[Zhang, IEEE CIM, 2008]

Hypernetwork as a Probabilistic Distributed Associative Memory

Hyperedges

55 (c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 55: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

56

Hypernetwork Coding: Population of

Hyperedges

v5

v1

v3

v7

v2

v6

v4

H = (V, E, W)

V = {v1, v2, v3, …, v7}

E = {E1, E2, E3, E4, E5}

W = {w1, w2, w3, w4, w5}

E1 = {v1, v3, v4} E2 = {v1, v4}

E3 = {v2, v3, v6}

E4 = {v3, v4, v6, v7}

E5 = {v4, v5, v7}

E1

E4

E5

E2

E3

(c) 2005-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

[Zhang, 2008]

Page 56: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/

57 x8 x9

x12

x1

x2

x3

x4

x5

x6

x7 x10

x11

x13

x14

x15

x1

=1

x2

=0

x3

=0

x4

=1

x5

=0

x6

=0

x7

=0

x8

=0

x9

=0

x10

=1

x11

=0

x12

=1

x13

=0

x14

=0

x15

=0

y

= 1

x1

=0

x2

=1

x3

=1

x4

=0

x5

=0

x6

=0

x7

=0

x8

=0

x9

=1

x10

=0

x11

=0

x12

=0

x13

=0

x14

=1

x15

=0

y

= 0

x1

=0

x2

=0

x3

=1

x4

=0

x5

=0

x6

=1

x7

=0

x8

=1

x9

=0

x10

=0

x11

=0

x12

=0

x13

=1

x14

=0

x15

=0

y

=1 4 Data Items

x4 x10 y=1 x1

x4 x12 y=1 x1

x10 x12 y=1 x4

x3 x9 y=0 x2

x3 x14 y=0 x2

x9 x14 y=0 x3

x6 x8 y=1 x3

x6 x13 y=1 x3

x8 x13 y=1 x6

1

2

3

1

2

3

x1

=0

x2

=0

x3

=0

x4

=0

x5

=1

x6

=1

x7

=0

x8

=1

x9

=0

x10

=0

x11

=1

x12

=0

x13

=0

x14

=0

x15

=1

y

=0 4

x6 x8 y=0 x5

4

Round 1 Round 2 Round 3

Data

x8 x11 y=0 x6

x11 x15 y=0 x8

(c) 2005-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

57

Page 57: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

58

x1

x2

x3

x4

x5

x6

x7

x8 x9

x10

x11

x12

x13

x14

x15

)( 2121...21

21

21...21

321

321321

21

2121

321

321321

21

2121

2 ,...,,

)()()()(

2 ,...,,

)()()()(

,,

)()()()3(

,

)()()2(

)()(

,,

)()()()3(

,

)()()2()(

...)(

1exp)Z(

isfunction partition thewhere

,...)(

1exp

)Z(

1

...6

1

2

1exp

)Z(

1

)];(exp[)Z(

1 )|(

ondistributiy probabilit The

...6

1

2

1 );(

rkhypernetwo theofenergy The

m kkiiikiii

k

kiiikiii

iiiiiiiiii

iiiiiiiiii

K

k iii

mmmk

K

k iii

nnnk

iii

nnn

ii

nn

nn

iii

nnn

ii

nnn

xxxwkc

W

xxxwkcW

xxxwxxwW

WEW

WP

xxxwxxwWE

x

xx

x

Nn

K

ii

i

i

D

WWWW

SkXSSS

xxxX

WSXH

I

1

)(

)()3()2(

,...,,

}{

:set Training

),...,,(

|| , ,

)(

),,(

as defined isrk hypernetwo The

21

x

[Zhang, DNA-2006]

[Zhang, IEEE CIM, 2008]

Hypernetwork as a Probabilistic Model of Distributed Parallel Associative Memory

Page 58: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Deriving the Learning Rule

N

n

K

k iii

nnnk

N

n

Kn

Nn

WZxxxwkc

WWWP

WP

k

kiiikiii

1 2 ,...,,

)()()()(

1

)()3()2()(

1

)(

)(ln...)(

1exp

),...,,|(ln

)|}({ln

21

21...21

x

x

)|}({ln 1

)(

)(

...21

WPw

Nn

s

siii

x

N

n

Nn

WP

WP

1

(n)

1

)(

)|(

)|}({

x

x [Zhang, DNA-2006]

[Zhang, IEEE CIM, 2008]

59 (c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 59: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Derivation of the Learning Rule

(c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 60

xx

x

x

x

x

)|(......

...1

...

where

......

......

)(ln...)(

1exp

)(ln...)(

1exp

)|}({ln

2121

2121

2121

2121

...2121

21...21

...21

21

21...21

...21

...21

)|(

1

)()()(

)|(

1)|(

)()()(

1)(

2 ,...,,

)()()()(

)(

1 2 ,...,,

)()()()(

)(

1

)(

)(

WPxxxxxx

xxxN

xxx

xxxxxxN

xxxxxx

WZw

xxxwkcw

WZxxxwkcw

WPw

siiis

siiis

ss

ssiii

siiik

kiiikiii

siii

k

kiiikiii

siii

siii

WPiii

N

n

nnn

Dataiii

WPiii

Dataiii

N

nWP

iii

nnn

N

ns

K

k iii

nnnk

s

N

n

K

k iii

nnnk

s

Nn

s

Page 60: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Higher-order terms Explicit representation

fast learning

cf. Bayesian networks

Structural learning

Evolving complex networks

discovery of modules

cf. Markov random fields

Population coding

Collection of modules

incremental learning

cf. numerical CPT

Features of Hypernetworks

Compositionality Creation of new modules

symbolic computation

cf. connectionist models

Self-supervised

Can learn from unlabeled data

no need for labeling

cf. supervised

Reconfigurable architecture

Run-time self-assembly

anytime inference

cf. fixed architecture

61 (c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 61: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Difference from Markov Networks

In Markov networks, the joint distribution is written as a product of

potential functions over the maximal cliques of the graph

Similarity

Hyperedges define potential functions (components) like cliques

Distribution is represented as a product of potential functions

Difference

Novel hyperedges are constructed from data (cliques are given)

Both model structures and parameters are evolved (cliques are fixed)

Hyperedges can be ordered (cliques are not)

1( ) ( )C C

C

PZ

x x

( )C C

C

Z x

x

( ) exp{ ( )}C C CE x x

Partition function

Potential function

energy function

Boltzmann

Distribution

62 (c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 62: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Mobile Phone Applications of Hypernetworks

• mLife

• eHealth

Page 63: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Mobile Sensors on Smartphones

일상 Logging을 통한 사용자의 행동패턴 인식 및 추천

Android 스마트폰을 이용하여 Real-life logging data 를 수집 (삼성

Galaxy S 시리즈, HTC Desire)

64 Action Logger (MDS)

센서 정보

GPS 절대 위치 정보

Accelerometer 3D 축을 기준으로 가속도 크기 및 방향

Proximity 단말 가까이에 물체의 존재 유무

Orientation 단말 정면을 기준으로 roll과 pitch 값

Magnetic fields 자기장 센서

Illuminometer 조도센서

Sound Noise Noise sound의 크기

Bluetooth Bluetooth device address

WIFI SSID 명, 신호 세기

(c) 2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 64: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Mobile Sensor Data

(c) 2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

Page 65: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

추론 예: 일반적인 경우 학습모델을 이용한 추론의 예

Page 66: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

User Scenario

DietAdvisor: A Personalized eHealth Agent in a Mobile Computing Environment 67

John’s daily-life and

DietAdvisor

Page 67: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Experimental Results: Activity Recognition

DietAdvisor: A Personalized eHealth Agent in a Mobile Computing Environment 68

Page 68: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Personalized Recommendation Module

Hypernetwork-based learning for menu

DietAdvisor: A Personalized eHealth Agent in a Mobile Computing Environment 69

Weight Item1 Item2

15 쌀밥 배추김치

3 김 깍두기

2 현미밥 버섯볶음

4 계란찜 숙주나물

4 쌀밥 두부조림

1 배추김치 깍두기

3 부추김치 장조림

5 마른김 양념간장

6 현미밥 배추김치

3 탕수육 군만두

4 현미밥 북어국

0 X X

Weight Item1 Item2

15 쌀밥 배추김치

3 김 깍두기

3 현미밥 버섯볶음

4 계란찜 숙주나물

4 쌀밥 두부조림

1 배추김치 깍두기

3 부추김치 장조림

5 마른김 양념간장

7 현미밥 배추김치

3 탕수육 군만두

4 현미밥 북어국

1 북어국 버섯볶음

Learning Late: 50%

Page 69: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Experimental Results: Menu Recommendation

DietAdvisor: A Personalized eHealth Agent in a Mobile Computing Environment 70

Page 70: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Other Applications of

Hypernetworks

• Language, Music, and Videos

• CogTV Recommendations

Page 71: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/

72

Text Corpus: TV Drama Series

Friends, 24, House, Grey Anatomy, Gilmore Girls, Sex and the City

289,468

Sentences

(Training Data)

700 Sentences

with Blanks

(Test Data)

I don't know what happened.

Take a look at this. …

What ? ? ? here. ? have ? visit the ? room.

Page 72: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

73

Sentence Completion Task

Why ? you ? come ? down ?

Why are you go come on down here

? appreciate it if ? call her by ? ?

I appreciate it if you call her by the way

Would you ? to meet ? ? Tuesday ?

Would you nice to meet you in Tuesday and

? gonna ? upstairs ? ? a shower

I'm gonna go upstairs and take a shower

? have ? visit the ? room I have to visit the ladies' room

? ? ? decision

to make a decision

? still ? believe ? did this

I still can't believe you did this

Page 73: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

(c) 2005-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

74

Corpus: Friends

Keyword: “mother”

Corpus: Prison Break

Keyword: “mother” you're mother killed herself it's my mother was shot by a woman at eight we're just gonna go to your mother that i love it feeling that something's wrong with my mother and father she's the single mother i put this on my friend's mother apparently phoebe's mother killed herself thanks for pleasing my mother killed herself i'm your mother told you this is an incredible mother that's not his mother or his hunger strike holy mother of god woman i like your mother and father on their honeymoon suite with her and never called your mother really did like us is my mother was shot by a drug dealer

tells his mother and his family she's the mother of my eyes speak to your mother used to be tells his mother made it pretty clear on the floor has speak to your mother never had life insurance she's the mother of lincoln's child she's the mother of my own crap to deal with you just lost his mother is fine just lost his mother and his god tells his mother and his stepfather she's the mother of my time his mother made it clear you couldn't deliver fibonacci she's the mother of my brother is facing the electric chair same guy who was it your mother before you do it they gunned my mother down

Concept Maps for Friends and Prison Break

[J.-H. Lee et al., 2009]

Page 74: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Music Generation Result: Cross-Corpus

Scores generated by Evolutionary Hypernetworks that learned

American (A), Scottish (B), Korean Singer Kim (C), and Korean

Singer Shin (D) with the cue (left side of the bar in the middle)

from “Swanee River”, the famous American folk song

[H.-W. Kim and B.-H. Kim]

(c) 2005-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

75

Page 75: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Digital Videos for Teaching Machines

(c) 2010-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/

76

Multimodal Language

Vision

Audio

“Situated”

Contexts

“Naturalistic”

Dynamic

“Quasireal”

Continuous

Educational

Page 76: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

LEARNING BY PLAYING

Learning the image from the given text

Click the Right Option

Text Query

Score : 01

Page 77: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

78

Learning the text from the given image

LEARNING BY PLAYING

Image Query

Click the Right Option

Score : 02

Page 78: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Player 1

Player 2

No. of Sessions

Acc

ura

cy

Result 1: Humans for T2I Learning

Page 79: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Result 2: Humans for I2T Learning

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/

No. of Sessions

Acc

ura

cy

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Player 1

Player 2

Page 80: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

© 2010, SNU Biointelligence Lab, http://bi.snu.ac.kr/

0.4

0.45

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0.85

0.9

0.95

1

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100 105

No. of Epochs

Acc

ura

cy

Result 3: Machines for I2T Learning

[Fareed et al., 2009]

Page 81: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Answer Query

I don't know what

happened

There's a kitty in my

guitar case

Maybe there's

something I can do to

make sure I get

pregnant

Maybe there's something

there's something I

I get pregnant

There's a

a kitty in

in my guitar case

I don't know

don't know what

know what happened

Matching &

Completion

Image-to-Text Recall Examples

© 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Page 82: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Query Matching &

Completion

I don't know what happened

Take a look at this

There's a kitty in my guitar case

Maybe there's something I

can do to make sure I get

pregnant

Answer

Text-to-Image Recall Examples

© 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Page 83: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Utterance-scene representation

84

Word Learning from Video

Oh, the rabbit's followed you

home, Maisy.

Oh, and don't forget panda.

Good night, bird. See you

in the morning.

Original sentence-scene pairs

rabbit, followed,

home, maisy

forget, panda

good, night, bird,

see, morning

Visual words Textual words

Page 84: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Concept representation

85

Sparse Population Code Models

w

w

w

w

w

v

v

v

v

mouse

tail

yellow

dark

v

longwred

w

eye

v

v

v

v

w ear

v

whop

w

run

Concept map for MOUSE Concept map for RABBIT

[Zhang et al., CogSci-2012]

Page 85: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Concept generalization and specialization (cont’d)

86

Experimental Results

bird

rabbit little

favoriteonion

excited

today

doingfarm

water

idea

night

morning

hole

rabbit

ride

maisy

good

want

need

look

helping

new

tree

diggingpenguin

Episodes 1-4 Episodes 1-6

[Zhang et al., CogSci-2012]

Page 86: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

CogTV: 멀티모달 인터랙티브 추천서비스 플랫폼

87

User Log 환경 데이터

User Descriptor

사용자 데이터

Image/Audio/Text

데이터 학습/추론엔진

멀티모달 스트림데이터

데이터

인지기반 추론엔진

추천 서비스 데이터

User Interface

사용자 학습 및 모델링 엔진

인지기반 연상검색

내용기반추천

User

Page 87: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Cognitive TV

88

• 인터랙티브 홈씨어터 파일럿 버전 구축 (가상 거실 환경 조성 및 디지털 컨텐츠 수집)

• 인터랙티브 홈씨어터 환경에서 시청자의 인지 모형 구축을 위한 실험 설계, 수행 및 데이터 수집

Page 88: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

MMG + EEG Experiments

Page 89: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Projection System: TRI-SPACE, an immersive VR-display system providing 3

stereoscopic screens using 6 digital JVC D-ILA projectors by 3Dims

Eye tracker: ViewPoint PC-60, BS007, from Arrington Research

Head movement tracking: An optical (cable-less) tracking system by ART

Visualization Software: InstantReality, OpenSG

MMG + Eye-Tracking Experiments

In Collaboration with Bielefeld University, Germany

Page 90: 확률 그래프 모델과 추론 · 2015-11-24 · Attach prior probabilities to non-root nodes Norman untidy Norman oversleep T F Norman untidy T 0.6 0.2 F 0.4 0.8 Train ... CVP

Acknowledgements

Sponsors: National Research Foundation (NRF), Ministry of Education, Science, and

Technology (MEST), Ministry of Knowledge Economy (MKE), Samsung Electronics, and

Microsoft Research (MSR)

(c) 2005-2012 SNU Biointelligence Laboratory, http://bi.snu.ac.kr/ 91