xiaolong wang and daniel khashabi

39
Xiaolong Wang and Daniel Khashabi UIUC, 2013 Indian Buffet Process

Upload: nascha

Post on 25-Feb-2016

43 views

Category:

Documents


5 download

DESCRIPTION

Indian Buffet Process. Xiaolong Wang and Daniel Khashabi. UIUC, 2013. Contents. Mixture Model Finite mixture Infinite mixture Matrix Feature Model Finite features Infinite features(Indian Buffet Process). Bayesian Nonparametrics. Models with undefined number of elements, - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Xiaolong  Wang and Daniel  Khashabi

Xiaolong Wang and Daniel Khashabi

UIUC, 2013

Indian Buffet Process

Page 2: Xiaolong  Wang and Daniel  Khashabi

Contents•Mixture Model

▫Finite mixture▫Infinite mixture

•Matrix Feature Model▫Finite features ▫Infinite features(Indian Buffet Process)

Page 3: Xiaolong  Wang and Daniel  Khashabi

Bayesian Nonparametrics • Models with undefined number of elements,

▫Dirichlet Process for infinite mixture models ▫With various applications Hierarchies Topics and syntactic classes Objects appearing in one image

▫Cons The models are limited to the case that could be modeled

using DP. i.e. set of observations are generated by only one latent

component

Page 4: Xiaolong  Wang and Daniel  Khashabi

Bayesian Nonparametrics contd. • In practice there might be more complicated

interaction between latent variables and observations

• Solution▫Looking for more flexible nonparametric models ▫Such interaction could be captured via a binary

matrix ▫ Infinite features means infinite number of columns

Page 5: Xiaolong  Wang and Daniel  Khashabi

Finite Mixture Model• Set of observation: • Constant clusters, • Cluster assignment for is • Cluster assignments vector : • The probability of each sample under the model:

• The likelihood of samples:

• The prior on the component probabilities (symmetric Dirichlet dits.)

1{ }Ni ix

1 2, ,..., ][ TNc c cc

11

| (| ) ) )( (N K

i i iki

c p cp p k k

X x

| ~ Dirichlet( , , ).K K

K

1

|( | ) ( ) ( )K

i i i ik

c cp p k p k

x x

ix 1,...,i Kc

Page 6: Xiaolong  Wang and Daniel  Khashabi

Finite Mixture Model• Since we want the mixture model to be valid for any

general component we only assume the number of cluster assignments to be the goal of learning this mixture model !

• Cluster assignments: • The model can be summarized as :

• To have a valid model, all of the distributions must be valid !

1 2, ,..., ][ TNc c cc

( | )j jp x c i

( ). ( )||( )( )

p ppp

c θ θθ cc

| ~ Dirichlet( , , ).

| ~ Discrete( )i

K Kc

θ α

θ θ

prior

likelihood

Marginal likelihood(Evidence)

posterior

Page 7: Xiaolong  Wang and Daniel  Khashabi

Finite Mixture Model contd.

1

1

, s.t. ((

))

K

Nk

k

k iKi

mK m k

Nc

K

( ) ( | ) ( )K

p p p d

c c θ θ θ1

11( )( ) ,..., k

K

kk

p DK K

θ

1

( )| k

Kmk

k

p

c θ

1

1

( ) ( | ) ( )

,

1

( . )..,

K

k

K

K mK

kk

p p p d

DK

d

K

c c θ θ θ

θ

1

( )k i

N

i

cm k

Page 8: Xiaolong  Wang and Daniel  Khashabi

Infinite mixture model • Infinite clusters likelihood

▫ It is like saying that we have : • Since we always have limited samples in reality, we will

have limited number of clusters used; so we define two set of clusters:▫ number of classes for which ▫ number of classes for which

• Assume a reordering, such that

• Infinite clusters likelihood ▫ It is like saying that we have : ▫ Infinite dimensional multinomial cluster distribution.

0km K

and0; 0k kk K mmK k

K

0K 0km

K 0K

K

K

Page 9: Xiaolong  Wang and Daniel  Khashabi

Infinite mixture model • Now we return to the previous slides and set in

formulas

11

1 1

) )( ))

( )( (

(( )

)(

k

K

K mKk

kK

j

km K jN K

K

pK N

c

( 1) ( ) ( ) ( )...( ) ( )1k kx x x m mK K K K

1

1

( ) , s.t. )(

()

K

Nk

k

iKi

k

mK m k

NK

p c

c

K

1

1

( )( )

( )

km

j

km K jK K

K

If we set the marginal likelihood will be .

Instead we can model this problem, by defining probabilities on partitions of samples, instead of class labels for each

sample.

K ( ) 0p c

Page 10: Xiaolong  Wang and Daniel  Khashabi

Infinite mixture model contd.• Define a partition of objects; • Want to partition objects into classes • Equivalence class of object partitions:

• Valid probability distribution for an infinite mixture model• Exchangeable with respect to clusters assignments !

▫ Important for Gibbs sampling (and Chinese restaurant process)▫ Di Finetti’s theorem: explains why exchangeable observations are

conditionally independent given some probability distribution

N K [ ] |i i c c c c

1

[ ] 10 1

))

! (([ ]) ( )! (

kK mK

k j

Kp p jK K K N

c c

c c

1

1 10

! (([ ]) . . .(

))!

kmKK

Kk j

Kp jK K K N

c

1

(([ ]) . 1 . )lim !1(

.)

KK

Kk

kp mN

c

Page 11: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

1 0m

1 1( 1 ) 1

p c

1 Parameter

Page 12: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 1, 0m m

2 1 1

1( 1| )1

p c c

1

12 1

1( 2 | )1

p c c

Parameter

Page 13: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 31, 1, 0m m m

3 1:2 13 :2 1 1( 1| ) ( 3 12 1

| )2

p c c p c c

1

1:23 1

1( 2 | )2

p c c

Parameter

Page 14: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 32, 1, 0m m m

1:34 14 :3 2 1( 1| ) ( 3 1 1

| )3 3

p c c p c c

1

1:34 1

1( 2 | )3

p c c

Parameter

Page 15: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 32, 2, 0m m m

5 1:4 15 :4 2 1( 1| ) ( 3 | )4

4

1 1

p c c p c c

1

1:45 1

2( 2 | )4

p c c

Parameter

Page 16: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 3 42, 2, 1, 0 m m m m

1:5 1:6 56 2 1( 1| ) ( 3 | )5

5

1 1

p c c p c c

1

1:5 6 1:56 2 1( 2 | ) ( 4 | ) 5

1 15

p c c p c c

Parameter

Page 17: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 3 42, 3, 1, 0 m m m m

7 1:6 17 :6 2 1( 1| ) ( 3 | )6

6

1 1

p c c p c c

1

1:6 1:67 7 3 1( 2 | ) ( 4 | ) 6

1 16

p c c p c c

Parameter

Page 18: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 3 42, 3, 2, 0 m m m m

8 1:3 18 :3 2 2( 1| ) ( 3 | )7

7

1 1

p c c p c c

1

1:3 8 1:38 3 1( 2 | ) ( 4 | ) 7

1 17

p c c p c c

Parameter

Page 19: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

21 3 43, 3, 2, 0 m m m m

1:3 1:9 39 3 2( 1| ) ( 3 | )8

8

1 1

p c c p c c

1

9 1:3 9 1:3 3 1( 2 | ) ( 4 | ) 8

1 18

p c c p c c

Parameter

Page 20: Xiaolong  Wang and Daniel  Khashabi

Chinese Restaurant Process (CRP)

1 11( ,...,

1)|

1

i

k

i

m Kk c c ip c

ki

K

k

1 3 4 523, 3, 2, 1 5m m m m m

10 1:4 10 1:4 10 1:43 2 1( 1|

1 1) ( 3 | ) ( 5 | )

9 19 9p c c p c c p c c

1

10 1:4 10 1:4 3 1( 2 | ) ( 4 | 1

)9 19

p c c p c c

Parameter

Page 21: Xiaolong  Wang and Daniel  Khashabi

CRP: Gibbs sampling• Gibbs sampler requires full conditional

• Finite Mixture Model:

• Infinite Mixture Model:

( | ). (( | , |) )i i i ip pp cc k k c X X c c

,( |

1)

i k

i i

mp c

Nk K

c

,

,1

( | )otherw

0

i1

1se0

i k

i k

i i ip k

N m

Nc K

m

k

c

Page 22: Xiaolong  Wang and Daniel  Khashabi

Beyond the limit of single label•In Latent Class Models:

▫ Each object (word) has only one latent label (topic)▫ Finite number of latent labels: LDA▫ Infinite number of latent labels: DPM

•In Latent Feature (latent structure) Models:▫ Each object (graph) has multiple latent features

(entities)▫ Finite number of latent features: Finite Feature Model

(FFM)▫ Infinite number of latent features: Indian Buffet Process

(IBP)

• Rows are data points • Columns are latent features

• Movie Preference Example:• Rows are movies: Rise of the Planet of the Apes• Columns are latent features:

• Made in U.S.• Is Science fiction• Has apes in it …

Page 23: Xiaolong  Wang and Daniel  Khashabi

Latent Feature Model

• F : latent feature matrix• Z : binary matrix• V : value matrix

• With p(F)=p(Z). p(V)

F Z V

Z F Fdiscretecontinuous

Page 24: Xiaolong  Wang and Daniel  Khashabi

Finite Feature Model•Generating Z : (N*K) binary matrix

▫ For each column k, draw from beta distribution▫ For each object, flip a coin by

•Distribution of Z : 1 1 1

( ) ( | ) (1 )| k km mik k k k

K N KN

k i k

p p z

Z π

Page 25: Xiaolong  Wang and Daniel  Khashabi

Finite Feature Model•Generating Z : (N*K) binary matrix

▫ For each column k, draw from beta distribution▫ For each object, flip a coin by

•Z is sparse:

▫ Even

Page 26: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

• Difficulty:▫ P(Z) -> 0▫ Solution: define equivalence classes on random binary feature

matrices.• left-ordered form function of binary matrices, lof(Z):

▫ Compute history h of feature (column) k▫ Order features by h decreasingly

191817161514131211109876543210

Z1 Z2 are lof equivalentiff lof(Z1)=lof(Z2) = [Z]Describing [Z]:

Cardinality of [Z]:

Page 27: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

Given:

Derive when : ( almost surely)

Page 28: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

Given:

Derive when : ( almost surely)

Page 29: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

Given:

Derive when : ( almost surely)

Page 30: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

Given:

Derive when : ( almost surely)

Harmonic sequence sum

Page 31: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process1st Representation:

Given:

Derive when : ( almost surely)

Note:• is well defined because a.s.• depends on Kh :

• The number of features (columns) with history h• Permute the rows (data points) does not change Kh

(exchangeability)

Page 32: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process2st Representation: Customers & Dishes

• Indian restaurant with infinitely many infinite dishes (columns)• The first customer tastes first K1

(1) dishes, sample• The i-th customer:

• Taste a previously sampled dish with probability • mk : number of previously customers taking dish k

• Taste following K1(i) new dishes, sample

Poission Distribution:

Page 33: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process2st Representation: Customers & Dishes

1/11/21/32/42/53/64/73/85/96/10

1/12/23/31/44/55/66/71/87/98/10

Page 34: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process2st Representation: Customers & Dishes

From:

We have:

• Permute K1(1), next K1

(2), next K1(3),… next K1

(N) dishes (columns) does not change P(Z)• The permuted matrices are all of same lof equivalent class [Z]• Number of permutation:

Note:

2nd representation is equivalent to

1st representation

Page 35: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process3rd Representation: Distribution over

Collections of Histories• Directly generating the left ordered form (lof) matrix Z• For each history h:

▫ mh: number of non-zero elements in h▫ Generate Kh columns of history h

• The distribution over collections of histories

• Note:▫ Permute digits in h does not change mh (nor P(K) )▫ Permute rows means customers are exchangeable

Page 36: Xiaolong  Wang and Daniel  Khashabi

Indian Buffet Process• Effective dimension of the model K+:

▫ Follow Poission distribution:▫ Derives from 2nd representation by summing Poission

components• Number of features possessed by each object:

▫ Follow Possion distribution: ▫ Derives from 2nd representation: The first customer chooses dishes The customers are exchangeable and thus can be purmuted

• Z is sparse:▫ Non-zero element ▫ Derives from the 2nd (or 1st) representation: Expected number of non-zeros for each row is Expected entries in Z is

Page 37: Xiaolong  Wang and Daniel  Khashabi

IBP: Gibbs sampling• Need to have the full conditional

▫ denotes the entries of Z other than .▫ p(X|Z) depends on the model chosen for the observed data.

• By exchangeability, consider generating row as the last customer:

• By IBP, in which

▫ If sample zik=0, and mk=0: delete the row▫ At the end, draw new dishes from with considering

Approximated by truncation, computing probabilities for a range of values of new dishes up to a upper bound

( ) ( )1| ( | )( ( 1|, ) )ik i ikik kp p zp z Z X X Z Z

nkz( , )n kZ

Pois( )n

Page 38: Xiaolong  Wang and Daniel  Khashabi

More talk on applications• Applications

▫ As prior distribution in models with infinite number of features.

▫ Modeling Protein Interactions▫ Models of bipartite graph consisting of one side with

undefined number of elements. ▫ Binary Matrix Factorization for Modeling Dyadic Data▫ Extracting Features from Similarity Judgments▫ Latent Features in Link Prediction▫ Independent Components Analysis and Sparse Factor Analysis

• More on inference▫ Stick-breaking representation (Yee Whye Teh et al., 2007) ▫ Variational inference (Finale Doshi-Velez et al., 2009)▫ Accelerated Inference (Finale Doshi-Velez et al., 2009)

Page 39: Xiaolong  Wang and Daniel  Khashabi

References• Griffiths, T. L., & Ghahramani, Z. (2011). The indian buffet process:

An introduction and review. Journal of Machine Learning Research, 12, 1185-1224.

• Slides: Tom Griffiths, “The Indian buffet process”.