a trainable graph combination scheme for belief propagation kai ju liu new york university

29
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University

Post on 22-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

A Trainable Graph Combination Scheme for Belief Propagation

Kai Ju Liu

New York University

Images

Pairwise Markov Random Field

1 2 3

4

5

• Basic structure: vertices, edges

Pairwise Markov Random Field

• Basic structure: vertices, edges

• Vertex i has set of possible states Xi

1X 2X 3X

4X

5X

and observed value yi

1y 2y 3y4y

5y

• Compatibility between states and observed values, iii yx ,

1 2 3

4

5

• Compatibility between neighboring vertices i and j, jiij xx ,

12 23

34

35

45

Pairwise MRF: Probabilities

• Joint probability:

1X 2X 3X

4X

5X

1y 2y 3y4y

5y

1 2 3

4

5

12 23

34

35

45

ij

jiiji

iii xxyxZ

xxp ,,1

,,5

151

• Marginal probability:

ijjXx

ii

jj

xxpxp,51,

51 ,,

– Advantage: allows average over ambiguous states– Disadvantage: complexity exponential in number of vertices

Belief Propagation

1 2 3

4

5

Belief Propagation

1b 2b 3b

4b

5b

• Beliefs replace probabilities:

iNj

ijiiiii

ii xmyxz

xb ,1

• Messages propagate information:

jj Xx ijNk

jkjijjijjjiji xmxxyxxm\

,,

212 xm

121 xm

323 xm

232 xm

434 xm

343 xm

535 xm

353 xm

BP: Questions

• When can we calculate beliefs exactly?• When do beliefs equal probabilities?• When is belief propagation efficient?

Answer: Singly-Connected Graphs (SCG’s)• Graphs without loops• Messages terminate at leaf vertices• Beliefs equal probabilities• Complexity in previous example reduced from 13S5

to 24S2

BP on Loopy Graphs

• Messages do not terminate

• Energy approximation schemes [Freeman et al.]– Standard belief propagation– Generalized belief propagation

• Standard belief propagation– Approximates Gibbs free energy of system by

Bethe free energy– Iterates, requiring convergence criteria

1 2

4 3

121 xm

232 xm

343 xm

414 xm

BP on Loopy Graphs

• Tree-based reparameterization [Wainwright]– Reparameterizes distributions on singly-connected

graphs– Convergence improved compared to standard

belief propagation– Permits calculation of bounds on approximation

errors

BP-TwoGraphs

• Eliminates iteration• Utilizes advantages of SCG’s

BP-TwoGraphs

• Calculate beliefs on each set of SCG’s:–

• Select set of beliefs with minimum entropy

iiiHii

Gi x

iHii

Hi

xi

Gii

Gi

xbxb

xbxbxbxb log,logminarg,

iHii

Gi xbxb and

n

n

HH

GG

,,

,,

1

1

• Consider loopy graph with n vertices• Select two sets of SCG’s that approximate the graph

BP-TwoGraphs on Images

• Rectangular grid of pixel vertices

• Hi: horizontal graphs

• Gi: vertical graphs

horizontal graph vertical graphoriginal graph

Image Segmentation

add noise segment

Image Segmentation Results

Image Segmentation Revisited

add noise ground truth

max-flowground truth

Image Segmentation:Horizontal Graph Analysis

Image Segmentation:Vertical Graph Analysis

BP-TwoLines

• Rectangular grid of pixel vertices

• Hi: horizontal lines

• Gi: vertical lines

horizontal line vertical lineoriginal graph

Image Segmentation Results II

Image Segmentation Results III

Natural Image Segmentation

Boundary-Based Image Segmentation: Window Vertices

• Square 2-by-2 window of pixels

• Each pixel has two states

– foreground

– background

Boundary-Based Image Segmentation: Overlap

Boundary-Based Image Segmentation: Graph

Real Image Segmentation: Training

Real Image Segmentation: Results

Real Image Segmentation: Gorilla Results

Conclusion

• BP-TwoGraphs– Accurate and efficient– Extensive use of beliefs– Trainable parameters

• Future work– Multiple states– Stereo– Image fusion