data-driven markov chain monte carlo presented by tomasz malisiewicztomasz malisiewicz for advanced...

Post on 22-Dec-2015

223 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Data-Driven Markov Chain Monte Carlo

Presented by Tomasz Malisiewiczfor Advanced Perception

3/1/2006

Overview of Talk

• What is Image Segmentation?

• How to find a good segmentation?

• DDMCMC results

Image segmentation in a Bayesian statistical framework

Markov Chain Monte Carlo for exploring the space of all segmentations

Data-Driven methods for exploiting image data and speeding up MCMC

DDMCMC Motivation

• Iterative approach: consider many different segmentations and keep the good ones

• Few tunable parameters, ex) # of segments encoded into prior

• DDMCMC vs Ncuts

Berkeley Segmentation Database Image 326038

Berkeley Ncuts K=30 DDMCMC

Why a rigorous formulation?

• Allows us to define what we want the segmentation algorithm to return

• Assigning a Score to a segmentation

Formulation #1(and you thought you knew what image segmentation was)

• Image Lattice: • Image:• For any point either or

• Lattice partition into K disjoint regions:

• Region is discrete label map:• Region Boundary is Continuous:

An image partition intodisjoint regions is not

An image segmentation!Regions Contents Are Key!

Formulation #2(and you thought you knew what image segmentation was)

• Each Image Region is a realization from a probabilistic model

• are parameters of model indexed by• A segmentation is denoted by a vector of hidden

variables W; K is number of regions

• Bayesian Framework:

Space of allsegmentations

PriorLikelihoodPosterior

Prior over segmentations(do you like exponentials?)

~ uniform

# of modelparamsWant less regions

Want round-ish regions

Want small regions

Want less complex models

Likelihood for Images

• Visual Patterns are independent stochastic processes

• is model-type index • is model parameter vector • is image appearance in i-th region

Grayscale

Color

Four Gray-level Models

Uniform Clutter Texture Shading

• Gray-level model space:

Gaussian Intensity Histogram

FB ResponseHistogram

B-Spline

Three Color Models (L*,u*,v*)

• Gaussian• Mixture of 2

Gaussians• Bezier Spline

• Color model space:

Calibration

• Likelihoods are calibrated using empirical study• Calibration required to make likelihoods for

different models comparable (necessary for model competition)

Principled?or

Hack?

What did we just do?

Def. of Segmentation:

Score (probability) of Segmentation:

Likelihood of Image = product of region likelihoods

Regions defined by k-partition:

What do we do with scores?

Search

Search through what? Anatomy of Solution Space

• Space of all k-partitions

• General partition space

• Space of all segmentations

Partitionspace

K Modelspaces

SceneSpace

or

Searching through segmentations

Exhaustive Enumeration of all segmentations

Greedy Search (Gradient Ascent)

Stochastic Search

MCMC based exploration

Takes too long!

Local minima!

Takes too long

Described in the rest of this talk!

Why MCMC

• What is it?

• What does it do?

-A clever way of searching through a high-dimensional space-A general purpose technique of generating samples from a probability

-Iteratively searches through space of all segmentations by constructinga Markov Chain which converges to stationary distribution

Designing Markov Chains

• Three Markov Chain requirements

• Ergodic: from an initial segmentation W0, any other state W can be visited in finite time (no greedy algorithms); ensured by jump-diffusion dynamics

• Aperiodic: ensured by random dynamics

• Detailed Balance: every move is reversible

5 Dynamics

1.) Boundary Diffusion

2.) Model Adaptation

3.) Split Region

4.) Merge Region

5.) Switch Region Model

At each iteration, we choose a dynamic with probability q(1),q(2),q(3),q(4),q(5)

Dynamics 1: Boundary Diffusion

• Diffusion* within

Boundary Between

Regions i and j

Brownian MotionAlong

Curve Normal

TemperatureDecreases over

Time

*Movement within partition space

Dynamics 2: Model Adaptation

• Fit the parameters* of a region by steepest ascent

*Movement within cue space

Dynamics 3-4: Split and Merge

• Split one region into twoRemainingVariables

Areunchanged

Probability ofProposed Split

Conditional Probability of how likely chain proposes to move to W’ from W

Data-Driven Speedup

Dynamics 3-4: Split and Merge

• Merge two RegionsRemainingVariables

Areunchanged

Probability ofProposed Merge

Data-Driven Speedup

Dynamics 5: Model Switching

• Change models

• Proposal ProbabilitiesData-Driven Speedup

Motivation of DD

• Region Splitting: How to decide where to split a region?

• Model Switching: Once we switch to a new model, what parameters do we jump to?

vs

Model Adaptation Required some initial parameter vector

Data Driven Methods

• Focus on boundaries and model parameters derived from data: compute these before MCMC starts

• Cue Particles: Clustering in Model Space• K-partition Particles: Edge Detection

• Particles Encode Probabilities Parzen Window Style

Cue Particles In ActionClustering in Color Space

K-partition Particles in Action

• Edge detection gives us a good idea of where we expect a boundary to be located

Particles or Parzen Window* Locations?

• What is this particle business about?

• A particle is just the position of a parzen-window which is used for density estimation

1D particles*Parzen Windowing also known as: Kernel Density Estimation, Non-parametric densityestimation

Are you awake: What did we just do?

• Scores (Probability of Segmentation) Search

• 5 MCMC dynamics

• Data-Driven Speedup (key to making MCMC work in finite time)

So what type of answer does the Markov Chain return?

What can we do with this answer?How many answers to we want?

Multiple Solutions

• MAP gives us one solution

• Output of MCMC sampling

How do we get multiple solutions?

Parzen Windows: Again Scene Particles

Why multiple solutions?

• Segmentation is often not the final stage of computation

• A higher level task such as recognition can utilize a segmentation

• We don’t want to make any hard decision before recognition

• multiple segmentations = good idea

K-adventurers

• We want to keep a fixed number K of segmentations but we don’t want to keep trivially different segmentations

• Goal: Keep the K segmentations that best preserve the posterior probability in KL-sense

• Greedy Algorithm:

- Add new particle, remove worst particle

Results (Multiple Solutions)

Results

Results (Color Images)http://www.stat.ucla.edu/~ztu/DDMCMC/benchmark_color/benchmark_color.htm

Conclusions

• DDMCMC: Combines Generative (top-down) and Discriminative (bottom-up) approaches

• Traverse the space of all segmentations via Markov Chains

• Does your head hurt? • Questions?

References

• DDMCMC Paper: http://www.cs.cmu.edu/~efros/courses/AP06/Papers/tu-pami-02.pdf

• DDMCMC Website: http://www.stat.ucla.edu/%7Eztu/DDMCMC/DDMCMC_segmentation.htm

• MCMC Tutorial by Authors: http://civs.stat.ucla.edu/MCMC/MCMC_tutorial.htm

top related