probabilistic models for images markov random fields
DESCRIPTION
Probabilistic Models for Images Markov Random Fields Applications in I mage S egmentation and Texture Modeling Ying Nian Wu UCLA Department of Statistics IPAM July 22, 2013. Outline Basic concepts, properties, examples Markov chain Monte Carlo sampling Modeling textures and objects - PowerPoint PPT PresentationTRANSCRIPT
Probabilistic Models for Images
Markov Random FieldsApplications in Image Segmentation and Texture Modeling
Ying Nian WuUCLA Department of Statistics
IPAM July 22, 2013
Outline•Basic concepts, properties, examples•Markov chain Monte Carlo sampling•Modeling textures and objects•Application in image segmentation
Markov Chains
Pr(future|present, past) = Pr(future|present)future past | presentMarkov property: conditional independence limited dependenceMakes modeling and learning possible
Markov Chains (higher order)
Temporal: a natural orderingSpatial: 2D image, no natural ordering
Markov Random Fields
all the other pixels
Nearest neighborhood, first order neighborhood
Markov Property
From Slides by S. Seitz - University of Washington
Markov Random Fields
Second order neighborhood
Markov Random Fields
Can be generalized to any undirected graphs (nodes, edges)Neighborhood system: each node is connected to its neighbors neighbors are reciprocalMarkov property: each node only depends on its neighbors
Note: the black lines on the left graph are illustrating the 2D grid for the image pixels they are not edges in the graph as the blue lines on the right
Markov Random Fields
What is
Cliques for this neighborhood
Hammersley-Clifford Theorem
normalizing constant, partition function
potential functions of cliques
From Slides by S. Seitz - University of Washington
Cliques for this neighborhood
Hammersley-Clifford Theorem
a clique: a set of pixels, each member is the neighbor of any other member
From Slides by S. Seitz - University of Washington
Gibbs distribution
Cliques for this neighborhood
Hammersley-Clifford Theorem
a clique: a set of pixels, each member is the neighbor of any other member
……etc, note: the black lines are for illustrating 2D grids, they are not edges in the graph
Gibbs distribution
Cliques for this neighborhood
Ising model
From Slides by S. Seitz - University of Washington
Ising model
Challenge: auto logistic regression
pair potential
Gaussian MRF model
continuous
Challenge: auto regression
pair potential
Sampling from MRF Models
Markov Chain Monte Carlo (MCMC)• Gibbs sampler (Geman & Geman 84)• Metropolis algorithm (Metropolis et al. 53)• Swedeson & Wang (87)• Hybrid (Hamiltonian) Monte Carlo
Gibbs Sampler
Simple one-dimension distribution
Repeat: • Randomly pick a pixel • Sample given the current values of
Gibbs sampler for Ising model
Challenge: sample from Ising model
Metropolis Algorithm
Repeat: • Proposal: Perturb I to J by sample from K(I, J) = K(J, I)• If change I to J otherwise change I to J with prob
energy function
Metropolis for Ising model
Challenge: sample from Ising model
Ising model: proposal --- randomly pick a pixel and flip it
Modeling Images by MRFIsing model
Exponential family model, log-linear model maximum entropy model
unknown parameters
features (may also need to be learned)
reference distribution
Hidden variables, layers, RBM
Modeling Images by MRF
Given
How to estimate
• Maximum likelihood • Pseudo-likelihood (Besag 1973) • Contrastive divergence (Hinton)
Maximum likelihood
Given
Challenge: prove it
Stochastic Gradient
Given
Generate
Analysis by synthesis
Texture Modeling
Modeling image pixel labels as MRF (Ising)
( , )i ix y
( , )i jx x
1
real image
label image
Slides by R. Huang – Rutgers University
MRF for Image Segmentation
Bayesian posterior
Model joint probability
label
image
label-labelcompatibility
Functionenforcing
Smoothness constraint
neighboringlabel nodes
local Observations
image-labelcompatibility
Functionenforcing
DataConstraint
( , )
1( , ) ( , ) ( , )i j i i
i j i
P x x x yZ
x y
* *
( , )( , ) arg max ( , | )P
xx x y
region labels
image pixels
model param
.
Slides by R. Huang – Rutgers University
*
1
( , ) ( , )2
2
2
2 2
arg max ( | )
1arg max ( , ) ( | ) ( , ) / ( ) ( , )
1arg max ( , ) ( , ) ( , ) ( , ) ( , )
( , ) ( ; , )
( , ) exp( ( ) / )
[ , , ]
i i
i i
i i i j i i i ji i j i i j
i i i x x
i j i j
x x
P
P P P P PZ
x y x x P x y x xZ
x y G y
x x x x
x
x
x
x x y
x y x y x y y x y
x y
( , )i ix y
( , )i jx xSlides by R. Huang – Rutgers University
MRF for Image Segmentation
Inference in MRFs
– Classical• Gibbs sampling, simulated annealing • Iterated conditional modes
– State of the Art• Graph cuts• Belief propagation• Linear Programming • Tree-reweighted message passing
Slides by R. Huang – Rutgers University
Summary•MRF, Gibbs distribution•Gibbs sampler, Metropolis algorithm•Exponential family model