introduction to recursive bayesian filtering · introduction to recursive bayesian filtering ......

53
1 Introduction to recursive Bayesian filtering Michael Rubinstein IDC Problem overview Input (Noisy) Sensor measurements Goal Estimate most probable measurement at time k using measurements up to time k’ k’<k: prediction k‘>k: smoothing k’=k: filtering Many problems require estimation of the state of systems that change over time using noisy measurements on the system © Michael Rubinstein

Upload: hoanghuong

Post on 05-Aug-2018

264 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

1

Introduction to recursive Bayesian filtering

Michael Rubinstein

IDC

Problem overview

• Input– (Noisy) Sensor measurements( y)

• Goal– Estimate most probable measurement at time k using measurements up to time k’k’<k: predictionk‘>k: smoothingk’=k: filtering

• Many problems require estimation of the state of systems that change over time using noisy measurements on the system

© Michael Rubinstein

Page 2: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

2

Applications

• Ballistics• Robotics• Robotics

– Robot localization

• Tracking hands/cars/…• Econometrics

– Stock prediction

N i ti• Navigation

• Many more…

© Michael Rubinstein

Challenges (why is it a hard problem?)

• Measurements– Noise– Errors

• Detection specific– Full/partial occlusions– Deformable objects– Entering/leaving the scene– Lighting variations

• Efficiency• Efficiency• Multiple models and switching dynamics• Multiple targets, • …

© Michael Rubinstein

Page 3: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

3

Talk overview

• Background– Model setup

k i h i• Markovian‐stochastic processes• The state‐space model• Dynamic systems

– The Bayesian approach– Recursive filters– Restrictive cases + pros and cons

• The Kalman filter• The Grid‐based filter

• Particle filters• Particle filters– Monte Carlo integration– Importance sampling

• Multiple target tracking – BraMBLe ICCV 2001 (?)

© Michael Rubinstein

Stochastic Processes

• Deterministic processOnly one possible ‘reality’– Only one possible  reality

• Random process– Several possible evolutions (starting point might be known)

– Characterized by probability distributions

• Time series modeling– Sequence of random states/variables– Measurements available at discrete times

© Michael Rubinstein

Page 4: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

4

State space

• The state vector contains all available information to describe the investigated systeminformation to describe the investigated system– usually multidimensional:

• The measurement vector represents observations related to the state vector

G ll (b t t il ) f l di i

xNRkX ∈)(

zNRkZ ∈)(

– Generally (but not necessarily) of lower dimension than the state vector

© Michael Rubinstein

State space

• Tracking: Econometrics:• Monetary flow

=N 3 =N 4• Interest rates• Inflation• …⎥

⎥⎥

⎢⎢⎢

θyx

Nx 3

⎥⎥⎥⎥⎥

⎢⎢⎢⎢⎢

y

x

x

vyvx

N 4

© Michael Rubinstein

Page 5: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

5

(First‐order) Markov process 

• The Markov property – the likelihood of a f t t t d d t t t lfuture state depends on present state only

• Markov chain – A stochastic process with Markov property

0)],()(|)([Pr ]),()(|)(Pr[

>∀==+=≤∀==+

hkxkXyhkXkssxsXyhkX

Markov property

© Michael Rubinstein

k‐1 k k+1 time

xk‐1 xk xk+1 States

Hidden Markov Model (HMM)

• the state is not directly visible, but output d d t th t t i i ibldependent on the state is visible

k‐1 k k+1 time

xk‐1 xk xk+1 States(hidden)

© Michael Rubinstein

zk‐1 zk zk+1Measurements(observed)

Page 6: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

6

Dynamic Systemk‐1 k k+1

xk‐1 xk xk+1kf

kh

State equation:state vector at time instant  kstate transition function, i.i.d process noise

),( 1 kkkk vxfx −=

kxkf xvx NNN

k RRRf →×:kv

zk‐1 zk zk+1Stochastic diffusion

Observation equation: observations at time instant  kobservation function,i.i.d measurement noise

),( kkkk wxhz =kzkhkw

zwx NNNk RRRh →×:

© Michael Rubinstein

A simple dynamic system

• (4‐dimensional state space) ],,,[ yx vvyxX =

• Constant velocity motion:vvvvtyvtxvXf yxyx +⋅Δ+⋅Δ+= ],,,[),(

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

=

2

2

00000000000000

qq

Q),0(~ QNv

• Only position is observed:

© Michael Rubinstein

wyxwXhz +== ],[),(

),0(~ RNw ⎟⎟⎠

⎞⎜⎜⎝

⎛=

2

2

00r

rR

Page 7: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

7

The Bayesian approach• Construct the posterior probability density function of the state based)|( 1 kk zxp

Thomas Bayes

density function                  of the state based on all available information

• By knowing the posterior many kinds of i f b d i d

)|( :1 kk zxp

Sample space

Posterior

estimates for  can be derived– mean (expectation), mode, median, …– Can also give estimation of the accuracy (e.g. covariance)

© Michael Rubinstein

kx

Recursive filters

• For many problems, estimate is required each time a new measurement arrives

• Batch processing– Requires all available data

• Sequential processing– New data is processed upon arrival– Need not store the complete datasetp– Need not reprocess all data for each new measurement

– Assume no out‐of‐sequence measurements (solutions for this exist as well…)

© Michael Rubinstein

Page 8: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

8

Recursive Bayes filters

• Given:– System models in probabilistic forms

(known statistics of vk, wk)

)|(),( 11 −− ↔= kkkkkk xxpvxfx

)|(),( kkkkkk xzpwxhz ↔=

Markovian process

Measurements areconditionally independent 

given the state

(known statistics of vk, wk)

– Initial state also known as the prior

– Measurements 

© Michael Rubinstein

)()|( 000 xpzxp =

kzz ,,1 K

Recursive Bayes filters

• Prediction step (a‐priori)

– Uses the system model to predict forward– Deforms/translates/spreads state pdf due to random noise

• Update step (a‐posteriori)

)|()|( 1:11:11 −−− → kkkk zxpzxp

– Update the prediction in light of new data– Tightens the state pdf

)|()|( :11:1 kkkk zxpzxp →−

© Michael Rubinstein

Page 9: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

9

General prediction‐update framework

• Assume      is given at time k‐1

P di ti

)|( 1:11 −− kk zxp• Prediction:

• Using Chapman‐Kolmogorov identity + Markov property

∫ −−−−− = 11:1111:1 )|()|()|( kkkkkkk dxzxpxxpzxp (1)

Previous posteriorSystem model

© Michael Rubinstein

General prediction‐update framework

• Update step ),|()|( 1:1:1 −= kkkkk zzxpzxp

)|()|(

)|()|(),|(

1:1

1:1

1:11:1

−−

=

=

kkkk

kk

kkkkk

zxpxzp

zzpzxpzxzp

(2)

)|()|(),|(),|(

CBpCApCABpCBAp =

priorlikelihood×

Measurement model

Currentprior

Where

)|( 1:1 −

=kk zzp

∫ −− = kkkkkkk dxzxpxzpzzp )|()|()|( 1:11:1

(2)

© Michael Rubinstein

evidenceNormalization constant

Page 10: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

10

Generating estimates

• Knowledge of  enables to t ti l ti t ith t t

)|( :1 kk zxpcompute optimal estimate with respect to any criterion. e.g.– Minimum mean‐square error (MMSE)

[ ] kkkkkkMMSE

kk dxzxpxzxEx ∫=≡ )|(|ˆ :1:1|

– Maximum a‐posteriori

)|(maxargˆ | kkx

MAPkk zxpx

k

© Michael RubinsteinMAP MMSE

General prediction‐update framework

So (1) and (2) give optimal solution for the i ti ti bl !recursive estimation problem!

• Unfortunately no… only conceptual solution– integrals are intractable…

– Can only implement the pdf to finite representation!

• However, optimal solution does exist for several restrictive cases

© Michael Rubinstein

Page 11: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

11

Restrictive case #1

• Posterior at each time step is Gaussian– Completely described by mean and covariance

• If is Gaussian it can be shown  that is also Gaussian provided that:– are Gaussian

– are linear

)|( 1:11 −− kk zxp

)|( :1 kk zxp

kk wv ,hf are linearkk hf ,

© Michael Rubinstein

Restrictive case #1

• Why Linear?

© Michael Rubinstein

Yacov Hel-Or( ) ( )TAABANypBAxy Σ+⇒+= ,~ μ

Page 12: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

12

Restrictive case #1

• Why Linear?

© Michael Rubinstein

Yacov Hel-Or( ) ( )Nypxgy ~)( ⇒/=

Restrictive case #1

• Linear system with additive noise)( vxfx = vxFx +=

• Simple example again

),(),( 1

kkkk

kkkkwxhz

vxfx== −

),R~N(w),Q~N(vwxHzvxFx

kk

kk

kkkk

kkkk

00

1+=+= −

vvvvtyvtxvXf yxyx +⋅Δ+⋅Δ+= ],,,[),( wyxwXhz +== ],[),(001 ⎞⎛⎞⎛ Δ⎞⎛ ⎞⎛

© Michael Rubinstein

),0(

10000100

010001

1,

1,

,

,k

ky

kx

k

k

ky

kx

k

k

QN

vvyx

tt

vvyx

+

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

⎛Δ

Δ

=

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

),0(00100001

,

,k

ky

kx

k

k

obs

obs RN

vvyx

yx

+

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

⎟⎟⎠

⎞⎜⎜⎝

⎛=⎟⎟⎠

⎞⎜⎜⎝

FH

Page 13: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

13

The Kalman filter

Rudolf E. Kalman

),ˆ;()|(),ˆ;()|(

),ˆ;()|(

||:1

1|1|1:1

1|11|111:11

kkkkkkk

kkkkkkk

kkkkkkk

PxxNzxpPxxNzxp

PxxNzxp

==

=−−−

−−−−−−−

( ) ⎟⎠⎞

⎜⎝⎛ −Σ−−Σ=Σ −− )(

21exp|2|),;( 12/1 μμπμ xxxN T

• Substituting into (1) and (2) yields the predict and update equations

© Michael Rubinstein

The Kalman filter

Predict:

Update:

kT

k|kkkk|k

|kkkk|k

QFPFPxFx

+==

−−−

−−−

111

111 ˆˆ

11− +=T

kTkk|kkk RHPHS

© Michael Rubinstein

( )[ ] 1

11

11

ˆˆˆ−

−−

−−

−=−+=

=

k|kkkk|k

k|kkkkk|kk|k

kTkk|kk

PHKIPxHzKxx

SHPK

Page 14: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

14

Intuition via 1D example

• Lost at sea– Night

– No idea of location

– For simplicity – let’s assume 1D

© Michael Rubinstein

* Example and plots by Maybeck, “Stochastic models, estimation and control, volume 1”

Example – cont’d

• Time t1: Star Sighting– Denote x(t1)=z1

• Uncertainty (inaccuracies, human error, etc)– Denote σ1 (normal)

• Can establish the conditional probability of x(t1) given measurement z1x(t1) given measurement z1

© Michael Rubinstein

Page 15: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

15

Example – cont’d

• Probability for any location, based on measurement• For Gaussian density – 68.3% within ±σ1• Best estimate of position: Mean/Mode/Median

© Michael Rubinstein

Example – cont’d

• Time t2≅t1: friend (more trained)– x(t2)=z2, σ(t2)=σ2– Since she has higher skill: σ2<σ1

© Michael Rubinstein

Page 16: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

16

Example – cont’d

• f(x(t2)|z1,z2) also Gaussian

© Michael Rubinstein

Example – cont’d

• σ less than both σ1 and σ2• σ1= σ2: average• σ1> σ2: more weight to z2• Rewrite: 

© Michael Rubinstein

Page 17: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

17

Example – cont’d

• The Kalman update rule:

Best estimate Given z2

(a poseteriori)

Best Prediction prior to z2(a priori)

Optimal Weighting(Kalman Gain)

Residual

© Michael Rubinstein

The Kalman filter

Predict:

Update:

kT

k|kkkk|k

|kkkk|k

QFPFPxFx

+==

−−−

−−−

111

111 ˆˆ

1− += kTkk|kkk RHPHS

( )[ ] 1

11

11

ˆˆˆ−

−−

−−

−=−+=

=

k|kkkk|k

k|kkkkk|kk|k

kTkk|kk

PHKIPxHzKxx

SHPK

© Michael Rubinstein

222

2

22

22

1

21

1

21

21 1 zzz

z

zz

zz σσσ

σσσσσ

⎟⎟⎠

⎞⎜⎜⎝

+−=

+

Page 18: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

18

Kalman gain

( )1

1

1−

=+=

kTkk|kk

kTkk|kkk

SHPKRHPHS

• Small measurement error:

( )[ ] 1

11

1ˆˆˆ

−−

−=−+=

k|kkkk|k

k|kkkkk|kk|k

kkk|kk

PHKIPxHzKxx

kkkkkk zHxHKkk

1|0

10 ˆlimlim −

→−

→ =⇒= RR

• Small prediction error:

© Michael Rubinstein

1||00 ˆˆlim0lim −→→ =⇒= kkkkk xxKkk PP

The Kalman filter

• Pros– Optimal closed‐form solution to the tracking problemOptimal closed form solution to the tracking problem (under the assumptions)

• No algorithm can do better in a linear‐Gaussian environment!

– All ‘logical’ estimations collapse to a unique solution– Simple to implement– Fast to execute

• Cons– If either the system or measurement model is non‐linear  the posterior will be non‐Gaussian

© Michael Rubinstein

Page 19: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

19

Restrictive case #2

• The state space (domain) is discrete and finite

• Assume the state space at time k‐1 consists of states 

• Let  be the conditional probability of the state at time k‐1, given measurements up to k‐1

sik Nix ..1,1 =−

ikkk

ikk wzxx 1|11:111 )|Pr( −−−−− ==

measurements up to k‐1

© Michael Rubinstein

The Grid‐based filter

• The posterior pdf at k‐1 can be expressed as f d lt f tisum of delta functions

• Again, substitution into (1) and (2) yields the predict and update equations

∑=

−−−−−− −=sN

i

ikk

ikkkk xxwzxp

1111|11:11 )()|( δ

predict and update equations

© Michael Rubinstein

Page 20: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

20

The Grid‐based filter• Prediction

∫ −−−−− = 11:1111:1 )|()|()|( kkkkkkk dxzxpxxpzxp (1)∫ 11:1111:1 )|()|()|( kkkkkkk ppp

∑∑

=−−−

= =−−−−−−

−=

−=

s

s s

N

i

ikk

ikk

N

i

N

j

ikk

jkk

jk

ikkk

xxw

xxwxxpzxp

1111|

1 1111|111:1

)(

)()|()|(

δ

δ

N

• New prior is also weighted sum of delta functions

• New prior weights are reweighting of old posterior weights using state transition probabilities

∑=

−−−− =sN

j

jk

ik

jkk

ikk xxpww

111|11| )|(

© Michael Rubinstein

The Grid‐based filter• Update

)|()|()|( 1:1:1

−= kkkkkk

zxpxzpzxp (2)

∑=

−− −=sN

i

ikk

ikkkk xxwzxp

111|:1 )()|( δ

∑−=

sNjj

ikk

ikki

kk

xzpw

xzpww 1|

|

)|(

)|(

)|()|(

1:1:1

−kkkk zzp

p ( )

• Posterior weights are reweighting of prior weights using likelihoods (+ normalization)

∑=

−j

kkkk xzpw1

1| )|(

© Michael Rubinstein

Page 21: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

21

The Grid‐based filter

• Pros:d k b)|()|(– assumed known, but no 

constraint on their (discrete) shapes

– Easy extension to varying number of states

– Optimal solution for the discrete‐finite environment!

• Cons:

)|(),|( 1 kkkk xzpxxp −

– Curse of dimensionality• Inefficient if the state space is large

– Statically considers all possible hypotheses

© Michael Rubinstein

Suboptimal solutions

• In many cases these assumptions do not hold– Practical environments are nonlinear non‐GaussianPractical environments are nonlinear, non Gaussian, continuous

Approximations are necessary…

– Extended Kalman filter (EKF)– Approximate grid‐based methodsM lti l d l ti t

Analytic approximations

Numerical methods

– Multiple‐model estimators– Unscented Kalman filter (UKF)– Particle filters (PF)– …

Sampling approaches

Gaussian-sum filters

© Michael Rubinstein

Page 22: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

22

The extended Kalman filter

• The idea: local linearization of the dynamic t i ht b ffi i t d i ti f thsystem might be sufficient description of the 

nonlinearity

• The model: nonlinear system with additive noise

),0(~),0(~

)()( 1

kk

kk

kkkk

kkkk

RNwQNv

wxhzvxfx

+=+= −

),RN(~w),QN(~v

wHxzvxFx

kk

kk

kkk

kkkk

00

1+=+= −

© Michael Rubinstein

The extended Kalman filter

• f, h are approximated using a first‐order Taylor i i ( l t t t ti ti )series expansion (eval at state estimations)

Predict:

Update:k

Tk|kkkk|k

|kkkk|k

QFPFP)x(fx

+==

−−−

−−−

ˆˆˆˆ

111

111

ˆˆ += T RHPHS1|1

][

ˆ][][

][ˆ],[ˆ

−−

=∂∂=

k

kkkk

k

ih

xxjxif

k

jiH

jiF

© Michael Rubinstein

( )[ ] 1

11

11

1

ˆˆˆˆ

ˆ

−−

−−

−=−+=

=+=

k|kkkk|k

k|kkkkk|kk|k

kTkk|kk

kkk|kkk

PHKIP)x(hzKxx

SHPKRHPHS

1|ˆ][][],[

−=∂∂=

kkkk

kxxjx

ihk jiH

Page 23: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

23

The extended Kalman filter

© Michael Rubinstein

Yacov Hel-Or

The extended Kalman filter

• Pros– Good approximation when models are near‐linearGood approximation when models are near linear– Efficient to calculate(de facto method for navigation systems and GPS)

• Cons– Only approximation (optimality not proven)– Still a single Gaussian approximations

N li it G i it ( bi d l)• Nonlinearity  non‐Gaussianity (e.g. bimodal)– If we have multimodal hypothesis, and choose incorrectly – can be difficult to recover

– Inapplicable when f,h discontinuous

© Michael Rubinstein

Page 24: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

24

Intermission

Questions?

© Michael Rubinstein

Particle filtering

• Family of techniques– Condensation algorithms (MacCormick&Blake ‘99)Condensation algorithms (MacCormick&Blake,  99)– Bootstrap filtering (Gordon et al., ‘93)– Particle filtering (Carpenter et al., ‘99)– Interacting particle approximations (Moral ‘98)– Survival of the fittest (Kanazawa et al., ‘95)– Sequential Monte Carlo methods (SMC,SMCM) SIS SIR ASIR RPF– SIS, SIR, ASIR, RPF, ….

• Statistics introduced in 1950s. Incorporated in vision in Last decade

© Michael Rubinstein

Page 25: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

25

Particle filtering

• Many variations, one general concept:

Represent the posterior pdf by a set of randomly chosen weighted samples (particles)

Posterior

• Randomly Chosen = Monte Carlo (MC)• As the number of samples become very large – the 

characterization becomes an equivalent representation of the true pdf

© Michael Rubinstein

Sample space

Particle filtering

• Compared to previous methodsC t bit di t ib ti– Can represent any arbitrary distribution

–multimodal support

– Keep track several hypotheses simultaneously– Approximate representation of complex model rather than exact representation of simplified modelode

• The basic building‐block: Importance Sampling

© Michael Rubinstein

Page 26: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

26

Monte Carlo integration

• Evaluate complex integrals using probabilistic techniquestechniques

• Assume we are trying to estimate a complicated integral of a function f over some domain D:

xdxfF rr∫= )(

• Also assume there exists some PDF p defined over D

fD∫ )(

© Michael Rubinstein

Monte Carlo integration

• Thenf r )(

• But 

xdxpxpxfxdxfF

DD

rrr

rr∫∫ == )(

)()()(

pxxfExdxpxfD

~,)()()(

)()(

⎥⎦

⎤⎢⎣

⎡=∫ r

rrr

r

r

• This is true for any PDF p over D!

xpxpD )()( ⎥⎦

⎢⎣

© Michael Rubinstein

Page 27: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

27

Monte Carlo integration

• Now, if we have i.i.d random samplesl d f th i t

Nxx rr ,...,1sampled from p, then we can approximate  

by 

• Guaranteed by law of large numbers

⎥⎦

⎤⎢⎣

⎡)()(

xpxfE r

r

∑=

=N

i i

iN xp

xfN

F1 )(

)(1r

r

• Guaranteed by law of large numbers:

FxpxfEFN

sa

N =⎥⎦

⎤⎢⎣

⎡→∞→

)()(,

.

r

r

© Michael Rubinstein

Importance Sampling (IS)

• What about ? 0)( =xp r

• If p is very small,          can be arbitrarily large, ‘damaging’ the average• Design p such that  is bounded

• Rule of thumb: take p similar to f as possible

pf /

pf /

Importance weights

Importance or proposal

• The effect: get more samples in ‘important’ areas of f, i.e. where f is large

© Michael Rubinstein

density

Page 28: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

28

Convergence of MC integration• Chebyshev’s inequality: let X be a           random variable with expected value μ Pafnuty Lvovich random variable with expected value μ        and std σ. For any real number k>0,

• For example for it shows that at least

2

1}|Pr{|k

kX ≤≥− σμ

2=k

yChebyshev

For example, for          , it shows that at least half the values lie in interval 

• Let                ,  then MC estimator is)()(

i

ii xp

xfy = ∑=

=N

iiN y

NF

1

1

2=k)2,2( σμσμ +−

© Michael Rubinstein

Convergence of MC integration• By Chebyshev’s,

δδ

≤⎟⎠⎞

⎜⎝⎛≥− }][|][Pr{|

21

NNN

FVFEF )/1( δ=k

[ ] [ ]yVN

yVN

yVN

yN

VFVN

ii

N

ii

N

iiN

1111][1

21

21

∑∑∑===

==⎥⎦

⎤⎢⎣

⎡=⎥

⎤⎢⎣

⎡=

δ ⎠⎝NN

δδ

≤⎟⎠⎞

⎜⎝⎛≥− }][1|Pr{|

21

yVN

FFN

• Hence, for a fixed threshold, the error decreases at rate

δ ⎠⎝N

N/1

© Michael Rubinstein

Page 29: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

29

Convergence of MC integration

• Meaning1. To cut the error in half, it is necessary to 

evaluate 4 times as many samples

2. Convergence rate is independent of the integrand dimension!

• On contrast, the convergence rate of grid‐based i ti d iNapproximations  decreases as        increases

© Michael Rubinstein

xN

IS for Bayesian estimation 

kkkX

k

xp

dxzxpxfXfE :0:1:0:0

)|(

)|()())(( ∫=

• We characterize the posterior pdf using a set of samples (particles) and their weights

Ni

ik

ik wx 1:0 },{ =

kkkX kk

kkk dxzxq

zxqzxpxf :0:1:0

:1:0

:1:0:0 )|(

)|()|()(∫=

• Then the joint posterior density at time k is approximated by

∑=

−≈N

i

ikk

ikkk xxwzxp

1:0:0:1:0 )()|( δ

© Michael Rubinstein

Page 30: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

30

IS for Bayesian estimation

• We draw the samples from the importance d it ith i t i ht)|(density     with importance weights

• Sequential update (after some calculation…)

)|( :1:0 kk zxq

)|()|(

:1:0

:1:0

kk

kkik zxq

zxpw ∝

© Michael Rubinstein

),|(~ 1 kikk

ik zxxqx −

Particle update

),|()|()|(

1

11

kik

ik

ik

ik

ikki

kik zxxq

xxpxzpww−

−−=

Weight update

Sequential Importance Sampling (SIS)

[ ] [ ]kNi

ik

ik

Ni

ik

ik zwxwx ,},{SIS},{ 1111 =−−= =

• FOR i=1:N– Draw

– Update weights

• END

• Normalize weights

),|(~ 1 kikk

ik zxxqx −

),|()|()|(

1

11

kik

ik

ik

ik

ikki

kik zxxq

xxpxzpww−

−−=

• Normalize weights   

© Michael Rubinstein

Page 31: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

31

State estimates

• Any function can be calculated by di t df i ti

)( kxfdiscrete pdf approximation

[ ] ∑=

=N

i

ik

ikk xfw

NxfE

1)(1)(

Robust mean• Example:– Mean (simple average)

© Michael Rubinstein

MAP Mean

Mean (simple average)– MAP estimate: particle with largest weight

– Robust mean: mean within window around MAP estimate 

Choice of importance density

© Michael Rubinstein

Hsiao et al.

Page 32: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

32

Choice of importance density

• Most common (suboptimal): the transitional iprior

)|(),|(

)|()|(

)|(),|(

11

11

11

ikk

ik

kik

ik

ik

ik

ikki

kik

ikkk

ikk

xzpwzxxq

xxpxzpww

xxpzxxq

−−

−−

−−

==⇒

=

© Michael Rubinstein

∑=

−=sN

j

jkk

jkk

ikk

ikki

kk

xzpw

xzpww

11|

1||

)|(

)|(Grid filter weight update:

The degeneracy phenomenon

• Unavoidable problem with SIS: after a few it ti t ti l h li ibliterations most particles have negligible weights– Large computational effort for updating particles with very small contribution to 

• Measure of degeneracy ‐ the effective sample )|( :1 kk zxp

g y psize:

– Uniform:  , severe degeneracy: 

∑=

= N

iik

effw

N1

2)(1

NN eff = 1=effN© Michael Rubinstein

Page 33: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

33

Resampling

• The idea: when degeneracy is above some th h ld li i t ti l ith lthreshold, eliminate particles with low importance weights and multiply particles with high importance weights

• The new set is generated by sampling with

NiN

ik

Ni

ik

ik xwx 1

1*1 },{},{ == →

The new set is generated by sampling with replacement from the discrete representation of  such that)|( :1 kk zxp j

kj

kik wxx == }Pr{ *

© Michael Rubinstein

Resampling

G t N i i d i bl[ ] [ ]N

iik

ik

Ni

ik

ik wxwx 11* },{RESAMPLE},{ == =

]10[U• Generate N i.i.d variables

• Sort them in ascending order 

• Compare them with the cumulative sum of normalized weights

]1,0[~ Uui

Ristic et al.© Michael Rubinstein

Page 34: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

34

Resampling

• Complexity: O(NlogN)– O(N) sampling algorithms exist– O(N) sampling algorithms exist

© Michael RubinsteinHsiao et al.

Generic PF

A l SIS filt i[ ] [ ]k

Ni

ik

ik

Ni

ik

ik zwxwx ,},{PF},{ 1111 =−−= =

[ ] [ ]NiiNii }{SIS}{• Apply SIS filtering

• Calculate

• IF

•• END

[ ] [ ]kNi

ik

ik

Ni

ik

ik zwxwx ,},{SIS},{ 1111 =−−= =

effN

threff NN <[ ] [ ]N

iik

ik

Ni

ik

ik wxwx 11 },{RESAMPLE},{ == =

© Michael Rubinstein

Page 35: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

35

Generic PF},{ 1−Nxi

kUniformly weighted measureApproximates )|( 1:1 −kk zxp

},{ kik wx

},{ 1* −Nxik

Compute for each particle its importance weight to Approximate

(Resample if needed)

Project ahead to approximate

)|( :1 kk zxp

},{ 11

−+ Nxi

k

},{ 11 ++ kik wx

Van der Merwe et al.

Project ahead to approximate)|( :11 kk zxp +

)|( 1:11 ++ kk zxp© Michael Rubinstein

PF variants

• Sampling Importance Resampling (SIR)

• Auxiliary Sampling Importance Resampling (ASIR)

• Regularized Particle Filter (RPF)

• Local‐linearization particle filters

• Multiple models particle filters (maneuvering targets)

• …

© Michael Rubinstein

Page 36: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

36

Sampling Importance Resampling (SIR)

• A.K.A Bootstrap filter, Condensation

• Initialize from prior distribution 

• For k > 0 do

• Resample into     

• Predict 

Ni

ii wx 100 },{ =

Ni

ik

ik wx 111 },{ =−−

Ni

ik N

x 1*

1 }1,{ =−

0X

)|(~ *11

ikkk

ik xxxpx −− =

© Michael Rubinstein

• Reweight

• Normalize weights 

• Estimate (for display)

)|( ikkk

ik xxzpw ==

kx̂

Red pill or blue pill?

1. We had enough – show us some videos!

2. 15 minute walk through a multiple‐target‐tracking system

© Michael Rubinstein

12

Page 37: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

37

Multiple Targets (MTT/MOT)

• Previous challenges– Full/partial occlusionsFull/partial occlusions– Entering/leaving the scene– …

• And in addition– Estimating the number of objects– Computationally tractable for multiple simultaneous targetstargets

– Interaction between objects

– Many works on multiple single‐target filters

© Michael Rubinstein

BraMBLe: A Bayesian Multiple‐Blob Tracker

M. Isard and J. MacCormickhCompaq Systems Research Center

ICCV 2001

Some slides taken from Qi ZhaoSome images taken from Isard and MacCormick

Page 38: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

38

BraMBLE

• First rigorous particle filter implementation with variable number of targetsvariable number of targets

• Posterior distribution is constructed over possible object configurations and number

• Sensor: single static cameraT ki SIR ti l filt• Tracking: SIR particle filter

• Performance: real‐time for 1‐2 simultaneous objects

© Michael Rubinstein

The BraMBLe posterior

)|( 1 kk zxp )|( :1 kk zxp

State at frame k Image Sequence

Number,Positions,Shapes,Velocities,

© Michael Rubinstein

,…

Page 39: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

39

State space

• Hypothesis configuration:

• Object configuration:),...,,,( 21 m

kkkkk xxxmX =

),,,( ik

ik

ik

ik

ikx SVXφ=

identifier shape

max131 MNx +=

© Michael Rubinstein

identifier

position),( zx=X

velocity),( zx vv=V

shape),,,,,,,( swhswf hwwww ααθ=S

Object model

• A person is modeled  as a generalized‐li d ith ti l i i th ldcylinder with vertical axis in the world 

coordsCalibratedCamera),,,,,,,( swhswf hwwww ααθ=S

)},(),,(),,(),0,{(),( hwhwhwwyr hsswwfii αθαθ=

© Michael Rubinstein

Page 40: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

40

Observation likelihood

• Image overlaid with rectangular Grid (e.g. 5 i l )

)|( ttp XZ

pixels)

Y

CrGaussian

© Michael Rubinstein

Cr

Cb Mexican Hat(second deriv of G)

Observation likelihood

• The response values are assumed diti ll i d d t i X

)|( ttp XZ

conditionally independent given X

∏∏ ==g ggg g lzpzp )|()|()|p( XXZ

0=gl1=gl

© Michael Rubinstein

2=glg

3=gl

Page 41: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

41

Appearance models

• GMMs for background and foreground are t i d i ktrained using kmeans

© Michael Rubinstein

Bk Bkg

kggg N

Klz τμ +Δ+Σ== ∑ ),(1)0|p(

FkkF

kFgg N

Klz τμ +Σ=≠ ∑ ),(1)0|p(

4=K

16=K

Observation likelihood

© Michael Rubinstein

⎟⎟⎠

⎞⎜⎜⎝

=

)0|p()0|p(

loggg

gg

lzlz

Page 42: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

42

System (prediction) model

• The number of objects can change:– Each object has a constant probability to remain inλ

)|( 1−tt XXp

Each object has a constant probability        to remain in the scene.

– At each time step, there is constant probability       that a new object will enter the scene.  

,...)~,(,...)~,( 1,1,'1

'1

'1

nt

nt

nt

nt

nt

nt xmXxmX =→= −−−

© Michael Rubinstein

Predictionfunction Initialization

function

Prediction function

• Motion evolution: damped constant velocity

Sh l ti 1st d t i• Shape evolution: 1st order auto‐regressive process model (ARP)

© Michael Rubinstein

1−tX 1−tV

11 8.0 −− + tt VX

Page 43: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

43

Particles1tX 2

tX 1Nt−X N

tX...N Points:

N Weights: 1tπ

2tπ

1Ntπ− N

© Michael Rubinstein

Estimate 

• Denote the set of existing i id tifi

},...,{ 1 Mt ΦΦ=M

tX̂

unique identifiers 

Total probability thatobject Φi exists

(particle,target)

© Michael Rubinstein

Page 44: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

44

Results

• N=1000 particles• initialization samples always generated

© Michael Rubinstein

Results

• Single foreground model cannot distinguish b t l i bj t idbetween overlapping objects – causes id switches

© Michael Rubinstein

Page 45: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

45

Parameters

© Michael Rubinstein

Summary

• Particle filters were shown to produce good approximations under relatively weakapproximations under relatively weak assumptions– can deal with nonlinearities– can deal with non‐Gaussian noise– Multiple hypotheses– can be implemented in O(N)p ( )– Relatively “simple”– Adaptive focus on more probable regions of the state‐space

© Michael Rubinstein

Page 46: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

46

In practice

1. State (object) model2. System (evolution) modely ( )3. Measurement (likelihood) model4. Initial (prior) state5. State estimate (given the pdf)

6. PF specifics1. Proposal density2. Resampling method

• Configurations for specific problems can be found in literature

© Michael Rubinstein

Isard&Blake CONDENSATION– conditional density propagation for visual tracking IJCV 98

© Michael Rubinstein

Page 47: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

47

Isard&Blake CONDENSATION– conditional density propagation for visual tracking IJCV 98

© Michael Rubinstein

Isard&Blake CONDENSATION– conditional density propagation for visual tracking IJCV 98

“girl dancing vigorously to a Scottish reel” – 100 particles “bush blowing in the wind” – 1200 particles

© Michael Rubinstein

Page 48: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

48

Okuma el al. Boosted Particle Filter ECCV 2004

• Goal: track hockey players

• Idea: AdaBoost + PF

© Michael Rubinstein

Okuma el al. Boosted Particle Filter ECCV 2004

© Michael Rubinstein

Page 49: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

49

Bibby&Reid Tracking using Pixel‐Wise Posteriors (ECCV08)

© Michael Rubinstein

Bibby&Reid Tracking using Pixel‐Wise Posteriors (ECCV08)

© Michael Rubinstein

Page 50: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

50

Thank you!

© Michael Rubinstein

References

• Beyond the Kalman filter/ Ristic Arulamplam GordonRistic,Arulamplam,Gordon– Online tutorial: A Tutorial on Particle Filters for Online Nonlinear/Non‐Gaussian Bayesian Tracking/ Arulampalam et al 2002

• Stochastic models, estimation and control/ Peter S. Maybeck

• An Introduction to the Kalman Filter/ Greg Welch, Gary Bishop

• Particle filters an overview/ Matthias Muhlich

© Michael Rubinstein

Page 51: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

51

Sequential derivation 1

• Suppose at time k‐1,          characterize  Ni

ik

ik wx 111:0 },{ =−−

)|( zxp• We receive new measurement         and need to approximate                      using new set of samples

• We choose q such that

)|( 1:11:0 −− kk zxpkz

)|( :1:0 kk zxp

)|(),|()|( 1:11:0:11:0:1:0 −−−= kkkkkkk zxqzxxqzxq

And we can generate new particles

),|(~ :11:0 ki

kkik zxxqx −

© Michael Rubinstein

Sequential derivation 2

• For the weight update equation, it can be h th tshown that

And so )|()|()|(

)|()|(

)|()|()|(

1:11:01

1:11:01:1

1:1:0

−−−

−−−

=

kkkkkk

kkkk

kkkkkk

zxpxxpxzp

zxpzzp

xxpxzpzxp

)|()|()|()|(

),|()|()|(

)|(),|()|()|()|(

)|()|(

:11:0

11

1:11:0:11:0

1:11:01

:1:0

:1:0

ki

kik

ik

ik

ikki

k

kkkkk

kkkkkk

kk

kkik

zxxqxxpxzpw

zxqzxxqzxpxxpxzp

zxqzxpw

−−

−−−

−−−

=

==

© Michael Rubinstein

Page 52: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

52

Sequential derivation 3

• Further, if  ),|(),|( 1:11:0 kkkkkk zxxqzxxq −− =

• Then the weights update rule becomes

(and need not store entire particle paths and full history of b i )

),|()|()|(

1

11

kik

ik

ik

ik

ikki

kik zxxq

xxpxzpww−

−−= (3)

observations)

• Finally, the (filtered) posterior density is approximated by ∑

=

−≈N

i

ikk

ikkk xxwzxp

1:1 )()|( δ

© Michael Rubinstein

Choice of importance density

• Choose q to minimize variance of weights

• Optimal choice:

– Usually cannot sample from  or solve for(in some specific cases is works)

• Most commonly used (suboptimal)

)|(),|(),|(

11

11ikk

ik

ik

kikkoptk

ikk

xzpwwzxxpzxxq

−−

−−

∝⇒=

optq ikw

• Most commonly used (suboptimal) alternative: 

– i.e. the transitional prior

)|()|(),|(

1

11ikk

ik

ik

ikkoptk

ikk

xzpwwxxpzxxq

−−

∝⇒=

© Michael Rubinstein

Page 53: Introduction to recursive Bayesian filtering · Introduction to recursive Bayesian filtering ... – The Bayesian approach ... • Multiple target tracking – BraMBLe ICCV 2001 (?)

53

Generic PF

• Resampling reduces degeneracy, but new problems ariseproblems arise…

1. Limits parallelization2. Sample impoverishment: particles with high 

weights are selected many times which leads to loss of diversity

– if process noise is small – all particles tend to collapse to single point within few interations

– Methods exist to counter this as well…

© Michael Rubinstein