multilevel sequential monte carlo samplers and...

Post on 13-Jun-2020

7 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Introduction MLMC BIP SMC MLSMC Other Summary

Multilevel sequential Monte Carlo samplersand other MLMC methods

Kody Law & A. Jasra (NUS) & others

Bayes Comp IMS 2018,Singapore

September 3, 2018

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

Inverse Problems

Data Parameter

y = G ( u ) + e

forward model (PDE) observation/model errors

y ∈ RM

u ∈ EG : E → RM

Data y may be limited in number, noisy, and indirect.Parameter u often a function, discretized.Needs to be approximated.

Introduction MLMC BIP SMC MLSMC Other Summary

Bayesian inversion

Goal of Bayesian inference: given observed data y , find theposterior distribution

P(du|y) =P(y |u)P0(du)∫E P(y |u)P0(du)

,

and estimate quantities of interest, such as, for g : E → R,expected value,

∫E g(u)P(du|y);

variance,∫

E g(u)2P(du|y)− (∫

E g(u)P(du|y))2;probability of exceeding some value

∫u∈E ;g(u)>R P(du|y).

There exists a connection with a classical inverse problems:identify most probable value, supu∈E limδ→0

P(Bδ(u)|y)P(Bδ(v)|y) .

Introduction MLMC BIP SMC MLSMC Other Summary

Orientation

Aim: Approximate expectations with respect to a probabilitydistribution η∞, which needs to be approximated by some ηL,and can only be evaluated up to a normalizing constant.Solution: The multilevel Monte Carlo (MLMC) method isutilised with Sequential Monte Carlo (SMC) samplers, yieldingthe MLSMC sampler for Bayesian inference problems.

MLMC methods reduce cost to error= O(ε), can be used inthe case that ηL can be sampled from directly [H00,G08].Here it is assumed that ηL cannot be sampled from directly,but can be evaluated up to a normalizing constant (e.g.Bayesian inference problems) [HSS13, DKST15, HTL16,BJLTZ17].SMC samplers are a general class of algorithms which areeffective for sampling from such distributions [N01, C02,DDJ06].

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

Example: expectation for SDE [G08]

Estimation of expectation of solution of intractable stochasticdifferential equation (SDE).

dX = f (X )dt + σ(X )dW , X0 = x0 .

Aim: estimate E(g(XT )).We need to(1) Approximate, e.g. by Euler-Maruyama method with

resolution h:

Xn+1 = Xn + hf (Xn) +√

hσ(Xn)ξn, ξn ∼ N(0,1).

(2) Sample X (i)NTNi=1, NT = T/h.

Introduction MLMC BIP SMC MLSMC Other Summary

Single level Monte Carlo

Aim: Approximate η∞(g) := Eη∞(g) for g : E → R.

Monte Carlo approachDiscretize the space⇒ approximate distribution ηL.

Sample U(i)L ∼ ηL i.i.d., and approximate

ηL(g) := EηL(g) ≈ Y NLL :=

1NL

NL∑i=1

g(U(i)L ).

Mean square error (MSE) EY NLL − Eη∞ [g(U)]2 splits into

EY NLL − EηL [g(U)]2︸ ︷︷ ︸variance=O(N−1

L )

+EηL [g(U)]− Eη∞ [g(U)]︸ ︷︷ ︸bias

2

Cost to achieve MSE= O(ε2) is Cost(U(i)L )× ε−2.

Introduction MLMC BIP SMC MLSMC Other Summary

Multilevel Monte Carlo I

Introduce a hierarchy of discretization levels ηlLl=1 and defineYl = Eηl [g(U)]− Eηl−1 [g(U)], with η−1 := 0.Observe the telescopic sum

EηL [g(U)] =L∑

l=0

Yl .

Each term can be unbiasedly approximated by

Y Nll =

1Nl

Nl∑i=1

g(U(i)l )− g(U(i)

l−1)

where g(U(i)−1) := 0.

Introduction MLMC BIP SMC MLSMC Other Summary

Multilevel Monte Carlo II

Multilevel Monte Carlo approach:Sample i.i.d. (Ul ,Ul−1)(i) ∼ ηl , such that∫ηldul−1,l = ηl,l−1, and approximate

ηL(g) ≈ YL,Multi :=L∑

l=0

Y Nll .

Mean square error (MSE) given by

EYL,Multi − Eη∞ [g(U)]2 =

EYL,Multi − EηL [g(U)]2︸ ︷︷ ︸variance=

∑Ll=0 Vl/Nl

+EηL [g(U)]− Eη∞ [g(U)]︸ ︷︷ ︸bias

2 .

Fix bias by choosing L. Minimize cost C =∑L

l=0 ClNl as afunction of NlLl=0 for fixed variance⇒ Nl ∝

√Vl/Cl .

Introduction MLMC BIP SMC MLSMC Other Summary

Illustration of pairwise coupling

Pairwise coupling of trajectories of an SDE:

X 1n+1 = X 1

n +hf (X 1n )+√

hσ(X 1n )ξn, ξn ∼ N(0,1), n = 0, . . . ,N1

X 0n+1 = X 0

n +(2h)f (X 0n )+√

2hσ(X 0n )(ξ2n+ξ2n+1), n = 0, . . . , (N1−1)/2.

0.0 0.2 0.4 0.6 0.8 1.0t

0.2

0.0

0.2

0.4

0.6 W0

t

W1

t

(a) Wiener processW 1

n =√

h∑n

i=0 ξn, W 0n = W 1

2n.

0.0 0.2 0.4 0.6 0.8 1.0t

1.0

1.1

1.2

1.3

1.4

1.5

1.6

X

0

t

X

1

t

(b) Stochastic process driven byWiener process.

Introduction MLMC BIP SMC MLSMC Other Summary

Multilevel vs. Single level

Assume hl = 2−l and there are α, and β > ζ such that(i) weak error |E[g(Ul)− g(U)]| = O(hαl ).

(ii) strong error E|g(Ul)− g(U)|2 = O(hβl )⇒ Vl = O(hβl ),(iii) computational cost for a realization of g(Ul)− g(Ul−1),

Cl ∝ h−ζl .Both cases require hαL = O(ε)⇒ L ∝ | log ε|.

Single level cost C = O(ε−ζ/α−2) : cost per sample isCL ∝ ε−ζ/α, and fixed V ∝ ε2 ⇒ NL ∝ ε−2.Multilevel cost CML = O(ε−2) : Nl ∝ ε−2KLh(β+ζ)/2

l , soV ∝ ε2 and C ∝ ε−2K 2

L for KL =∑L

l=0 h(β−ζ)/2l = O(1)

[G08] – cost of simulating a scalar random variable.Example: Milstein solution of SDE

C = O(ε−3) vs. CML = O(ε−2).

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

Inverse Problems

Data Parameter

y = G ( u ) + e

forward model (PDE) observation/model errors

y ∈ RM

u ∈ EG : E → RM

Data y may be limited in number, noisy, and indirect.Parameter u often a function, discretized.Needs to be approximated.

Introduction MLMC BIP SMC MLSMC Other Summary

Example forward problem

Let V := H1(Ω) ⊂ L2(Ω) ⊂ H−1(Ω) =: V ∗, Ω ⊂ Rd with ∂Ωconvex, and f ∈ V ∗. Consider

−∇ · (u∇p) = f , on Ω

p = 0, on ∂Ω,

where

u(x) = u(x) +K∑

k=1

ukσk Φk (x) .

Define u = ukKk=1 ∈ E :=∏K

k=1[−1,1], with uk ∼ U[−1,1]i.i.d. This determines the prior distribution for u. Assumeu,Φk ∈ C∞, and ‖Φk‖∞ = 1 for all k , and require

infx

u(x) ≥ infx

u(x)−K∑

k=1

σk ≥ u∗ > 0.

Introduction MLMC BIP SMC MLSMC Other Summary

Bayesian inverse problem

Let gm ∈ V ∗ for m = 1, . . . ,M and define

G(p) = [g1(p), · · · ,gM(p)]> .

Let p(·; u) denote the weak solution for parameter value u.

DATA : y = G(p(·; u)) + e, e ∼ N(0, Γ), e ⊥ u.

The unnormalized density of u|y over u ∈ E is given by

κ(u) = e−Φ[G(p(·;u))] ; Φ(G) = 12 |G − y |2Γ .

TARGET : η(u) =κ(u)

Z, Z =

∫Eκ(u)du.

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

SMC sampler algorithm

Distributions ηl dictated by an accuracy parameter hl (hereFEM mesh diameter)∞ > h0 > h1 · · · > h∞ = 0. ApproximateEηL [g(U)] = ηL(g) =

∫E g(u)ηL(u)du.

Idea: interlace sequential importance resampling (selection)along the hierarchy, and mutation by MCMC kernels.

Initialize i.i.d. U i0 ∼ η0, i = 1, . . . ,N. For l ∈ 0, . . . ,L− 1:

Resample U il Ni=1 according to the weights w i

l Ni=1,

w il = Gi

l/∑N

j=1 Gjl , Gi

l = (κl+1/κl)(U il ).

Draw U il+1 ∼ Ml+1(U i

l , ·), where Ml+1 is an MCMC kernelsuch that ηl+1Ml+1 = ηl+1.

For g : E → R, l ∈ 0, . . . ,L, we have the following estimators

Eηl [g(U)] ≈ ηNl (g) :=

1N

N∑i=1

g(U il ) .

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

MLSMC sampler

Notice

EηL [g(U)] = Eη0 [g(U)] +L∑

l=1

Eηl [g(U)]− Eηl−1 [g(U)]

= Eη0 [g(U)] +L∑

l=1

Eηl−1

[(κl(U)Zl−1

κl−1(U)Zl− 1)

g(U)]. †

Idea: Approximate † using SMC sample hierarchy.

Key: Subsample (U1:N00 , . . . ,U1:NL−1

L−1 ) as in single level SMC, butwith +∞ > N0 ≥ N1 · · · ≥ NL−1 ≥ 1 appropriately chosen.

Introduction MLMC BIP SMC MLSMC Other Summary

MLSMC estimator

The MLSMC consistent estimator of ηL(g) is given by

Y := ηN00 (g) +

L∑l=1

ηNl−1l−1 (gGl−1)

ηNl−1l−1 (Gl−1)

− ηNl−1l−1 (g)

.

i) the L + 1 terms above are not unbiased estimates ofEηl [g(U)]− Eηl−1 [g(U)], so decompose MSE as:

E[Y − Eη∞ [g(U)]2

]≤

2E[Y − EηL [g(U)]2

]+ 2 EηL [g(U)]− Eη∞ [g(U)]2 .

ii) the same L + 1 estimates are not independent, so a morecomplex error analysis will be required to characterizeE[Y − EηL [g(U)]2].

Introduction MLMC BIP SMC MLSMC Other Summary

Assumptions

(A1) There exist 0 < C < C < +∞ such that

sup1≤l≤L

supu∈E

Gl(u) ≤ C ,

inf1≤l≤L

infu∈E

Gl(u) ≥ C .

(A2) There exist a ρ ∈ (0,1) such that for any 1 ≤ p ≤ L− 1,(u, v) ∈ E2, A ∈ σ(E),∫

AMp(u,du′) ≥ ρ

∫A

Mp(v ,du′).

(A3) There is a β > 0 such that

Vl := ‖Zl−1Zl

Gl−1 − 1‖2∞ = O(hβl ) .

Introduction MLMC BIP SMC MLSMC Other Summary

Main result

Theorem (BJLTZ16 )Assume (A1-3). For any g : E → R bounded

E[Y − EηL [g(U)]2

]=

V2

.1

N0+

L∑l=1

(Vl

Nl+(Vl

Nl

)1/2 L∑q=l+1

V 1/2q

Nq

).

In particular, for β > ζ, L and NlLl=0 can be chosen such thatMSE= O(ε2) for computational cost= O(ε−2), the optimal case.

Introduction MLMC BIP SMC MLSMC Other Summary

Runtime cost as a function of error

215

220

225

230

2−25 2−20 2−15 2−10

MSE (𝜀2)

Runtim

ecost∝∑𝐿 𝑙=0𝑁𝑙ℎ−1 𝑙

Algorithm MLSMC SMC

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

(IS) MLSMC sampler for normalizing constants

Two unbiased estimators proposed which provide the optimalrate with a logarithmic penalty on the cost: MSE O(ε2) for costO(| log ε|ε−2) [DJLZ16]. Here one can also construct rMLMCestimators of Rhee&Glynn type.

215

220

225

230

2−30 2−25 2−20 2−15

MSE (𝜀2)

Runtim

ecost∝∑𝐿 𝑙=0𝑁𝑙ℎ−1 𝑙

Algorithm MLSMC ( 𝛾𝑁0∶𝑙−2𝑙 (1)) MLSMC (𝛾𝑁0∶𝑙−1𝑙 (1)) SMC (𝛾𝑁0∶𝑙−1𝑙 (1))

Introduction MLMC BIP SMC MLSMC Other Summary

(IS) MLSMC samplers with DILI mutations

Posterior over function-space, levels include refinement inparameter and model ηl(u0:l).Covariance-based LIS (cLIS) introduced and incorporatedin DILI proposals [CLM16, BJLMZ17] – substantialreduction in cost.

220

230

240

250

260

2−10 2−5 20

Error

rmse

Algorithm mlsmc (dili) mlsmc (pcn) smc ∝ 𝜀−2 ∝ 𝜀−3

1

Introduction MLMC BIP SMC MLSMC Other Summary

(CA) ML particle filter (MLPF) for SDE

Filtering involves a sequence of Bayesian inversions,separated by propagation in time (through an SDE).Let η0,m, . . . , ηL,m, . . . , η∞,m denote the time m filteringdistribution at a hierarchy of levels (time discretization).Coupled traditional SMC algorithms (particle filters) can beused for each level.Mutation M` is now coupled propagation of a pair of initialconditions through an SDE discretized at two successivemesh-refinements, for ` = 0, . . . ,L.Selection is performed by novel pairwise coupledresampling which preserves marginals (maximal coupling).MLMC results carry over with somewhat weaker rateβ → β/2 [JKLZ15].New version with improved rate using L2 optimal couplinginstead.

Introduction MLMC BIP SMC MLSMC Other Summary

(CA) MLPF numerical experiments

MSE Bias2 Variance

10−8

10−7

10−6

10−5

10−4

10−10

10−9

10−8

10−7

10−6

10−7

10−6

10−5

10−4

10−3

10−8

10−7

10−6

10−5

10−4

OUGBM

LangevinNLM

102 103 104 105 106 107 102 103 104 105 106 107 102 103 104 105 106 107Cost

Error

Algorithm MLPF PF

Introduction MLMC BIP SMC MLSMC Other Summary

(CA) ML ensemble Kalman filter (MLEnKF) for S(P)DE

EnKF uses sample covariance from an ensemble ofparticles to approximate a linear Gaussian Bayesianupdate, given by an affine transformation of particles.Multilevel approximation of the covariance improves costfor MSE O(ε2) [HLT16].

Best theoretical bound (at step n): O(| log ε|2nε−2).Numerically (uniformly in n): O(ε−2).

New SPDE extension: same n-dependence, much morebroadly applicable [CHLNT17].

10−1 100 101 102 103 104

Runtime [s]

10−6

10−5

10−4

10−3

RM

SE

10−1 100 101 102 103 104

Runtime [s]

EnKF

MLEnKF

cs−1/3

cs−1/2

10−1 100 101 102 103 104

Runtime [s]

Introduction MLMC BIP SMC MLSMC Other Summary

(AC) Parameter estimation for SDE with pMCMC

Aim: estimate E[ϕ(θ)|y ], where y is a finite set of partialobservations of the SDE X θ

t on [0,T ], parameterized by θ.particle MCMC: Iterate

propose θ′ ∼ q(θ, θ′),simulate X θ′,iM

i=1 ≈ π(X |θ′, y) with particle filter,compute non-negative and unbiased estimatorpM(y |θ′) =

∏np=1( 1

M

∑Mi=1 gp(X θ′,i

p )),accept/reject according to

1 ∧ pM(y |θ′)π(θ′)q(θ′, θ)

pM(y |θ)π(θ)q(θ, θ′).

Introduction MLMC BIP SMC MLSMC Other Summary

MLMC version [JKLZ16]:Construct approximate coupling πl,l−1(θ,X l ,X l−1): usualcoupled forward kernel, and coupled selection functionGp,θ(X l ,X l−1) = maxgp,θ(X l ),gp,θ(X l−1).Let Hl (θ,X l ,X l−1) =

∏np=1 gp,θ(X l )/Gp,θ(X l ,X l−1). Then

Eπl [ϕ(θ)]− Eπl−1 [ϕ(θ)] =Eπl,l−1 [ϕ(θ)Hl (θ,X l ,X l−1)]

Eπl,l−1 [Hl (θ,X l ,X l−1)]− Eπl,l−1 [ϕ(θl−1)Hl−1(θ,X l ,X l−1)]

Eπl,l−1 [Hl−1(θ,X l ,X l−1)].

Optimal results hold with same rate as forward.

Lagenvin SDE

𝜎

Lagenvin SDE

𝜃

Ornstein-Uhlenbech process

𝜎

Ornstein-Uhlenbech process

𝜃

2−8 2−7 2−6 2−5 2−4 2−3 2−2 2−5 2−4 2−3 2−2 2−1 20

2−8 2−7 2−6 2−5 2−4 2−3 2−6 2−5 2−4 2−3 2−2 2−122242628210

22242628210

22242628210

22242628210

Error

Cost

Algorithm ML-PMCMC PMCMC

Introduction MLMC BIP SMC MLSMC Other Summary

(AC) Multi-index Markov chain Monte Carlo (MIMCMC)

If spatio-temporal approximation dimension d > 1, thenMIMC is preferable to MLMC [HNT15]. α ∈ Nd

∆iEα(ϕ(u)) = Eα(ϕ(u))− Eα−ei (ϕ(u)), ∆ = ∆d · · ·∆1,

E(ϕ(u)) =∑α

∆Eα(ϕ(u)) ≈∑α∈I

∆Eα(ϕ(u))

Approximate coupling can be applied to the 2d

probability measures in each summand.

Introduction MLMC BIP SMC MLSMC Other Summary

Optimal results hold for appropriate regularity [JKLZ17]

210

220

230

240

2−4 2−2 20 22

Error

Cost

Algorithm mcmc mimcmc

1

Introduction MLMC BIP SMC MLSMC Other Summary

Outline

1 Introduction

2 Multilevel Monte Carlo sampling

3 Bayesian inference problem

4 Sequential Monte Carlo samplers

5 Multilevel Sequential Monte Carlo (MLSMC) samplers

6 Other MLMC algorithms for inferenceImportance sampling (IS) strategyCoupled algorithm (CA) strategyApproximate coupling (AC) strategy

7 Summary

Introduction MLMC BIP SMC MLSMC Other Summary

Summary

MLSMC sampler can perform as well as MLMC.For our example β > ζ. If β ≤ ζ, cost is somewhat higher,analogous to standard MLMC.If ζ > 2α then the optimal cost is ε−ζ/α, the cost of a singlesimulation at the finest level.New importance sampling: MLSMC with DILI mutations.Coupled algorithms: MLPF strong error is effectivelyreduced by coupled resampling β → β/2.Coupled algorithms: MLEnKF has a spuriousn-dependent logarithmic penalty | log ε|2n on cost.New coupled algorithms: MLMCMC.Friday approximate couplings: ML PMCMC for SDEparameter estimation preserves strong error β.New approximate couplings: MIMCMC and MISMC2.Looking for students/postdocs with similar interests.

Introduction MLMC BIP SMC MLSMC Other Summary

References

[BJLTZ15]: Beskos, Jasra, Law, Tempone, Zhou."Multilevel Sequential Monte Carlo samplers." SPA 127:5,1417–1440 (2017).[DJLZ16]: Del Moral, Jasra, Law, Zhou. "MultilevelSequential Monte Carlo samplers for normalizingconstants." ToMACS 27(3), 20 (2017).[JKLZ15]: Jasra, Kamatani, Law, Zhou. "Multilevel particlefilter." SINUM 55(6), 3068–3096 (2017).[JLZ16]: Jasra, Law, and Zhou. "Forward and InverseUncertainty Quantification using Multilevel Monte CarloAlgorithms for an Elliptic Nonlocal Equation." Int. J. Unc.Quant., 6(6), 501–514 (2016).[HLT15]: Hoel, Law, Tempone. "Multilevel ensembleKalman filter." SINUM 54(3), 1813–1839 (2016).

Introduction MLMC BIP SMC MLSMC Other Summary

References[G08]: Giles. Op. Res., 56, 607-617 (2008).[H00] Heinrich. LSSC proceedings (2001).[HNT15] Haji-Ali, Nobile, Tempone. NumerischeMathematik, 132, 767-806 (2016).[DDJ06]: Del Moral, Doucet, Jasra. J. R. Statist. Soc. B,68, 411-436 (2006).[C02]: Chopin. Biometrika 89:3 539–552 (2002).[D04]: Del Moral. "Feynman-Kac Formulae." Springer:New York (2004).[CLM16]: Cui, Law, Marzouk. J. Comp. Phys. 304,109-137 (2016).[HSS13]: Hoang, Schwab, Stuart. Inverse Prob., 29,085010 (2013).[KST13]: Ketelsen, Scheichl, Teckentrup. SIAM/ASA JUQ3(1) 1075-1108 (2015).

Introduction MLMC BIP SMC MLSMC Other Summary

References

[CHLNT17] Chernov, Hoel, Law, Nobile, Tempone."Multilevel ensemble Kalman filtering for spatio-temporalprocesses." arXiv:1608.08558 (2017).[JKLZ17] Jasra, Kamatani, Law, Zhou. "MLMC for staticBayesian parameter estimation." SISC 40(2), A887-A902(2018). See talk on Friday.[JKLZ17] Jasra, Kamatani, Law, Zhou. "A Multi-IndexMarkov Chain Monte Carlo Method." IJUQ 8(1) (2018).[JLS17] Jasra, Law, Suciu. "Advanced Multilevel MonteCarlo Methods." arXiv:1704.07272 (2017).[BJLMZ17]: Beskos, Jasra, Law, Marzouk, Zhou. "MLSMCsamplers with DILI mutations." SIAM/ASA JUQ 6(2),762-786 (2018).

Introduction MLMC BIP SMC MLSMC Other Summary

References

[JLX18.i] Jasra, Law, Xu. “Markov chain Simulation forMultilevel Monte Carlo.” arXiv:1806.09754 (2018).[JLX18.ii]: Jasra, Law, Xu. “MISMC2 for partially observedSPDE.” arXiv:1805.00415 (2018).

Introduction MLMC BIP SMC MLSMC Other Summary

Thank you

top related