bayesian inference in intractable likelihood models...intractable likelihood the bernoulli factory...
TRANSCRIPT
![Page 1: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/1.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Bayesian Inference in Intractable Likelihood Models
Krzysztof Łatuszynski(University of Warwick, UK)
(The Alan Turing Institute, London)
WISŁA 2018
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 2: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/2.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Intractable LikelihoodMonta Carlo based inferenceIntractable Likelihood
The Bernoulli Factory problemThe Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Barkers and moreBarkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Markov switching diffusion model & exact Bayesian inferenceThe model and inferenceDesigning an exact MCMC algorithm
Example: the SINE modelPseudo-marginal MCMC
Pseudo-marginal MCMCKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 3: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/3.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Monta Carlo based inference
I A common goal in parametric Bayesian inference is to estimate posteriorexpectations.
I Given data y, the likelihood lθ(y) and the prior p(θ), the posterior is
π(θ) =p(θ)lθ(y)∫p(θ)lθ(y)dθ
I We are interested in computing the expectations of the form
π(φ) =
∫φ(θ)π(θ)dθ
I The integral cannot be computed analyticallyI Monte Carlo methods involve approximation of π(φ) with random variables.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 4: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/4.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Monta Carlo based inference
I A common goal in parametric Bayesian inference is to estimate posteriorexpectations.
I Given data y, the likelihood lθ(y) and the prior p(θ), the posterior is
π(θ) =p(θ)lθ(y)∫p(θ)lθ(y)dθ
I We are interested in computing the expectations of the form
π(φ) =
∫φ(θ)π(θ)dθ
I The integral cannot be computed analyticallyI Monte Carlo methods involve approximation of π(φ) with random variables.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 5: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/5.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Monta Carlo based inference
I A common goal in parametric Bayesian inference is to estimate posteriorexpectations.
I Given data y, the likelihood lθ(y) and the prior p(θ), the posterior is
π(θ) =p(θ)lθ(y)∫p(θ)lθ(y)dθ
I We are interested in computing the expectations of the form
π(φ) =
∫φ(θ)π(θ)dθ
I The integral cannot be computed analyticallyI Monte Carlo methods involve approximation of π(φ) with random variables.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 6: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/6.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Monta Carlo based inference
I A common goal in parametric Bayesian inference is to estimate posteriorexpectations.
I Given data y, the likelihood lθ(y) and the prior p(θ), the posterior is
π(θ) =p(θ)lθ(y)∫p(θ)lθ(y)dθ
I We are interested in computing the expectations of the form
π(φ) =
∫φ(θ)π(θ)dθ
I The integral cannot be computed analyticallyI Monte Carlo methods involve approximation of π(φ) with random variables.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 7: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/7.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Monta Carlo based inference
I A common goal in parametric Bayesian inference is to estimate posteriorexpectations.
I Given data y, the likelihood lθ(y) and the prior p(θ), the posterior is
π(θ) =p(θ)lθ(y)∫p(θ)lθ(y)dθ
I We are interested in computing the expectations of the form
π(φ) =
∫φ(θ)π(θ)dθ
I The integral cannot be computed analyticallyI Monte Carlo methods involve approximation of π(φ) with random variables.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 8: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/8.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 9: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/9.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 10: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/10.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 11: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/11.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 12: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/12.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 13: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/13.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Metropolis-Hastings
I The Metropolis-Hastings algorithm generates a Markov chain that is reversiblewrt π. Its transition operator P(·, ·) satisfies
π(θ)P(θ, θ′) = π(θ′)P(θ′, θ)
I The algorithm: Given θn
sample θ′ ∼ Q(θn, ·)I with probability α(θn, θ
′) set θn+1 := θ′ otherwise θn+1 := θn.I where
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwise
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 14: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/14.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwiseI Intractable models are found in all application areas
I physics, biology, chemistry, epidemiology, etc.I finance, marketing, manufacturing, etc.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 15: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/15.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwiseI Intractable models are found in all application areas
I physics, biology, chemistry, epidemiology, etc.I finance, marketing, manufacturing, etc.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 16: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/16.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwiseI Intractable models are found in all application areas
I physics, biology, chemistry, epidemiology, etc.I finance, marketing, manufacturing, etc.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 17: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/17.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I
α(θn, θ′) = min
1,π(θ′)q(θ′, θn)
π(θn)q(θn, θ′)
= min
1,
p(θ′)lθ′(y)q(θ′, θn)
p(θn)lθn(y)q(θn, θ′)
I Tractable model: lθ(y) can be computed pointwiseI Intractable model: lθ(y) cannot be computed pointwiseI Intractable models are found in all application areas
I physics, biology, chemistry, epidemiology, etc.I finance, marketing, manufacturing, etc.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 18: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/18.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 19: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/19.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 20: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/20.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 21: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/21.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 22: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/22.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 23: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/23.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Monta Carlo based inferenceIntractable Likelihood
Intractable Likelihood
I There are several types of intractability:I latent variable models:
lθ(y) =∫
lθ(y, x)dx and lθ(y, x) can be computed pointwise;I Big data: lθ(y) =
∏i lθ(yi);
I ABC: one can only z ∼ lθ(·)I Consider a diffusion
Yt = µθ(Yt)dt + σθ(Yt)dBt
And discretely observed data yt1 , . . . , ytn
I
lθ(yt1 , . . . , ytn) =∏
i
pθ(yti , yti+1)
where pθ(yti , yti+1) is the transition density of the diffusion.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 24: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/24.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
The Benoulli Factory problem
I let p ∈ (0, 1) be unknownI given a black box that generates a sequence
X1,X2, . . .
of p−coinsI is it possible to generate an
f (p)− coin
for a known f ?I for example
f (p) = min1, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 25: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/25.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
The Benoulli Factory problem
I let p ∈ (0, 1) be unknownI given a black box that generates a sequence
X1,X2, . . .
of p−coinsI is it possible to generate an
f (p)− coin
for a known f ?I for example
f (p) = min1, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 26: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/26.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
The Benoulli Factory problem
I let p ∈ (0, 1) be unknownI given a black box that generates a sequence
X1,X2, . . .
of p−coinsI is it possible to generate an
f (p)− coin
for a known f ?I for example
f (p) = min1, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 27: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/27.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
The Benoulli Factory problem
I let p ∈ (0, 1) be unknownI given a black box that generates a sequence
X1,X2, . . .
of p−coinsI is it possible to generate an
f (p)− coin
for a known f ?I for example
f (p) = min1, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 28: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/28.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Some history
I von Neumann posed and solved (see e.g. Peres 1992):
f (p) = 1/2
I Algorithm1. set n = 1;2. use the black box to sample Xn,Xn+1
3. if (Xn,Xn+1) = (0, 1) output 1 and STOP4. if (Xn,Xn+1) = (1, 0) output 0 and STOP5. set n := n + 2 and GOTO 2.
I Asmussen posed an open poblem for:
f (p) = 2p
I but it turned out difficult
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 29: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/29.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Some history
I von Neumann posed and solved (see e.g. Peres 1992):
f (p) = 1/2
I Algorithm1. set n = 1;2. use the black box to sample Xn,Xn+1
3. if (Xn,Xn+1) = (0, 1) output 1 and STOP4. if (Xn,Xn+1) = (1, 0) output 0 and STOP5. set n := n + 2 and GOTO 2.
I Asmussen posed an open poblem for:
f (p) = 2p
I but it turned out difficult
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 30: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/30.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Some history
I von Neumann posed and solved (see e.g. Peres 1992):
f (p) = 1/2
I Algorithm1. set n = 1;2. use the black box to sample Xn,Xn+1
3. if (Xn,Xn+1) = (0, 1) output 1 and STOP4. if (Xn,Xn+1) = (1, 0) output 0 and STOP5. set n := n + 2 and GOTO 2.
I Asmussen posed an open poblem for:
f (p) = 2p
I but it turned out difficult
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 31: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/31.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Some history
I von Neumann posed and solved (see e.g. Peres 1992):
f (p) = 1/2
I Algorithm1. set n = 1;2. use the black box to sample Xn,Xn+1
3. if (Xn,Xn+1) = (0, 1) output 1 and STOP4. if (Xn,Xn+1) = (1, 0) output 0 and STOP5. set n := n + 2 and GOTO 2.
I Asmussen posed an open poblem for:
f (p) = 2p
I but it turned out difficult
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 32: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/32.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusions
I MCMC for jump diffusions with stochastic jump rate(F. Goncalves, G.O. Roberts, KL)
I Consider the model t ∈ [0,T]
γt ∼ Ornstein-Uhlenbeck(θ1)
λt = exp(γt)
Jt ∼ JumpProcess(λt, d∆)
dVt = µ(Vt−, θ2)dt + σ(Vt−, θ2)dBt+dJt
I Gibbs sampling from the full posterior will alternate between((Jt,Vt) | ·
); (λt | ·) ; (θ1 | ·) ; (θ2 | ·)
I let’s have a look at updating(λt | ·)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 33: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/33.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusions
I MCMC for jump diffusions with stochastic jump rate(F. Goncalves, G.O. Roberts, KL)
I Consider the model t ∈ [0,T]
γt ∼ Ornstein-Uhlenbeck(θ1)
λt = exp(γt)
Jt ∼ JumpProcess(λt, d∆)
dVt = µ(Vt−, θ2)dt + σ(Vt−, θ2)dBt+dJt
I Gibbs sampling from the full posterior will alternate between((Jt,Vt) | ·
); (λt | ·) ; (θ1 | ·) ; (θ2 | ·)
I let’s have a look at updating(λt | ·)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 34: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/34.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusions
I MCMC for jump diffusions with stochastic jump rate(F. Goncalves, G.O. Roberts, KL)
I Consider the model t ∈ [0,T]
γt ∼ Ornstein-Uhlenbeck(θ1)
λt = exp(γt)
Jt ∼ JumpProcess(λt, d∆)
dVt = µ(Vt−, θ2)dt + σ(Vt−, θ2)dBt+dJt
I Gibbs sampling from the full posterior will alternate between((Jt,Vt) | ·
); (λt | ·) ; (θ1 | ·) ; (θ2 | ·)
I let’s have a look at updating(λt | ·)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 35: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/35.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusions
I MCMC for jump diffusions with stochastic jump rate(F. Goncalves, G.O. Roberts, KL)
I Consider the model t ∈ [0,T]
γt ∼ Ornstein-Uhlenbeck(θ1)
λt = exp(γt)
Jt ∼ JumpProcess(λt, d∆)
dVt = µ(Vt−, θ2)dt + σ(Vt−, θ2)dBt+dJt
I Gibbs sampling from the full posterior will alternate between((Jt,Vt) | ·
); (λt | ·) ; (θ1 | ·) ; (θ2 | ·)
I let’s have a look at updating(λt | ·)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 36: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/36.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusionsI for updating (λt | ·) compute
p(γt|·) = p(γt|Jt) ∝ p(γt)p(Jt|γt) = p(γt) exp−∫ T
0eγt dt +
NJ∑j=1
γtj
= p(γt)Kγ exp
−∫ T
0eγt dt
= p(γ)KγI(γ)
I If proposal = p(γt), then the Metropolis acceptance rate is of the form
α(γ(i), γ(i+1)) = min1 , K(γ(i),γ(i+1))I(γ(i), γ(i+1)), where
I K(γ(i),γ(i+1)) is a known constantI We have a mechanism to generate events of probability I(γ(i), γ(i+1))I so we have the Bernoulli factory problem with
f (p) = min1,Kp.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 37: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/37.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusionsI for updating (λt | ·) compute
p(γt|·) = p(γt|Jt) ∝ p(γt)p(Jt|γt) = p(γt) exp−∫ T
0eγt dt +
NJ∑j=1
γtj
= p(γt)Kγ exp
−∫ T
0eγt dt
= p(γ)KγI(γ)
I If proposal = p(γt), then the Metropolis acceptance rate is of the form
α(γ(i), γ(i+1)) = min1 , K(γ(i),γ(i+1))I(γ(i), γ(i+1)), where
I K(γ(i),γ(i+1)) is a known constantI We have a mechanism to generate events of probability I(γ(i), γ(i+1))I so we have the Bernoulli factory problem with
f (p) = min1,Kp.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 38: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/38.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusionsI for updating (λt | ·) compute
p(γt|·) = p(γt|Jt) ∝ p(γt)p(Jt|γt) = p(γt) exp−∫ T
0eγt dt +
NJ∑j=1
γtj
= p(γt)Kγ exp
−∫ T
0eγt dt
= p(γ)KγI(γ)
I If proposal = p(γt), then the Metropolis acceptance rate is of the form
α(γ(i), γ(i+1)) = min1 , K(γ(i),γ(i+1))I(γ(i), γ(i+1)), where
I K(γ(i),γ(i+1)) is a known constantI We have a mechanism to generate events of probability I(γ(i), γ(i+1))I so we have the Bernoulli factory problem with
f (p) = min1,Kp.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 39: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/39.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusionsI for updating (λt | ·) compute
p(γt|·) = p(γt|Jt) ∝ p(γt)p(Jt|γt) = p(γt) exp−∫ T
0eγt dt +
NJ∑j=1
γtj
= p(γt)Kγ exp
−∫ T
0eγt dt
= p(γ)KγI(γ)
I If proposal = p(γt), then the Metropolis acceptance rate is of the form
α(γ(i), γ(i+1)) = min1 , K(γ(i),γ(i+1))I(γ(i), γ(i+1)), where
I K(γ(i),γ(i+1)) is a known constantI We have a mechanism to generate events of probability I(γ(i), γ(i+1))I so we have the Bernoulli factory problem with
f (p) = min1,Kp.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 40: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/40.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation I - MCMC for jump diffusionsI for updating (λt | ·) compute
p(γt|·) = p(γt|Jt) ∝ p(γt)p(Jt|γt) = p(γt) exp−∫ T
0eγt dt +
NJ∑j=1
γtj
= p(γt)Kγ exp
−∫ T
0eγt dt
= p(γ)KγI(γ)
I If proposal = p(γt), then the Metropolis acceptance rate is of the form
α(γ(i), γ(i+1)) = min1 , K(γ(i),γ(i+1))I(γ(i), γ(i+1)), where
I K(γ(i),γ(i+1)) is a known constantI We have a mechanism to generate events of probability I(γ(i), γ(i+1))I so we have the Bernoulli factory problem with
f (p) = min1,Kp.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 41: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/41.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 42: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/42.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 43: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/43.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 44: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/44.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 45: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/45.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 46: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/46.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation II - perfect sampling for Markov chainsI Consider Xnn≥0 an ergodic Markov chain with transition kernel P and
limiting distribution π.I Under mild assumptions P can be decomposed
P(x, ·) = s(x)ν(·) + (1− s(x))Q(x, ·)I and every time we sample from P we flip a coin with probability s(x) to
decide between sampling from ν(·) and Q(x, ·)I Let τ be the first time the coin points at ν(·)I then π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ); R(x, ·) =
P(x, ·)− s(x)ν(·)1− s(x)
.
I it looks like perfect sampling from π is possible using rejection sampling.(S.Assmussen, P.W.Glynn, H.Thorisson 1992; J.P.Hobert, C.P.Robert 2004;J.Blanchet, X-L.Meng 2005; J.Blanchet, A.C.Thomas 2007)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 47: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/47.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 48: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/48.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 49: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/49.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 50: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/50.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 51: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/51.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 52: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/52.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 53: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/53.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Motivation III - perfect sampling for Markov chainsI π(·) admits the decomposition
π(·) =
∞∑n=0
pnRn(ν, ·) where pn :=Pr(τ ≥ n)
E(τ).
I find a probability distribution d(n) s.t. Pr(τ > n) ≤ Md(n).(e.g. using drift conditions for geometrically ergodic chains)
I Now we can writePr(τ > n)
E(τ)=
Pr(τ > n)
E(τ)d(n)d(n)
I Goal: reject the d(n) proposal with probability proportional to Pr(τ>n)E(τ)d(n)
I so we can usePr(τ > n)
Md(n)=: KPr(τ > n) < 1
I where K is known and we can sample from Pr(τ > n).I The above was successfully implemented by J.Flegal, R.Herbei 2012.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 54: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/54.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Keane and O’Brien - existence result
I Keane and O’Brien (1994):
Let p ∈ P ⊆ (0, 1)→ [0, 1]
then it is possible to simulate an f (p)−coin ⇐⇒I f is constant, orI f is continuous and for some n ∈ N and all p ∈ P satisfies
min
f (p), 1− f (p)≥ min
p, 1− p
n
I however their proof is not constructiveI note that the result rules out min1, 2p, but not min1− ε, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 55: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/55.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Keane and O’Brien - existence result
I Keane and O’Brien (1994):
Let p ∈ P ⊆ (0, 1)→ [0, 1]
then it is possible to simulate an f (p)−coin ⇐⇒I f is constant, orI f is continuous and for some n ∈ N and all p ∈ P satisfies
min
f (p), 1− f (p)≥ min
p, 1− p
n
I however their proof is not constructiveI note that the result rules out min1, 2p, but not min1− ε, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 56: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/56.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Keane and O’Brien - existence result
I Keane and O’Brien (1994):
Let p ∈ P ⊆ (0, 1)→ [0, 1]
then it is possible to simulate an f (p)−coin ⇐⇒I f is constant, orI f is continuous and for some n ∈ N and all p ∈ P satisfies
min
f (p), 1− f (p)≥ min
p, 1− p
n
I however their proof is not constructiveI note that the result rules out min1, 2p, but not min1− ε, 2p.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 57: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/57.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 58: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/58.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 59: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/59.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 60: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/60.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 61: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/61.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 62: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/62.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 63: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/63.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Nacu-Peres Theorem - Bernstein polynomial approachI There exists an algorithm which simulates f ⇐⇒ there exist polynomials
gn(x, y) =
n∑k=0
(nk
)a(n, k)xkyn−k, hn(x, y) =
n∑k=0
(nk
)b(n, k)xkyn−k
I 0 ≤ a(n, k) ≤ b(n, k) ≤ 1I(n
k
)a(n, k) and
(nk
)b(n, k) are integers
I limn→∞ gn(p, 1− p) = f (p) = limn→∞ hn(p, 1− p)I for all m < n
a(n, k) ≥k∑
i=0
(n−mk−i
)(mi
)(nk
) a(m, i), b(n, k) ≤k∑
i=0
(n−mk−i
)(mi
)(nk
) b(m, i). (1)
I Nacu & Peres provide coefficients for f (p) = min1− ε, 2p explicitly.I Given an algorithm for f (p) = min1− ε, 2p Nacu & Peres develop a
calculus that collapses every real analytic g to nesting the algorithm for fand simulating g.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 64: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/64.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Summary of theoretical results
I Nacu and Peres show that the random running time of their algorithm hasexponentially decaying tails for every real analytic function f .
I There are further interesting theoretical results relating the smoothness of f toexistence of Bernoulli Factory algorithms with certain running time. (see OHoltz, F Nazarov, Y Peres, 2011)
I Other results (E Mossel, Y Peres, C Hillar - 2005) relate to constructing aBernoulli Factory for f rational over Q be a finite automaton.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 65: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/65.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Summary of theoretical results
I Nacu and Peres show that the random running time of their algorithm hasexponentially decaying tails for every real analytic function f .
I There are further interesting theoretical results relating the smoothness of f toexistence of Bernoulli Factory algorithms with certain running time. (see OHoltz, F Nazarov, Y Peres, 2011)
I Other results (E Mossel, Y Peres, C Hillar - 2005) relate to constructing aBernoulli Factory for f rational over Q be a finite automaton.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 66: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/66.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Summary of theoretical results
I Nacu and Peres show that the random running time of their algorithm hasexponentially decaying tails for every real analytic function f .
I There are further interesting theoretical results relating the smoothness of f toexistence of Bernoulli Factory algorithms with certain running time. (see OHoltz, F Nazarov, Y Peres, 2011)
I Other results (E Mossel, Y Peres, C Hillar - 2005) relate to constructing aBernoulli Factory for f rational over Q be a finite automaton.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 67: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/67.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 68: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/68.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 69: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/69.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 70: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/70.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 71: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/71.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 72: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/72.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 73: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/73.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Bernstein polynomial approach - to nice to be true?
I at time n the N-P algorithm computes sets An and Bn
An and Bn are subsets of all 01 strings of length nI the cardinalities of An and Bn are precisely
(nk
)a(n, k) and
(nk
)b(n, k)
I the upper polynomial approximation is converging slowly to fI length of 01 strings is 215 = 32768 and above, e.g. 225 = 16777216I one has to deal efficiently with the set of 2225
strings, of length 225 each.I
I
I we shall develop a reverse time martingale approach to the problemI we will construct reverse time super- and submartingales that perform a
random walk on the Nacu-Peres polynomial coefficients a(n, k), b(n, k)and result in a black box that has algorithmic cost linear in the number oforiginal p−coins
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 74: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/74.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Reverse time martingale approach to sampling
I Reverse time martingale approach to sampling events of unknown probability(KL, I. Kosmidis, O. Papaspiliopoulos, G.O. Roberts, RSA 2011)
I
I We shall progress gradually from a simple to a general algorithm for samplingevents of unknown probabilities constructively
I
I s is the unknown ”target” probability (”s = f (p)”)I It is determined uniquely but can not be computed and increasing
knowledge/precision about s is expensive algorithmically.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 75: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/75.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Reverse time martingale approach to sampling
I Reverse time martingale approach to sampling events of unknown probability(KL, I. Kosmidis, O. Papaspiliopoulos, G.O. Roberts, RSA 2011)
I
I We shall progress gradually from a simple to a general algorithm for samplingevents of unknown probabilities constructively
I
I s is the unknown ”target” probability (”s = f (p)”)I It is determined uniquely but can not be computed and increasing
knowledge/precision about s is expensive algorithmically.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 76: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/76.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Reverse time martingale approach to sampling
I Reverse time martingale approach to sampling events of unknown probability(KL, I. Kosmidis, O. Papaspiliopoulos, G.O. Roberts, RSA 2011)
I
I We shall progress gradually from a simple to a general algorithm for samplingevents of unknown probabilities constructively
I
I s is the unknown ”target” probability (”s = f (p)”)I It is determined uniquely but can not be computed and increasing
knowledge/precision about s is expensive algorithmically.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 77: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/77.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Reverse time martingale approach to sampling
I Reverse time martingale approach to sampling events of unknown probability(KL, I. Kosmidis, O. Papaspiliopoulos, G.O. Roberts, RSA 2011)
I
I We shall progress gradually from a simple to a general algorithm for samplingevents of unknown probabilities constructively
I
I s is the unknown ”target” probability (”s = f (p)”)I It is determined uniquely but can not be computed and increasing
knowledge/precision about s is expensive algorithmically.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 78: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/78.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 0 - randomization
I Lemma: Sampling events of probability s ∈ [0, 1] is equivalent to constructingan unbiased estimator of s taking values in [0, 1] with probability 1.
I Proof: Let S, s.t. ES = s and P(S ∈ [0, 1]) = 1 be the estimator. Then drawG0 ∼ U(0, 1), obtain S and define a coin Cs := IG0 ≤ S.
P(Cs = 1) = E I(G0 ≤ S) = E(E(I(G0 ≤ s) | S = s
))= ES = s.
The converse is straightforward since an s−coin is an unbiased estimator ofs with values in [0, 1].
I Algorithm 01. simulate G0 ∼ U(0, 1);2. obtain S;3. if G0 ≤ S set Cs := 1, otherwise set Cs := 0;4. output Cs.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 79: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/79.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 0 - randomization
I Lemma: Sampling events of probability s ∈ [0, 1] is equivalent to constructingan unbiased estimator of s taking values in [0, 1] with probability 1.
I Proof: Let S, s.t. ES = s and P(S ∈ [0, 1]) = 1 be the estimator. Then drawG0 ∼ U(0, 1), obtain S and define a coin Cs := IG0 ≤ S.
P(Cs = 1) = E I(G0 ≤ S) = E(E(I(G0 ≤ s) | S = s
))= ES = s.
The converse is straightforward since an s−coin is an unbiased estimator ofs with values in [0, 1].
I Algorithm 01. simulate G0 ∼ U(0, 1);2. obtain S;3. if G0 ≤ S set Cs := 1, otherwise set Cs := 0;4. output Cs.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 80: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/80.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 0 - randomization
I Lemma: Sampling events of probability s ∈ [0, 1] is equivalent to constructingan unbiased estimator of s taking values in [0, 1] with probability 1.
I Proof: Let S, s.t. ES = s and P(S ∈ [0, 1]) = 1 be the estimator. Then drawG0 ∼ U(0, 1), obtain S and define a coin Cs := IG0 ≤ S.
P(Cs = 1) = E I(G0 ≤ S) = E(E(I(G0 ≤ s) | S = s
))= ES = s.
The converse is straightforward since an s−coin is an unbiased estimator ofs with values in [0, 1].
I Algorithm 01. simulate G0 ∼ U(0, 1);2. obtain S;3. if G0 ≤ S set Cs := 1, otherwise set Cs := 0;4. output Cs.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 81: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/81.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 1 - monotone deterministic boundsI let l1, l2, ... and u1, u2, ... be sequences of lower and upper monotone
bounds for s converging to s, i.e.
li s and ui s.I Algorithm 1
1. simulate G0 ∼ U(0, 1); set n = 1;2. compute ln and un;3. if G0 ≤ ln set Cs := 1;4. if G0 > un set Cs := 0;5. if ln < G0 ≤ un set n := n + 1 and GOTO 2;6. output Cs.
I
I Remark: P(N > n) = un − ln.I
I If Clnn≥1 and Cunn≥1 are sequences of coins s.t. P(Cln = 1) = ln andP(Cun = 1) = un respectively,
I Then Algorithm 1 corresponds to a coupling of Clnn≥1 and Cunn≥1 s.t.Cln = Cun for all n ≥ N , where N is the random number of iterations needed.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 82: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/82.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 1 - monotone deterministic boundsI let l1, l2, ... and u1, u2, ... be sequences of lower and upper monotone
bounds for s converging to s, i.e.
li s and ui s.I Algorithm 1
1. simulate G0 ∼ U(0, 1); set n = 1;2. compute ln and un;3. if G0 ≤ ln set Cs := 1;4. if G0 > un set Cs := 0;5. if ln < G0 ≤ un set n := n + 1 and GOTO 2;6. output Cs.
I
I Remark: P(N > n) = un − ln.I
I If Clnn≥1 and Cunn≥1 are sequences of coins s.t. P(Cln = 1) = ln andP(Cun = 1) = un respectively,
I Then Algorithm 1 corresponds to a coupling of Clnn≥1 and Cunn≥1 s.t.Cln = Cun for all n ≥ N , where N is the random number of iterations needed.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 83: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/83.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 1 - monotone deterministic boundsI let l1, l2, ... and u1, u2, ... be sequences of lower and upper monotone
bounds for s converging to s, i.e.
li s and ui s.I Algorithm 1
1. simulate G0 ∼ U(0, 1); set n = 1;2. compute ln and un;3. if G0 ≤ ln set Cs := 1;4. if G0 > un set Cs := 0;5. if ln < G0 ≤ un set n := n + 1 and GOTO 2;6. output Cs.
I
I Remark: P(N > n) = un − ln.I
I If Clnn≥1 and Cunn≥1 are sequences of coins s.t. P(Cln = 1) = ln andP(Cun = 1) = un respectively,
I Then Algorithm 1 corresponds to a coupling of Clnn≥1 and Cunn≥1 s.t.Cln = Cun for all n ≥ N , where N is the random number of iterations needed.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 84: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/84.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 1 - monotone deterministic boundsI let l1, l2, ... and u1, u2, ... be sequences of lower and upper monotone
bounds for s converging to s, i.e.
li s and ui s.I Algorithm 1
1. simulate G0 ∼ U(0, 1); set n = 1;2. compute ln and un;3. if G0 ≤ ln set Cs := 1;4. if G0 > un set Cs := 0;5. if ln < G0 ≤ un set n := n + 1 and GOTO 2;6. output Cs.
I
I Remark: P(N > n) = un − ln.I
I If Clnn≥1 and Cunn≥1 are sequences of coins s.t. P(Cln = 1) = ln andP(Cun = 1) = un respectively,
I Then Algorithm 1 corresponds to a coupling of Clnn≥1 and Cunn≥1 s.t.Cln = Cun for all n ≥ N , where N is the random number of iterations needed.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 85: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/85.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 1 - monotone deterministic boundsI let l1, l2, ... and u1, u2, ... be sequences of lower and upper monotone
bounds for s converging to s, i.e.
li s and ui s.I Algorithm 1
1. simulate G0 ∼ U(0, 1); set n = 1;2. compute ln and un;3. if G0 ≤ ln set Cs := 1;4. if G0 > un set Cs := 0;5. if ln < G0 ≤ un set n := n + 1 and GOTO 2;6. output Cs.
I
I Remark: P(N > n) = un − ln.I
I If Clnn≥1 and Cunn≥1 are sequences of coins s.t. P(Cln = 1) = ln andP(Cun = 1) = un respectively,
I Then Algorithm 1 corresponds to a coupling of Clnn≥1 and Cunn≥1 s.t.Cln = Cun for all n ≥ N , where N is the random number of iterations needed.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 86: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/86.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 2 - monotone stochastic bounds
Ln ≤ Un
Ln ∈ [0, 1] and Un ∈ [0, 1]
Ln−1 ≤ Ln and Un−1 ≥ Un
E Ln = ln s and E Un = un s.
F0 = ∅,Ω, Fn = σLn,Un, Fk,n = σFk,Fk+1, ...Fn for k ≤ n.
I Algorithm 21. simulate G0 ∼ U(0, 1); set n = 1;2. obtain Ln and Un; conditionally on F1,n−1
3. if G0 ≤ Ln set Cs := 1;4. if G0 > Un set Cs := 0;5. if Ln < G0 ≤ Un set n := n + 1 and GOTO 2;6. output Cs.
I
I Thm In the above algorithm ECs = sKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 87: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/87.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 2 - monotone stochastic bounds
Ln ≤ Un
Ln ∈ [0, 1] and Un ∈ [0, 1]
Ln−1 ≤ Ln and Un−1 ≥ Un
E Ln = ln s and E Un = un s.
F0 = ∅,Ω, Fn = σLn,Un, Fk,n = σFk,Fk+1, ...Fn for k ≤ n.
I Algorithm 21. simulate G0 ∼ U(0, 1); set n = 1;2. obtain Ln and Un; conditionally on F1,n−1
3. if G0 ≤ Ln set Cs := 1;4. if G0 > Un set Cs := 0;5. if Ln < G0 ≤ Un set n := n + 1 and GOTO 2;6. output Cs.
I
I Thm In the above algorithm ECs = sKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 88: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/88.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 2 - monotone stochastic bounds
Ln ≤ Un
Ln ∈ [0, 1] and Un ∈ [0, 1]
Ln−1 ≤ Ln and Un−1 ≥ Un
E Ln = ln s and E Un = un s.
F0 = ∅,Ω, Fn = σLn,Un, Fk,n = σFk,Fk+1, ...Fn for k ≤ n.
I Algorithm 21. simulate G0 ∼ U(0, 1); set n = 1;2. obtain Ln and Un; conditionally on F1,n−1
3. if G0 ≤ Ln set Cs := 1;4. if G0 > Un set Cs := 0;5. if Ln < G0 ≤ Un set n := n + 1 and GOTO 2;6. output Cs.
I
I Thm In the above algorithm ECs = sKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 89: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/89.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 3 - reverse time martingales
Ln ≤ Un (2)Ln ∈ [0, 1] and Un ∈ [0, 1] (3)Ln−1 ≤ Ln and Un−1 ≥ Un (4)
E Ln = ln s and E Un = un s. (5)F0 = ∅,Ω, Fn = σLn,Un, Fk,n = σFk,Fk+1, ...Fn for k ≤ n.
The final step is to weaken condition (4) and let Ln be a reverse timesupermartingale and Un a reverse time submartingale with respect to Fn,∞.Precisely, assume that for every n = 1, 2, ... we have
E (Ln−1 | Fn,∞) = E (Ln−1 | Fn) ≤ Ln a.s. and (6)E (Un−1 | Fn,∞) = E (Un−1 | Fn) ≥ Un a.s. (7)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 90: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/90.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 3 - reverse time martingales
I Algorithm 31. simulate G0 ∼ U(0, 1); set n = 1; set L0 ≡ L0 ≡ 0 and U0 ≡ U0 ≡ 12. obtain Ln and Un given F0,n−1,3. compute L∗
n = E (Ln−1 | Fn) and U∗n = E (Un−1 | Fn).
4. compute
Ln = Ln−1 +Ln − L∗
n
U∗n − L∗
n
(Un−1 − Ln−1
)Un = Un−1 −
U∗n − Un
U∗n − L∗
n
(Un−1 − Ln−1
)5. if G0 ≤ Ln set Cs := 1;6. if G0 > Un set Cs := 0;7. if Ln < G0 ≤ Un set n := n + 1 and GOTO 2;8. output Cs.
I Ln and Un satisfy assumptions of Algorithm 2.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 91: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/91.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Algorithm 3 - reverse time martingales
I Algorithm 31. simulate G0 ∼ U(0, 1); set n = 1; set L0 ≡ L0 ≡ 0 and U0 ≡ U0 ≡ 12. obtain Ln and Un given F0,n−1,3. compute L∗
n = E (Ln−1 | Fn) and U∗n = E (Un−1 | Fn).
4. compute
Ln = Ln−1 +Ln − L∗
n
U∗n − L∗
n
(Un−1 − Ln−1
)Un = Un−1 −
U∗n − Un
U∗n − L∗
n
(Un−1 − Ln−1
)5. if G0 ≤ Ln set Cs := 1;6. if G0 > Un set Cs := 0;7. if Ln < G0 ≤ Un set n := n + 1 and GOTO 2;8. output Cs.
I Ln and Un satisfy assumptions of Algorithm 2.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 92: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/92.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I Let X1,X2, . . . iid tosses of a p−coin.I Define Ln,Unn≥1 as follows:I if
n∑i=1
Xi = k,
letLn = a(n, k) and Un = b(n, k).
I Verify assumptions of Algorithm 3.I Here Ln,Unn≥1 are random walks on the coefficients of Nacu-Peres
polynomials with dynamics driven by the original p−coins.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 93: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/93.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I Let X1,X2, . . . iid tosses of a p−coin.I Define Ln,Unn≥1 as follows:I if
n∑i=1
Xi = k,
letLn = a(n, k) and Un = b(n, k).
I Verify assumptions of Algorithm 3.I Here Ln,Unn≥1 are random walks on the coefficients of Nacu-Peres
polynomials with dynamics driven by the original p−coins.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 94: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/94.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I Let X1,X2, . . . iid tosses of a p−coin.I Define Ln,Unn≥1 as follows:I if
n∑i=1
Xi = k,
letLn = a(n, k) and Un = b(n, k).
I Verify assumptions of Algorithm 3.I Here Ln,Unn≥1 are random walks on the coefficients of Nacu-Peres
polynomials with dynamics driven by the original p−coins.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 95: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/95.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I Let X1,X2, . . . iid tosses of a p−coin.I Define Ln,Unn≥1 as follows:I if
n∑i=1
Xi = k,
letLn = a(n, k) and Un = b(n, k).
I Verify assumptions of Algorithm 3.I Here Ln,Unn≥1 are random walks on the coefficients of Nacu-Peres
polynomials with dynamics driven by the original p−coins.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 96: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/96.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I Let X1,X2, . . . iid tosses of a p−coin.I Define Ln,Unn≥1 as follows:I if
n∑i=1
Xi = k,
letLn = a(n, k) and Un = b(n, k).
I Verify assumptions of Algorithm 3.I Here Ln,Unn≥1 are random walks on the coefficients of Nacu-Peres
polynomials with dynamics driven by the original p−coins.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 97: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/97.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I The reverse time martingale approach is the first constructive and practicalimplementation of a general Bernoulli Factory
I In particular the Nacu-Peres polynomials can be utilised forf (p) = min1− ε,Kp yielding a practical algorithm for the Metropolisaccept-reject step in the discussed scenarios (and many others, see e.g. workby R. Herbei and M. Berliner)
I J. Flegal and R. Herbei use the reverse martingale approach to implement theMarkov Chain perfect sampling algorithm discussed above.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 98: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/98.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I The reverse time martingale approach is the first constructive and practicalimplementation of a general Bernoulli Factory
I In particular the Nacu-Peres polynomials can be utilised forf (p) = min1− ε,Kp yielding a practical algorithm for the Metropolisaccept-reject step in the discussed scenarios (and many others, see e.g. workby R. Herbei and M. Berliner)
I J. Flegal and R. Herbei use the reverse martingale approach to implement theMarkov Chain perfect sampling algorithm discussed above.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 99: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/99.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to the Bernoulli Factory problem
I The reverse time martingale approach is the first constructive and practicalimplementation of a general Bernoulli Factory
I In particular the Nacu-Peres polynomials can be utilised forf (p) = min1− ε,Kp yielding a practical algorithm for the Metropolisaccept-reject step in the discussed scenarios (and many others, see e.g. workby R. Herbei and M. Berliner)
I J. Flegal and R. Herbei use the reverse martingale approach to implement theMarkov Chain perfect sampling algorithm discussed above.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 100: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/100.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to Metropolis-Hastings
I Recall that in MH (say with proposal Q)we needed a Bernoulli Factory for f (p) = min1,Kp
I f (p) = min1,Kp - impossible f (p) = min1− ε,Kp - possibleI Consider a lazy version
εI + (1− ε)P
I It turns out it is an accept reject algorithm with proposal Q and accept reject
min1− ε, (1− ε)Kp
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 101: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/101.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to Metropolis-Hastings
I Recall that in MH (say with proposal Q)we needed a Bernoulli Factory for f (p) = min1,Kp
I f (p) = min1,Kp - impossible f (p) = min1− ε,Kp - possibleI Consider a lazy version
εI + (1− ε)P
I It turns out it is an accept reject algorithm with proposal Q and accept reject
min1− ε, (1− ε)Kp
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 102: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/102.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to Metropolis-Hastings
I Recall that in MH (say with proposal Q)we needed a Bernoulli Factory for f (p) = min1,Kp
I f (p) = min1,Kp - impossible f (p) = min1− ε,Kp - possibleI Consider a lazy version
εI + (1− ε)P
I It turns out it is an accept reject algorithm with proposal Q and accept reject
min1− ε, (1− ε)Kp
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 103: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/103.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The Bernoulli FactoryMotivationBernoulli Factory - what is known?Reverse time martingale approach to sampling
Application to Metropolis-Hastings
I Recall that in MH (say with proposal Q)we needed a Bernoulli Factory for f (p) = min1,Kp
I f (p) = min1,Kp - impossible f (p) = min1− ε,Kp - possibleI Consider a lazy version
εI + (1− ε)P
I It turns out it is an accept reject algorithm with proposal Q and accept reject
min1− ε, (1− ε)Kp
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 104: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/104.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm
I Recall the Metropolis algorithm: sample from π we propose from q(x, y) andaccept with probability
αM(x, y) = min1, π(y)q(y, x)
π(x)q(x, y) =: 1 ∧ R(x, y),
I in order to satisfy detailed balance π(x)P(x, y) = π(y)P(y, x).I Other choices of the acceptance function can yield detailed balance too!I Any acceptance rate of the form g(R(x, y)) will do, if
g(R) = Rg(1/R)
I The Barkers acceptance rate is
αB(x, y) =π(y)q(y, x)
π(y)q(y, x) + π(x)q(x, y)so for Barkers g(R) =
R1 + R
.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 105: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/105.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm
I Recall the Metropolis algorithm: sample from π we propose from q(x, y) andaccept with probability
αM(x, y) = min1, π(y)q(y, x)
π(x)q(x, y) =: 1 ∧ R(x, y),
I in order to satisfy detailed balance π(x)P(x, y) = π(y)P(y, x).I Other choices of the acceptance function can yield detailed balance too!I Any acceptance rate of the form g(R(x, y)) will do, if
g(R) = Rg(1/R)
I The Barkers acceptance rate is
αB(x, y) =π(y)q(y, x)
π(y)q(y, x) + π(x)q(x, y)so for Barkers g(R) =
R1 + R
.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 106: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/106.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm
I Recall the Metropolis algorithm: sample from π we propose from q(x, y) andaccept with probability
αM(x, y) = min1, π(y)q(y, x)
π(x)q(x, y) =: 1 ∧ R(x, y),
I in order to satisfy detailed balance π(x)P(x, y) = π(y)P(y, x).I Other choices of the acceptance function can yield detailed balance too!I Any acceptance rate of the form g(R(x, y)) will do, if
g(R) = Rg(1/R)
I The Barkers acceptance rate is
αB(x, y) =π(y)q(y, x)
π(y)q(y, x) + π(x)q(x, y)so for Barkers g(R) =
R1 + R
.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 107: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/107.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm
I Recall the Metropolis algorithm: sample from π we propose from q(x, y) andaccept with probability
αM(x, y) = min1, π(y)q(y, x)
π(x)q(x, y) =: 1 ∧ R(x, y),
I in order to satisfy detailed balance π(x)P(x, y) = π(y)P(y, x).I Other choices of the acceptance function can yield detailed balance too!I Any acceptance rate of the form g(R(x, y)) will do, if
g(R) = Rg(1/R)
I The Barkers acceptance rate is
αB(x, y) =π(y)q(y, x)
π(y)q(y, x) + π(x)q(x, y)so for Barkers g(R) =
R1 + R
.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 108: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/108.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm
I Recall the Metropolis algorithm: sample from π we propose from q(x, y) andaccept with probability
αM(x, y) = min1, π(y)q(y, x)
π(x)q(x, y) =: 1 ∧ R(x, y),
I in order to satisfy detailed balance π(x)P(x, y) = π(y)P(y, x).I Other choices of the acceptance function can yield detailed balance too!I Any acceptance rate of the form g(R(x, y)) will do, if
g(R) = Rg(1/R)
I The Barkers acceptance rate is
αB(x, y) =π(y)q(y, x)
π(y)q(y, x) + π(x)q(x, y)so for Barkers g(R) =
R1 + R
.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 109: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/109.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 110: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/110.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 111: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/111.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 112: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/112.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 113: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/113.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 114: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/114.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers Algorithm - efficiency
I The Metropolis acceptance function is optimal with respect to Peskunordering (explain)
I Suppose we estimate πf :=∫
f (x)π(dx) by πf := 1n
∑ni=1 f (Xi)
I Then, under mild assumptions the Markov chain CLT holds:√
n(πf − πf ) → N(0, σas(f ,P)).
I By Peskun ordering
σas(f ,PBarker) ≥ σas(f ,PMetropolis)
I However:σ2
as(f ,PBarker) ≤ 2σ2as(f ,PMetropolis) + σ2
as(f , π)
(KL, GO Roberts, 2013)I So Barkers is not that much worse than Metropolis!
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 115: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/115.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
recall: The Benoulli Factory for Metropolis-Hastings
I in the intractable likelihood setting the Metropolis-Hastings acceptance ratetakes the form
f (p1, p2) = 1 ∧ c1p1
c2p2
and can be usually rewritten as
f (p3) = 1 ∧ c3p3 and then f (p3) = (1− ε) ∧ (1− ε)c3p3
I but this is still a difficult Bernoulli Factory problem - not suitable for manyapplications.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 116: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/116.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
recall: The Benoulli Factory for Metropolis-Hastings
I in the intractable likelihood setting the Metropolis-Hastings acceptance ratetakes the form
f (p1, p2) = 1 ∧ c1p1
c2p2
and can be usually rewritten as
f (p3) = 1 ∧ c3p3 and then f (p3) = (1− ε) ∧ (1− ε)c3p3
I but this is still a difficult Bernoulli Factory problem - not suitable for manyapplications.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 117: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/117.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers and the Bernoulli Factory
I In the scenarios where w need Bernoulli Factory to execute the Metropolisacceptance rate, we typically can also write the Barkers acceptance rate inthe form of
αB(x, y) =Kq
Mp + Kq,
I where K and M are known constants and p and q re probabilities that we cansample.
I Obtaining an event of probability αB(x, y) may be more efficient by thefollowing algorithm:
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 118: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/118.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers and the Bernoulli Factory
I In the scenarios where w need Bernoulli Factory to execute the Metropolisacceptance rate, we typically can also write the Barkers acceptance rate inthe form of
αB(x, y) =Kq
Mp + Kq,
I where K and M are known constants and p and q re probabilities that we cansample.
I Obtaining an event of probability αB(x, y) may be more efficient by thefollowing algorithm:
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 119: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/119.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Barkers and the Bernoulli Factory
I In the scenarios where w need Bernoulli Factory to execute the Metropolisacceptance rate, we typically can also write the Barkers acceptance rate inthe form of
αB(x, y) =Kq
Mp + Kq,
I where K and M are known constants and p and q re probabilities that we cansample.
I Obtaining an event of probability αB(x, y) may be more efficient by thefollowing algorithm:
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 120: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/120.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The two coin algorithmI Assume there is a black box generating p−coin and another black box
generating q−coins.I Assume p and q are unknown and, for known K,M. , we are interested to
obtain an event of probability
KqMp + Kq
=K
K+M qM
K+M p + KK+M q
I Two coin algorithm(1) draw C ∼ K
K+M−coin,(2) if C = 1 draw X ∼ q−coin,
if X = 1, output 1 and STOPif X = 0, GOTO (1).
(3) if C = 0 draw X ∼ p−coin,if X = 1, output 0 and STOPif X = 0, GOTO (1).
I The number of iterations N needed by the algorithm for a single output hasgeometric distribution with parameter M
K+M p + KK+M q.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 121: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/121.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The two coin algorithmI Assume there is a black box generating p−coin and another black box
generating q−coins.I Assume p and q are unknown and, for known K,M. , we are interested to
obtain an event of probability
KqMp + Kq
=K
K+M qM
K+M p + KK+M q
I Two coin algorithm(1) draw C ∼ K
K+M−coin,(2) if C = 1 draw X ∼ q−coin,
if X = 1, output 1 and STOPif X = 0, GOTO (1).
(3) if C = 0 draw X ∼ p−coin,if X = 1, output 0 and STOPif X = 0, GOTO (1).
I The number of iterations N needed by the algorithm for a single output hasgeometric distribution with parameter M
K+M p + KK+M q.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 122: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/122.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The two coin algorithmI Assume there is a black box generating p−coin and another black box
generating q−coins.I Assume p and q are unknown and, for known K,M. , we are interested to
obtain an event of probability
KqMp + Kq
=K
K+M qM
K+M p + KK+M q
I Two coin algorithm(1) draw C ∼ K
K+M−coin,(2) if C = 1 draw X ∼ q−coin,
if X = 1, output 1 and STOPif X = 0, GOTO (1).
(3) if C = 0 draw X ∼ p−coin,if X = 1, output 0 and STOPif X = 0, GOTO (1).
I The number of iterations N needed by the algorithm for a single output hasgeometric distribution with parameter M
K+M p + KK+M q.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 123: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/123.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The two coin algorithmI Assume there is a black box generating p−coin and another black box
generating q−coins.I Assume p and q are unknown and, for known K,M. , we are interested to
obtain an event of probability
KqMp + Kq
=K
K+M qM
K+M p + KK+M q
I Two coin algorithm(1) draw C ∼ K
K+M−coin,(2) if C = 1 draw X ∼ q−coin,
if X = 1, output 1 and STOPif X = 0, GOTO (1).
(3) if C = 0 draw X ∼ p−coin,if X = 1, output 0 and STOPif X = 0, GOTO (1).
I The number of iterations N needed by the algorithm for a single output hasgeometric distribution with parameter M
K+M p + KK+M q.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 124: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/124.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 125: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/125.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 126: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/126.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 127: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/127.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 128: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/128.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 129: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/129.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The s-poly-Barkers acceptance rate
I Recall Barkers:
αB = g(R) =R
1 + Rwhere R(x, y) =
π(y)q(y, x)
π(x)q(x, y)
I consider [D Vats, GO Roberts, KL, 2018]
αspB = g(R) =
∑si=0 Ri − 1∑s
i=0 Ri .
I αspB(x, y)→ αMH(x, y) as s→∞I Asymptotic variances satisfy
σMH(f ) ≤ σspB(f ) ≤ s + 1s
σMH(f ) +1sσπ(f )
I We can extend the two coin algorithm to s-poly-BarkersI We can do even betterKrzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 130: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/130.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 131: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/131.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 132: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/132.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 133: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/133.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 134: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/134.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 135: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/135.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 136: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/136.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Dice Enterprise [KL, G Molina, A Wendland, 2018]
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I The strategy:I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I Apply a Markov chain perfect sampling algorithm to sample from the
stationary distribution exactly.I e.g. Couling From the Past (CFTP) - Propp and Wilson 1995.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 137: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/137.jpg)
![Page 138: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/138.jpg)
![Page 139: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/139.jpg)
![Page 140: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/140.jpg)
![Page 141: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/141.jpg)
![Page 142: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/142.jpg)
![Page 143: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/143.jpg)
![Page 144: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/144.jpg)
![Page 145: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/145.jpg)
![Page 146: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/146.jpg)
![Page 147: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/147.jpg)
![Page 148: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/148.jpg)
![Page 149: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/149.jpg)
![Page 150: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/150.jpg)
![Page 151: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/151.jpg)
![Page 152: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/152.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 153: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/153.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 154: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/154.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 155: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/155.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 156: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/156.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 157: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/157.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 158: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/158.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
2 Examples
I ∆m =
p = (p1, . . . , pm) ∈ (0, 1)m :∑m
i=1 pi = 1
I f : ∆m → ∆v - rational function. We have the mapping p→ f (p).
I Design a Markov chain that admits f (p) as stationary distribution;I Using samples from p is enough to sample the dynamics of the Markov chain;I
p301
p302 + p50
2 + p403,
p502
p302 + p50
2 + p403,
p403
p302 + p50
2 + p403
I
p301
p302 + (p2 − p3)50
,(p2 − p3)50
p302 + (p2 − p3)50
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 159: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/159.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Disaggregation
I We will be doing a lot of this:
I
f : 13
12
16
π: 15
120
15
14
16
215
I Given a rational function f : ∆m → ∆v, in we will construct a new discreteprobability distribution
π : ∆m → ∆k,
k > v, called a ladder, and such that a sample from f (p) can be transformedinto a sample from π(p) and vice-versa.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 160: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/160.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Disaggregation
I We will be doing a lot of this:
I
f : 13
12
16
π: 15
120
15
14
16
215
I Given a rational function f : ∆m → ∆v, in we will construct a new discreteprobability distribution
π : ∆m → ∆k,
k > v, called a ladder, and such that a sample from f (p) can be transformedinto a sample from π(p) and vice-versa.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 161: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/161.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Disaggregation
I We will be doing a lot of this:
I
f : 13
12
16
π: 15
120
15
14
16
215
I Given a rational function f : ∆m → ∆v, in we will construct a new discreteprobability distribution
π : ∆m → ∆k,
k > v, called a ladder, and such that a sample from f (p) can be transformedinto a sample from π(p) and vice-versa.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 162: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/162.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Multivariate ladder over RI Let p = (p1, . . . , pm) and π(p) = (π1(p), . . . , πk(p)) be a probability distribution
on 1, . . . , k for every p ∈ ∆m. We say that π(p) is a ladder over R if every πi
is of the form
πi(p) = Ri
∏mj=1 pni,j
j
C(p)(8)
whereI C(p) is a polynomial with real coefficients that does not admit any root in ∆m;I ∀i, j, Ri is a strictly positive real constant and ni,j ∈ N≥0;I Denote ni = (ni,1, ni,2, . . . , ni,m). Then, there exists an integer d such that ∀i,‖ni‖1 = d, where the 1-norm of a vector a = (a1, . . . , an) is ‖a‖1 =
∑nj=1 |aj|. We
will refer to ni as the degree of πi(p) and to d as the degree of π(p).Moreover, we say that π(p) is a connected ladder if
I For each i, j ∈ 1, . . . , k states i and j are connected, meaning that there exists asequence (n(1) = ni, n(2), . . . , n(t−1), n(t) = nj) such that
∥∥∥n(h) − n(h−1)∥∥∥
1≤ 2, for
all h ∈ 2, . . . , t.Finally, we say that π(p) is a fine ladder if
I If ni = nj, then i = j.Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 163: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/163.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Fine and connected ladder π : ∆3 → ∆5
R1p2
1C(p)
R2p1p2C(p) R3
p1p3C(p)
R4p2
2C(p)
0 p2p3C(p) R5
p23
C(p)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 164: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/164.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Fine, but not connected ladder π : ∆3 → ∆4
R1p2
1C(p)
R2p1p2C(p) 0 p1p3
C(p)
R3p2
2C(p)
0 p2p3C(p) R4
p23
C(p)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 165: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/165.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Connected, but not fine ladder π : ∆3 → ∆6
R1p2
1C(p)
R2p1p2C(p) R3
p1p3C(p) R4
p1p3C(p)
R5p2
2C(p)
0 p2p3C(p) R6
p23
C(p)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 166: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/166.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Main Theorem
I Let f : ∆m → ∆v be a probability distribution such that every fi(p) is a rationalfunction with real coefficients. Then, one can explicitly construct a fine andconnected ladder π : ∆m → ∆k such that sampling from π is equivalent tosampling from f .
I utilizes the following theorem by Polya:I Let g : ∆m → R be a homogeneous and positive polynomial in the variables
p1, . . . , pm, i.e. all the monomials of the polynomial have the same degree.Then for all sufficiently large n, all the coefficients of(p1 + . . .+ pm)ng(p1, . . . , pm) are positive.
I plus some trickery
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 167: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/167.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Main Theorem
I Let f : ∆m → ∆v be a probability distribution such that every fi(p) is a rationalfunction with real coefficients. Then, one can explicitly construct a fine andconnected ladder π : ∆m → ∆k such that sampling from π is equivalent tosampling from f .
I utilizes the following theorem by Polya:I Let g : ∆m → R be a homogeneous and positive polynomial in the variables
p1, . . . , pm, i.e. all the monomials of the polynomial have the same degree.Then for all sufficiently large n, all the coefficients of(p1 + . . .+ pm)ng(p1, . . . , pm) are positive.
I plus some trickery
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 168: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/168.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Main Theorem
I Let f : ∆m → ∆v be a probability distribution such that every fi(p) is a rationalfunction with real coefficients. Then, one can explicitly construct a fine andconnected ladder π : ∆m → ∆k such that sampling from π is equivalent tosampling from f .
I utilizes the following theorem by Polya:I Let g : ∆m → R be a homogeneous and positive polynomial in the variables
p1, . . . , pm, i.e. all the monomials of the polynomial have the same degree.Then for all sufficiently large n, all the coefficients of(p1 + . . .+ pm)ng(p1, . . . , pm) are positive.
I plus some trickery
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 169: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/169.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
Main Theorem
I Let f : ∆m → ∆v be a probability distribution such that every fi(p) is a rationalfunction with real coefficients. Then, one can explicitly construct a fine andconnected ladder π : ∆m → ∆k such that sampling from π is equivalent tosampling from f .
I utilizes the following theorem by Polya:I Let g : ∆m → R be a homogeneous and positive polynomial in the variables
p1, . . . , pm, i.e. all the monomials of the polynomial have the same degree.Then for all sufficiently large n, all the coefficients of(p1 + . . .+ pm)ng(p1, . . . , pm) are positive.
I plus some trickery
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 170: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/170.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Markov chain
R1p3
1C(p)
R2p2
1p2C(p) R3
p21p3
C(p)
R4p1p2
2C(p)
R5p1p2p3C(p) R6
p1p23
C(p)
R7p3
2C(p) R8
p22p3
C(p) R9p2p2
3C(p) R10
p33
C(p)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 171: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/171.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
The Markov chain
πi
p1 p 1
p2 p3
p 2 p3
And the second term that accounts for Ri’s in such a way that the chain is optimalin Peskun ordering in the class of chains with dynamics operating with the sameneighborhood structure and using p dynamics.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 172: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/172.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Barkers AlgorithmThe two coin algorithmThe s-poly-BarkersThe Dice Enterprise!
From coins to dice: monotone CFTP
R1(1−p)4
C(p) R2p(1−p)3
C(p) R3p2(1−p)2
C(p) R4p3(1−p)
C(p)R5
p4
C(p)
P1,1P1,2
P2,1
P2,2P2,3
P3,2
P3,3P3,4
P4,3
P4,4P4,5
P5,4
P5,5
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 173: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/173.jpg)
![Page 174: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/174.jpg)
![Page 175: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/175.jpg)
![Page 176: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/176.jpg)
![Page 177: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/177.jpg)
![Page 178: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/178.jpg)
![Page 179: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/179.jpg)
![Page 180: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/180.jpg)
![Page 181: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/181.jpg)
![Page 182: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/182.jpg)
![Page 183: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/183.jpg)
![Page 184: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/184.jpg)
![Page 185: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/185.jpg)
![Page 186: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/186.jpg)
![Page 187: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/187.jpg)
![Page 188: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/188.jpg)
![Page 189: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/189.jpg)
![Page 190: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/190.jpg)
![Page 191: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/191.jpg)
![Page 192: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/192.jpg)
![Page 193: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/193.jpg)
![Page 194: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/194.jpg)
![Page 195: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/195.jpg)
![Page 196: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/196.jpg)
![Page 197: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/197.jpg)
![Page 198: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/198.jpg)
![Page 199: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/199.jpg)
![Page 200: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/200.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 201: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/201.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 202: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/202.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 203: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/203.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 204: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/204.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 205: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/205.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 206: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/206.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 207: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/207.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
The Markov switching diffusion model
I V = Vt, t ∈ [0,T] follows dynamics described by the stochastic differentialequation
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, (9)
whereI Y = Yt, t ∈ [0,T] is a continuous time jump process onY = 1, . . . ,m, m ∈ N ∪ ∞,
I Bt is the Brownian motionI θ ∈ Θ is an unknown parameterI moreover L 3 Λ = λi,j is the intensity matrix for the dynamics of Y
I denote by Ω, F , P the probability spaceI and assume Bt and Yt are independent under P.I we observe V = Vt at discrete time instances
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 208: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/208.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 209: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/209.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 210: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/210.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 211: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/211.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 212: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/212.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 213: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/213.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 214: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/214.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 215: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/215.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
exact Bayesian inference (L, Palczewski, Roberts)
I
dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt, t ∈ [0,T],
I let VD be the observed discrete data form V and VM the missing parts ofthe trajectory, i.e. V = (VD,VM)
I Bayesian setting: we assume prior distributions on unknown parametersI Θ 3 θ ∼ πθ and L 3 Λ ∼ πΛ
I The goal is to explore, via MCMC, the posterior distribution
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ) (10)
I note that the state space of the target distribution is infinite dimensional,I in particular VM is a continuous time diffusion path.I nevertheless, the limiting distribution of our MCMC algorithm is the exact full
posterior (10)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 216: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/216.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
properties of our exact MCMC algorithm
I the limiting distribution of our MCMC algorithm is the exact full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ)
I we also avoid any discrete time approximation of the diffusion VI we employ the Exact Algorithm methodology of Beskos et al 06, Beskos &
Roberts 04, Beskos et al 05, Beskos et al 08I we work with a random, finite dimensional representation of VM and store it
in computer memory while the simulation progressI we can evaluate averages of any finite dimensional functional of (VM,Y,Λ, θ)
with Monte Carlo error only. In particular, the exact posterior distribution ofany individual variables VM,Y,Λ or θ can be explored by marginalising the fullposterior.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 217: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/217.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
properties of our exact MCMC algorithm
I the limiting distribution of our MCMC algorithm is the exact full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ)
I we also avoid any discrete time approximation of the diffusion VI we employ the Exact Algorithm methodology of Beskos et al 06, Beskos &
Roberts 04, Beskos et al 05, Beskos et al 08I we work with a random, finite dimensional representation of VM and store it
in computer memory while the simulation progressI we can evaluate averages of any finite dimensional functional of (VM,Y,Λ, θ)
with Monte Carlo error only. In particular, the exact posterior distribution ofany individual variables VM,Y,Λ or θ can be explored by marginalising the fullposterior.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 218: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/218.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
properties of our exact MCMC algorithm
I the limiting distribution of our MCMC algorithm is the exact full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ)
I we also avoid any discrete time approximation of the diffusion VI we employ the Exact Algorithm methodology of Beskos et al 06, Beskos &
Roberts 04, Beskos et al 05, Beskos et al 08I we work with a random, finite dimensional representation of VM and store it
in computer memory while the simulation progressI we can evaluate averages of any finite dimensional functional of (VM,Y,Λ, θ)
with Monte Carlo error only. In particular, the exact posterior distribution ofany individual variables VM,Y,Λ or θ can be explored by marginalising the fullposterior.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 219: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/219.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
properties of our exact MCMC algorithm
I the limiting distribution of our MCMC algorithm is the exact full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ)
I we also avoid any discrete time approximation of the diffusion VI we employ the Exact Algorithm methodology of Beskos et al 06, Beskos &
Roberts 04, Beskos et al 05, Beskos et al 08I we work with a random, finite dimensional representation of VM and store it
in computer memory while the simulation progressI we can evaluate averages of any finite dimensional functional of (VM,Y,Λ, θ)
with Monte Carlo error only. In particular, the exact posterior distribution ofany individual variables VM,Y,Λ or θ can be explored by marginalising the fullposterior.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 220: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/220.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
properties of our exact MCMC algorithm
I the limiting distribution of our MCMC algorithm is the exact full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD |Y, θ)
I we also avoid any discrete time approximation of the diffusion VI we employ the Exact Algorithm methodology of Beskos et al 06, Beskos &
Roberts 04, Beskos et al 05, Beskos et al 08I we work with a random, finite dimensional representation of VM and store it
in computer memory while the simulation progressI we can evaluate averages of any finite dimensional functional of (VM,Y,Λ, θ)
with Monte Carlo error only. In particular, the exact posterior distribution ofany individual variables VM,Y,Λ or θ can be explored by marginalising the fullposterior.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 221: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/221.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 222: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/222.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 223: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/223.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 224: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/224.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 225: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/225.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 226: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/226.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 227: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/227.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithmI Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBtI We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)I Problem: for different (Y, θ) the measures π(VM,VD | Y, θ) are mutually
singular (quadratic variation issue). A naive Gibbs sampler won’t mix at all.I Finding a dominating measure of a product form for π(VM,Y,Λ, θ |VD) is an
essential stepI We find a sequence of transformations of the diffusion path V and,
respectively, of the diffusion equation for VI Let ΩT = C[0,T] and Ω∗ = C[0, 1] .I Given fixed Y, θ, v0, vT we define a 1-1 transformation
HY,θ,v0,vT : ΩT → Ω∗, (11)
such that the law of HY,θ,v0,vT (V) is absolutely continuous with respect to thelaw of a Brownian bridge on Ω∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 228: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/228.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... continued
I Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)on ΩT × YT × L×Θ
I The Gibbs sampler we design targets a measure π∗v0,vT(ω∗, y,Λ, θ) on
Ω∗ × YT × L×Θ
I Let the simulation output be
(ω∗(n), y(n),Λ(n), θ(n)) n = 0, 1, . . .
I Then(H−1
y(n),θ(n),v0,vT(ω∗(n)), y(n),Λ(n), θ(n)) n = 0, 1, . . .
targets π(VM,Y,Λ, θ |VD) on ΩT × YT × L×Θ
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 229: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/229.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... continued
I Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)on ΩT × YT × L×Θ
I The Gibbs sampler we design targets a measure π∗v0,vT(ω∗, y,Λ, θ) on
Ω∗ × YT × L×Θ
I Let the simulation output be
(ω∗(n), y(n),Λ(n), θ(n)) n = 0, 1, . . .
I Then(H−1
y(n),θ(n),v0,vT(ω∗(n)), y(n),Λ(n), θ(n)) n = 0, 1, . . .
targets π(VM,Y,Λ, θ |VD) on ΩT × YT × L×Θ
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 230: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/230.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... continued
I Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)on ΩT × YT × L×Θ
I The Gibbs sampler we design targets a measure π∗v0,vT(ω∗, y,Λ, θ) on
Ω∗ × YT × L×Θ
I Let the simulation output be
(ω∗(n), y(n),Λ(n), θ(n)) n = 0, 1, . . .
I Then(H−1
y(n),θ(n),v0,vT(ω∗(n)), y(n),Λ(n), θ(n)) n = 0, 1, . . .
targets π(VM,Y,Λ, θ |VD) on ΩT × YT × L×Θ
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 231: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/231.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... continued
I Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)on ΩT × YT × L×Θ
I The Gibbs sampler we design targets a measure π∗v0,vT(ω∗, y,Λ, θ) on
Ω∗ × YT × L×Θ
I Let the simulation output be
(ω∗(n), y(n),Λ(n), θ(n)) n = 0, 1, . . .
I Then(H−1
y(n),θ(n),v0,vT(ω∗(n)), y(n),Λ(n), θ(n)) n = 0, 1, . . .
targets π(VM,Y,Λ, θ |VD) on ΩT × YT × L×Θ
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 232: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/232.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... continued
I Recall the SDE for V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I We aim at Gibbs sampling from the full posterior
π(VM,Y,Λ, θ |VD) ∝ πθ(θ)πΛ(Λ)π(Y |Λ)π(VM,VD | Y, θ)on ΩT × YT × L×Θ
I The Gibbs sampler we design targets a measure π∗v0,vT(ω∗, y,Λ, θ) on
Ω∗ × YT × L×Θ
I Let the simulation output be
(ω∗(n), y(n),Λ(n), θ(n)) n = 0, 1, . . .
I Then(H−1
y(n),θ(n),v0,vT(ω∗(n)), y(n),Λ(n), θ(n)) n = 0, 1, . . .
targets π(VM,Y,Λ, θ |VD) on ΩT × YT × L×Θ
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 233: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/233.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I We now identify HY,θ,v0,vT
I start with V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I and use the 1-1 Lampertie transformation
η(v, θ) =
∫ v 1σ(u, θ)
du and define Xt := η(Vt, θ). (12)
dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,
I For Xt assume the setting of the EA3 Algorithm of Beskos et al. 08.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 234: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/234.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I We now identify HY,θ,v0,vT
I start with V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I and use the 1-1 Lampertie transformation
η(v, θ) =
∫ v 1σ(u, θ)
du and define Xt := η(Vt, θ). (12)
dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,
I For Xt assume the setting of the EA3 Algorithm of Beskos et al. 08.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 235: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/235.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I We now identify HY,θ,v0,vT
I start with V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I and use the 1-1 Lampertie transformation
η(v, θ) =
∫ v 1σ(u, θ)
du and define Xt := η(Vt, θ). (12)
dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,
I For Xt assume the setting of the EA3 Algorithm of Beskos et al. 08.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 236: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/236.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I We now identify HY,θ,v0,vT
I start with V : dVt = b(Vt,Yt, θ)dt + σ(Vt, θ)γ(Yt, θ)dBt
I and use the 1-1 Lampertie transformation
η(v, θ) =
∫ v 1σ(u, θ)
du and define Xt := η(Vt, θ). (12)
dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,
I For Xt assume the setting of the EA3 Algorithm of Beskos et al. 08.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 237: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/237.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 238: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/238.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 239: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/239.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 240: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/240.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 241: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/241.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 242: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/242.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI We now work with dXt = α(Xt,Yt, θ)dt + γ(Yt, θ)dBt,I define a speed adjusted Brownian motion by
dByt = γ(yt, θ)dBt, and denote hy,θ(t) =
∫ t
0γ2(ys, θ)ds.
I Let BByt be a speed adjusted Brownian bridge with the endpoints x0 and xT
I To obtain a Brownian bridge on [0, hy,θ(T)] with endpoints x0, xT , put
BB1t = BBy
h−1y,θ(t)
, t ∈ [0, hy,θ(T)]. (13)
I Next, define its centered version starting and ending at 0,
BB1,ct = BB1
t − (1− t/hy,θ(T))x0 − t/hy,θ(T)xT , t ∈ [0, hy,θ(T)], (14)
I The process BBt is a standard centred Brownian bridge on [0, 1]
BBt =1√
hy,θ(T)BB1,c
t hy,θ(T), t ∈ [0, 1]. (15)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 243: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/243.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 244: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/244.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 245: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/245.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 246: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/246.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 247: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/247.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 248: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/248.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 249: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/249.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some details
I By Hy,θ,x0,xT denote an operator that maps ΩT into Ω∗ by applyingtransformations (13), (14), (15), i.e.,
BBt =(Hy,θ,x0,xT (BBy)
)t.
I defineHy,θ,v0,vT (·) := Hy,θ,x0,xT
(η(·, θ)
).
I Let Qy and Py be measures induced by X and By , respectively, on ΩT .For conditional measures write Q(y;x0,xT ) , P(y;x0,xT ) respectively.
I Let P(y;x0,xT )H be the push-forward measure of P(y;x0,xT ) via mapping Hy,θ,x0,xT
I Then P(y;x0,xT )H = P∗, the Wiener measure of the standard Brownian bridge.
I In order to identify π∗v0,vT(ω∗, y,Λ, θ) on Ω∗ × YT × L×Θ ,
I we shall find the Radon-Nikodym derivative of Q(y;x0,xT ) with respect toP(y;x0,xT ) and consequently of Q(y;x0,xT )
H with respect to P∗
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 250: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/250.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI From Girsanov and applying a trick from EA papers,
dQ(y;x0,xT )
dP(y;x0,xT )(ω) ∝ exp
τ(T)+1∑
k=1
(A(ωtk , ytk−1 , θ)− A(ωtk−1 , ytk−1 , θ)
)−1
2
∫ T
0
(α′x(ωs, ys, θ) +
α2(ωs, ys, θ)
γ2(ys, θ)
)ds
=: G(ω, y, θ; x0, xT),
I whereI 0 = t0 < t1 < · · · < tτ(T) < tτ(T)+1 = T are moments of jumps of yt and
yt = ytk for t ∈ [tk, tk+1).I A(x, y, θ) = 1
γ2(y,θ)
∫ xα(u, y, θ)du,
I As a consequence we can set
π∗v0,vT(ω∗, y,Λ, θ) := πθ(θ)πΛ(Λ)π(y |Λ)G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
)(16)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 251: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/251.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI From Girsanov and applying a trick from EA papers,
dQ(y;x0,xT )
dP(y;x0,xT )(ω) ∝ exp
τ(T)+1∑
k=1
(A(ωtk , ytk−1 , θ)− A(ωtk−1 , ytk−1 , θ)
)−1
2
∫ T
0
(α′x(ωs, ys, θ) +
α2(ωs, ys, θ)
γ2(ys, θ)
)ds
=: G(ω, y, θ; x0, xT),
I whereI 0 = t0 < t1 < · · · < tτ(T) < tτ(T)+1 = T are moments of jumps of yt and
yt = ytk for t ∈ [tk, tk+1).I A(x, y, θ) = 1
γ2(y,θ)
∫ xα(u, y, θ)du,
I As a consequence we can set
π∗v0,vT(ω∗, y,Λ, θ) := πθ(θ)πΛ(Λ)π(y |Λ)G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
)(16)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 252: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/252.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Designing an exact MCMC algorithm... some detailsI From Girsanov and applying a trick from EA papers,
dQ(y;x0,xT )
dP(y;x0,xT )(ω) ∝ exp
τ(T)+1∑
k=1
(A(ωtk , ytk−1 , θ)− A(ωtk−1 , ytk−1 , θ)
)−1
2
∫ T
0
(α′x(ωs, ys, θ) +
α2(ωs, ys, θ)
γ2(ys, θ)
)ds
=: G(ω, y, θ; x0, xT),
I whereI 0 = t0 < t1 < · · · < tτ(T) < tτ(T)+1 = T are moments of jumps of yt and
yt = ytk for t ∈ [tk, tk+1).I A(x, y, θ) = 1
γ2(y,θ)
∫ xα(u, y, θ)du,
I As a consequence we can set
π∗v0,vT(ω∗, y,Λ, θ) := πθ(θ)πΛ(Λ)π(y |Λ)G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
)(16)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 253: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/253.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 254: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/254.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 255: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/255.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 256: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/256.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 257: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/257.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 258: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/258.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Conditional distributions for the Gibbs samplerI The conditional distributions are as follows.
ω∗ ∝ G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
y ∝ π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
),
Λ ∝ πΛ(Λ)π(y|Λ),
θ ∝ πθ(θ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I For Λ we can use a conjugate prior λij ∼ Exp(βij) and compute the fullconditional Gamma(formula1, formula2).
I For ω∗ we use a rejection sampling with reweighed Brownian Bridgeproposals using ideas of Exact Algorithms of Beskos et al. 08.
I a reweighed Brownian bridge proposal BB is accepted as ω∗ withprobability obtained from G
(H−1
y,θ,x0,xT(BB), y, θ; x0, xT
)I The decision on accepting a Brownian bridge proposal BB as ω∗ is made
after evaluating BB at finite number of randomly chosen points.I For y and θ we use Barker’s within Gibbs.
(Metropolis within Gibbs also possible based on Sermaidis et al 2011)Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 259: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/259.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Barker’s within Gibbs step o for y (and θ )I Recall the Barker’s acceptance probability for a move from y to y′ , for a
stationary distribution π , is
a(y, y′) =π(y′)q(y′, y)
π(y′)q(y′, y) + π(y)q(y, y′)
I In our context a Barker’s step is applied within the Gibbs sampler step for yand the conditional target distribution is proportional to
π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I if q(y, y′) = π(y′|Λ) , the acceptance a(y, y′) simplifies to
a(y, y′) =G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)+ G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
) .I And the two coin algorithm can be readily applied!Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 260: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/260.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Barker’s within Gibbs step o for y (and θ )I Recall the Barker’s acceptance probability for a move from y to y′ , for a
stationary distribution π , is
a(y, y′) =π(y′)q(y′, y)
π(y′)q(y′, y) + π(y)q(y, y′)
I In our context a Barker’s step is applied within the Gibbs sampler step for yand the conditional target distribution is proportional to
π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I if q(y, y′) = π(y′|Λ) , the acceptance a(y, y′) simplifies to
a(y, y′) =G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)+ G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
) .I And the two coin algorithm can be readily applied!Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 261: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/261.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Barker’s within Gibbs step o for y (and θ )I Recall the Barker’s acceptance probability for a move from y to y′ , for a
stationary distribution π , is
a(y, y′) =π(y′)q(y′, y)
π(y′)q(y′, y) + π(y)q(y, y′)
I In our context a Barker’s step is applied within the Gibbs sampler step for yand the conditional target distribution is proportional to
π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I if q(y, y′) = π(y′|Λ) , the acceptance a(y, y′) simplifies to
a(y, y′) =G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)+ G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
) .I And the two coin algorithm can be readily applied!Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 262: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/262.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Barker’s within Gibbs step o for y (and θ )I Recall the Barker’s acceptance probability for a move from y to y′ , for a
stationary distribution π , is
a(y, y′) =π(y′)q(y′, y)
π(y′)q(y′, y) + π(y)q(y, y′)
I In our context a Barker’s step is applied within the Gibbs sampler step for yand the conditional target distribution is proportional to
π(y |Λ)G(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
).
I if q(y, y′) = π(y′|Λ) , the acceptance a(y, y′) simplifies to
a(y, y′) =G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)G(H−1
y′,θ,x0,xT(ω∗), y′, θ; x0, xT
)+ G
(H−1
y,θ,x0,xT(ω∗), y, θ; x0, xT
) .I And the two coin algorithm can be readily applied!Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 263: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/263.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Example: the SINE modelYt − a 2-state Markov process, Y = 1, 2dVt = sin
(Vt − µ(Yt)
)dt + γ(Yt)dBt
Parameter: θ =(µ(1), µ(2), γ(1), γ(2)
)Priors:
I µ(1), µ(2) ∼ U(0, 2π), independentI γ2(1), γ2(2) ∼ InvGamma(1, 1), independent
Data:I 1000 samples of Vt at 0, 1, 2, . . . , 999I µ = [3, 1], γ = [1, 2]I Yt = 1 for t ∈ [0, 250] ∪ (750, 1000], and Yt = 2 for t ∈ (250, 750]
Stats of MCMC:I Acceptance probabilities: BB 0.65, Y 0.50, θ 0.36I Average number of imputed points (per interval): 1.49Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 264: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/264.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Density
0.5 1.0 1.5 2.0 2.5 3.0
01
23
45
mu
De
nsity
mu_1 (3.0)
mu_2 (1.0)
1.0 1.5 2.0
01
23
45
6
gamma
De
nsity
gamma_1 (1.0)
gamma_2 (2.0)
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 265: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/265.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Posterior distribution for Y
0 200 400 600 800 1000
0.0
0.2
0.4
0.6
0.8
1.0
time
Pro
babili
ty o
f sta
te 1
200 220 240 260 280 300
0.0
0.2
0.4
0.6
0.8
1.0
time
Pro
babili
ty o
f sta
te 1
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 266: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/266.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
The model and inferenceDesigning an exact MCMC algorithm
Example: the SINE model
Autocorrelation
0 20 40 60 80
0.0
0.2
0.4
0.6
0.8
1.0
Lag
AC
F
mu_2
0 20 40 60 80
0.0
0.2
0.4
0.6
0.8
1.0
Lag
AC
F
gamma_2
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 267: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/267.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 268: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/268.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 269: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/269.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 270: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/270.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 271: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/271.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 272: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/272.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 273: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/273.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I If the likelihood component of π(·) is not in closed form, αMH(x, y) can not beevaluated.
I However, it might be possible to design an unbiased estimator of π(x).I The pseudo-marginal approach exploits this. It designs an extended state
space algorithm that targets π as its marginal.I Pseudo-marginal suffers from loss of efficiency through MCMC
convergence slow down typical for extended state space algorithms. Thismay be drastic (depending on the properties of the unbiased estimator).
I This is in contrast with the Bernoulli Factory based methods that retain theexact MCMC convergence speed, but instead the execution time of singleiteration suffers.
I Pseudo-marginal methods are more general but more difficult to diagnose.
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 274: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/274.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 275: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/275.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 276: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/276.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 277: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/277.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 278: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/278.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood
![Page 279: Bayesian Inference in Intractable Likelihood Models...Intractable Likelihood The Bernoulli Factory problem Barkers and more The Markov switching diffusion model & exact Bayesian inference](https://reader030.vdocuments.net/reader030/viewer/2022040608/5ec5a33169d7b460ea09afd4/html5/thumbnails/279.jpg)
Intractable LikelihoodThe Bernoulli Factory problem
Barkers and moreThe Markov switching diffusion model & exact Bayesian inference
Pseudo-marginal MCMC
Pseudo-marginal MCMC
The pseudo-marginal approach
I
αMH = 1 ∧ π(y)q(y, x)
π(x)q(x, y).
I Assume that we have access to an unbiased estimator of π(x)
I
π(x) = Wxπ(x) Wx ∼ Qx(·) > 0 E(Wx) = 1
I Then the method can be seen as targeting the distribution
π(x,w) = π(x)Qx(w)w on X × R+
I And using the proposal
q(x,w, y, u) = q(x, y)Qy(u)
I Convergence is obtained by integrating out W
Krzysztof Łatuszynski(University of Warwick, UK) (The Alan Turing Institute, London)Intractable Likelihood