at the boundary between light transport simulation …hachisuka/mcqmc2014.pdflight transport...

Post on 21-Jun-2020

8 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

At the Boundary Between Light Transport Simulation and

Computational Statistics

Toshiya Hachisuka

Aarhus University

MCQMC2014

At the boundary

2

Ij = E

fj(X)

p(X)

Me

What’s often considered

• Graphics communities “just use” statistics

3

StatisticsGraphics

What I hope to see

• Statistics communities take something from graphics

4

StatisticsGraphics

Aim of this talk

• Motivate you to facilitate cross-pollination

• Both ways (graphics statistics)

• Using two examples

• Progressive density estimation

• Primary space serial tempering

5

“Progressive Photon Mapping” Hachisuka et al. ACM SIGGRAPH Asia 2008

“Multiplexed Metropolis Light Transport” Hachisuka et al. ACM SIGGRAPH 2014 (conditionally accepted)

6

Progressive Density Estimation

7

Manifold of non-zero valuesΩ∞

L =

Ω∞

fjXdµ(X)

L =

Ω∞

fjXdµ(X)

8

Ω∞

xi ∼ p(xi)≈ 1

N

i

fj (xi)

p(xi)

9

Eye

Transparent Layer

Light source

Matte Surface

10

11

θ1

12

θ1

13

θ2

θ1

14

θ2

θ1

15

Probability of sampling this path is nearly zero

θ2

θ1

16

Probability of sampling this path is nearly zero

θ2

θ1

17

Probability of hitting at the same point is zero

θ2 θ1

Manifold of non-samplable paths

18

θ1

θ2

19

20

21

Light source

Eye

Glass

Matte surface

Matte surface

22

How can we capture non-samplable paths?

23

Progressive Density Estimation

24

θ2 θ1

25

θ2 θ1

Allow approximate connection within distance

r

r

26

θ2 θ1

Allow approximate connection within distance r... and progressively reduce r

Progressive density estimation

• Algorithmically convergent density estimation

• Add gradually reducing bias to relax the problem

• First to solve the issue of non-samplable paths

27

[Hachisuka et al. 2008]

28

Initial pass

Image

29

Initial pass

j-th pixel

30

Initial pass

xj Visible point

31

Initial pass

Kernelr0

r0Initial kernel bandwidth:

32

1st passMonte Carlo photon tracing

33

1st pass

Photons

34

1st pass

Range query with r0

35

1st pass

Reduce radius into r1

Radius reduction

• Reduce radius based on sample statistics

• Number of photons found so far

• Number of newly found photons

• Reduction rate (user-defined)

36

rn+1 =Nn + αMn

Nn +Mnrn

Nn

Mn

α ∈]0, 1[

Nn+1 = Nn + αMn

37

1st pass

r1 =N0 + αM0

N0 +M0r0 =

0 + α4

0 + 4r0

38

1st pass

N1 = N0 + αM0 = 0 + α4

39

2nd pass

40

2nd pass

New set of photons

41

2nd pass

Range query with r1

42

2nd pass

r2 =N1 + αM1

N1 +M1r1 N2 = N1 + αM1

43

3rd and n-th passes

rn+1 =Nn + αMn

Nn +Mnrn Nn+1 = Nn + αMn

44

MC integration Progressive density estimation

Equal time

Convergence

45

• Converges to the correct solution

• Infinite number of neighboring samples

• Infinitely small radius

L = limn→∞

Nn

πr2n

• Converges to the correct solution

46

Convergence

limi→∞

Ni(Ri) = ∞limi→∞

Ri(Ni) = 0

L = limn→∞

Nn

πr2n

• Converges to the correct solution

• Convergence rate

• Variance

• Bias

47

Convergence

limi→∞

Ni(Ri) = ∞limi→∞

Ri(Ni) = 0

L = limn→∞

Nn

πr2n

O(n−α)

O(nα−1)α =

2

3Balanced with

Relationship to recursive density estimation

• Recursive density estimation

• Predefined sequence of radii

• Assume stationary density distribution

• Progressive density estimation

• Adaptive sequence from sample statistics

• No assumption on stationarity

48

[Wolverton and Wagner 1969, Yamato 1971, Knaus and Zwicker 2011]

[Hachisuka et al. 2008, Hachisuka and Jensen 2009]

Progressive vs Ordinary

49

Progressive Ordinary

Computation Unbounded Unbounded

Storage Bounded Unbounded

Convergence Single run Multiple runs

50

Primary Space Serial Tempering

Multiple sampling methods

51

Multiple sampling methods

52

Multiple sampling methods

53

Multiple sampling methods

54

Multiple sampling methods

55

p0(X)

p1(X) p2(X)

p3(X)

Given the same integrand, there are many known PDFs

Multiple importance sampling

• Utilizes samples from multiple PDFs by weighting

56

[Veach and Guibas 1995]

f(X)

Multiple importance sampling

• Utilizes samples from multiple PDFs by weighting

57

[Veach and Guibas 1995]

p0(X) p1(X)

f(X)

Multiple importance sampling

• Importance sampling

• Multiple importance sampling

58

[Veach and Guibas 1995]

f(X)dµ(X) ≈ 1

N

i

f(xi)

p0(xi)p0(xi)

f(X)dµ(X) ≈ 1

N

w0(xi)

f(xi)

p0(xi)+ w1(xi)

f(xi)

p1(xi)

p0(xi) p1(xi)

1

N

i

f(xi)

p1(xi)p1(xi)or

Uses both

Markov chain Monte Carlo sampling

• Requires no knowledge of PDFs

59

f(X)

60

How can we combine MIS and MCMC?

Combination

• Multiple importance sampling

• Ordinary Monte Carlo method

• Utilizes the knowledge of all the PDFs

• Markov chain Monte Carlo sampling

• Markov chain Monte Carlo method

• No usage (or requirement) of PDFs

61

62

Primary Space Serial Tempering

Primary sample space

• Hypercube of random numbers

• Mapping from a point to a sample

• Inverse CDF = mapping function

63

[Kelemen et al. 2002]

U ∈ ]0, 1[N

C(U) =f(P−1(U))

p(P−1(U))

f(X)dµ(X) ≈ 1

N

i

f(xi)

p(xi)=

1

N

i

C(ui)

Primary sample space

• Hypercube of random numbers

• Mapping from a point to a sample

• Inverse CDF = mapping function

64

[Kelemen et al. 2002]

Point in the hypercubePath in the sample space

U ∈ ]0, 1[N

X = P−1(U)X U

Serial tempering

• MCMC sampling of a sum of multiple distributions

• Extended states of the Markov chain

• Index j is updated via MCMC

• Extended state is originally temperature

• Still does not use any PDFs

65

x ∼

j

fj(x)

(x, j)

[Marinari and Parisi 1992, Geyer and Thompson 1995]

Primary space serial tempering

• Use a weighted sum of primary space distributions

• Extended states is now

• Index j is the index to the j-th PDF/CDF

66

u ∼

j

wj(u)Cj(u)

wj(U) = w(P−1j (U))

(u, j)

Cj(U) =f(P−1

j (U))

pj(P−1j (U))

[Hachisuka et al. 2014] (TBA)

MIS weightPrimary space distribution

Original primary space

67

C(U)

x = P−1(U)

Multiplexed primary space

68

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−1j (U)

Primary space serial tempering

69

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−10 (U)

Primary space serial tempering

70

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−10 (U)

Primary space serial tempering

71

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−11 (U)

Primary space serial tempering

72

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−13 (U)

Primary space serial tempering

73

w3(U)C3(U)w2(U)C2(U)w1(U)C1(U)w0(U)C0(U)

x = P−13 (U)

[Veach and Guibas 1997]Metropolis-Hastings

74

Equal time

Primary space only

75

[Kelemen et al. 2002]

Equal time

Primary space serial tempering

76

Equal time

[Veach and Guibas 1997]Metropolis-Hastings

77

Equal time

78

Summary

Statistics / Graphics

• Progressive density estimation

• Statistics: new density estimator

• Graphics: simulation of complex light paths

• Primary space serial tempering

• Statistics: MCMC driven MIS

• Graphics: robust and efficient path sampling

79

What I hope to see

• Statistics communities take something from graphics

80

StatisticsGraphics

StatisticsGraphics

What I really hope to see

• We collaborate to develop something interesting!

81

top related