overview - stanford university
TRANSCRIPT
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Overview
Earlier lectures
■Monte Carlo I: Integration
■ Signal processing and sampling
Today: Monte Carlo II
■Noise and variance reduction
■ Importance sampling
■ Stratified sampling
■Multidimensional sampling patterns
Undersampling: Aliasing
CS348b Lecture 9 Pat Hanrahan, Spring 2016
⊗ =
Band-limited signal
Aliasing occurs if the sampling rate is less than 1/2 the maximum frequency in the signal
Note that high frequencies can appear as low frequencies
Frequency Space
Zone plate:
Sampling a “Zone Plate”
CS348b Lecture 9 Pat Hanrahan, Spring 2016
2 2sin x y+
Left rings: signal Right rings: aliases Middle rings: sum of the signal and the aliases
Jittered Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Add uniform random jitter to each sample
Jittered vs. Uniform Supersampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
4x4 Jittered Sampling 4x4 Uniform
Regular pattern of aliases replaced by noise
Theory: Analysis of Jitter
CS348b Lecture 9 Pat Hanrahan, Spring 2016
( ) ( )n
nn
s x x xδ=∞
=−∞
= −∑
n nx nT j= +
~ ( )
1 1/ 2( )
0 1/ 2
( ) sinc
nj j x
xj x
x
J ω ω
" ≤$= %
>$&=
2 22
2
1 2 2( ) 1 ( ) ( ) ( )
1 1 sinc ( )
n
n
nS J JT T T
T
π πω ω ω δ ω
ω δ ω
=−∞
=−∞
& '= − + −( )
& '= − +( )
∑
Non-uniform sampling Jittered sampling
Poisson Disk Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Dart throwing algorithm Less energy near origin
Distribution of Extrafoveal Cones
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Monkey eye cone distribution Fourier transform
Yellot
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Intuition: Uniform Sampling
Uniform sampling
■ The spectrum of uniformly spaced samples is also a set of uniformly spaced spikes
■Multiplying the signal by the sampling pattern in the spatial domain, corresponds to placing a copy of the spectrum at each spike in the frequency domain
■ Aliases are coherent, and very noticeable
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Intuition: Non-uniform Sampling
Non-uniform sampling
■ Samples at non-uniform locations have a different spectrum; a single spike at the center plus noise
■ Sampling a signal in this way converts aliases into broadband noise
■Noise is incoherent, and much less objectionable
Random Sampling Introduces Noise
CS348b Lecture 9 Pat Hanrahan, Spring 2016
1 shadow ray
Center Random
Less Noise with More Rays
CS348b Lecture 9 Pat Hanrahan, Spring 2016
16 shadow rays1 shadow ray
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Variance
Definition
Variance decreases linearly with sample size
2
2 2
[ ] [( [ ]) ][ ] [ ]
V Y E Y E YE Y E Y
≡ −
= −
2 21 1
1 1 1 1[ ] [ ] [ ] [ ]N N
i ii i
V Y V Y NV Y V YN N N N= =
= = =∑ ∑
2[ ] [ ]V aY a V Y=
Why is Area Better than Solid Angle?
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Solid Angle
100 shadow rays
Area
100 shadow rays
Uniform Random Sampling Works
CS348b Lecture 9 Pat Hanrahan, Spring 2016
1
1 11
1 01
1 01
0
1[ ] [ ]
1 1[ ] [ ( )]
1 ( ) ( )
1 ( )
( )
N
N ii
N N
i ii i
N
i
N
i
E F E YN
E Y E f XN N
f x p x dxN
f x dxN
f x dx
=
= =
=
=
=
= =
=
=
=
∑
∑ ∑
∑∫
∑∫
∫
[ ] ( )NE F I f=
Assume uniform probability distribution for now
[ ] [ ]i ii i
E Y E Y=∑ ∑[ ] [ ]E aY aE Y=
Properties
CS348b Lecture 9 Pat Hanrahan, Spring 2016
“Biased” Sampling is “Unbiased”
Probability
Estimator
Proof
~ ( )iX p x
( )( )
ii
i
f XYp X
=
∫
∫
=
=
"#
$%&
'=
dxxf
dxxpxpxf
xpxf
EYE i
)(
)()()()()(][
Importance Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
( )( )[ ]f x
p xE f
=!
( )( )( )f x
f xp x
=!!
Sample according to f
Importance Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
( )( )[ ]f x
p xE f
=!
( )( )( )f x
f xp x
=!!
Sample according to f
2 2[ ] [ ] [ ]V f E f E f= −
Variance
Importance Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
22
2
2
( )[ ] ( )( )
( ) ( )( ) / [ ] [ ]
[ ] ( )
[ ]
f xE f p x dx
p x
f x f xdx
f x E f E f
E f f x dx
E f
! "= # $
% &
! "= # $
% &
=
=
∫
∫
∫
! !!
( )( )[ ]f x
p xE f
=!
( )( )( )f x
f xp x
=!!
Sample according to f
2 2[ ] [ ] [ ]V f E f E f= −
Variance
Importance Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
22
2
2
( )[ ] ( )( )
( ) ( )( ) / [ ] [ ]
[ ] ( )
[ ]
f xE f p x dx
p x
f x f xdx
f x E f E f
E f f x dx
E f
! "= # $
% &
! "= # $
% &
=
=
∫
∫
∫
! !!
( )( )[ ]f x
p xE f
=!
( )( )( )f x
f xp x
=!!
Sample according to f
2[ ] 0V f =!
Zero variance!
2 2[ ] [ ] [ ]V f E f E f= −
Variance
Importance Sampling
CS348b Lecture 9 Pat Hanrahan, Spring 2016
22
2
2
( )[ ] ( )( )
( ) ( )( ) / [ ] [ ]
[ ] ( )
[ ]
f xE f p x dx
p x
f x f xdx
f x E f E f
E f f x dx
E f
! "= # $
% &
! "= # $
% &
=
=
∫
∫
∫
! !!
( )( )[ ]f x
p xE f
=!
( )( )( )f x
f xp x
=!!
Sample according to f
2[ ] 0V f =!
Zero variance!
2 2[ ] [ ] [ ]V f E f E f= −
Variance
Gotcha?
Importance Sampling: Area
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Solid Angle
100 shadow rays
Area
100 shadow rays
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Comparing Different Techniques
Efficiency measure
Comparing two sampling techniques A and B
If A has twice the variance as B, then it takes twice as many samples from A to achieve the same variance as B
If A has twice the cost of B, then it takes twice as much time reduce the variance using A compared to using B
The product of variance and cost is a constant independent of the number of samples
Recall: Variance goes as 1/N, time goes as C*N
E�ciency / 1
Variance · Cost
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Jittered = Stratified Sampling
Allocate samples per region
Estimate each region separately
New variance
If the variance in each region is the same, then total variance goes as
21
1[ ] [ ]N
N ii
V F V FN =
= ∑
1
1 N
N ii
F FN =
= ∑
1/N
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Stratified Sampling
Sample a polygon
If the variance in some regions are smaller, then the overall variance will be reduced
In this example, the variance in the regions inside and outside the polygon is 0. The variance in the regions along the edge are greater, but there are fewer edge regions
2 1.51
1 [ ][ ] [ ]N
EN j
i
V FV F V FN N=
= =∑
Sampling a Circle
CS348b Lecture 9 Pat Hanrahan, Spring 2016
1
2
2 U
r U
θ π=
=
Equi-Areal
Shirley’s Mapping: Better Strata
CS348b Lecture 9 Pat Hanrahan, Spring 2016
1
2
14
r UUU
πθ
=
=
Block Design
CS348b Lecture 9 Pat Hanrahan, Spring 2016
a
bcd
abc
d
ab
cd
a
bc
d
Alphabet of size n
Each symbol appears exactly once in each row and column
Rows and columns are stratified
Latin Square
Incomplete Block Design
CS348b Lecture 9 Pat Hanrahan, Spring 2016
aa
aa
N-Rook Pattern
Replace n2 samples with n samples
Permutations:
Generalizations: N-queens
1 2( ( ), ( ), ( ))di i iπ π π!
( {1,2,3,4}, {4,2,3,1})x yπ π= =
Path Tracing
CS348b Lecture 9 Pat Hanrahan, Spring 2016
4 eye rays per pixel 16 shadow rays per eye ray
64 eye rays per pixel 1 shadow ray per eye ray
Complete Incomplete
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Space-Time Patterns
Distribute samples in time
■ Complete in space
■ Incomplete in time
■ Decorrelate space and time
■Nearby samples in space should differ greatly in time
154101
3136
8
011
514
12
297
Pan-diagonal Magic Square
154
10
1
31368
0 115
14 122
97
Cook Pattern
CS348b Lecture 9 Pat Hanrahan, Spring 2016
High-dimensional Sampling
Complete set of samples
Random sampling
Error (variance) …
Numerical integration
Error …
In high dimensional space, for the same error, Monte Carlo integration requires fewer samples than numerical integration
11 1~
dE
n N=
1/ 21/ 21~ ~E VN
1. Numerical integration
■ Quadrature/Integration rules
■ Efficient for smooth functions
2. Statistical sampling (Monte Carlo integration)
■ Unbiased estimate of integral
■ Variance reduction techniques
■ High dimensional sampling:
3. Signal processing
■ Sampling and reconstruction
■ Aliasing and antialiasing
■ Blue noise good
4. Quasi Monte Carlo
■ Bound error using discrepancy
■ Asymptotic efficiency in high dimensions
CS348b Lecture 9 Pat Hanrahan, Spring 2016
Views of Sampling
1/N1/2