random number generation: old and newghan/wpi/shunslides.pdf · random number generation: old and...
TRANSCRIPT
![Page 1: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/1.jpg)
Random Number Generation: Old and New
WPI2019@Univ. Hong KongAugust, 2019
Shun Watanabe
joint work with Te Sun Han
![Page 2: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/2.jpg)
What is Random Number Generation (RNG)
�. . . X2 X1 . . . Y2 Y1
X = {Xm = (X1, . . . , Xm)}1m=1
Y = {Y n = (Y1, . . . , Yn)}1n=1
coin process
target process
We shall simulate the target process using the output from the coin process exactly.
![Page 3: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/3.jpg)
von Neumann’s Algorithm
X is i.i.d. Bernoulli: Pr(Xi = 0) = p, Pr(Xi = 1) = 1� p, p 6= 0,1
2, 1
Y is i.i.d. unbiased Bernoulli
![Page 4: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/4.jpg)
von Neumann’s Algorithm
X is i.i.d. Bernoulli: Pr(Xi = 0) = p, Pr(Xi = 1) = 1� p, p 6= 0,1
2, 1
1: Set i=1.2: If
X2i�1 = 0, X2i = 1 =) outputs 0
X2i�1 = 1, X2i = 0 =) outputs 1
Else, set i=i+1 and repeat Step 2.
Y is i.i.d. unbiased Bernoulli
0 1
0
0
0
0
1
1
1
10 1
![Page 5: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/5.jpg)
von Neumann’s Algorithm (cont’d)
T : stopping time (#of coin tosses until the algorithm terminates)
![Page 6: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/6.jpg)
von Neumann’s Algorithm (cont’d)
T : stopping time (#of coin tosses until the algorithm terminates)
follows geometric distribution with parameter …2p(1� p)T/2
E[T ] = 1
p(1� p)
![Page 7: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/7.jpg)
Generalization by Eliasvon Neuman’s algorithm was extended in various direction.
(eg. Samuelson ’68, Hoeffding-Simons ‘70, Elias ’72, Peres ’92)
![Page 8: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/8.jpg)
Generalization by Eliasvon Neuman’s algorithm was extended in various direction.
(eg. Samuelson ’68, Hoeffding-Simons ‘70, Elias ’72, Peres ’92)
Y nLet’s simulate unbiased bits simultaneously.n
![Page 9: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/9.jpg)
Generalization by Eliasvon Neuman’s algorithm was extended in various direction.
(eg. Samuelson ’68, Hoeffding-Simons ‘70, Elias ’72, Peres ’92)
Y nLet’s simulate unbiased bits simultaneously.n
m =
⇠n
✓1
H(X)+ �
◆⇡Set and let {0, 1}m =
m[
k=0
Tk Tk := {xm : wH(xm) = k}
For each , take the largest subset withk 2 G := {k0 : |Tk0 | � 2n} Ck ✓ Tk |Ck| = c2n
Let be “balanced” assignment 'k : Ck ! {0, 1}n
![Page 10: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/10.jpg)
Generalization by Eliasvon Neuman’s algorithm was extended in various direction.
(eg. Samuelson ’68, Hoeffding-Simons ‘70, Elias ’72, Peres ’92)
Y nLet’s simulate unbiased bits simultaneously.n
m =
⇠n
✓1
H(X)+ �
◆⇡Set and let {0, 1}m =
m[
k=0
Tk Tk := {xm : wH(xm) = k}
For each , take the largest subset withk 2 G := {k0 : |Tk0 | � 2n} Ck ✓ Tk |Ck| = c2n
Upon observing for some , outputs Xm = (X1, . . . , Xm) 2 Ck k 2 G 'k(Xm)
Let be “balanced” assignment 'k : Ck ! {0, 1}n
Otherwise, go to the next block
![Page 11: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/11.jpg)
Generalization by Eliasvon Neuman’s algorithm was extended in various direction.
(eg. Samuelson ’68, Hoeffding-Simons ‘70, Elias ’72, Peres ’92)
Y nLet’s simulate unbiased bits simultaneously.n
m =
⇠n
✓1
H(X)+ �
◆⇡Set and let {0, 1}m =
m[
k=0
Tk Tk := {xm : wH(xm) = k}
For each , take the largest subset withk 2 G := {k0 : |Tk0 | � 2n} Ck ✓ Tk |Ck| = c2n
Upon observing for some , outputs Xm = (X1, . . . , Xm) 2 Ck k 2 G 'k(Xm)
Let be “balanced” assignment 'k : Ck ! {0, 1}n
Otherwise, go to the next block
T/m follows geometric distribution with parameter Pr
✓Xm 2
[
k2GCk
◆
1
nE[T ] = m/n
Pr
✓Xm 2
Sk2G Ck
◆ ! 1
H(X)(n ! 1, � ! 0)
optimal
![Page 12: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/12.jpg)
Knuth-Yao’s AlgorithmX is i.i.d. unbiased Bernoulli
![Page 13: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/13.jpg)
Knuth-Yao’s AlgorithmX is i.i.d. unbiased Bernoulli
The case with didactic target distribution.
eg) PY = (1/2, 1/4, 1/8, 1/8)
Use the Huffman code tree…
0 1
0
0
1
1
1
2
3 4
E[T ] = H(Y )
Since `(��1(y)) = log1
PY (y)
![Page 14: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/14.jpg)
Knuth-Yao’s Algorithm (cont’d)General case:eg) PY = (2/3, 1/3)
2
3=
1
2+
1
23+
1
25+ · · ·
Consider the binary expansion1
3=
1
22+
1
24+
1
26+ · · ·
Then, use the Huffman code tree…0 1
0
0
1
1
1
2
. . .
1
2
![Page 15: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/15.jpg)
Knuth-Yao’s Algorithm (cont’d)General case:eg) PY = (2/3, 1/3)
2
3=
1
2+
1
23+
1
25+ · · ·
Consider the binary expansion1
3=
1
22+
1
24+
1
26+ · · ·
Then, use the Huffman code tree…0 1
0
0
1
1
1
2
. . .
1
2
Theorem (Knuth-Yao)
The Knuth-Yao’s algorithm satisfies
E[T ] H(Y ) + 2
Any RNG algorithm must satisfy
E[T ] � H(Y )
![Page 16: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/16.jpg)
RNG from Biased coin to Biased TargetRoche ’91 an algorithm based on arithmetic coding
Abrahams ’96 an extension of Knuth-Yao algorithm
Han-Hoshi ’97 “interval algorithm”
etc.
![Page 17: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/17.jpg)
A Converse BoundProposition (Han-Hoshi)
When the coin process is i.i.d., any RNG algorithm simulating exactly must satisfy
E[T ] � H(Y n)
H(X)
Y n
![Page 18: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/18.jpg)
A Converse BoundProposition (Han-Hoshi)
When the coin process is i.i.d., any RNG algorithm simulating exactly must satisfy
E[T ] � H(Y n)
H(X)
Y n
Proof sketch)
Z : RV describing the leaves induced by X 0 1
0
0
1
1
1
2
. . .
1
2
For an i.i.d. coin process,
H(Z) = E[T ] ·H(X)
Since is a function of , Y n Z
H(Y n) H(Z) = E[T ] ·H(X)
![Page 19: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/19.jpg)
Interval AlgorithmBasic Idea
For a sequence , assign with xm 2 Xm Ixm = [↵xm ,↵xm) ✓ [0, 1)
For a sequence , assign withJyn = [�yn ,�yn) ✓ [0, 1) |Jyn | = PY n(yn)
|Ixm | = PXm(xm)
yn 2 Yn
Upon observing , if for some , outputs .xm Ixm ✓ Jyn yn yn
![Page 20: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/20.jpg)
Interval AlgorithmBasic Idea
For a sequence , assign with xm 2 Xm Ixm = [↵xm ,↵xm) ✓ [0, 1)
For a sequence , assign withJyn = [�yn ,�yn) ✓ [0, 1) |Jyn | = PY n(yn)
|Ixm | = PXm(xm)
yn 2 Yn
Upon observing , if for some , outputs .xm Ixm ✓ Jyn yn yn
↵s = �s= 0 ↵s = �s = 1and for s = t = ?
↵sx := ↵s + (↵s � ↵s)P(x�1)|s
↵sx := ↵s + (↵s � ↵s)Px|sfor s 2 X i, x 2 X
Ps|x :=xX
k=1
PXi+1|Xi(k|s)
and are defined similarly by . �t �t PY n
More precisely,…↵s
↵s
x
(↵s � ↵s)P(x�1)|s
↵sx
↵sx
![Page 21: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/21.jpg)
Interval Algorithm (cont’d)1: (Initialization) Set , and .s = t = ? j = 1
2: If for some , then output and go to Step 3;[↵s,↵x) ✓ [�ty,�ty) y 2 Y yj = y
i = 0
i = i+ 1 s = sxi
j = n t = tyj j = j + 1
Otherwise, set , , and repeat Step 2 again.
3: If , terminates; otherwise, set , , and go to Step 2.
![Page 22: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/22.jpg)
Interval Algorithm (cont’d)Example) {Xm}1m=1The coin is Markov chain.
The target is i.i.d with .{Y n}1y=1 PY = (1/3, 2/3)
1/3
1/3
2/3 2/31 2
I1
I2
0
1
1/2
1/3
2/3
I11
I12
I21
I22
2/9
I111
I112
I211
I21211/18
1/3
5/9
J1
J2
J11
J12
J21
J22
![Page 23: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/23.jpg)
Interval Algorithm (cont’d)The algorithm itself is quite simple, but performance analysis is not straightforward…
Theorem (Han-Hoshi ’97)
When the coin process is i.i.d., the stopping time of the interval algorithm satisfies
E[T ] H(Y n)
H(X)+
log(2|Y|� 1)
H(X)+
H(X)
(1� pmax)H(X)
where .pmax = maxx2X
PX(x)
![Page 24: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/24.jpg)
Interval Algorithm (cont’d)The algorithm itself is quite simple, but performance analysis is not straightforward…
Theorem (Han-Hoshi ’97)
When the coin process is i.i.d., the stopping time of the interval algorithm satisfies
E[T ] H(Y n)
H(X)+
log(2|Y|� 1)
H(X)+
H(X)
(1� pmax)H(X)
where .pmax = maxx2X
PX(x)
Asymptotically,
lim supn!1
1
nE[T ] H(Y )
H(X)
where
H(Y ) = lim supn!1
1
nH(Y n) sup entropy rate
optimal
![Page 25: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/25.jpg)
Work on Interval AlgorithmOohama ’11: refined analysis for i.i.d. coin process
Uyematsu-Kayana ’00: analysis for ergodic coin/target processes
Uyematsu-Kayana ’99: large deviation analysis for i.i.d. coin/target processes
W.-Han ’19: analysis via information spectrum approach
• analysis for general coin/target processes
• optimality of the interval algorithm among any RNG algorithms for wide
class of general coin/target processes
![Page 26: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/26.jpg)
Information Spectrum Approach: Brief ReviewWhen is ergodic, the AEP states{Zn}1n=1
Pr
✓����1
nlog
1
PZn(Zn)�H(Z)
���� �
◆! 1
1
nlog
1
PZn(Zn)H(Z)
![Page 27: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/27.jpg)
Information Spectrum Approach: Brief ReviewWhen is ergodic, the AEP states{Zn}1n=1
Pr
✓����1
nlog
1
PZn(Zn)�H(Z)
���� �
◆! 1
1
nlog
1
PZn(Zn)
In general, spectrum is spreading (eg. reducible Markov chain)
1
nlog
1
PZn(Zn)
H(Z)
![Page 28: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/28.jpg)
Information Spectrum Approach (cont’d)To handle spreading spectrum, it is more convenient to define “typical sets” by
Sn(�) :=
⇢zn : log
1
PZn(zn)� �
�
Tn(�) :=⇢zn : log
1
PZn(zn) �
�
![Page 29: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/29.jpg)
Information Spectrum Approach (cont’d)To handle spreading spectrum, it is more convenient to define “typical sets” by
Sn(�) :=
⇢zn : log
1
PZn(zn)� �
�
Tn(�) :=⇢zn : log
1
PZn(zn) �
�
If we define
H(Z) := sup
⇢a : lim
n!1Pr
✓1
nlog
1
PZn(Zn) a
◆= 0
�
H(Z) := inf
⇢a : lim
n!1Pr
✓1
nlog
1
PZn(Zn)� a
◆= 0
�
then
� = n(H(Z)� �) =) PZn(Sn(�)) ! 1
� = n(H(Z) + �) =) PZn(Tn(�)) ! 11
nlog
1
PZn(Zn)
H(Z)H(Z)
spectral inf-entropy
spectral sup-entropy
![Page 30: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/30.jpg)
Information Spectral Analysis of Interval Algorithm
Theorem (W.-Han ’19)
For the interval algorithm, the overflow probability of the stopping time satisfies
Pr(T > m) PXm(Scm(�)) + PY n(T c
n (⌧)) + 2��+⌧+1
where
Sm(�) :=
⇢xm 2 Xm : log
1
PXm(xm)� �
�
Tn(⌧) :=⇢yn 2 Yn : log
1
PY n(yn) ⌧
�
If we set , , and , then � ' mH(X) ⌧ ' nH(Y )
Pr(T > m) ! 0
m ' nH(Y )
H(X)
(n ! 1)
![Page 31: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/31.jpg)
Proof Sketch
Pr(T > m) PXm(Scm(�)) + PY n(T c
n (⌧)) + 2��+⌧+1
Sm(�) :=
⇢xm 2 Xm : log
1
PXm(xm)� �
�
Tn(⌧) :=⇢yn 2 Yn : log
1
PY n(yn) ⌧
�
When is observed, the interval algorithm stops iff.xm
Ixm
Jyn
PXm(xm) PY n(yn) for some yn 2 Yn
small and large are favorable; and are handled as exceptions. PXm(xm) PY n(yn) Scm(�) T c
n (⌧)
![Page 32: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/32.jpg)
Proof Sketch
Pr(T > m) PXm(Scm(�)) + PY n(T c
n (⌧)) + 2��+⌧+1
Sm(�) :=
⇢xm 2 Xm : log
1
PXm(xm)� �
�
Tn(⌧) :=⇢yn 2 Yn : log
1
PY n(yn) ⌧
�
When is observed, the interval algorithm stops iff.xm
Ixm
Jyn
PXm(xm) PY n(yn) for some yn 2 Yn
small and large are favorable; and are handled as exceptions. PXm(xm) PY n(yn) Scm(�) T c
n (⌧)
xm 2 Sm(�) =) PXm(xm) 2��
yn 2 Tn(⌧) =) PY n(yn) � 2�⌧
good bad bad
![Page 33: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/33.jpg)
A Converse BoundTheorem (W.-Han ’19)
For any RNG algorithm, the overflow probability of the stopping time satisfies
Pr(T > m) � PY n(T cn (⌧))� PXm(Sm(�))� 2�⌧+�
= PXm(Scm(�))� PY n(Tn(⌧))� 2�⌧+�
Set , , and . Then, only if � ' mH(X) ⌧ ' nH(Y ) m = nR Pr(T > m) ! 0
R � H(Y )
H(X)
Similarly,
R � H(Y )
H(X)
![Page 34: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/34.jpg)
Optimality of Interval AlgorithmIf either the coin or the target process has one point spectrum, i.e.,
H(X) = H(X) = H(X) H(Y ) = H(Y ) = H(Y )or
then iff.Pr(T > nR) ! 0
R � H(Y )
H(X)or R � H(Y )
H(X)
Furthermore, it is attained by the interval algorithm.
![Page 35: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/35.jpg)
Average Stopping TimeBy using
E[T ] =Z 1
0Pr(T > z)dz
<⇠Z 1
0Pr
✓1
H(X)log
1
PY n(Y n)> z
◆dz
=H(Y n)
H(X)
Corollary (W.-Han ’19)Under some regularity condition, the interval algorithm satisfies
lim supn!1
1
nE[T ] H(Y )
H(X)
Any RNG algorithm satisfies
lim supn!1
1
nE[T ] � H(Y )
H(X)
If the coin process has one point spectrum, the interval algorithm is optimal.
![Page 36: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/36.jpg)
ExampleLet be a Markov chain induced by irreducible X = {Xm}1m=1 W (x|x0)
For the stationary distribution , let⇡H
W (X) =X
x,x0
⇡(x0)W (x|x0) log1
W (x|x0)
Then, H(X) = H(X) = H(X) = HW (X)
Let be a Markov chain induced by reducible , Y = {Y n}1n=1 V (y|y0)
but assume that there is no transient class; then
V =rM
⇠=1
V⇠ V⇠ is irreducible
For the weight induced by an initial distribution, w(⇠)
H(Y ) = max�H
V⇠(Y ) : 1 ⇠ r, w(⇠) > 0
H(Y ) =rX
⇠=1
w(⇠)HV⇠(Y )
![Page 37: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/37.jpg)
Example (cont’d)The interval algorithm satisfies
lim supn!1
1
nE[T ] 1
HW (X)
rX
⇠=1
HV⇠(Y )
and ifPr(T > nR) ! 0
R � 1
HW (X)max
�H
V⇠(Y ) : 1 ⇠ r, w(⇠) > 0
These performances are optimal among any RNG algorithms.
![Page 38: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/38.jpg)
Isomorphism Problem In the ergodic theory, a basic problem is to show if a (two-sided) random process
X = (. . . , X�1, X0, X1, . . .) is isomorphic to
Y = (. . . , Y�1, Y0, Y1, . . .)
S : X Z ! X Z shift operator
x = (· · · , x�1, x0, x1, . . .)For , (Sx)i = xi+1
Definition
A measurable map from to is termed homomorphism from X Z YZ (X Z,BX , µ, S)
to if for and for -a.e. . (YZ,BY , µ, S) ⌫(A) = µ(��1(A))
�
�(Sx) = S�(x) µ x
Furthermore, if is invertible for -a.e. , then it is termed isomorphism.
A 2 BY
� µ x
another random process .
![Page 39: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/39.jpg)
Isomorphism Problem (cont’d)Consider i.i.d. processes and , which are termed Bernoulli shifts in ergodic theory.
Theorem (Ornstein ’70)
For Bernoulli shifts, isomorphism exists iff. . H(X) = H(Y )
X Y
In 1970s, the isomorphism problem was actively studied in IT community
from the viewpoint of source coding (eg. Gray, Neuhoff).
The connection between the isomorphism problem and the RNG problem seems to be
not well understood; but there are some work (eg. Harvey-Holroyd-Peres-Romik ’07).
![Page 40: Random Number Generation: Old and Newghan/WPI/ShunSlides.pdf · Random Number Generation: Old and New WPI2019@Univ. Hong Kong August, 2019 Shun Watanabe joint work with Te Sun Han](https://reader036.vdocuments.net/reader036/viewer/2022062603/5f190c273a43a351530b37f2/html5/thumbnails/40.jpg)
Thank you very much.