capacity achieving codes: there and back againpfister.ee.duke.edu/talks/gothenburg16.pdf ·...

198
Capacity Achieving Codes: There and Back Again Henry D. Pfister Electrical and Computer Engineering Information Initiative (iiD) Duke University 2016 European School of Information Theory Chalmers University, Gothenburg, Sweden April 6th, 2016

Upload: others

Post on 22-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again

Henry D. Pfister

Electrical and Computer EngineeringInformation Initiative (iiD)

Duke University

2016 European School of Information TheoryChalmers University, Gothenburg, Sweden

April 6th, 2016

Page 2: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 2 / 65

Acknowledgments

I Thanks to my coauthors involved in this work

I Krishna Narayanan

I Phong Nguyen

I Arvind Yedla

I Yung-Yih Jian

I Santhosh Kumar

I Shrinivas Kudekar, Marco Mondelli,Eren Sasoglu, Ruediger Urbanke

I Thanks to the organizers!

I Alexandre Graell i Amat

I Fredrik Brannstrom

I Giuseppe Durisi

Page 3: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 3 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 4: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 4 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 5: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 5 / 65

Capacity of Point-to-Point Communication

X PY |X Y

I Coding for Discrete-Time Memoryless Channels

I Transition probability: PY |X(y|x) for x ∈ X and y ∈ YI Transmit a length-n codeword x ∈ C ⊂ Xn

I Decode to most likely codeword given received y

I Channel Capacity introduced by Shannon in 1948

I Random code of rate R , 1n log2 |C| (bits per channel use)

I As n→∞, reliable transmission possible if R < C with

C , maxp(x)

I(X;Y )

Page 6: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 5 / 65

Capacity of Point-to-Point Communication

X PY |X Y

I Coding for Discrete-Time Memoryless Channels

I Transition probability: PY |X(y|x) for x ∈ X and y ∈ YI Transmit a length-n codeword x ∈ C ⊂ Xn

I Decode to most likely codeword given received y

I Channel Capacity introduced by Shannon in 1948

I Random code of rate R , 1n log2 |C| (bits per channel use)

I As n→∞, reliable transmission possible if R < C with

C , maxp(x)

I(X;Y )

Page 7: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 6 / 65

The Binary Erasure Channel (BEC)

0

1

0

?

1

1− ε

ε

1− ε

ε

I Denoted BEC(ε) when erasure probability is ε

I C = 1− ε = expected fraction bits not erased

I Coding with a binary linear code

I Parity-check matrix H ∈ {0, 1}m×n with m = (1−R)n

I Codebook C , {x ∈ {0, 1}n |Hx = 0} has 2Rn codewords

I Let E denote the index set of erased positions so that

Hx =[HE HEc

][ xEyEc

]= 0 ⇔ HExE = −HEcyEc

I Decoding fails iff: HE singular ⇔ cw exists with 1’s only in EI One can achieve capacity by drawing H uniformly at random!

Page 8: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 6 / 65

The Binary Erasure Channel (BEC)

0

1

0

?

1

1− ε

ε

1− ε

ε

I Denoted BEC(ε) when erasure probability is ε

I C = 1− ε = expected fraction bits not erased

I Coding with a binary linear code

I Parity-check matrix H ∈ {0, 1}m×n with m = (1−R)n

I Codebook C , {x ∈ {0, 1}n |Hx = 0} has 2Rn codewords

I Let E denote the index set of erased positions so that

Hx =[HE HEc

][ xEyEc

]= 0 ⇔ HExE = −HEcyEc

I Decoding fails iff: HE singular ⇔ cw exists with 1’s only in EI One can achieve capacity by drawing H uniformly at random!

Page 9: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 6 / 65

The Binary Erasure Channel (BEC)

0

1

0

?

1

1− ε

ε

1− ε

ε

I Denoted BEC(ε) when erasure probability is ε

I C = 1− ε = expected fraction bits not erased

I Coding with a binary linear code

I Parity-check matrix H ∈ {0, 1}m×n with m = (1−R)n

I Codebook C , {x ∈ {0, 1}n |Hx = 0} has 2Rn codewords

I Let E denote the index set of erased positions so that

Hx =[HE HEc

][ xEyEc

]= 0 ⇔ HExE = −HEcyEc

I Decoding fails iff: HE singular ⇔ cw exists with 1’s only in E

I One can achieve capacity by drawing H uniformly at random!

Page 10: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 6 / 65

The Binary Erasure Channel (BEC)

0

1

0

?

1

1− ε

ε

1− ε

ε

I Denoted BEC(ε) when erasure probability is ε

I C = 1− ε = expected fraction bits not erased

I Coding with a binary linear code

I Parity-check matrix H ∈ {0, 1}m×n with m = (1−R)n

I Codebook C , {x ∈ {0, 1}n |Hx = 0} has 2Rn codewords

I Let E denote the index set of erased positions so that

Hx =[HE HEc

][ xEyEc

]= 0 ⇔ HExE = −HEcyEc

I Decoding fails iff: HE singular ⇔ cw exists with 1’s only in EI One can achieve capacity by drawing H uniformly at random!

Page 11: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 12: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 13: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 14: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 15: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 16: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 17: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 7 / 65

Some Early Milestones in Coding

I 1948: Shannon defines channel capacity and random codes

I 1950: Hamming formalizes linear codes and Hamming distance

I 1954: Reed-Muller codes (Muller gives codes, Reed the decoder)

I 1955: Elias introduces the erasure channel and convolutional codes;also shows random parity-check codes achieve capacity on the BEC

I 1959: BCH Codes (Hocquenghem’59 and Bose-Ray-Chaudhuri’60)

I 1960: Gallager introduces low-density parity-check (LDPC) codesand iterative decoding

I 1960: Reed-Solomon codes

Page 18: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 19: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 20: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 21: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 22: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 23: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 24: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 8 / 65

Achieving Capacity in Practice

But, more than 35 years passed before we could:

I Achieve capacity in practice

I Provably achieve capacity with determinstic constructions

Modern Milestones:

I 1993: Turbo Codes (Berrou, Glavieux, Thitimajshima)

I 1995: Rediscovery of LDPC codes (MacKay-Neal,Spielman)

I 1997: Optimized irregular LDPC codes for the BEC (LMSSS)

I 2001: Optimized irregular LDPC codes for BMS channels (RSU)

I 2008: Polar codes provable, low-complexity, deterministic (Arikan)

I 1999-2011: Understanding LDPC convolutional codes and coupling

Page 25: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 26: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 27: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 28: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 29: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 30: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 31: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 32: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 33: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 34: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 35: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 9 / 65

Key Tools That Made the Difference

I Factor Graph (FG)

I Compact description of joint distribution for random variables

I Natural setup for inference problems with partial observations

I Belief-Propagation (BP)

I Message-passing algorithm for inference on a FG

I Probability estimates are passed along edges in the factor graph

I Provides exact marginals if factor graph is a tree

I Density Evolution (DE)

I Tracks distribution of messages passed by belief propagation

I In some cases, allows rigorous analysis of BP-based inference

I EXtrinsic Information Transfer (EXIT) Curves

Page 36: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 10 / 65

Applications of These Tools

I Error-Correcting Codes

I Random code defined by random factor graph

I Low-complexity decoding via belief propagation

I Analysis of belief-propagation decoding via density evolution

I Provides code constructions that provably achieve capacity!

I Boolean Satisfiability: K-SAT

I Random instance of K-SAT defined by random factor graph

I Non-rigorous analysis via the cavity method

I Predicted thresholds later proved exact!

I Compressed Sensing

I Random measurement matrix defined by random factor graph

I Low-complexity reconstruction via message passing

I Schemes provably achieve the information-theoretic limit!

Page 37: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 10 / 65

Applications of These Tools

I Error-Correcting Codes

I Random code defined by random factor graph

I Low-complexity decoding via belief propagation

I Analysis of belief-propagation decoding via density evolution

I Provides code constructions that provably achieve capacity!

I Boolean Satisfiability: K-SAT

I Random instance of K-SAT defined by random factor graph

I Non-rigorous analysis via the cavity method

I Predicted thresholds later proved exact!

I Compressed Sensing

I Random measurement matrix defined by random factor graph

I Low-complexity reconstruction via message passing

I Schemes provably achieve the information-theoretic limit!

Page 38: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 10 / 65

Applications of These Tools

I Error-Correcting Codes

I Random code defined by random factor graph

I Low-complexity decoding via belief propagation

I Analysis of belief-propagation decoding via density evolution

I Provides code constructions that provably achieve capacity!

I Boolean Satisfiability: K-SAT

I Random instance of K-SAT defined by random factor graph

I Non-rigorous analysis via the cavity method

I Predicted thresholds later proved exact!

I Compressed Sensing

I Random measurement matrix defined by random factor graph

I Low-complexity reconstruction via message passing

I Schemes provably achieve the information-theoretic limit!

Page 39: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 11 / 65

Polya’s Dictum

If you can’t solve a problem, then it probably contains aneasier problem that you can’t solve: find it.

I The solution of the simpler problem often provides insight thatallows one to crack the harder problem.

I To achieve channel capacity in practice, we now know that a good“easy” problem would have been:

I “Design a code that achieves capacity on the BEC andis encodable and decodable in quasi-linear time”

Page 40: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 11 / 65

Polya’s Dictum

If you can’t solve a problem, then it probably contains aneasier problem that you can’t solve: find it.

I The solution of the simpler problem often provides insight thatallows one to crack the harder problem.

I To achieve channel capacity in practice, we now know that a good“easy” problem would have been:

I “Design a code that achieves capacity on the BEC andis encodable and decodable in quasi-linear time”

Page 41: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 11 / 65

Polya’s Dictum

If you can’t solve a problem, then it probably contains aneasier problem that you can’t solve: find it.

I The solution of the simpler problem often provides insight thatallows one to crack the harder problem.

I To achieve channel capacity in practice, we now know that a good“easy” problem would have been:

I “Design a code that achieves capacity on the BEC andis encodable and decodable in quasi-linear time”

Page 42: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 12 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 43: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 13 / 65

Factor Graphs

I A factor graph provides a graphical representation of the localdependence structure for a set of random variables

I Bipartite graph with variables x1, . . . , xn and factors f1, . . . , fm

I Consider random variables (X1, X2, . . . , X4) ∈ X 4 and Y where:

P (x1, x2, x3, x4) , P(X1 =x1, X2 =x2, . . . , X4 =x4|Y = y)

∝ f(x1, x2, x3, x4)

, f1(x1, x2)f2(x2, x3)f3(x3, x4)

I Given Y = y, this describes a Markov chain whose factor graph is

x1 f1 x2 f2 x3 f3 x4

Page 44: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 13 / 65

Factor Graphs

I A factor graph provides a graphical representation of the localdependence structure for a set of random variables

I Bipartite graph with variables x1, . . . , xn and factors f1, . . . , fm

I Consider random variables (X1, X2, . . . , X4) ∈ X 4 and Y where:

P (x1, x2, x3, x4) , P(X1 =x1, X2 =x2, . . . , X4 =x4|Y = y)

∝ f(x1, x2, x3, x4)

, f1(x1, x2)f2(x2, x3)f3(x3, x4)

I Given Y = y, this describes a Markov chain whose factor graph is

x1 f1 x2 f2 x3 f3 x4

Page 45: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 13 / 65

Factor Graphs

I A factor graph provides a graphical representation of the localdependence structure for a set of random variables

I Bipartite graph with variables x1, . . . , xn and factors f1, . . . , fm

I Consider random variables (X1, X2, . . . , X4) ∈ X 4 and Y where:

P (x1, x2, x3, x4) , P(X1 =x1, X2 =x2, . . . , X4 =x4|Y = y)

∝ f(x1, x2, x3, x4)

, f1(x1, x2)f2(x2, x3)f3(x3, x4)

I Given Y = y, this describes a Markov chain whose factor graph is

x1 f1 x2 f2 x3 f3 x4

Page 46: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 14 / 65

Conditional Independence for Factor Graphs

I Let A,B, S ⊂ [n] be disjoint subsets of VNs in factor graph G

I If S separates A from B (i.e., there is no path in G from A toB that avoids S), then we have XA ⊥⊥ XB | XS

P (xA, xB |xS) = P (xA|xS)P (xB |xS)

I Markov chain example: A = {x1, x2}, B = {x4}, S = {x3}

x1 f12 x2 x4

I Sketch of Proof:

I Fixing XS=xS separates the FG into disjoint components

I Groups of VNs in different components are independent

I XA ⊥⊥ XB because A and B are in different components

Page 47: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 14 / 65

Conditional Independence for Factor Graphs

I Let A,B, S ⊂ [n] be disjoint subsets of VNs in factor graph G

I If S separates A from B (i.e., there is no path in G from A toB that avoids S), then we have XA ⊥⊥ XB | XS

P (xA, xB |xS) = P (xA|xS)P (xB |xS)

I Markov chain example: A = {x1, x2}, B = {x4}, S = {x3}

x1 f12 x2 f23 x3 f34 x4

I Sketch of Proof:

I Fixing XS=xS separates the FG into disjoint components

I Groups of VNs in different components are independent

I XA ⊥⊥ XB because A and B are in different components

Page 48: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 14 / 65

Conditional Independence for Factor Graphs

I Let A,B, S ⊂ [n] be disjoint subsets of VNs in factor graph G

I If S separates A from B (i.e., there is no path in G from A toB that avoids S), then we have XA ⊥⊥ XB | XS

P (xA, xB |xS) = P (xA|xS)P (xB |xS)

I Markov chain example: A = {x1, x2}, B = {x4}, S = {x3}

x1 f12 x2 f23 x3 f34 x4

I Sketch of Proof:

I Fixing XS=xS separates the FG into disjoint components

I Groups of VNs in different components are independent

I XA ⊥⊥ XB because A and B are in different components

Page 49: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 14 / 65

Conditional Independence for Factor Graphs

I Let A,B, S ⊂ [n] be disjoint subsets of VNs in factor graph G

I If S separates A from B (i.e., there is no path in G from A toB that avoids S), then we have XA ⊥⊥ XB | XS

P (xA, xB |xS) = P (xA|xS)P (xB |xS)

I Markov chain example: A = {x1, x2}, B = {x4}, S = {x3}

x1 f12 x2 f ′23 f ′34 x4

I Sketch of Proof:

I Fixing XS=xS separates the FG into disjoint components

I Groups of VNs in different components are independent

I XA ⊥⊥ XB because A and B are in different components

Page 50: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 15 / 65

Inference via Marginalization

I Marginalizing out all variables except X1 gives

P(X1 = x1|Y = y) ∝ g1(x1) ,∑

(x2,...,x4)∈X 3

f(x1, x2, x3, x4)

I Thus, the maximum a posteriori decision for X1 given Y = y is

x1 = arg maxx1∈X

∑(x2,...,x4)∈X 3

f(x1, x2, x3, x4)

I For a general function, this requires roughly |X |4 operations

I Marginalization is efficient for tree-structured factor graphs

I For the Markov chain, roughly 5 |X |2 operations required

g1(x1) =∑x2∈X

f1(x1, x2)∑x3∈X

f2(x2, x3)∑x4∈X

f3(x3, x4)

Page 51: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 15 / 65

Inference via Marginalization

I Marginalizing out all variables except X1 gives

P(X1 = x1|Y = y) ∝ g1(x1) ,∑

(x2,...,x4)∈X 3

f(x1, x2, x3, x4)

I Thus, the maximum a posteriori decision for X1 given Y = y is

x1 = arg maxx1∈X

∑(x2,...,x4)∈X 3

f(x1, x2, x3, x4)

I For a general function, this requires roughly |X |4 operations

I Marginalization is efficient for tree-structured factor graphs

I For the Markov chain, roughly 5 |X |2 operations required

g1(x1) =∑x2∈X

f1(x1, x2)∑x3∈X

f2(x2, x3)∑x4∈X

f3(x3, x4)

Page 52: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 16 / 65

The Importance of Factorization (1)

I Consider a random vector (X1, X2, . . . , X6) ∈ X 6 where

P(X1 = x1, . . . , X6 = x6|Y = y) ∝ f(x1, x2, x3, x4, x5, x6)

I Brute force marginal requires |X |5 operations for each x1 ∈ X :

g1(x1) ,∑x62∈X 5

f(x1, x2, x3, x4, x5, x6)

I Thus, we need |X |6 operations

I If f factors as follows, then the marginalization can be simplified:

f(x1, x2, x3, x4, x5, x6) = f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

Page 53: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 16 / 65

The Importance of Factorization (1)

I Consider a random vector (X1, X2, . . . , X6) ∈ X 6 where

P(X1 = x1, . . . , X6 = x6|Y = y) ∝ f(x1, x2, x3, x4, x5, x6)

I Brute force marginal requires |X |5 operations for each x1 ∈ X :

g1(x1) ,∑x62∈X 5

f(x1, x2, x3, x4, x5, x6)

I Thus, we need |X |6 operations

I If f factors as follows, then the marginalization can be simplified:

f(x1, x2, x3, x4, x5, x6) = f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

Page 54: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 16 / 65

The Importance of Factorization (1)

I Consider a random vector (X1, X2, . . . , X6) ∈ X 6 where

P(X1 = x1, . . . , X6 = x6|Y = y) ∝ f(x1, x2, x3, x4, x5, x6)

I Brute force marginal requires |X |5 operations for each x1 ∈ X :

g1(x1) ,∑x62∈X 5

f(x1, x2, x3, x4, x5, x6)

I Thus, we need |X |6 operations

I If f factors as follows, then the marginalization can be simplified:

f(x1, x2, x3, x4, x5, x6) = f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

Page 55: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 17 / 65

The Importance of Factorization (2)

For example, we can write g1(x1) as:

=∑x62

f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

=∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)

[∑x6

f2(x1, x4, x6)

]

=∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]

=∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

=∑x2

[∑x3

f1(x1, x2, x3)

][∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

This implementation requires roughly 2 |X |3 + 5 |X |2 operations

Page 56: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 17 / 65

The Importance of Factorization (2)

For example, we can write g1(x1) as:

=∑x62

f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

=∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)

[∑x6

f2(x1, x4, x6)

]

=∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]

=∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

=∑x2

[∑x3

f1(x1, x2, x3)

][∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

This implementation requires roughly 2 |X |3 + 5 |X |2 operations

Page 57: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 17 / 65

The Importance of Factorization (2)

For example, we can write g1(x1) as:

=∑x62

f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

=∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)

[∑x6

f2(x1, x4, x6)

]

=∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]

=∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

=∑x2

[∑x3

f1(x1, x2, x3)

][∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

This implementation requires roughly 2 |X |3 + 5 |X |2 operations

Page 58: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 17 / 65

The Importance of Factorization (2)

For example, we can write g1(x1) as:

=∑x62

f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

=∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)

[∑x6

f2(x1, x4, x6)

]

=∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]

=∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

=∑x2

[∑x3

f1(x1, x2, x3)

][∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

This implementation requires roughly 2 |X |3 + 5 |X |2 operations

Page 59: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 17 / 65

The Importance of Factorization (2)

For example, we can write g1(x1) as:

=∑x62

f1(x1, x2, x3)f2(x1, x4, x6)f3(x4)f4(x4, x5)

=∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)

[∑x6

f2(x1, x4, x6)

]

=∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]

=∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

=∑x2

[∑x3

f1(x1, x2, x3)

][∑x4

f3(x4)

[∑x5

f4(x4, x5)

][∑x6

f2(x1, x4, x6)

]]

This implementation requires roughly 2 |X |3 + 5 |X |2 operations

Page 60: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5

g1(x1) =∑x52

f1(x1, x2, x3)f3(x4)f4(x4, x5)∑x6

f2(x1, x4, x6)

Page 61: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f1 f ′2

x2 x3 x4

f3 f4

x5

g1(x1) =∑x42

f1(x1, x2, x3)f3(x4)

[∑x5

f4(x4, x5)

]f ′2(x1, x4)

Page 62: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f1 f ′2

x2 x3 x4

f3 f ′4

x5

g1(x1) =∑x32

f1(x1, x2, x3)

[∑x4

f3(x4)f ′4(x4)f ′2(x1, x4)

]

Page 63: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f1 f ′′2

x2 x3

x5

g1(x1) =∑x2

[∑x3

f1(x1, x2, x3)

]f ′′2 (x1)

Page 64: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f ′1 f ′′2

x2

x5

g1(x1) =

[∑x2

f ′1(x1, x2)

]f ′′2 (x1)

Page 65: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 18 / 65

The Factor Graph and Leaf Removal

x1

f ′′1 f ′′2

x2

x5

g1(x1) = f ′′1 (x1)f ′′2 (x1)

Page 66: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 19 / 65

Constraint Satisfaction and Zero-One Factors

I A non-negative function f : Xn → R defines a distribution on Xn:

P (x) , P(X1 =x1, . . . , Xn=xn)

=1

Zf(x) ,

1

Z

m∏a=1

fa(x∂a),

I where x∂a is the subvector of variables involved in factor a

I and Z ,∑x f(x) is called the partition function

I For Constraint Satisfaction Problems (CSPs)

I All factors fa(x∂a) take values in {0, 1}I The set of valid configurations is {x ∈ Xn|f(x) = 1}I Thus, Z equals the number of valid configurations

I P (x) is uniform over the set of valid configurations

Page 67: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 19 / 65

Constraint Satisfaction and Zero-One Factors

I A non-negative function f : Xn → R defines a distribution on Xn:

P (x) , P(X1 =x1, . . . , Xn=xn)

=1

Zf(x) ,

1

Z

m∏a=1

fa(x∂a),

I where x∂a is the subvector of variables involved in factor a

I and Z ,∑x f(x) is called the partition function

I For Constraint Satisfaction Problems (CSPs)

I All factors fa(x∂a) take values in {0, 1}I The set of valid configurations is {x ∈ Xn|f(x) = 1}I Thus, Z equals the number of valid configurations

I P (x) is uniform over the set of valid configurations

Page 68: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 20 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 69: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 21 / 65

Marginalization via Belief Propagation

I Factor Graph G = (V ∪ F,E)

I Variable nodes V , Factor nodes F

I Edges: (i, a) ∈ E ⊆ V × FI F (i)/V (a) = set of neighbors for node-i/a

I Messages: µ(t)i→a(xi) and µ

(t)a→i(xi)

I variable-i to factor-a message

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t+1)i→a (xi) =

∏b∈F (i)\a

µ(t)b→i(xi)

I factor-a to variable-i message

µ(t)j1→a(xj1)

µ(t)j2→a(xj2)

µ(t)j3→a(xj3)

a µ(t)a→i(xi) =

∑xV (a)\i

fa(xV (a))∏

j∈V (a)\i

µ(t)j→a(xj)

I variable-i marginal

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t)b4→i(xi)

µ(t+1)i (xi) =

∏b∈F (i)

µ(t)b→i(xi)

Page 70: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 21 / 65

Marginalization via Belief Propagation

I Factor Graph G = (V ∪ F,E)

I Variable nodes V , Factor nodes F

I Edges: (i, a) ∈ E ⊆ V × FI F (i)/V (a) = set of neighbors for node-i/a

I Messages: µ(t)i→a(xi) and µ

(t)a→i(xi)

I variable-i to factor-a message

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t+1)i→a (xi) =

∏b∈F (i)\a

µ(t)b→i(xi)

I factor-a to variable-i message

µ(t)j1→a(xj1)

µ(t)j2→a(xj2)

µ(t)j3→a(xj3)

a µ(t)a→i(xi) =

∑xV (a)\i

fa(xV (a))∏

j∈V (a)\i

µ(t)j→a(xj)

I variable-i marginal

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t)b4→i(xi)

µ(t+1)i (xi) =

∏b∈F (i)

µ(t)b→i(xi)

Page 71: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 21 / 65

Marginalization via Belief Propagation

I Factor Graph G = (V ∪ F,E)

I Variable nodes V , Factor nodes F

I Edges: (i, a) ∈ E ⊆ V × FI F (i)/V (a) = set of neighbors for node-i/a

I Messages: µ(t)i→a(xi) and µ

(t)a→i(xi)

I variable-i to factor-a message

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t+1)i→a (xi) =

∏b∈F (i)\a

µ(t)b→i(xi)

I factor-a to variable-i message

µ(t)j1→a(xj1)

µ(t)j2→a(xj2)

µ(t)j3→a(xj3)

a µ(t)a→i(xi) =

∑xV (a)\i

fa(xV (a))∏

j∈V (a)\i

µ(t)j→a(xj)

I variable-i marginal

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t)b4→i(xi)

µ(t+1)i (xi) =

∏b∈F (i)

µ(t)b→i(xi)

Page 72: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 21 / 65

Marginalization via Belief Propagation

I Factor Graph G = (V ∪ F,E)

I Variable nodes V , Factor nodes F

I Edges: (i, a) ∈ E ⊆ V × FI F (i)/V (a) = set of neighbors for node-i/a

I Messages: µ(t)i→a(xi) and µ

(t)a→i(xi)

I variable-i to factor-a message

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t+1)i→a (xi) =

∏b∈F (i)\a

µ(t)b→i(xi)

I factor-a to variable-i message

µ(t)j1→a(xj1)

µ(t)j2→a(xj2)

µ(t)j3→a(xj3)

a µ(t)a→i(xi) =

∑xV (a)\i

fa(xV (a))∏

j∈V (a)\i

µ(t)j→a(xj)

I variable-i marginal

µ(t)b1→i(xi)

µ(t)b2→i(xi)

µ(t)b3→i(xi)

i µ(t)b4→i(xi)

µ(t+1)i (xi) =

∏b∈F (i)

µ(t)b→i(xi)

Page 73: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 22 / 65

Marginalization via Belief Propagation: Example

iteration 1: variable to factor

µ(1)i→a(xi) = 1

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5µ

(1)

1→1

µ (1)1→2

µ(1)

2→

1

µ (1)3→

1 µ(1)

4→

2

µ (1)6→

2

µ(1)

4→

3

µ (1)4→

4

µ(1

)5→

4

Page 74: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 22 / 65

Marginalization via Belief Propagation: Example

iteration 1: variable to factor

µ(1)i→a(xi) = 1

iteration 1: factor to variable

µ(1)4→4(x4) =

∑x5

f4(x4, x5)µ(1)5→4(xi)

=∑x5

f4(x4, x5)

µ(1)3→4(x4) = f3(x4)

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5

µ(1)

3→

4

µ (1)4→

4

µ(1

)5→

4

Page 75: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 22 / 65

Marginalization via Belief Propagation: Example

iteration 1: factor to variable

µ(1)4→4(x4) =

∑x5

f4(x4, x5)µ(1)5→4(xi)

=∑x5

f4(x4, x5)

µ(1)3→4(x4) = f3(x4)

iteration 2: variable to factor

µ(2)4→2(x4) = µ

(1)4→4(x4)µ

(1)3→4(x4)

= f3(x4)∑x5

f4(x4, x5)

µ(2)6→2(x6) = 1

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5

µ(2)

4→

2

µ (2)6→

2

µ(1)

3→

4

µ (1)4→

4

µ(1

)5→

4

Page 76: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 22 / 65

Marginalization via Belief Propagation: Example

iteration 2: variable to factor

µ(2)4→2(x4) = µ

(1)4→4(x4)µ

(1)3→4(x4)

= f3(x4)∑x5

f4(x4, x5)

µ(2)6→2(x6) = 1

iteration 2: factor to variable

µ(2)2→1(x1) =

∑x4,x6

f2(x1, x4, x6)µ(2)4→2(x4)µ

(2)6→2(x6)

=∑x4,x6

f2(x1, x4, x6)f3(x4)∑x5

f4(x4, x5)

= f ′′2 (x1)

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5

µ (2)2→1

µ(2)

4→

2

µ (2)6→

2

µ(1)

3→

4

µ (1)4→

4

µ(1

)5→

4

Page 77: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 22 / 65

Marginalization via Belief Propagation: Example

iteration 2: variable marginal

µ(3)1 (x1) = µ

(2)1→1(x1)µ

(2)2→1(x1)

= f ′′1 (x1)f ′′2 (x2)

Same answer as peeling but froma distributed parallel algorithm

x1

f1 f2

x2 x3 x4 x6

f3 f4

x5µ

(2)

1→1

µ (2)2→1

µ(2)

2→

1

µ (2)3→

1 µ(2)

4→

2

µ (2)6→

2

µ(1)

3→

4

µ (1)4→

4

µ(1

)5→

4

Page 78: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 23 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 79: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 24 / 65

Sudoku: A Factor Graph for the Masses

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

rows are permutations of {1, 2, . . . , 9}

columns are permutations of {1, 2, . . . , 9}subblocks are permutations of {1, 2, . . . , 9}

x11 x12 x13 x14 x15 x16 x17 x18 x19

x21 x22 x23 x24 x25 x26 x27 x28 x29

x31 x32 x33 x34 x35 x36 x37 x38 x39

x41 x42 x43 x44 x45 x46 x47 x48 x49

x51 x52 x53 x54 x55 x56 x57 x58 x59

x61 x62 x63 x64 x65 x66 x67 x68 x69

x71 x72 x73 x74 x75 x76 x77 x78 x79

x81 x82 x83 x84 x85 x86 x87 x88 x89

x91 x92 x93 x94 x95 x96 x97 x98 x99

implied factor graph has81 variable and 27 factor nodes

f(x) =

(9∏i=1

fσ(xi∗)

) 9∏j=1

fσ(x∗j)

( 9∏k=1

fσ(xB(k))

) ∏(i,j)∈O

I(xij = yij)

Page 80: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 24 / 65

Sudoku: A Factor Graph for the Masses

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

rows are permutations of {1, 2, . . . , 9}columns are permutations of {1, 2, . . . , 9}

subblocks are permutations of {1, 2, . . . , 9}

x11 x12 x13 x14 x15 x16 x17 x18 x19

x21 x22 x23 x24 x25 x26 x27 x28 x29

x31 x32 x33 x34 x35 x36 x37 x38 x39

x41 x42 x43 x44 x45 x46 x47 x48 x49

x51 x52 x53 x54 x55 x56 x57 x58 x59

x61 x62 x63 x64 x65 x66 x67 x68 x69

x71 x72 x73 x74 x75 x76 x77 x78 x79

x81 x82 x83 x84 x85 x86 x87 x88 x89

x91 x92 x93 x94 x95 x96 x97 x98 x99

implied factor graph has81 variable and 27 factor nodes

f(x) =

(9∏i=1

fσ(xi∗)

) 9∏j=1

fσ(x∗j)

( 9∏k=1

fσ(xB(k))

) ∏(i,j)∈O

I(xij = yij)

Page 81: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 24 / 65

Sudoku: A Factor Graph for the Masses

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

rows are permutations of {1, 2, . . . , 9}columns are permutations of {1, 2, . . . , 9}

subblocks are permutations of {1, 2, . . . , 9}

x11 x12 x13 x14 x15 x16 x17 x18 x19

x21 x22 x23 x24 x25 x26 x27 x28 x29

x31 x32 x33 x34 x35 x36 x37 x38 x39

x41 x42 x43 x44 x45 x46 x47 x48 x49

x51 x52 x53 x54 x55 x56 x57 x58 x59

x61 x62 x63 x64 x65 x66 x67 x68 x69

x71 x72 x73 x74 x75 x76 x77 x78 x79

x81 x82 x83 x84 x85 x86 x87 x88 x89

x91 x92 x93 x94 x95 x96 x97 x98 x99

implied factor graph has81 variable and 27 factor nodes

f(x) =

(9∏i=1

fσ(xi∗)

) 9∏j=1

fσ(x∗j)

( 9∏k=1

fσ(xB(k))

) ∏(i,j)∈O

I(xij = yij)

Page 82: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 24 / 65

Sudoku: A Factor Graph for the Masses

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

rows are permutations of {1, 2, . . . , 9}columns are permutations of {1, 2, . . . , 9}

subblocks are permutations of {1, 2, . . . , 9}

x11 x12 x13 x14 x15 x16 x17 x18 x19

x21 x22 x23 x24 x25 x26 x27 x28 x29

x31 x32 x33 x34 x35 x36 x37 x38 x39

x41 x42 x43 x44 x45 x46 x47 x48 x49

x51 x52 x53 x54 x55 x56 x57 x58 x59

x61 x62 x63 x64 x65 x66 x67 x68 x69

x71 x72 x73 x74 x75 x76 x77 x78 x79

x81 x82 x83 x84 x85 x86 x87 x88 x89

x91 x92 x93 x94 x95 x96 x97 x98 x99

implied factor graph has81 variable and 27 factor nodes

f(x) =

(9∏i=1

fσ(xi∗)

) 9∏j=1

fσ(x∗j)

( 9∏k=1

fσ(xB(k))

) ∏(i,j)∈O

I(xij = yij)

Page 83: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 25 / 65

Solving Sudoku with a Factor Graph

I Consider any constraint satisfaction problem with observed entries

I One can write f(x) as the product of indicator functions

I Some factors force x to be valid (i.e., satisfy constraints)

I Other factors force x to be compatible with observed values

I Summing over x counts the # of valid compatible sequences

I Low-complexity peeling solution

I Set elements of x one at a time

I Each step looks for i ∈ [n] and x′ ∈ X such that:

I For currently set variables, f(x) = 0 for all xi ∈ X \ x′

I Sudoku’s unique solution implies that xi = x′ correct

I Fix xi = x′ and repeat until all values fixed

Page 84: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 25 / 65

Solving Sudoku with a Factor Graph

I Consider any constraint satisfaction problem with observed entries

I One can write f(x) as the product of indicator functions

I Some factors force x to be valid (i.e., satisfy constraints)

I Other factors force x to be compatible with observed values

I Summing over x counts the # of valid compatible sequences

I Low-complexity peeling solution

I Set elements of x one at a time

I Each step looks for i ∈ [n] and x′ ∈ X such that:

I For currently set variables, f(x) = 0 for all xi ∈ X \ x′

I Sudoku’s unique solution implies that xi = x′ correct

I Fix xi = x′ and repeat until all values fixed

Page 85: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 26 / 65

Boolean Satisfiability: K-SAT

I One instance of 3-SAT is given, for example, by

f(x) = (x1 ∨ x3 ∨ x7) ∧ (x1 ∨ x2 ∨ x5) ∧ (x2 ∨ x4 ∨ x6) .

I In the FG, clause a ∈ [m] is enforced by the function fa

I Marginalization allows uniform sampling from valid set

I For i = 1, 2, . . . , n, fix xj for j < i and compute marginal

gi(xi) =1

Zi

∑xi+1,...,xn

f(x) = P (Xi = xi|X1 = x1, . . . , Xi−1 = xi−1)

I Then, sample xi ∼ gi(·) and repeat

I This algorithm has low complexity if factor graph forms a tree

I If not a tree, use approximate marginal from belief propagation

I This is related to BP-guided decimation [MM09]

Page 86: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 26 / 65

Boolean Satisfiability: K-SAT

I One instance of 3-SAT is given, for example, by

f(x) = (x1 ∨ x3 ∨ x7) ∧ (x1 ∨ x2 ∨ x5) ∧ (x2 ∨ x4 ∨ x6) .

I In the FG, clause a ∈ [m] is enforced by the function fa

I Marginalization allows uniform sampling from valid set

I For i = 1, 2, . . . , n, fix xj for j < i and compute marginal

gi(xi) =1

Zi

∑xi+1,...,xn

f(x) = P (Xi = xi|X1 = x1, . . . , Xi−1 = xi−1)

I Then, sample xi ∼ gi(·) and repeat

I This algorithm has low complexity if factor graph forms a tree

I If not a tree, use approximate marginal from belief propagation

I This is related to BP-guided decimation [MM09]

Page 87: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 26 / 65

Boolean Satisfiability: K-SAT

I One instance of 3-SAT is given, for example, by

f(x) = (x1 ∨ x3 ∨ x7) ∧ (x1 ∨ x2 ∨ x5) ∧ (x2 ∨ x4 ∨ x6) .

I In the FG, clause a ∈ [m] is enforced by the function fa

I Marginalization allows uniform sampling from valid set

I For i = 1, 2, . . . , n, fix xj for j < i and compute marginal

gi(xi) =1

Zi

∑xi+1,...,xn

f(x) = P (Xi = xi|X1 = x1, . . . , Xi−1 = xi−1)

I Then, sample xi ∼ gi(·) and repeat

I This algorithm has low complexity if factor graph forms a tree

I If not a tree, use approximate marginal from belief propagation

I This is related to BP-guided decimation [MM09]

Page 88: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 27 / 65

Low-Density Parity-Check (LDPC) Codes

paritychecks

permutation

code bits

I Linear codes defined by xHT = 0 for all c.w. x ∈ C ⊂ {0, 1}n

I H is an m× n sparse parity-check matrix for the code

I Code bits and parity checks associated with cols/rows of H

I Factor graph: H is the biadjacency matrix for variable/factor nodes

I Ensemble defined by configuration model for random graphs

I Checks define factors: feven(xd1) = I(x1 ⊕ · · · ⊕ xd = 0)

I Let x∂a be the subvector of variables in the a-th check and

f(x1, . . . , xn) =

(m∏a=1

feven(x∂a)

)(n∏i=1

PY |X(yi|xi)

)

Page 89: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 27 / 65

Low-Density Parity-Check (LDPC) Codes

paritychecks

permutation

code bits

I Linear codes defined by xHT = 0 for all c.w. x ∈ C ⊂ {0, 1}n

I H is an m× n sparse parity-check matrix for the code

I Code bits and parity checks associated with cols/rows of H

I Factor graph: H is the biadjacency matrix for variable/factor nodes

I Ensemble defined by configuration model for random graphs

I Checks define factors: feven(xd1) = I(x1 ⊕ · · · ⊕ xd = 0)

I Let x∂a be the subvector of variables in the a-th check and

f(x1, . . . , xn) =

(m∏a=1

feven(x∂a)

)(n∏i=1

PY |X(yi|xi)

)

Page 90: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 28 / 65

A Little History

Robert Gallager introduced LDPC codes in 1962 paper

Judea Pearl defined general belief-propagation in 1986 paper

Page 91: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 29 / 65

Simple Message-Passing Decoding for the BEC

I Constraint nodes define the valid patterns

I Circles represent a single value shared by factors

I Squares assert attached variables sum to 0 mod 2

I Iterative decoding on the binary erasure channel (BEC)

I Messages passed in phases: bit-to-check and check-to-bitI Each output message depends on other input messagesI Each message is either the correct value or an erasure

I Message passing rules for the BEC

I Bits pass an erasure only if all other inputs are erasedI Checks pass the correct value only if all other inputs are correct

?

?

1

0

Page 92: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 29 / 65

Simple Message-Passing Decoding for the BEC

I Constraint nodes define the valid patterns

I Circles represent a single value shared by factors

I Squares assert attached variables sum to 0 mod 2

I Iterative decoding on the binary erasure channel (BEC)

I Messages passed in phases: bit-to-check and check-to-bitI Each output message depends on other input messagesI Each message is either the correct value or an erasure

I Message passing rules for the BEC

I Bits pass an erasure only if all other inputs are erasedI Checks pass the correct value only if all other inputs are correct

?

?

?

?

1

0

1

0

Page 93: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 29 / 65

Simple Message-Passing Decoding for the BEC

I Constraint nodes define the valid patterns

I Circles represent a single value shared by factors

I Squares assert attached variables sum to 0 mod 2

I Iterative decoding on the binary erasure channel (BEC)

I Messages passed in phases: bit-to-check and check-to-bitI Each output message depends on other input messagesI Each message is either the correct value or an erasure

I Message passing rules for the BEC

I Bits pass an erasure only if all other inputs are erasedI Checks pass the correct value only if all other inputs are correct

1

?

?

1

1

0

1

0

Page 94: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 29 / 65

Simple Message-Passing Decoding for the BEC

I Constraint nodes define the valid patterns

I Circles represent a single value shared by factors

I Squares assert attached variables sum to 0 mod 2

I Iterative decoding on the binary erasure channel (BEC)

I Messages passed in phases: bit-to-check and check-to-bitI Each output message depends on other input messagesI Each message is either the correct value or an erasure

I Message passing rules for the BEC

I Bits pass an erasure only if all other inputs are erasedI Checks pass the correct value only if all other inputs are correct

1

?

?

1

1

0

?

?

Page 95: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = ε

y1 = 1−(1−x1)3x2 = εy21

y2 = 1−(1−x2)3x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 96: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = ε

y1 = 1−(1−x1)3x2 = εy21

y2 = 1−(1−x2)3x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 97: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = 0.600

y1 = 1−(1−x1)3x2 = εy21

y2 = 1−(1−x2)3x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 98: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = 0.600

y1 = 0.936

x2 = εy21

y2 = 1−(1−x2)3x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 99: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = 0.600

y1 = 0.936

x2 = 0.526

y2 = 1−(1−x2)3x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 100: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = 0.600

y1 = 0.936

x2 = 0.526

y2 = 1−(1−x2)3y2 = 0.894

x3 = εy32

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 101: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 30 / 65

Computation Graph and Density Evolution

x1 = 0.600

y1 = 0.936

x2 = 0.526

y2 = 1−(1−x2)3y2 = 0.894

x3 = εy32x3 = 0.429

I Computation graph for a (3,4)-regular LDPC code

I Illustrates decoding from the perspective of a single bit-node

I For long random LDPC codes, the graph is typically a tree

I Allows density evolution to track message erasure probability

I If x/y are erasure prob. of bit/check output messages, then

εy

y

y

εy3

x

x

x

1− (1− x)3

Page 102: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 31 / 65

Density Evolution (DE) for LDPC Codes

0 0.1 0.2 0.3 0.4 0.5 0.60

0.1

0.2

0.3

0.4

0.5

0.6

x`

x`+

1(3,4) LDPC Code with ε = 0.6

Density evolution for a(3, 4)-regular LDPC code:

x`+1 = ε(1− (1− x`)3

)2Decoding Thresholds:

εBP ≈ 0.647

εMAP ≈ 0.746

εSh = 0.750

I Binary erasure channel (BEC) with erasure prob. ε

I DE tracks bit-to-check msg erasure rate x` after ` iterations

I Defines noise threshold εBP for the large system limit

I Easily computed numerically for given code ensemble

Page 103: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 32 / 65

EXtrinsic Information Transfer (EXIT) Curves

I Introduced by ten Brink in 1999 to understand iterative decoding

I For the BEC, the MAP EXIT curve is

hMAP(ε) ,1

n

n∑i=1

H(Xi|Y ∼i(ε))

I EXIT Area Theorem [ABK04]

1

nH(X|Y (ε)) =

∫ ε

0

hMAP(δ)dδ

I BP EXIT curve

hBP(ε) ,1

n

n∑i=1

H(Xi|ΦBP

i (Y ∼i(ε)))

I where ΦBPi (Z) is the BP estimate of Xi given Z

I Data processing inequality: hBP(ε) ≥ hMAP(ε)

Page 104: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 32 / 65

EXtrinsic Information Transfer (EXIT) Curves

I Introduced by ten Brink in 1999 to understand iterative decoding

I For the BEC, the MAP EXIT curve is

hMAP(ε) ,1

n

n∑i=1

H(Xi|Y ∼i(ε))

I EXIT Area Theorem [ABK04]

1

nH(X|Y (ε)) =

∫ ε

0

hMAP(δ)dδ

I BP EXIT curve

hBP(ε) ,1

n

n∑i=1

H(Xi|ΦBP

i (Y ∼i(ε)))

I where ΦBPi (Z) is the BP estimate of Xi given Z

I Data processing inequality: hBP(ε) ≥ hMAP(ε)

Page 105: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 32 / 65

EXtrinsic Information Transfer (EXIT) Curves

I Introduced by ten Brink in 1999 to understand iterative decoding

I For the BEC, the MAP EXIT curve is

hMAP(ε) ,1

n

n∑i=1

H(Xi|Y ∼i(ε))

I EXIT Area Theorem [ABK04]

1

nH(X|Y (ε)) =

∫ ε

0

hMAP(δ)dδ

I BP EXIT curve

hBP(ε) ,1

n

n∑i=1

H(Xi|ΦBP

i (Y ∼i(ε)))

I where ΦBPi (Z) is the BP estimate of Xi given Z

I Data processing inequality: hBP(ε) ≥ hMAP(ε)

Page 106: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 33 / 65

EXtrinsic Information Transfer (EXIT) Curves

0.5 0.6 0.7 0.8 0.9 10

0.2

0.4

0.6

0.8

1

ε

hBP

(ε)

I (3,4)-regular LDPC code

I Codeword (X1, . . . , Xn)I Received (Y1, . . . , Yn)

I BP EXIT curve via DE

I This code: hBP(ε) = (x∞(ε))3

I 0 below BP threshold 0.647

I MAP EXIT curve is extrinsic en-tropy H(Xi|Y ∼i) vs. channel ε

I 0 below MAP threshold 0.746I Area under curve equals rate RI Upper bounded by BP EXIT

I MAP threshold upper bound εMAP

I ε s.t. area under BP EXIT is R

Page 107: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 33 / 65

EXtrinsic Information Transfer (EXIT) Curves

0.5 0.6 0.7 0.8 0.9 10

0.2

0.4

0.6

0.8

1

ε

hBP

(ε)

I (3,4)-regular LDPC code

I Codeword (X1, . . . , Xn)I Received (Y1, . . . , Yn)

I BP EXIT curve via DE

I This code: hBP(ε) = (x∞(ε))3

I 0 below BP threshold 0.647

I MAP EXIT curve is extrinsic en-tropy H(Xi|Y ∼i) vs. channel ε

I 0 below MAP threshold 0.746I Area under curve equals rate RI Upper bounded by BP EXIT

I MAP threshold upper bound εMAP

I ε s.t. area under BP EXIT is R

Page 108: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 33 / 65

EXtrinsic Information Transfer (EXIT) Curves

0.5 0.6 0.7 0.8 0.9 10

0.2

0.4

0.6

0.8

1

ε

hBP

(ε)

I (3,4)-regular LDPC code

I Codeword (X1, . . . , Xn)I Received (Y1, . . . , Yn)

I BP EXIT curve via DE

I This code: hBP(ε) = (x∞(ε))3

I 0 below BP threshold 0.647

I MAP EXIT curve is extrinsic en-tropy H(Xi|Y ∼i) vs. channel ε

I 0 below MAP threshold 0.746I Area under curve equals rate RI Upper bounded by BP EXIT

I MAP threshold upper bound εMAP

I ε s.t. area under BP EXIT is R

Page 109: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 33 / 65

EXtrinsic Information Transfer (EXIT) Curves

0.5 0.6 0.7 0.8 0.9 10

0.2

0.4

0.6

0.8

1

ε

hBP

(ε)

I (3,4)-regular LDPC code

I Codeword (X1, . . . , Xn)I Received (Y1, . . . , Yn)

I BP EXIT curve via DE

I This code: hBP(ε) = (x∞(ε))3

I 0 below BP threshold 0.647

I MAP EXIT curve is extrinsic en-tropy H(Xi|Y ∼i) vs. channel ε

I 0 below MAP threshold 0.746I Area under curve equals rate RI Upper bounded by BP EXIT

I MAP threshold upper bound εMAP

I ε s.t. area under BP EXIT is R

Page 110: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 34 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 111: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 35 / 65

Properties of the MAP EXIT Curve

I For linear codes, the recovery of Xi from Y = y

I is independent of the transmitted codeword X

I only depends on erasure indicator zi = 1{?}(yi)

I For example, H(Xi|Y = y, Z = z) = f(z) ∈ {0, 1}

I The MAP bit-erasure rate Pb(ε) satisfies

Pb(ε) = P(Yi = ?)H(Xi|Y , Yi = ?) = εhMAP(ε)

I A sequence of rate-R codes achieves capacity iff

I Pb(ε)→ 0 for all ε < 1−RI hMAP(ε)→ 0 for all ε < 1−RI hMAP(ε) transitions sharply from 0 to 1

Page 112: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 35 / 65

Properties of the MAP EXIT Curve

I For linear codes, the recovery of Xi from Y = y

I is independent of the transmitted codeword X

I only depends on erasure indicator zi = 1{?}(yi)

I For example, H(Xi|Y = y, Z = z) = f(z) ∈ {0, 1}

I The MAP bit-erasure rate Pb(ε) satisfies

Pb(ε) = P(Yi = ?)H(Xi|Y , Yi = ?) = εhMAP(ε)

I A sequence of rate-R codes achieves capacity iff

I Pb(ε)→ 0 for all ε < 1−RI hMAP(ε)→ 0 for all ε < 1−RI hMAP(ε) transitions sharply from 0 to 1

Page 113: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 35 / 65

Properties of the MAP EXIT Curve

I For linear codes, the recovery of Xi from Y = y

I is independent of the transmitted codeword X

I only depends on erasure indicator zi = 1{?}(yi)

I For example, H(Xi|Y = y, Z = z) = f(z) ∈ {0, 1}

I The MAP bit-erasure rate Pb(ε) satisfies

Pb(ε) = P(Yi = ?)H(Xi|Y , Yi = ?) = εhMAP(ε)

I A sequence of rate-R codes achieves capacity iff

I Pb(ε)→ 0 for all ε < 1−RI hMAP(ε)→ 0 for all ε < 1−RI hMAP(ε) transitions sharply from 0 to 1

Page 114: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 36 / 65

The MAP EXIT Curve of a Capacity-Achieving Code

0 0.25 0.5 0.75 10

0.25

0.5

0.75

1

Erasure Probability

MA

PE

XIT

Fu

nct

ion

n = 23

I For δ>0, transition width is ε-range over which δ≤hMAP(ε)≤1− δ

I Area Theorem implies sharp transition iff capacity achieving

Page 115: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 36 / 65

The MAP EXIT Curve of a Capacity-Achieving Code

0 0.25 0.5 0.75 10

0.25

0.5

0.75

1

Erasure Probability

MA

PE

XIT

Fu

nct

ion

n = 23

I For δ>0, transition width is ε-range over which δ≤hMAP(ε)≤1− δI Area Theorem implies sharp transition iff capacity achieving

Page 116: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 36 / 65

The MAP EXIT Curve of a Capacity-Achieving Code

0 0.25 0.5 0.75 10

0.25

0.5

0.75

1

Erasure Probability

MA

PE

XIT

Fu

nct

ion

n = 23

n = 25

I For δ>0, transition width is ε-range over which δ≤hMAP(ε)≤1− δI Area Theorem implies sharp transition iff capacity achieving

Page 117: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 36 / 65

The MAP EXIT Curve of a Capacity-Achieving Code

0 0.25 0.5 0.75 10

0.25

0.5

0.75

1

Erasure Probability

MA

PE

XIT

Fu

nct

ion

n = 23

n = 25

n = 27

I For δ>0, transition width is ε-range over which δ≤hMAP(ε)≤1− δI Area Theorem implies sharp transition iff capacity achieving

Page 118: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 36 / 65

The MAP EXIT Curve of a Capacity-Achieving Code

0 0.25 0.5 0.75 10

0.25

0.5

0.75

1

Erasure Probability

MA

PE

XIT

Fu

nct

ion

n = 23

n = 25

n = 27

n = 29

I For δ>0, transition width is ε-range over which δ≤hMAP(ε)≤1− δI Area Theorem implies sharp transition iff capacity achieving

Page 119: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 37 / 65

EXIT Curves and Sharp Transitions

I Consider any monotone boolean function f : {0, 1}n−1 → {0, 1}

I Define its symmetry group G to be

G ={π ∈ Sn−1 | f(π(z)) = f(z)∀z ∈ {0, 1}n−1

}I Let Zi ∈ {0, 1} be i.i.d. with P(Zi = 1) = ε and define

h(ε) , E [f(Z1, . . . , Zn−1)]

I If G is transitive, then h(ε) has transition width O(

1lnn

)∗∀i, j ∈ {1, 2, . . . , n− 1},∃π ∈ G s.t. π(i) = j

I When do EXIT curves have a sharp transition? [KKMPSU15]

I If the code’s permutation group is doubly transitive!

I For example, Reed-Muller and prim. narrow-sense BCH codes

∗ Friedgut-Kalai’96: “Every monotone graph property has a sharp threshold”

Page 120: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 37 / 65

EXIT Curves and Sharp Transitions

I Consider any monotone boolean function f : {0, 1}n−1 → {0, 1}I Define its symmetry group G to be

G ={π ∈ Sn−1 | f(π(z)) = f(z)∀z ∈ {0, 1}n−1

}

I Let Zi ∈ {0, 1} be i.i.d. with P(Zi = 1) = ε and define

h(ε) , E [f(Z1, . . . , Zn−1)]

I If G is transitive, then h(ε) has transition width O(

1lnn

)∗∀i, j ∈ {1, 2, . . . , n− 1},∃π ∈ G s.t. π(i) = j

I When do EXIT curves have a sharp transition? [KKMPSU15]

I If the code’s permutation group is doubly transitive!

I For example, Reed-Muller and prim. narrow-sense BCH codes

∗ Friedgut-Kalai’96: “Every monotone graph property has a sharp threshold”

Page 121: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 37 / 65

EXIT Curves and Sharp Transitions

I Consider any monotone boolean function f : {0, 1}n−1 → {0, 1}I Define its symmetry group G to be

G ={π ∈ Sn−1 | f(π(z)) = f(z)∀z ∈ {0, 1}n−1

}I Let Zi ∈ {0, 1} be i.i.d. with P(Zi = 1) = ε and define

h(ε) , E [f(Z1, . . . , Zn−1)]

I If G is transitive, then h(ε) has transition width O(

1lnn

)∗∀i, j ∈ {1, 2, . . . , n− 1},∃π ∈ G s.t. π(i) = j

I When do EXIT curves have a sharp transition? [KKMPSU15]

I If the code’s permutation group is doubly transitive!

I For example, Reed-Muller and prim. narrow-sense BCH codes

∗ Friedgut-Kalai’96: “Every monotone graph property has a sharp threshold”

Page 122: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 37 / 65

EXIT Curves and Sharp Transitions

I Consider any monotone boolean function f : {0, 1}n−1 → {0, 1}I Define its symmetry group G to be

G ={π ∈ Sn−1 | f(π(z)) = f(z)∀z ∈ {0, 1}n−1

}I Let Zi ∈ {0, 1} be i.i.d. with P(Zi = 1) = ε and define

h(ε) , E [f(Z1, . . . , Zn−1)]

I If G is transitive, then h(ε) has transition width O(

1lnn

)∗∀i, j ∈ {1, 2, . . . , n− 1},∃π ∈ G s.t. π(i) = j

I When do EXIT curves have a sharp transition? [KKMPSU15]

I If the code’s permutation group is doubly transitive!

I For example, Reed-Muller and prim. narrow-sense BCH codes

∗ Friedgut-Kalai’96: “Every monotone graph property has a sharp threshold”

Page 123: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 37 / 65

EXIT Curves and Sharp Transitions

I Consider any monotone boolean function f : {0, 1}n−1 → {0, 1}I Define its symmetry group G to be

G ={π ∈ Sn−1 | f(π(z)) = f(z)∀z ∈ {0, 1}n−1

}I Let Zi ∈ {0, 1} be i.i.d. with P(Zi = 1) = ε and define

h(ε) , E [f(Z1, . . . , Zn−1)]

I If G is transitive, then h(ε) has transition width O(

1lnn

)∗∀i, j ∈ {1, 2, . . . , n− 1},∃π ∈ G s.t. π(i) = j

I When do EXIT curves have a sharp transition? [KKMPSU15]

I If the code’s permutation group is doubly transitive!

I For example, Reed-Muller and prim. narrow-sense BCH codes

∗ Friedgut-Kalai’96: “Every monotone graph property has a sharp threshold”

Page 124: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 38 / 65

Summary and Open Problems

I Gallager’s 1960 thesis already contains most of the tools necessaryto achieve capacity in practice

I But, he focuses mainly on the BSC

I Had he attacked the BEC, practical capacity-achieving codesmight have been introduced years earlier

I The first deterministic sequence of capacity-achieving binary codesfor the BEC (under MAP decoding) was defined in 1954!

I Sequences of Reed-Muller codes achieve capacity on the BEC

I But, we didn’t know this until 2015!

I Open problems

I Generalize the Reed-Muller result to have weaker conditionsand/or apply to more general channels/problems

I Find a purely information-theoretic proof of the Reed-Mullerresult for the BEC

Page 125: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 38 / 65

Summary and Open Problems

I Gallager’s 1960 thesis already contains most of the tools necessaryto achieve capacity in practice

I But, he focuses mainly on the BSC

I Had he attacked the BEC, practical capacity-achieving codesmight have been introduced years earlier

I The first deterministic sequence of capacity-achieving binary codesfor the BEC (under MAP decoding) was defined in 1954!

I Sequences of Reed-Muller codes achieve capacity on the BEC

I But, we didn’t know this until 2015!

I Open problems

I Generalize the Reed-Muller result to have weaker conditionsand/or apply to more general channels/problems

I Find a purely information-theoretic proof of the Reed-Mullerresult for the BEC

Page 126: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 38 / 65

Summary and Open Problems

I Gallager’s 1960 thesis already contains most of the tools necessaryto achieve capacity in practice

I But, he focuses mainly on the BSC

I Had he attacked the BEC, practical capacity-achieving codesmight have been introduced years earlier

I The first deterministic sequence of capacity-achieving binary codesfor the BEC (under MAP decoding) was defined in 1954!

I Sequences of Reed-Muller codes achieve capacity on the BEC

I But, we didn’t know this until 2015!

I Open problems

I Generalize the Reed-Muller result to have weaker conditionsand/or apply to more general channels/problems

I Find a purely information-theoretic proof of the Reed-Mullerresult for the BEC

Page 127: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 39 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 128: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 40 / 65

What is Spatial Coupling?

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

.

1 3 5

2 9 4

8 7 6

6

7

8

5

3

1

4

9

2

4

6

5

3

1

8

7

9

2

2

3

5

8

6 3

1

6

4

7

4

3 8

4 9

6 2

9

4

3

7

2

1

I Spatially-Coupled Factor Graphs

I Variable nodes have a natural global orientation

I Boundaries help variables to be recovered in an ordered fashion

Page 129: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 40 / 65

What is Spatial Coupling?

2 5 1 9

8 2 3 6

3 6 7

1 6

5 4 1 9

2 7

9 3 8

2 8 4 7

1 9 7 6

.

1 3 5

2 9 4

8 7 6

6

7

8

5

3

1

4

9

2

4

6

5

3

1

8

7

9

2

2

3

5

8

6 3

1

6

4

7

4

3 8

4 9

6 2

9

4

3

7

2

1

I Spatially-Coupled Factor Graphs

I Variable nodes have a natural global orientation

I Boundaries help variables to be recovered in an ordered fashion

Page 130: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 41 / 65

Spatially-Coupled LDPC Codes: (l, r, L, w) Ensemble

...

...

π0

π′0

−L −2 −1 0 1 2 L... ...

...

...

π−L

π′−L

...

...

π−2

π′−2

...

...

π−1

π′−1

...

...

π1

π′1

...

...

π2

π′2

...

...

πL

π′L

...

...

...

...

l = 3

w = 3

r = 4

−L−2 −L−1 L+1 L+2

π−L−2

...

π−L−1

...

πL+1

...

πL+2

...

π′L+1

...

π′L+2

...

I Historical Notes

I LDPC convolutional codes introduced by FZ in 1999

I Shown to have near optimal noise thresholds by LSZC in 2005

I (l, r, L, w) ensemble proven to achieve capacity by KRU in 2011

Page 131: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 41 / 65

Spatially-Coupled LDPC Codes: (l, r, L, w) Ensemble

...

...

π0

π′0

−L −2 −1 0 1 2 L... ...

...

...

π−L

π′−L

...

...

π−2

π′−2

...

...

π−1

π′−1

...

...

π1

π′1

...

...

π2

π′2

...

...

πL

π′L

...

...

...

...

l = 3

w = 3

r = 4

−L−2 −L−1 L+1 L+2

π−L−2

...

π−L−1

...

πL+1

...

πL+2

...

π′L+1

...

π′L+2

...

I Historical Notes

I LDPC convolutional codes introduced by FZ in 1999

I Shown to have near optimal noise thresholds by LSZC in 2005

I (l, r, L, w) ensemble proven to achieve capacity by KRU in 2011

Page 132: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 41 / 65

Spatially-Coupled LDPC Codes: (l, r, L, w) Ensemble

...

...

π0

π′0

−L −2 −1 0 1 2 L... ...

...

...

π−L

π′−L

...

...

π−2

π′−2

...

...

π−1

π′−1

...

...

π1

π′1

...

...

π2

π′2

...

...

πL

π′L

...

...

...

...

l = 3

w = 3

r = 4

−L−2 −L−1 L+1 L+2

π−L−2

...

π−L−1

...

πL+1

...

πL+2

...

π′L+1

...

π′L+2

...

I Historical Notes

I LDPC convolutional codes introduced by FZ in 1999

I Shown to have near optimal noise thresholds by LSZC in 2005

I (l, r, L, w) ensemble proven to achieve capacity by KRU in 2011

Page 133: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 41 / 65

Spatially-Coupled LDPC Codes: (l, r, L, w) Ensemble

...

...

π0

π′0

−L −2 −1 0 1 2 L... ...

...

...

π−L

π′−L

...

...

π−2

π′−2

...

...

π−1

π′−1

...

...

π1

π′1

...

...

π2

π′2

...

...

πL

π′L

...

...

...

...

l = 3

w = 3

r = 4

−L−2 −L−1 L+1 L+2

π−L−2

...

π−L−1

...

πL+1

...

πL+2

...

π′L+1

...

π′L+2

...

I Historical Notes

I LDPC convolutional codes introduced by FZ in 1999

I Shown to have near optimal noise thresholds by LSZC in 2005

I (l, r, L, w) ensemble proven to achieve capacity by KRU in 2011

Page 134: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 42 / 65

The LDPCC Gang

Page 135: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 43 / 65

The Spatial Coupling KRU

Page 136: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 137: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 1

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 138: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 2

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 139: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 3

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 140: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 4

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 141: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 10

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 142: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 50

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 143: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 100

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 144: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 44 / 65

Density Evolution for the (l, r, L, w)-SC LDPC Ensemble

−15 −10 −5 0 5 10 150

0.05

0.10

0.15

0.20

0.25

0.30

0.35

0.40

0.45

0.50

0.55

0.60

0.65

0.70

0.75

Spatial Position

Mes

sage

Era

sure

Pro

babi

lity

(3, 4, 16, 3)-SC Ensemble with ε = 0.70

Iteration 150

x(`+1)i =

1

w

w−1∑k=0

ε

1

w

w−1∑j=0

(1− (1− x(`)i+j−k)r−1

)l−1

1[−L,L+w−1](i−k)

Page 145: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 45 / 65

Properties of Threshold Saturation

l r εBP εMAP

3 6 0.4294 0.4882

4 8 0.3834 0.4977

5 10 0.3416 0.4995

6 12 0.3075 0.4999

7 14 0.2798 0.5000

I Spatial coupling achieves the MAP threshold as w →∞I BP threshold typically decreases after l = 3

I MAP threshold is increasing in l, r for fixed rate

I Benefits and Drawbacks

I For fixed L, minimum distance grows linearly with block length

I Rate loss of O(w/L) is a big obstacle in practice

Page 146: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 46 / 65

Threshold Saturation via Spatial Coupling

I General Phenomenon (observed by Kudekar, Richardson, Urbanke)

I BP threshold of the spatially-coupled system converges to theMAP threshold of the uncoupled system

I Can be proven rigorously in many cases!

I Connection to statistical physics

I Factor graph defines system of coupled particles

I Valid sequences are ordered crystalline structures

I Between BP and MAP threshold, system acts as supercooled liquid

I Correct answer (crystalline state) has minimum energy

I Crystallization (i.e., decoding) does not occur without a seed

I Ex.: ice melts at 0 ◦C but freezing w/o a seed requires −48.3 ◦C

http://www.youtube.com/watch?v=Xe8vJrIvDQM

Page 147: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 46 / 65

Threshold Saturation via Spatial Coupling

I General Phenomenon (observed by Kudekar, Richardson, Urbanke)

I BP threshold of the spatially-coupled system converges to theMAP threshold of the uncoupled system

I Can be proven rigorously in many cases!

I Connection to statistical physics

I Factor graph defines system of coupled particles

I Valid sequences are ordered crystalline structures

I Between BP and MAP threshold, system acts as supercooled liquid

I Correct answer (crystalline state) has minimum energy

I Crystallization (i.e., decoding) does not occur without a seed

I Ex.: ice melts at 0 ◦C but freezing w/o a seed requires −48.3 ◦C

http://www.youtube.com/watch?v=Xe8vJrIvDQM

Page 148: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 46 / 65

Threshold Saturation via Spatial Coupling

I General Phenomenon (observed by Kudekar, Richardson, Urbanke)

I BP threshold of the spatially-coupled system converges to theMAP threshold of the uncoupled system

I Can be proven rigorously in many cases!

I Connection to statistical physics

I Factor graph defines system of coupled particles

I Valid sequences are ordered crystalline structures

I Between BP and MAP threshold, system acts as supercooled liquid

I Correct answer (crystalline state) has minimum energy

I Crystallization (i.e., decoding) does not occur without a seed

I Ex.: ice melts at 0 ◦C but freezing w/o a seed requires −48.3 ◦C

http://www.youtube.com/watch?v=Xe8vJrIvDQM

Page 149: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 47 / 65

Why is Spatial Coupling Interesting?

I Breakthroughs: first practical constructions of

I universal codes for binary-input memoryless channels [KRU12]

I information-theoretically optimal compressive sensing [DJM11]

I universal codes for Slepian-Wolf and MAC problems [YJNP11]

I codes → capacity with iterative hard-decision decoding [JNP12]

I codes → rate-distortion limit with iterative decoding [AMUV12]

I It allows rigorous proof in many cases

I Original proofs [KRU11/12] quite specific to LDPC codes

I Our proof for increasing scalar/vector recursions [YJNP12/13]

I Spatial coupling as a proof technique [GMU13]

I For a large random factor graph, construct a coupled version

I Use DE to analyze BP decoding of coupled system

I Compare uncoupled MAP with coupled BP via interpolation

Page 150: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 47 / 65

Why is Spatial Coupling Interesting?

I Breakthroughs: first practical constructions of

I universal codes for binary-input memoryless channels [KRU12]

I information-theoretically optimal compressive sensing [DJM11]

I universal codes for Slepian-Wolf and MAC problems [YJNP11]

I codes → capacity with iterative hard-decision decoding [JNP12]

I codes → rate-distortion limit with iterative decoding [AMUV12]

I It allows rigorous proof in many cases

I Original proofs [KRU11/12] quite specific to LDPC codes

I Our proof for increasing scalar/vector recursions [YJNP12/13]

I Spatial coupling as a proof technique [GMU13]

I For a large random factor graph, construct a coupled version

I Use DE to analyze BP decoding of coupled system

I Compare uncoupled MAP with coupled BP via interpolation

Page 151: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 47 / 65

Why is Spatial Coupling Interesting?

I Breakthroughs: first practical constructions of

I universal codes for binary-input memoryless channels [KRU12]

I information-theoretically optimal compressive sensing [DJM11]

I universal codes for Slepian-Wolf and MAC problems [YJNP11]

I codes → capacity with iterative hard-decision decoding [JNP12]

I codes → rate-distortion limit with iterative decoding [AMUV12]

I It allows rigorous proof in many cases

I Original proofs [KRU11/12] quite specific to LDPC codes

I Our proof for increasing scalar/vector recursions [YJNP12/13]

I Spatial coupling as a proof technique [GMU13]

I For a large random factor graph, construct a coupled version

I Use DE to analyze BP decoding of coupled system

I Compare uncoupled MAP with coupled BP via interpolation

Page 152: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 48 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 153: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 49 / 65

Universality over Unknown Parameters

I The Achievable Channel Parameter Region (ACPR)

I For a sequence of coding schemes involving one or moreparameters, the parameter region where decoding succeeds inthe limit

I In contrast, a capacity region is a rate region for fixed channels

I Properties

I For fixed encoders, the ACPR depends on the decoders

I For example, one has BP-ACPR ⊆ MAP-ACPR

I Often, ∃ unique maximal ACPR given by information theory

I Universality

I A sequence of encoding/decoding schemes is called universal if:its ACPR equals the optimal ACPR

I Channel parameters are assumed unknown at the transmitter

I At the receiver, the channel parameters are easily estimated

0.8 1 1.2 1.4 1.6 1.8 2 2.20.8

1

1.2

1.4

1.6

1.8

2

2.2

MAC-ACPR boundaryfor rate 1/2

α1

α2

Page 154: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 49 / 65

Universality over Unknown Parameters

I The Achievable Channel Parameter Region (ACPR)

I For a sequence of coding schemes involving one or moreparameters, the parameter region where decoding succeeds inthe limit

I In contrast, a capacity region is a rate region for fixed channels

I Properties

I For fixed encoders, the ACPR depends on the decoders

I For example, one has BP-ACPR ⊆ MAP-ACPR

I Often, ∃ unique maximal ACPR given by information theory

I Universality

I A sequence of encoding/decoding schemes is called universal if:its ACPR equals the optimal ACPR

I Channel parameters are assumed unknown at the transmitter

I At the receiver, the channel parameters are easily estimated

Page 155: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 49 / 65

Universality over Unknown Parameters

I The Achievable Channel Parameter Region (ACPR)

I For a sequence of coding schemes involving one or moreparameters, the parameter region where decoding succeeds inthe limit

I In contrast, a capacity region is a rate region for fixed channels

I Properties

I For fixed encoders, the ACPR depends on the decoders

I For example, one has BP-ACPR ⊆ MAP-ACPR

I Often, ∃ unique maximal ACPR given by information theory

I Universality

I A sequence of encoding/decoding schemes is called universal if:its ACPR equals the optimal ACPR

I Channel parameters are assumed unknown at the transmitter

I At the receiver, the channel parameters are easily estimated

Page 156: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 50 / 65

2-User Binary-Input Gaussian Multiple Access Channel

X1

X2

+

h1

h2

Z ∼ N (0, 1)

Y

I Fixed noise variance

I Real channel gains h1 and h2 not known at transmitter

I Each code has rate R

I MAC-ACPR denotes the information-theoretic optimal region

Page 157: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 51 / 65

A Little History: SC for Multiple-Access (MAC) Channels

I KK consider a binary-adder erasure channel (ISIT 2011)

I SC exhibits threshold saturation for the joint decoder

I YNPN consider the Gaussian MAC (ISIT/Allerton 2011)

I SC exhibits threshold saturation for the joint decoder

I For channel gains h1, h2 unknown at transmitter,SC provides universality

I Others consider CDMA systems without coding

I TTK show SC improves BP demod of standard CDMA

I ST prove saturation for a SC protograph-style CDMA

Page 158: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 52 / 65

Spatially-Coupled Factor Graph for Joint Decoder

2L+ 1

Page 159: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 52 / 65

Spatially-Coupled Factor Graph for Joint Decoder

2L+ 1

Page 160: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 52 / 65

Spatially-Coupled Factor Graph for Joint Decoder

2L+ 1

Page 161: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 53 / 65

DE Performance of the Joint Decoder

0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.20.8

0.9

1

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

2

2.1

2.2

MAC-ACPRboundary for rate1/2

α1 = |h1|2

α2=|h

2|2

Page 162: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 53 / 65

DE Performance of the Joint Decoder

0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.20.8

0.9

1

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

2

2.1

2.2

BP-ACPR, LDPC(3, 6)

MAC-ACPRboundary for rate1/2

α1 = |h1|2

α2=|h

2|2

Page 163: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 53 / 65

DE Performance of the Joint Decoder

0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.20.8

0.9

1

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

2

2.1

2.2

BP-ACPR, LDPC(3, 6)

BP-ACPR, SC(3, 6, 64, 5)

MAC-ACPRboundary for rate1/2

α1 = |h1|2

α2=|h

2|2

Page 164: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 53 / 65

DE Performance of the Joint Decoder

0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.20.8

0.9

1

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

2

2.1

2.2

MAC-ACPRboundary for rate1/2

BP-ACPR, LDPC(4, 8)

α1 = |h1|2

α2=|h

2|2

Page 165: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 53 / 65

DE Performance of the Joint Decoder

0.8 0.9 1 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 2 2.1 2.20.8

0.9

1

1.1

1.2

1.3

1.4

1.5

1.6

1.7

1.8

1.9

2

2.1

2.2

BP-ACPR,SC(4, 8, 64, 5)

MAC-ACPRboundary for rate1/2

BP-ACPR, LDPC(4, 8)

α1 = |h1|2

α2=|h

2|2

Page 166: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 54 / 65

Outline

Introduction

Factor Graphs

Message Passing

Applications of Factor Graphs

Applications of EXIT Curves

Spatially-Coupled Factor Graphs

Universality for Multiuser Scenarios

Abstract Formulation of Threshold Saturation

Page 167: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 168: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 169: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 170: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)

I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 171: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2

Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 172: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 55 / 65

Single Monotone Recursion

I Smooth increasing f : [0, 1]→ [0, 1]

I Discrete-time recursion

x(`+1) = f(x(`))

I “Potential energy” Us(x)

Us(x) =

∫ x

0

(z − f(z)

)dz =

x2

2− F (x)

I Continuous (small step) dynamics

d

dtx(t) = f

(x(t)

)−x(t) = −∇Us

(x(t)

)I Lyapunov stability

d

dtUs

(x(t)

)= −

(x(t)− f(x(t))

)2Both ↓ 0 iff no fixed points in (0, 1]

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

f(x

)

0 0.2 0.4 0.6 0.8 10

0.2

0.4

0.6

0.8

1

x

25Us(x

)

Page 173: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 56 / 65

Coupled Monotone Recursion (1)

I Coupled recursion x(`+1) = Tx(`) with x(`) =(x(`)0 , x

(`)1 , . . .

)and

Tx , A>f(Ax),

where [f(x)]i = f(xi) and A averages w adjacent values

A =1

w

1 1 · · · 1 0 · · ·0 1 1

. . . 1. . ....

. . .. . .

. . .. . .

. . .

I i.e., avg right w positions, apply f , then avg left w positions

I Coupled potential: Uc(x) = 12

∞∑i=0

x2i −∞∑i=0

F

(1w

w−1∑j=0

xi+j

)I Satisfies ∇Uc(x) = x−A>f

(Ax)

I Danger: there be dragons———– infinities

Page 174: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 56 / 65

Coupled Monotone Recursion (1)

I Coupled recursion x(`+1) = Tx(`) with x(`) =(x(`)0 , x

(`)1 , . . .

)and

Tx , A>f(Ax),

where [f(x)]i = f(xi) and A averages w adjacent values

A =1

w

1 1 · · · 1 0 · · ·0 1 1

. . . 1. . ....

. . .. . .

. . .. . .

. . .

I i.e., avg right w positions, apply f , then avg left w positions

I Coupled potential: Uc(x) = 12

∞∑i=0

x2i −∞∑i=0

F

(1w

w−1∑j=0

xi+j

)I Satisfies ∇Uc(x) = x−A>f

(Ax)

I Danger: there be dragons———– infinities

Page 175: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 57 / 65

Coupled Monotone Recursion (2)

I Properties of T (note: x � y ⇔ xi ≤ yi for all i)

I T is monotone: x � y implies Tx � TyI T preserves spatial order: xi+1 ≥ xi implies [Tx]i+1 ≥ [Tx]i

I For x(0) = 1, iterates x(`)i are decreasing in ` and increasing in i

I Spatial limit exists: x(`)∞ = limi→∞ x

(`)i

I Iteration limit exists: x(∞)i = lim`→∞ x

(`)i

I Iteration limit satisfies fixed point: x(∞) = Tx(∞)

I Double limit satisfies fixed point: x(∞)∞ = f

(x(∞)∞)

Page 176: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 57 / 65

Coupled Monotone Recursion (2)

I Properties of T (note: x � y ⇔ xi ≤ yi for all i)

I T is monotone: x � y implies Tx � TyI T preserves spatial order: xi+1 ≥ xi implies [Tx]i+1 ≥ [Tx]i

I For x(0) = 1, iterates x(`)i are decreasing in ` and increasing in i

I Spatial limit exists: x(`)∞ = limi→∞ x

(`)i

I Iteration limit exists: x(∞)i = lim`→∞ x

(`)i

I Iteration limit satisfies fixed point: x(∞) = Tx(∞)

I Double limit satisfies fixed point: x(∞)∞ = f

(x(∞)∞)

Page 177: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 178: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 179: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 180: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 181: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 182: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 183: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 58 / 65

Intuition Behind Threshold Saturation

I Between the BP and MAP threshold

I decoding trajectory looks like a right-moving wave

I we know recursion converges pointwise to a limit

I if limit not 0, then compute energy change due to right shift

I Right-shift S satisfies [Sx]i = xi−1 with x−1 = 0

I Relative potential: Vx(t) = Uc

((1− t)x+ tSx

)− Uc(x)

I If xi+1 ≥ xi for all i, then Vx(t) well-defined for t ∈ [0, 1]

I For t = 1, one gets a telescoping sum that shows

Vx(1) ≤ −Us(x∞)

Page 184: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 185: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 186: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursion

I If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 187: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0

I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 188: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)

I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 189: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesis

I Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 190: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0

I Taylor series for V shows |Vz(1)| ≤ K 1w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 191: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)

I Thus, we get a contradiction for sufficiently large w

Page 192: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 59 / 65

Threshold Saturation

Theorem

If f(0)=0 and f ′(0)<1 (0 is stable f.p.) with Us(x)>0 for x∈(0, 1],

then ∃w0 <∞ such that x(∞)∞ = 0 for all w > w0.

I Define relative potential (with xi(t) , xi + t(xi−1 − xi))

Vx(t) ,1

2

∞∑i=0

(xi(t)

2−(xi)2)−∞∑i=0

F 1

w

w−1∑j=0

xi+j(t)

− F 1

w

w−1∑j=0

xi+j

I Sketch of Proof:

I For x(0) = 1, let z = x(∞) be limiting fixed-point of recursionI If z∞ = 0, then we’re done. Suppose z∞ > 0I Then, z∞ = f(z∞) ≥ smallest non-zero f.p. > 0 (ind. of w)I Thus, U(z∞) > 0 by hypothesisI Telescoping sum for V shows Vz(1) ≤ −U(z∞) < 0I Taylor series for V shows |Vz(1)| ≤ K 1

w

(1 + supx∈[0,1] |f ′(x)|

)I Thus, we get a contradiction for sufficiently large w

Page 193: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 60 / 65

History of Threshold Saturation Proofs

I the BEC in 2010 [KRU11]

I Established many properties and tools used by later approaches

I the Curie-Weiss model of physics in 2010 [HMU12]

I CDMA using a GA in 2011 [TTK12]

I CDMA with outer code via GA in 2011 [Tru12]

I compressive sensing using a GA in 2011 [DJM13]

I regular codes on BMS channels in 2012 [KRU13]

I increasing scalar and vector recursions in 2012 [YJNP14]

I irregular LDPC codes on BMS channels in 2012 [KYMP14]

I non-decreasing scalar recursions in 2012 [KRU15]

I non-binary LDPC codes on the BEC in 2014 [AG16]

I and more since 2014...

Page 194: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 61 / 65

Summary and Open Problems

I Factor Graphs

I Useful tool for modeling dependent random variables

I Low-complexity algorithms for approximate inference

I Density evolution can be used to analyze performance

I Spatial Coupling

I Powerful technique for designing and understanding FGs.

I Related to the statistical physics of supercooled liquids

I Simple proof of threshold saturation for scalar recursions

I Interesting Open Problems

I Code constructions that reduce the rate-loss due to termination

I Compute the scaling exponent for SC codes

I Finding new problems where SC provides benefits

Page 195: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 62 / 65

Thanks for your attention

Page 196: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 63 / 65

References I

[AG16] Iryna Andriyanova, Alexandre Graell i Amat. Threshold saturation fornonbinary SC-LDPC codes on the binary erasure channel.

arXiv preprint arXiv:1311.2003v4, 2016.

[DJM13] D.L. Donoho, A. Javanmard, A. Montanari.

Information-theoretically optimal compressed sensing via spatial couplingand approximate message passing.

IEEE Trans. Inform. Theory, 59(11):7434–7464, Nov. 2013.

[Gal63] Robert G. Gallager.

Low-Density Parity-Check Codes.

The M.I.T. Press, Cambridge, MA, USA, 1963.

[HMU12] S. H. Hassani, N. Macris, R. Urbanke.

Chains of mean-field models.

J. Stat. Mech., strona P02011, 2012.

[KFL01] Frank R. Kschischang, Brendan J. Frey, Hans-Andrea Loeliger.

Factor graphs and the sum-product algorithm.

IEEE Trans. Inform. Theory, 47(2):498–519, Feb. 2001.

[KRU11] S. Kudekar, T. J. Richardson, R. L. Urbanke.

Threshold saturation via spatial coupling: Why convolutional LDPCensembles perform so well over the BEC.

IEEE Trans. Inform. Theory, 57(2):803–834, Feb. 2011.

Page 197: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 64 / 65

References II

[KRU13] S. Kudekar, T. Richardson, R. L. Urbanke.

Spatially coupled ensembles universally achieve capacity under beliefpropagation.

IEEE Trans. Inform. Theory, 59(12):7761–7813, Dec. 2013.

[KRU15] Shrinivas Kudekar, Thomas J Richardson, Rudiger L Urbanke.

Wave-like solutions of general 1-D spatially coupled systems.

IEEE Trans. Inform. Theory, 61(8):4117–4157, 2015.

[KYMP14] Santhosh Kumar, Andrew J. Young, Nicolas Macris, Henry D. Pfister.

Threshold saturation for spatially-coupled LDPC and LDGM codes onBMS channels.

IEEE Trans. Inform. Theory, 60(12):7389–7415, Dec. 2014.

[LMSS01] M. G. Luby, M. Mitzenmacher, M. A. Shokrollahi, D. A. Spielman.

Efficient erasure correcting codes.

IEEE Trans. Inform. Theory, 47(2):569–584, Feb. 2001.

[Mac99] David J. C. MacKay.

Good error-correcting codes based on very sparse matrices.

IEEE Trans. Inform. Theory, 45(2):399–431, March 1999.

[MM09] M. Mezard, A. Montanari.

Information, Physics, and Computation.

Oxford University Press, New York, NY, 2009.

Page 198: Capacity Achieving Codes: There and Back Againpfister.ee.duke.edu/talks/gothenburg16.pdf · Capacity Achieving Codes: There and Back Again2 / 65 Acknowledgments I Thanks to my coauthors

Capacity Achieving Codes: There and Back Again 65 / 65

References III

[RSU01] Thomas J. Richardson, M. Amin Shokrollahi, Rudiger L. Urbanke.

Design of capacity-approaching irregular low-density parity-check codes.

IEEE Trans. Inform. Theory, 47(2):619–637, Feb. 2001.

[RU01] Thomas J. Richardson, Rudiger L. Urbanke.

The capacity of low-density parity-check codes under message-passingdecoding.

IEEE Trans. Inform. Theory, 47(2):599–618, Feb. 2001.

[RU08] Thomas J. Richardson, Rudiger L. Urbanke.

Modern Coding Theory.

Cambridge University Press, New York, NY, 2008.

[Tru12] Dmitri Truhachev.

Achieving AWGN multiple access channel capacity with spatial graphcoupling.

IEEE Commun. Letters, 16(5):585–588, May 2012.

[TTK12] Keigo Takeuchi, Toshiyuki Tanaka, Tsutomu Kawabata.

A phenomenological study on threshold improvement via spatial coupling.

IEICE Trans. Fundamentals, E95-A(5):974–977, 2012.

[YJNP14] A. Yedla, Y.-Y. Jian, P. S. Nguyen, H. D. Pfister.

A simple proof of Maxwell saturation for coupled scalar recursions.

IEEE Trans. Inform. Theory, 60(11):6943–6965, Nov. 2014.