error-correcting codes: progress & challenges madhu sudan microsoft/mit 02/17/20101ecc:...

31
Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/2010 1 ECC: Progress/Challenges (@CMU)

Upload: chester-bates

Post on 17-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 1

Error-Correcting Codes:Progress & Challenges

Madhu SudanMicrosoft/MIT

02/17/2010

Page 2: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 2

Communication in presence of noise

NoisyChannel

Sender Receiver

We are not ready

We are now ready

If information is digital, reliability is critical02/17/2010

Page 3: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 3

Shannon’s Model: Probabilistic Noise

NoisyChannel

Probabilistic Noise: E.g., every letter flipped to random other letter of Σ w.p. p

Focus: Design good Encode/Decode algorithms.

Encode(expand)

Decode(compress?)

Sender Receiver

E:Σk Σn D:Σn Σk

02/17/2010

Page 4: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 4

Hamming Model: Worst-case error

Errors: Upto t worst-case errors

Focus: Code: C = Image(E) = {E(x) | x Є Σk } (Note: Not encoding/decoding)

Goal: Design code to correct every possible pattern

of t errors.

02/17/2010

Page 5: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 5

Problems in Coding Theory, Broadly

Combinatorics: Design best possible error-correcting codes.

Probability/Algorithms: Design algorithms correcting random/worst-case errors.

02/17/2010

Page 6: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 6

Part I (of III):

Combinatorial Results

02/17/2010

Page 7: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 7

Hamming Notions

Hamming Distance: ¢(x,y) = |{i | xi ≠ yi}|

Distance of Code: ¢ (C) = minx,y 2 C {¢ (x,y)}

Code of distance 2t+1 corrects t errors.

Main question: Four parameters: Length n, message length k, distance d, alphabet q = |Σ|. - How do they relate?

- Want + n, " k, " d, ? q

Let: R = k/n; δ= d/n; How do R, δ, q relate?

Asymptotically:

02/17/2010

Page 8: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 8

Simple results

Ball(x,r) = {y \in Σn | Δ(x,y) · r}Volume of Ball:Vol(q,n,r) = |Ball(x,r)| Entropy function: Hq(δ) = c s.t. Vol(q,n, δn) ¼ qcn

Hamming (Packing) Bound: Balls of radius δn/2 around codewords are disjoint.

qk ¢ qHq(δ/2)n · qn R + Hq(δ/2) · 1

02/17/2010

Page 9: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 9

Gilbert-Varshamov (Greedy) Bound: Let C:Σk Σn be maximal code of distance d. Then balls of radius d-1 around codewords cover Σn

So qk ¢ qHq(δn) ¸ qn ……… Or … R ¸ 1 – Hq(δ)

Simple results (contd.)

02/17/2010

Page 10: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 10

Simple results (Summary)

For the best code:

1 – Hq(δ) · R · 1 – Hq(δ/2)

After fifty years of research … We still don’t know.

Which is right?

02/17/2010

Page 11: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 11

Binary case (q =2):

Case of large distance: δ = ½ - ², ² 0.

Case of small (relative) distance: No bound better than R · 1 – (1-o(1)) ¢ H(δ/2)

Case of constant distance d: (d/2) log n ¸ n-k ¸ (1-o(1)). (d/2) \log n

GV/ Cherno® LP Bound

Hamming

BCH Hamming

(² 2) · R · O* (²2)

02/17/2010

Page 12: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 12

Binary case (Closer look):

For general n,d: # Codewords ¸ 2n / Vol (2,n, d-1)

Can we do better? Twice as many codewords? (won’t change asymptotics of R, δ )

Recent progress [Jiang-Vardy]: # Codewords ¸ d ¢ 2n / Vol(2,n,d-1)

02/17/2010

Page 13: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 14

Major questions in binary codes:

Give explicit construction meeting GV bound. Specifically: Codes with δ = ½ - ² & R = (²2)

Is Hamming tight when δ 0 ? Do there exist codes of distance δ with R = 1 – [ c ¢ (1 – o(1)) ¢ δ log2 (1/δ) ] for c < 1? [Hamming: c > ½ ]

Is LP Bound tight?

02/17/2010

Page 14: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 15

Combinatorics (contd.): q-ary case

Fix δ and let q 1 (then fix q and let n 1)

Surprising result (’80s): Algebraic Geometry yields: R ¸ 1 – δ – 1/(√q – 1)

(Also a negative surprise: BCH codes only yield 1 – R · (q-1)/q logq n)

GV bound Plotkin

Not Hamming

02/17/2010

1 – δ – O(1/log q) · R · 1 – δ – 1/q

Page 15: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 16

Major questions: q-ary case

Suppose R = 1 – δ – f(q)

What is the fastest decaying function f(.)? (somewhere between 1/√q and 1/q). Give a simple explanation for why f(q) · 1/√q

Fix d, and let q 1 How does (n-k)/(d logq n) grow in the limit? Is it 1 or ½? Or somewhere in between?

02/17/2010

Page 16: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 17

Part II (of III):

Correcting Random Errors

02/17/2010

Page 17: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 18

Recall Shannon ‘1948

Σ-symmetric channel w. error prob. p: Transmits σ 2 Σ as σ w.p. 1-p; and as ¿ 2 Σ- {σ} w.p. p/(q-1).

Shannon’s Coding Theorem: Can transmit at rate R = 1 – Hq(p) - ², 8 ² > 0

Converse Coding Theorem: Can not transmit at rate R = 1 – Hq(p) + ²

So: No mysteries?

02/17/2010

If R = 1 – Hq(p) - ², then for every n and k = Rn, there exist E:Σk Σn and D:Σn Σk s.t. PrChannel,x [D(Channel(E(x)) ≠ x] · exp(-n).

Page 18: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 19

Shannon’s functions: E random, D brute force search.

Can we get poly time E, D? [Forney 66]: Yes! (Using Reed-Solomon codes

correcting ²-fraction error + composition.) [Sipser-Spielman ‘92, Spielman ‘94, Barg-

Zemor ‘97]: Even in linear time! Still didn’t satisfy practical needs. Why?

[Berrou et al. 92] Turbo codes + belief propagation: No theorems; Much excitement

Constructive versions

02/17/2010

Page 19: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 20

What is satisfaction?

Articulated by [Luby,Mitzenmacher,Shokrollahi,Spielman ’96]

Practically interesting question: n = 10000; q = 2, p = .1; Desired error prob. = 10-6; k = ? [Forney ‘66]: Decoding time: exp(1/(1 – H(p) – (k/n)));

Rate = 90% ) decoding time ¸ 2100;

Right question: reduce decoding time to poly(n,1/ ²); where ² = 1 – H(p) – (k/n)

02/17/2010

Page 20: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 21

Current state of the art

Luby et al.: Propose study of codes based on irregular graphs (“Irregular LDPC Codes”).

No theorems so far for erroneous channels. Strong analysis for (much) simpler case of

erasure channels (symbols are erased); decoding time = O(n log (1/²))

(Easy to get “composition” based algorithms with decoding time = O(n poly(1/²)) Do have some proposals for errors as well (with

analysis by Luby et al., Richardson & Urbanke), but none known to converge to Shannon limit.

02/17/2010

Page 21: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 22

Part III:

Correcting Adversarial Errors

02/17/2010

Page 22: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 23

Motivation:

As notions of communication/storage get more complex, modeling error as oblivious (to message/encoding/decoding) may be too simplistic.

Need more general models of error + encoding/decoding for such models.

Most pessimistic model: errors are worst-case.

02/17/2010

Page 23: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 24

Gap between worst-case & random errors

In Shannon model, with binary channel: Can correct upto 50% (random) errors. ( 1-1/q fraction errors, if channel q-ary.)

In Hamming model, for binary channel: Code with more than n codewords has distance

at most 50%. So it corrects at most 25% worst-case errors. ( ½(1 – 1/q) errors in q-ary case.)

Shannon model corrects twice as many errors: Need new approaches to bridge gap.

02/17/2010

Page 24: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 25

Approach: List-decoding

Main reason for gap between Shannon & Hamming: The insistence on uniquely recovering message.

List-decoding: Relaxed notion of recovery from error. Decoder produces small list (of L) codewords, such that it includes message.

Code is (p,L) list-decodable if it corrects p fraction error with lists of size L.

02/17/2010

Page 25: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 26

List-decoding

Main reason for gap between Shannon & Hamming: The insistence on uniquely recovering message.

List-decoding [Elias ’57, Wozencraft ’58]: Relaxed notion of recovery from error. Decoder produces small list (of L) codewords, such that it includes message.

Code is (p,L) list-decodable if it corrects p fraction error with lists of size L.

02/17/2010

Page 26: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 27

What to do with list?

Probabilistic error: List has size one w.p. nearly 1

General channel: Need side information of only O(log n) bits to disambiguate [Guruswami ’03] (Alt’ly if sender and receiver share O(log n) bits,

then they can disambiguate [Langberg ’04]).

Computationally bounded error: Model introduced by [Lipton, Ding Gopalan L.] List-decoding results can be extended (assuming

PKI and some memory at sender) [Micali et al.]

02/17/2010

Page 27: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 28

List-decoding: State of the art

[Zyablov-Pinsker/Blinovskii – late 80s] There exist codes of rate 1 – Hq(p) - \epsilon

that are (p,O(1))-list-decodable.

Matches Shannon’s converse perfectly! (So can’t do better even for random error!)

But [ZP/B] non-constructive!

02/17/2010

Page 28: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU)

29

Algorithms for List-decoding

Not examined till ’88.

First results: [Goldreich-Levin] for “Hadamard’’ codes (non-trivial in their setting).

More recent work: [S.’96, Shokrollahi-Wasserman ’98, Guruswami-S.’99,

Parvaresh-Vardy ’05, Guruswami-Rudra ’06] – Decode algebraic codes.

[Guruswami-Indyk ’00-’02] – Decode graph-theoretic codes.

02/17/2010

Page 29: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 30

Results in List-decoding

q-ary case: [Guruswami-Rudra ‘06] Codes of rate R correcting 1 – R - ² fraction errors with q = q(²) Matches Shannon bound (except for q(²) )

Binary case:

¡ c ! 3: Implied by Parvaresh-Vardy 05¡ c= 4: Guruswami et al. 2000

9 Codes of rate ²c correcting 12 ¡ ² fraction errors.

02/17/2010¡ c= 3: Guruswami Rudra

Page 30: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 34

Major open question

² Note: If running time is poly(1=²) then this implies asolution to the random error problem as well.

² Construct (p;O(1)) list-decodable binary codeof rate 1¡ H (p) ¡ ² with polytime list decoding..

02/17/2010

Page 31: Error-Correcting Codes: Progress & Challenges Madhu Sudan Microsoft/MIT 02/17/20101ECC: Progress/Challenges (@CMU)

ECC: Progress/Challenges (@CMU) 35

Conclusions

Coding theory: Very practically motivated problems; solutions influence (if not directly alter) practice.

Many mysteries remain in combinatorial setting.

Significant progress in algorithmic setting, but many more questions to resolve.

02/17/2010