fast and robust sparse recovery

71
Fast and robust sparse recovery New Algorithms and Applications The Chinese University of Hong Kong The Institute of Network Coding Sheng Cai Eric Chan Minghua Chen Sidharth Jaggi Mohammad Jahangoshahi Venkatesh Saligrama Mayank Bakshi INC, CUHK

Upload: sunee

Post on 24-Feb-2016

37 views

Category:

Documents


6 download

DESCRIPTION

Fast and robust sparse recovery. Mayank Bakshi INC, CUHK. New Algorithms and Applications. Sheng Cai. Eric Chan. Mohammad Jahangoshahi. Sidharth Jaggi. Venkatesh Saligrama. Minghua Chen. The Chinese University of Hong Kong. The Institute of Network Coding. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Fast and robust sparse recovery

Fast and robust sparse recoveryNew Algorithms and Applications

The Chinese University of Hong Kong

The Institute of Network Coding

ShengCai

EricChan Minghua

ChenSidharth

JaggiMohammad Jahangoshahi

VenkateshSaligrama

Mayank BakshiINC, CUHK

Page 2: Fast and robust sparse recovery

? n

2

Fast and robust sparse recovery

m

m<n

k

Unknown x

MeasurementMeasurement output

Reconstruct x

Page 3: Fast and robust sparse recovery

A. Compressive sensing

4

?

k ≤ m<n

? n

m

k

Page 4: Fast and robust sparse recovery

A. Robust compressive sensing

y=A(x+z)+eApproximate sparsity

Measurement noise

5

?

z

e

Page 5: Fast and robust sparse recovery

TomographyComputerized Axial

(CAT scan)

Page 6: Fast and robust sparse recovery

B. Tomography

Estimate x given y and T

y = Tx

Page 7: Fast and robust sparse recovery

B. Network Tomography

Measurements y:• End-to-end packet delays

Transform T:• Network connectivity matrix (known a priori)

Infer x:• Link/node congestion

Hopefully “k-sparse”

Compressive sensing?

Challenge:• Matrix T “fixed”• Can only take “some”

types of measurements

Page 8: Fast and robust sparse recovery

9

n-dd

1 0q

1q

For Pr(error)< ε , Lower bound:

Noisy Combinatorial OMP:What’s known

…[CCJS11]

0

C. Robust group testing

Page 9: Fast and robust sparse recovery

A. Robust compressive sensing

y=A(x+z)+eApproximate sparsity

Measurement noise

11

?

z

e

Page 10: Fast and robust sparse recovery

Apps: 1. Compression

12

W(x+z)

BW(x+z) = A(x+z)M.A. Davenport, M.F. Duarte, Y.C. Eldar, and G. Kutyniok, "Introduction to Compressed Sensing,"in Compressed Sensing: Theory and Applications, 2012

x+z

Page 11: Fast and robust sparse recovery

Apps: 2. Fast(er) Fourier Transform

13

H. Hassanieh, P. Indyk, D. Katabi, and E. Price. Nearly optimal sparse fourier transform. In Proceedings of the 44th symposium on Theory of Computing (STOC '12).

Page 12: Fast and robust sparse recovery

Apps: 3. One-pixel camera

http://dsp.rice.edu/sites/dsp.rice.edu/files/cs/cscam.gif14

Page 13: Fast and robust sparse recovery

y=A(x+z)+e

15

Page 14: Fast and robust sparse recovery

y=A(x+z)+e

16

Page 15: Fast and robust sparse recovery

y=A(x+z)+e

17

Page 16: Fast and robust sparse recovery

y=A(x+z)+e

18

Page 17: Fast and robust sparse recovery

y=A(x+z)+e

(Information-theoretically) order-optimal19

Page 18: Fast and robust sparse recovery

(Information-theoretically) order-optimal

• Support Recovery

20

Page 19: Fast and robust sparse recovery

SHO-FA:SHO(rt)-FA(st)

Page 20: Fast and robust sparse recovery

O(k) measurements,O(k) time

Page 21: Fast and robust sparse recovery

1. Graph-Matrix

n ck

d=3

24

A

Page 22: Fast and robust sparse recovery

1. Graph-Matrix

25

n ck

Ad=3

Page 23: Fast and robust sparse recovery

26

1. Graph-Matrix

Page 24: Fast and robust sparse recovery

2. (Most) x-expansion

≥2|S||S|27

Page 25: Fast and robust sparse recovery

3. “Many” leafs

≥2|S||S|L+L’≥2|S|

3|S|≥L+2L’

L≥|S|L+L’≤3|S|

L/(L+L’) ≥1/3L/(L+L’) ≥1/2

28

Page 26: Fast and robust sparse recovery

4. Matrix

29

Page 27: Fast and robust sparse recovery

Encoding – Recap.

30

0

1

0

1

0

Page 28: Fast and robust sparse recovery

Decoding – Initialization

31

Page 29: Fast and robust sparse recovery

Decoding – Leaf Check(2-Failed-ID)

32

Page 30: Fast and robust sparse recovery

Decoding – Leaf Check (4-Failed-VER)

33

Page 31: Fast and robust sparse recovery

Decoding – Leaf Check(1-Passed)

34

Page 32: Fast and robust sparse recovery

Decoding – Step 4 (4-Passed/STOP)

35

Page 33: Fast and robust sparse recovery

Decoding – Recap.

36

0

0

0

0

0

?

?

?0

0

0

1

0

Page 34: Fast and robust sparse recovery

Decoding – Recap.

28

0

1

0

1

0

Page 35: Fast and robust sparse recovery

Noise/approx. sparsity

39

Page 36: Fast and robust sparse recovery

Meas/phase error

40

Page 37: Fast and robust sparse recovery

Correlated phase meas.

41

Page 38: Fast and robust sparse recovery

Correlated phase meas.

42

Page 39: Fast and robust sparse recovery

Correlated phase meas.

43

Page 40: Fast and robust sparse recovery

44

• Goal: Infer network characteristics (edge or node delay)• Difficulties:

– Edge-by-edge (or node-by node) monitoring too slow– Inaccessible nodes

Network Tomography

Page 41: Fast and robust sparse recovery

45

• Goal: Infer network characteristics (edge or node delay)• Difficulties:

– Edge-by-edge (or node-by node) monitoring too slow– Inaccessible nodes

• Network Tomography:– with very few end-to-end measurements– quickly– for arbitrary network topology

Network Tomography

Page 42: Fast and robust sparse recovery

B. Network Tomography

Measurements y:• End-to-end packet delays

Transform T:• Network connectivity matrix

(known a priori)

Infer x:• Link/node congestion

Hopefully “k-sparse”

Compressive sensing?

Idea:• “Mimic” random matrix

Challenge:• Matrix T “fixed”• Can only take “some”

types of measurements

Our algorithm: FRANTIC• Fast Reference-based Algorithm for Network

Tomography vIa Compressive sensing

Page 43: Fast and robust sparse recovery
Page 44: Fast and robust sparse recovery

SHO-FA

49

n ck

Ad=3

Page 45: Fast and robust sparse recovery

50

T

1. Integer valued CS [BJCC12] “SHO-FA-INT”

Page 46: Fast and robust sparse recovery

2. Better mimicking of desired T

Page 47: Fast and robust sparse recovery

Node delay estimation

1v3v4v2v

Page 48: Fast and robust sparse recovery

Node delay estimation

4v2v3v

1v

Page 49: Fast and robust sparse recovery

4v2v1v3v

Node delay estimation

Page 50: Fast and robust sparse recovery

Edge delay estimation

1e 5e6e 3e4e

2e

Page 51: Fast and robust sparse recovery

Idea 1: Cancellation

, ,

Page 52: Fast and robust sparse recovery

Idea 2: “Loopy” measurements

•Fewer measurements•Arbitrary packet injection/

reception•Not just 0/1 matrices (SHO-FA)

,

Page 53: Fast and robust sparse recovery
Page 54: Fast and robust sparse recovery

C. GROTESQUE: Noisy GROup TESting (QUick and Efficient)

Page 55: Fast and robust sparse recovery

63

n-dd

1 0q

1q

For Pr(error)< ε , Lower bound:

Noisy Combinatorial OMP:What’s known

…[CCJS11]

0

Page 56: Fast and robust sparse recovery

Decoding complexity

# Tests

Lower bound

Lower bound

Adaptive

Non-Adaptive

2-Stage Adaptive

This work

O(poly(D)log(N)),O(D2log(N))

O(DN),O(Dlog(N))

[NPR12]

Page 57: Fast and robust sparse recovery

Decoding complexity

# Tests

This work

Page 58: Fast and robust sparse recovery

Hammer: GROTESQUE testing

Page 59: Fast and robust sparse recovery

Multiplicity

?

Page 60: Fast and robust sparse recovery

Localization

?

Noiseless:

Noisy:

Page 61: Fast and robust sparse recovery

Nail: “Good” Partioning

GROTESQUE

n itemsd defectives

Page 62: Fast and robust sparse recovery

Adaptive Group Testing

O(n/d)

Page 63: Fast and robust sparse recovery

Adaptive Group Testing

O(n/d)

GROTESQUEGROTESQUE

GROTESQUE

GROTESQUE

O(dlog(n)) time, tests, constant fraction recovered

Page 64: Fast and robust sparse recovery

Adaptive Group Testing

•Each stage constant fraction recovered•# tests, time decaying geometrically

Page 65: Fast and robust sparse recovery

Adaptive Group Testing

T=O(logD)

Page 66: Fast and robust sparse recovery

Non-Adaptive Group Testing

Constant fraction “good”

O(Dlog(D))

Page 67: Fast and robust sparse recovery

Non-Adaptive Group Testing

Iterative Decoding

Page 68: Fast and robust sparse recovery

2-Stage Adaptive Group Testing

=D2

Page 69: Fast and robust sparse recovery

D. Threshold Group Testing

l u # defective items in a group

Prob

abili

ty th

at

Out

put i

s pos

itive

0

1

n itemsd defectives

Each test:

Goal: find all d defectives

Our result: tests suffice; Previous best algorithms:

Page 70: Fast and robust sparse recovery

Summary• Fast and Robust Sparse Recovery algorithms

• Compressive sensing: Order optimal complexity, # of measurements

• Network Tomography: Nearly optimal complexity, # of measurements

• Group Testing: Optimal complexity, nearly optimal # of tests- Threshold Group Testing: Nearly optimal # of tests

Page 71: Fast and robust sparse recovery

THANK YOU謝謝

18