cooperative compressed sensing for decentralized...
TRANSCRIPT
Cooperative Compressed Sensing for
Decentralized Networks
Zhi (Gerry) Tian
Dept. of ECE, Michigan Tech Univ.
February 18, 2011
A presentation at
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
1
Ground-Breaking Recent Advances
Compressive sampling [Chen-Donoho-Saunders’98], [Candès et al’04-06]
Given y and H, unknown s can be found with high probability
Least-absolute shrinkage selection operator (Lasso)
Ex. (scalar case) closed-form solution
Sparse regression [Tibshirani’96], [Tipping’01]
(a1) s is sparse (nonzero entries unknown)
(a2) H can be fat (K N);
satisfies restricted isometry property (RIP)
variable selection + estimation
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
2
Outline
Sparsity-aware sensing for global awareness
e.g., spectrum sensing in cognitive radio networks
Sparsity-aware sensing for local awareness
e.g., localized event detection in wireless sensor networks
Decentralized cooperative sensing
Summary and future research
“Sparsity-Aware Sensing in Networked Environments”
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
3
Sensing Network: Signal Model
signal vector is sparse wrt grid points
source locations = grid locations localization as byproduct
Virtual grid Sensor grid
(densely deployed)
# sources (N) = # grid points (Ns) # sources (N) = # sensors (Nr)
Source locations at grid points w/ known locations
Signal source locations and amplitudes
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
4
Sensor readings are additive:
Hi is distance-dependent, known (learning)
Objective: to recover sparse s from
Network Data Model
Global Info
Global Awareness Global Awareness Local Awareness
Local Info Local Info
Scalability Robustness Lack of Infrastructure
Centralized
FUSION
CENTER
Decentralized
FUSION
CENTER
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
5
Decentralized Processing for Global Awareness
Spectrum Sensing in Cooperative Cognitive Radio Networks
Decentralized,
Global Awareness
Global Info Local Info
Centralized,
Global Awareness
one-hop communication range: rC
neighboring sensors of sensor i:
long-range or multi-hop
communication needed
security issues
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Spectrum Scarcity Problem
fixed spectrum access policies have
useful radio spectrum pre-assigned
US FCC
inefficient utilization 0 1 2 3 4 5 6GHz
PS
D
“Scarcity vs. Underutilization Dilemma”
Source: Spectrum Sharing Inc. 6
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
7
Cognitive Radio (CR)
CRs opportunistically use the spectrum under user hierarchy
Cognitive radio network problems
Finding holes in the spectrum: wideband spectrum sensing
Allocating the open spectrum: dynamic resource allocation
Adjusting the transmit waveforms: waveform adaptation
legacy users
frequency
pow
er
cognitive radios
legacy users cognitive radio
Secondary User (SU) Primary User (PU)
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Multiple CRs jointly detect the spectrum
[Ghasemi-Sousa’05, Ganesan-Li’05, Bazarque-Giannakis’08, Tian’08]
Benefits:
spatial diversity gain mitigates multipath fading and shadowing
reduced sensing time and local processing
increase of reliability and ability to detect hidden terminals
Tradeoff: cooperation gain vs network overhead
Efficient Sharing Requires Sensing
Source: Office of Communications (UK)
f multiple (random) paths unlikely to fade simultaneously
Spatial diversity against fading
8
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Idea: CRs collaborate to form a spatial map of the spectrum
Goal:
Specifications: coarse approx. suffices
Approach: basis expansion of
Compressive Sampling (CS) possible
to form the PSD data
Distributed Cooperative CR Sensing
9
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
10
Modeling
Transmitters
Sensing CRs
Frequency bases
Sensed frequencies
Sparsity present in space and frequency
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
11
Superimposed Tx spectra measured at CR r
Average path-loss
Frequency bases
Linear model in and
Space-Frequency Basis Expansion
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
12
Sparse Regression
Seek a space s to capture the spectrum measured at all CRr
Lasso:
Soft threshold shrinks noisy estimates to zero
Similar to Akaike’s Information Criterion,
it penalizes the number of parameters
spectrum selection + estimation via || . ||1 penalty
Power spectrum is non-negative non-negativity constraints
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Consensus-based Distributed Optimization
Decentralized
Scalability Robustness Lack of infrastructure
Centralized Lasso:
Decentralized equivalence
Constraints impose consensus across the network
solvable locally
Exchange of local
estimates
13
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
14
Alternating-direction method of multipliers (ADMoM)
Augmented Lagrange function
Iterative implementation
each CR i reconstructs locally:
each CR i updates multipliers:
broadcasts local decision one-hop:
Decentralized Algorithm
Scalable: one-hop communication, local computation
Globally optimal: guaranteed if the network is connected
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
15
Power Spectrum Cartography
NNLS Lasso
5 sources Ns = 121 candidate locations, Nr = 50 CRs
Sparsity-unaware NNLS is prone to false alarms
As a byproduct, Lasso localizes all sources via variable selection
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
16
Cooperative Compressed Sensing P
rob.
of
Dete
ction
Prob. of False Alarm
Decentralized,
majority vote
Decentralized
consensus
2 PUs, 3 CRs, SNR=-5 dB; compression = 50%
Performance gain by decentralized fusion over majority vote
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
17
Other Scenarios of Global Awareness
Compressive sampling at sub-Nyquist rates [ICASSP’07]
Edge detection using Wavelet basis [CROWNCOM’06]
Cooperative Sensing at sub-Nyquist rates [GLOBECOM’08]
Cooperative sensing of common PU’s spectrum in the presence
of local interference [ICC’10, JSAC’11]
Cooperative detection of multiple signals with common support [ICASSP’11]
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
18
Spectrum Hole/Edge Detection
Spectrum reconstruction Spectrum hole detection
20%
3
3%
50%
7
5%
90%
100%
Compressive sampling at sub-Nyquist rates [ICASSP’07]
Edge detection using Wavelet basis [CROWNCOM’06]
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
19
Decentralized Processing for Local Awareness
Localized Event Detection in Large Networks
Characteristics: events are sparse and local, with limited influence
Applications: radioactive sources, targets, structural damages
Network considerations: energy efficiency, scalability, robustness
Decentralized,
Local Awareness
Local Info
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
20
Localized Event Detection in Large Networks
Sensor grid: Nr = Ns = N
Localized events of limited influence
influence of event sj on sensor vi: hij si
Sparsity-aware formulations
Prior info: sources are sparse wrt grid points
Quadratic programming: bounded noise energy
Linear programming: bounded measurement errors
( )
[centralized]
H
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
21
Global vs. Local Awareness
Global Awareness via Consensus
each sensor optimizes one local copy of the decision vector s
all local copies are forced to consent via one-hop comm.
equivalence to centralized optimality if network is connected
Consensus with neighbors
Separable objective for sensor i
Decentralized implementation via Iterative ADMoM
iteratively exchange decision vectors with neighbors
heavy communication load for a large network with L >>
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
22
Localized Event Detection
Reformulation
each sensor i optimizes one scalar variable si for itself
based on linear programming for simplicity
equivalence to centralized optimality if H is localized
22
( )
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Iterative Procedure
update local decision variable at sensor i; linear computation
send decision + multiplier scalars to neighbors; one-hop comm.
23
Solution 1 via ADMoM
23
Slack variables
Lagrange multipliers for
measurement constraints
Lagrange multipliers for
nonnegative constraints
Local decision
si(t) si(t+1)
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
24
Solution 2 via DLP
Parallel computing under diagonal dominance [Tseng’90]
Reformulation: Decentralized Linear Programming (DLP)
uncoupled objective and constraints
solution per sensor:
Iterative implementation send one decision scalar to neighboring sensors
Global optimality is H is localized and diagonal dominance
Simple computation, low-cost comm., fast convergence
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
25
Comparison of Iteration Steps
Local DLP:
Decision Making
Local computation
Information Exchange
One-hop communication
send
collect
send
collect
send
collect
compute
compute
Global Consensus:
Local ADMoM:
compute
[ICASSP’10]
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
26
Simulation Setup
Sensors on grid (structural health monitoring)
sensor locations: (xr, yr), x,y = 1,…,L
network size: N = LxL: L=10, N=100
Damages to detect
si = 1 at (3r, 5r)
sj = 0.5 at (5r, 5r)
Influence is distance-dependent
influence function
limited influence
Parameters
Task: identify locations & severity
of damage
26 emulation hij: model
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
27
Convergence: ADMoM & DLP
ADMoM: converges in 20-30 steps DLP: converges in < 4 steps
scalable costs in communication and computation
global optimality via local cooperation
faster convergence than global awareness
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
28
Sleeping Networks
Scalable complexity
Energy saving
Fast convergence
High resolution
Randomly turns off a fraction of sensors to induce compression
Active sensors make decisions for self & neighboring sleeping
sensors, but not the entire network [Qing-Tian’2010]
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
29
Multiple measurement vector (MMV) problem
sensors recover signals of different amplitudes, but common support
no need for channel or location information
Cooperative Support Detection
Unknown environments
Known sampling strategy
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
30
Row Lasso for the MMV problem
Similar to Group Lasso in centralized form [Yuan-Lin’06]
Coupled variables in mixed-norm
Decentralized Support Detection
Q: What to consent on?
Distributed Implementation
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
31
Energy-based Consensus
Energy vector
Consensus optimization formulation
Consensus-based Support Detection
Centralized
R-Lasso:
solved locally
exchange
in one-hop
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
32
Alternating-direction method of multipliers (ADMoM)
Augmented Lagrange function
Iterative implementation
each CR i reconstructs locally:
each CR i updates multipliers:
broadcasts local decision one-hop:
Decentralized Algorithm
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
33
Cooperative Support Detection
20 channels, 5 PUs, 6 cooperative CRs, SNR = 5dB, 25% compression
33
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Summary
Exploiting sparsity in networked environments
Global awareness
Decentralized cooperation via consensus optimization
Flexible problem formulations
Local awareness
Suitable for large networks that monitor localized phenomena
Improved convergence and reduced network overhead
34
Sparsity-Aware Sensing and Communications Z. Tian, Michigan Tech
Future Research
Wireless Sensor Networking
Infrastructure: centralized, decentralized hierarchical
Awareness: global vs. local
Tasks: long-term monitoring vs. time-critical exploration
Collaborative Information Processing
Iterative Consensus Optimization
benefits: one-hop, optimal, scalable, robust, asynchronous
issues: convergence speed
Assessment and optimization
When to collaborate? How to collaborate?
How to speed up the convergence rate?