secure communication for distributed systems

50
Secure Communication for Distributed Systems Paul Cuff Electrical Engineering Princeton University

Upload: appollo-kristin

Post on 31-Dec-2015

62 views

Category:

Documents


1 download

DESCRIPTION

Secure Communication for Distributed Systems. Paul Cuff Electrical Engineering Princeton University. Overview. Application A framework for secrecy of distributed systems Theoretical result Information theory in a competitive context (zero-sum game) Two methods of coordination. Main Idea. - PowerPoint PPT Presentation

TRANSCRIPT

Secure Communication for Distributed SystemsPaul CuffElectrical EngineeringPrinceton University

Overview• Application• A framework for secrecy of distributed systems

• Theoretical result• Information theory in a competitive context (zero-sum game)

• Two methods of coordination

Main Idea• Secrecy for distributed systems

• Design encryption specifically for a system objective

Node A

Node BMessageInformation

Action

Adversary

Distributed System

Attack

Communication in Distributed Systems

“Smart Grid”

Image from http://www.solarshop.com.au

Example: Rate-Limited Control

Adversary

00101110010010111

Signal (sensor)Communication

Signal (control)

Attack Signal

Example: Feedback Stabilization

• “Data Rate Theorem” [Wong-Brockett 99, Baillieul 99]

Controller Dynamic System

EncoderDecoder 10010011011010101101010100101101011

SensorAdversary

Feedback

Traditional View of Encryption

Information inside

Shannon Analysis• 1948• Channel Capacity• Lossless Source Coding• Lossy Compression

• 1949 - Perfect Secrecy• Adversary learns nothing about the information• Only possible if the key is larger than the information

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Shannon Model• Schematic

• Assumption• Enemy knows everything about the system except the key

• Requirement• The decipherer accurately reconstructs the information

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Encipherer DeciphererCiphertext

Key Key

Plaintext Plaintext

Adversary

For simple substitution:

Shannon Analysis• Equivocation vs Redundancy• Equivocation is conditional entropy:• Redundancy is lack of entropy of the source:• Equivocation reduces with redundancy:

C. Shannon, "Communication Theory of Secrecy Systems," Bell Systems Technical Journal, vol. 28, pp. 656-715, Oct. 1949.

Computational Secrecy• Assume limited computation resources• Public Key Encryption• Trapdoor Functions

• Difficulty not proven• Can become a “cat and mouse” game

• Vulnerable to quantum computer attack

W. Diffie and M. Hellman, “New Directions in Cryptography,” IEEE Trans. on Info. Theory, 22(6), pp. 644-654, 1976.

1125897758 834 689524287

2147483647X

Information Theoretic Secrecy• Achieve secrecy from randomness (key or channel), not from

computational limit of adversary.

• Physical layer secrecy• Wyner’s Wiretap Channel [Wyner 1975]

• Partial Secrecy• Typically measured by “equivocation:”• Other approaches:• Error exponent for guessing eavesdropper [Merhav 2003]• Cost inflicted by adversary [this talk]

Equivocation• Not an operationally defined quantity

• Bounds:• List decoding• Additional information needed for decryption

• Not concerned with structure

Our Framework• Assume secrecy resources are available (secret key, private

channel, etc.)

• How do we encode information optimally?

• Game Theoretic• Eavesdropper is the adversary• System performance (for example, stability) is the payoff• Bayesian games• Information structure

Competitive Distributed System

Node A Node BMessage

Key

Information Action

AdversaryAttack

Encoder:

System payoff: .

Decoder:

Adversary:

Zero-Sum Game• Value obtained by system:• Objective• Maximize payoff

Node A Node BMessage

Key

Information Action

AdversaryAttack

Secrecy-Distortion Literature

• [Yamamoto 97]:• Cause an eavesdropper to have high reconstruction distortion

• Replace payoff (π) with distortion• No causal information to the eavesdropper

• Warning:  Problem statement can be too optimistic!

How to Force High Distortion• Randomly assign bins• Size of each bin is • Adversary only knows bin

• Reconstruction of only depends on the marginal posterior distribution of

Example (Bern(1/3)):

THEORETICAL RESULTS

Information Theoretic Rate Regions

Provable Secrecy

Two Categories of Results

Lossless Transmission

• Simplex interpretation• Linear program

• Hamming Distortion

General Reward Function

• Common Information• Secret Key

Competitive Distributed System

Node A Node BMessage

Key

Information Action

AdversaryAttack

Encoder:

System payoff: .

Decoder:

Adversary:

Zero-Sum Game• Value obtained by system:• Objective• Maximize payoff

Node A Node BMessage

Key

Information Action

AdversaryAttack

Theorem: [Cuff 10]

Lossless Case• Require Y=X• Assume a payoff function

• Related to Yamamoto’s work [97]• Difference: Adversary is more capable with more information

Also required:

Linear Program on the Simplex

Constraint:

Minimize:

Maximize:

U will only have mass at a small subset of points (extreme points)

Linear Program on the Simplex

Catego

ry 1

Catego

ry 2

Catego

ry 3

Catego

ry 4

00.4

0.8

Series 1

Series 2

Series 3

Series 4

Series 5 Series 1Series 2Series 3Series 4Series 5

Binary-Hamming Case• Binary Source:• Hamming Distortion

• Optimal approach• Reveal excess 0’s or 1’s to condition the hidden bits

0 1 0 0 1 0 0 0 0 1

* * 0 0 * * 0 * 0 *Source

Public message

Binary Source (Example)• Information source is Bern(p)• Usually zero (p < 0.5)• Hamming payoff

• Secret key rate R0 required to guarantee eavesdropper error

R0

p

Eavesdropper Error

General Payoff FunctionNo requirement for lossless transmission.

• Any payoff function π(x,y,z)• Any source distribution (i.i.d.)

Adversary:

Payoff-Rate Function• Maximum achievable average payoff

• Markov relationship:

Theorem:

Unlimited Public Communication• Maximum achievable average payoff

• Conditional common information:

Theorem (R=∞):

RELATED COMMUNICATION METHODSTwo Coordination Results

Coordination Capacity• References:• [C., Permuter, Cover – IT Trans. 09]• [C. - ISIT 08]• [Bennett, Shor, Smolin, Thapliyal – IT Trans. 02]• [C., Zhao – ITW 11]

• Ability to coordinate sequences (“actions”) with communication limitations.• Empirical Coordination• Strong Coordination

X1 X2 X3 X4 X5 X6 … Xn

Empirical Coordination

Y1 Y2 Y3 Y4 Y5 Y6 … Yn

Z1 Z2 Z3 Z4 Z5 Z6 … Zn

Empirical Distribution

Empirical Distribution

1 0 1 1 0 0 0 1

0 1 1 0 1 0 1 1

1 1 0 1 0 0 1 0

000 001 010 011 100 101 110 111

Average Distortion

• Average values are a function of the empirical distribution

• Example: Squared error distortion

• Rate distortion theory fits in the empirical coordination context.

No Rate – No Channel• No explicit communication channel

• Signal “A” serves an analog and information role.• Analog: symbol-by-symbol relationship• (Digital): uses complex structure to carry information.

Processor 1 Processor 2

Source

Actuator 1 Actuator 2

Define Empirical Coordination

Processor 1 Processor 2

Source

is achievable if:

Coordination Region

• The coordination region

gives us all results concerning average distortion.

Processor 1 Processor 2

Source

Result – No constraints

Processor 1 Processor 2

Source

Achievability: Make a codebook of (An , Bn ) pairs

General Results• Variety of causality constraints (delay)

Finite Look-ahead

Processor 1 Processor 2

Source

Alice and Bob Game• Alice and Bob want to cooperatively score points by both

correctly guessing a sequence of random binary numbers (one point if they both guess correctly).

• Alice gets entire sequence ahead of time• Bob only sees that past binary numbers and guesses of Alice.• What is the optimal score in the game?

Alice and Bob Game (answer)• Online Matching Pennies• [Gossner, Hernandez, Neyman, 2003]• “Online Communication”

• Solution

General (causal) solution

• Score in Alice and Bob Game is a first-order statistic

• Achievable empirical distributions• (Processor 2 is strictly causal)

• Surprise: Bob doesn’t need to see the past of the sequence.

X1 X2 X3 X4 X5 X6 … Xn

Strong Coordination

Y1 Y2 Y3 Y4 Y5 Y6 … Yn

Z1 Z2 Z3 Z4 Z5 Z6 … Zn

Joint distribution of sequences is i.i.d.with respect to the desired joint distribution.

(Allow epsilon total variation distance.)

Point-to-point Coordination

• Theorem [C. 08]:• Strong Coordination involves picking a V such that X-V-Y• Message: R > I(X;V)• Common Randomness: R0 + R > I(X,Y;V)• Uses randomized decoder (channel from V to Y)

Node A Node BMessage

Common Randomness

Source Output

Synthetic Channel p(y|x)

Zero-Sum Game• Value obtained by system:• Objective• Maximize payoff

Node A Node BMessage

Key

Information Action

AdversaryAttack

Encoding Scheme

• Coordination Strategies• Empirical coordination for U• Strong coordination for Y

K

Converse

What the Adversary doesn’t know can hurt him.

[Yamamoto 97]

Knowledge of Adversary:

[Yamamoto 88]:

Proposed View of Encryption

Information obscured

Images from albo.co.uk