quantification of integrity michael clarkson and fred b. schneider cornell university mit systems...

48
Quantificati on of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15, 2010

Upload: ava-merriman

Post on 28-Mar-2015

215 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

MIT Systems Security and Cryptography Discussion GroupOctober 15, 2010

Page 2: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

2

Goal

Information-theoreticQuantification of

programs’ impact onIntegrity

of Information

[Denning 1982]

Page 3: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

3

What is Integrity?Common Criteria:

Protection of assets from unauthorized modification

Biba (1977):Guarantee that a subsystem will perform as it was

intendedIsolation necessary for protection from subversionDual to confidentiality

Databases:Constraints that relations must satisfyProvenance of dataUtility of anonymized data

…no universal definition

Page 4: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

4

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Page 5: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

5

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

…distinct, but interact

Page 6: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

6

ContaminationGoal: model taint analysis

Program

User

Attacker

User

Attacker

trusted

untrusted

Page 7: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

7

ContaminationGoal: model taint analysis

Untrusted input contaminates trusted output

Program

User

Attacker

User

Attacker

trusted

untrusted

Page 8: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

8

Contamination

u contaminates o

o:=(t,u)

Page 9: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

9

Contamination

u contaminates o

(Can’t u be filtered from o?)

o:=(t,u)

Page 10: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

10

Quantification of Contamination

Use information theory: information is surprise

X, Y, Z: distributions

I(X,Y): mutual information between X and Y (in bits)

I(X,Y | Z): conditional mutual information

Page 11: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

11

Quantification of Contamination

Program

User

Attacker

User

Attacker

trusted

untrusted

Page 12: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

12

Quantification of Contamination

Program

User

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

Page 13: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

13

Quantification of Contamination

Contamination = I(Uin,Tout | Tin)

Program

User

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

[Newsome et al. 2009]

Dual of [Clark et al. 2005, 2007]

Page 14: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

14

Example of Contamination

o:=(t,u)

Contamination = I(U, O | T) = k bits

if U is uniform on [0,2k-1]

Page 15: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

15

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

Page 16: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

16

Program SuppressionGoal: model program

(in)correctnessSender Receiver

correctSpecification

(Specification must be deterministic)

Page 17: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

17

Program SuppressionGoal: model program

(in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receivercorrect

real

Page 18: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

18

Program SuppressionGoal: model program

(in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Sender Receiver

Implementation might suppress information about correct output from real output

real

correctSpecification

Page 19: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

19

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

Spec.

a[0..m-1]: trusted

Page 20: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

20

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

Spec.

Impl. 1

Suppression—a[0] missing

No contamination

a[0..m-1]: trusted

Page 21: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

21

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression—a[0] missing

No contamination

Suppression—a[m] added

Contamination

a[0..m-1]: trusted

a[m]: untrusted

Page 22: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

22

Suppression vs. Contamination

*

Attacker

*

Attacker

Contamination

Suppression

output := input

Page 23: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

23

Suppression vs. Contamination

n:=rnd();o:=t xor n

o:=t xor u

o:=(t,u)

t suppressed by noiseno contamination

t suppressed by uo contaminated by u

o contaminated by uno suppression

o := t

Page 24: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

24

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receiver

Page 25: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

25

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

In Spec

Sender Receiver

Page 26: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

26

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin Impl

In Spec

Sender Receiver

Page 27: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

27

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin

Program transmission = I(Spec , Impl)

In Spec

Impl

Sender Receiver

Page 28: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

28

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Page 29: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

29

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)Total info to learn about Spec

Page 30: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

30

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec

Page 31: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

31

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec Info NOT learned

about Spec by observing Impl

Page 32: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

32

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Program Suppression = H(Spec | Impl)

Page 33: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

33

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression = H(A) Suppression ≤ H(A)

A = distribution of individual array elements

Page 34: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

34

Echo Specification

output := inputSender Receivertrusted

Tin Tin

Page 35: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

35

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

Page 36: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

36

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

Simplifies to information-theoretic model of channels, with attacker

Page 37: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

37

Channel Suppression

Channel transmission = I(Tin,Tout)Channel suppression = H(Tin | Tout)

(Tout depends on Uin)

Channel

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Tin Tout

Uin

Page 38: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

38

Belief-based MetricsWhat if user’s/receiver’s distribution on

unobservable inputs is wrong? Belief-based information flow

[Clarkson et al. 2005]

Belief-based generalizes information-theoretic:

On single executions, the same In expectation, the same …if user’s/receiver’s

distribution is correct

Page 39: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

39

Suppression and Confidentiality

Declassifier: program that reveals (leaks) some information; suppresses rest

Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Thm. Leakage + Suppression is a constant What isn’t leaked is suppressed

Page 40: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

40

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake

Anonymizer

User

Database

User

query

response

anonymized response

Page 41: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

41

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage

Anonymizer

User

Database

User

query

response

anonymized response

anon. resp. := resp.

Page 42: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

42

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage…sacrifices integrity for confidentiality’s

sake

Anonymizer

User

Database

User

query

response

anonymized response

Page 43: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

43

K-anonymity

DB. Every individual must be anonymous within set of size k [Sweeney 2002]

Iflow. Every output corresponds to k inputs

…no bound on leakage or suppression

Page 44: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

44

L-diversityDB. Every individual’s sensitive information

should appear to have L (roughly) equally likely values[Machanavajjhala et al. 2007]

DB. Entropy L-diversity: H(anon. block) ≥ log L[Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007]

Iflow. H(Tin | tout) ≥ log L

…implies suppression ≥ log L

Page 45: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

45

Differential PrivacyDB. [Dwork et al. 2006, Dwork 2006]

No individual loses privacy by including data in database

Anonymized data set reveals no information about an individual beyond what other individuals already reveal

Iflow. Output reveals almost no information about individual input beyond what other inputs already reveal

…implies almost all information about individual suppressed …quite similar to noninterference

Page 46: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

46

SummaryMeasures of information corruption:

Contamination (generalizes taint analysis, dual to leakage)

Suppression (generalizes program correctness, no dual)

Application: database privacy(model anonymizers; relate utility and privacy; security conditions)

Page 47: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

47

More Integrity Measures

Attacker- and program-controlled suppression

Granularity: Average over all executions Single executions Sequences of executions

…interaction of attacker with program

Applications: Error-correcting codes, malware

Page 48: Quantification of Integrity Michael Clarkson and Fred B. Schneider Cornell University MIT Systems Security and Cryptography Discussion Group October 15,

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

MIT Systems Security and Cryptography Discussion GroupOctober 15, 2010