quantification of integrity michael clarkson and fred b. schneider cornell university mit systems...

Post on 28-Mar-2015

215 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

MIT Systems Security and Cryptography Discussion GroupOctober 15, 2010

2

Goal

Information-theoreticQuantification of

programs’ impact onIntegrity

of Information

[Denning 1982]

3

What is Integrity?Common Criteria:

Protection of assets from unauthorized modification

Biba (1977):Guarantee that a subsystem will perform as it was

intendedIsolation necessary for protection from subversionDual to confidentiality

Databases:Constraints that relations must satisfyProvenance of dataUtility of anonymized data

…no universal definition

4

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

5

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

…distinct, but interact

6

ContaminationGoal: model taint analysis

Program

User

Attacker

User

Attacker

trusted

untrusted

7

ContaminationGoal: model taint analysis

Untrusted input contaminates trusted output

Program

User

Attacker

User

Attacker

trusted

untrusted

8

Contamination

u contaminates o

o:=(t,u)

9

Contamination

u contaminates o

(Can’t u be filtered from o?)

o:=(t,u)

10

Quantification of Contamination

Use information theory: information is surprise

X, Y, Z: distributions

I(X,Y): mutual information between X and Y (in bits)

I(X,Y | Z): conditional mutual information

11

Quantification of Contamination

Program

User

Attacker

User

Attacker

trusted

untrusted

12

Quantification of Contamination

Program

User

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

13

Quantification of Contamination

Contamination = I(Uin,Tout | Tin)

Program

User

Attacker

User

Attacker

trusted

untrusted

Uin

Tin Tout

[Newsome et al. 2009]

Dual of [Clark et al. 2005, 2007]

14

Example of Contamination

o:=(t,u)

Contamination = I(U, O | T) = k bits

if U is uniform on [0,2k-1]

15

Our Notions of Integrity

Starting Point Corruption Measure

Taint analysis ContaminationProgram correctness

Suppression

Corruption: damage to integrity

Contamination: bad information present in outputSuppression: good information lost from output

16

Program SuppressionGoal: model program

(in)correctnessSender Receiver

correctSpecification

(Specification must be deterministic)

17

Program SuppressionGoal: model program

(in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receivercorrect

real

18

Program SuppressionGoal: model program

(in)correctness

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Sender Receiver

Implementation might suppress information about correct output from real output

real

correctSpecification

19

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

Spec.

a[0..m-1]: trusted

20

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

Spec.

Impl. 1

Suppression—a[0] missing

No contamination

a[0..m-1]: trusted

21

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression—a[0] missing

No contamination

Suppression—a[m] added

Contamination

a[0..m-1]: trusted

a[m]: untrusted

22

Suppression vs. Contamination

*

Attacker

*

Attacker

Contamination

Suppression

output := input

23

Suppression vs. Contamination

n:=rnd();o:=t xor n

o:=t xor u

o:=(t,u)

t suppressed by noiseno contamination

t suppressed by uo contaminated by u

o contaminated by uno suppression

o := t

24

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

SpecificationSender Receiver

25

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

In Spec

Sender Receiver

26

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin Impl

In Spec

Sender Receiver

27

Quantification of Program Suppression

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Specification

Uin

Tin

Program transmission = I(Spec , Impl)

In Spec

Impl

Sender Receiver

28

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

29

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)Total info to learn about Spec

30

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec

31

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Info actually learned about

Spec by observing Impl

Total info to learn about Spec Info NOT learned

about Spec by observing Impl

32

Quantification of Program Suppression

H(X): entropy (uncertainty) of XH(X|Y): conditional entropy of X given Y

Program Transmission = I(Spec, Impl) = H(Spec) − H(Spec |

Impl)

Program Suppression = H(Spec | Impl)

33

Example of Program Suppression

for (i=0; i<m; i++) { s := s + a[i]; }

for (i=1; i<m; i++) { s := s + a[i]; }

for (i=0; i<=m; i++) { s := s + a[i]; }

Spec.

Impl. 1 Impl. 2

Suppression = H(A) Suppression ≤ H(A)

A = distribution of individual array elements

34

Echo Specification

output := inputSender Receivertrusted

Tin Tin

35

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

36

Echo Specification

Implementation

Sender

Attacker

Receiver

Attacker

trusted

untrusted

output := inputSender Receivertrusted

Tin Tin

Tin Tout

Uin

Simplifies to information-theoretic model of channels, with attacker

37

Channel Suppression

Channel transmission = I(Tin,Tout)Channel suppression = H(Tin | Tout)

(Tout depends on Uin)

Channel

Sender

Attacker

Receiver

Attacker

trusted

untrusted

Tin Tout

Uin

38

Belief-based MetricsWhat if user’s/receiver’s distribution on

unobservable inputs is wrong? Belief-based information flow

[Clarkson et al. 2005]

Belief-based generalizes information-theoretic:

On single executions, the same In expectation, the same …if user’s/receiver’s

distribution is correct

39

Suppression and Confidentiality

Declassifier: program that reveals (leaks) some information; suppresses rest

Leakage: [Denning 1982, Millen 1987, Gray 1991, Lowe 2002, Clark et al. 2005, 2007, Clarkson et al. 2005, McCamant & Ernst 2008, Backes et al. 2009]

Thm. Leakage + Suppression is a constant What isn’t leaked is suppressed

40

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake

Anonymizer

User

Database

User

query

response

anonymized response

41

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage

Anonymizer

User

Database

User

query

response

anonymized response

anon. resp. := resp.

42

Database PrivacyStatistical database anonymizes query

results:

…sacrifices utility for privacy’s sake…suppresses to avoid leakage…sacrifices integrity for confidentiality’s

sake

Anonymizer

User

Database

User

query

response

anonymized response

43

K-anonymity

DB. Every individual must be anonymous within set of size k [Sweeney 2002]

Iflow. Every output corresponds to k inputs

…no bound on leakage or suppression

44

L-diversityDB. Every individual’s sensitive information

should appear to have L (roughly) equally likely values[Machanavajjhala et al. 2007]

DB. Entropy L-diversity: H(anon. block) ≥ log L[Øhrn and Ohno-Machado 1999, Machanavajjhala et al. 2007]

Iflow. H(Tin | tout) ≥ log L

…implies suppression ≥ log L

45

Differential PrivacyDB. [Dwork et al. 2006, Dwork 2006]

No individual loses privacy by including data in database

Anonymized data set reveals no information about an individual beyond what other individuals already reveal

Iflow. Output reveals almost no information about individual input beyond what other inputs already reveal

…implies almost all information about individual suppressed …quite similar to noninterference

46

SummaryMeasures of information corruption:

Contamination (generalizes taint analysis, dual to leakage)

Suppression (generalizes program correctness, no dual)

Application: database privacy(model anonymizers; relate utility and privacy; security conditions)

47

More Integrity Measures

Attacker- and program-controlled suppression

Granularity: Average over all executions Single executions Sequences of executions

…interaction of attacker with program

Applications: Error-correcting codes, malware

Quantification of Integrity

Michael Clarkson and Fred B. SchneiderCornell University

MIT Systems Security and Cryptography Discussion GroupOctober 15, 2010

top related