© 2003 dataprivacy partners ltd. dataprivacy partners ltd. privacy by design ® 4 th annual privacy...

68
© 2003 dataPrivacy Partners Ltd. dataPrivacy Partners Ltd. Privacy by Design ® 4 th Annual Privacy & Security Workshop From Anonymisation to Identification: The Technologies of Today and Tomorrow Peter Hope-Tindall Chief Privacy Architect™ dataPrivacy Partners Ltd. [email protected] November 7, 2003

Upload: giles-morgan

Post on 26-Dec-2015

218 views

Category:

Documents


1 download

TRANSCRIPT

© 2003 dataPrivacy Partners Ltd.

dataPrivacy Partners Ltd.

Privacy by Design ®

4th Annual Privacy & Security Workshop

From Anonymisation to Identification: The Technologies of Today and Tomorrow Peter Hope-TindallChief Privacy Architect™dataPrivacy Partners Ltd.

[email protected] November 7, 2003

dataPrivacy Partners Ltd.

2 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Agenda

• Biometrics and Privacy• Privacy Concerns

• Design & Implementation Issues

• Technology to protect Privacy

dataPrivacy Partners Ltd.

3 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Privacy

“Privacy is at the heart of liberty in the modern state.” Alan Westin

“the right to be let alone”* Warren & Brandeis

““the right to exercise the right to exercise control over your personal control over your personal

information.” information.” Ann Ann CavoukianCavoukian

* Warren and Brandeis, "The Right to Privacy" 4 Harvard Law Review 193 (1890). The phrase "right to be let alone" had been coined by Judge Cooley several years earlier. See THOMAS M. COOLEY, COOLEY ON TORTS 29

(2d ed. 1888).

dataPrivacy Partners Ltd.

4 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Security and Privacy – a technical view

• authentication

• data-integrity

• confidentiality

• access controls

• non-repudiation

Security

Privacy

• data protection - FIPs (not FIPS)

n.b.n.b. FIPs: Fair Information PracticesFIPs: Fair Information Practices

FIPS: Federal Information Processing StandardsFIPS: Federal Information Processing Standards

dataPrivacy Partners Ltd.

5 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Security vs. Privacy

• Accountable to President/CEO Board of Directors.

• Risk based assessment. (how likely is it?)

• Access and use controls defined by the system owner.

• Has been focused on protecting against outsiders.

• Accountable to the data subject.

• Capabilities based assessment.(is it possible?)

• Access and use controls defined by use limitation and consent of data subject and legislation.

• Protecting against outsiders, insiders and system owner.

dataPrivacy Partners Ltd.

6 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

The Complex nature of Privacy

• Identity• Measures the degree to which information is

personally identifiable.

• Linkability• Measures the degree to which data tuples or

transactions are linked to each other.

• Observability• Measures the degree to which identity or linkability

may be impacted from the use of a system. Which other data elements are visible; implicitly or explicitly.

With thanks and apologies to the Common CriteriaWith thanks and apologies to the Common Criteria

dataPrivacy Partners Ltd.

7 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometrics

• Biometric is derived from the Greek words bio (life) and metric (the measure of).

• “The automated use of Physiological or Behavioral Characteristics to determine or verify identity”

International Biometric Group (IBG)

• “‘Biometrics’ are unique, measurable characteristics or traits of a human being for automatically recognizing or verifying identity. ”

dataPrivacy Partners Ltd.

8 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometrics Schmetrics?

• Biometric: (noun) - one of various technologies that utilize behavioral or physiological characteristics to determine or verify identity. “Finger-scanning is a commonly used biometric.” Plural form also acceptable: “Retina-scan and iris-scan are eye-based biometrics."

• Biometrics: (noun) - Field relating to biometric identification. “What is the future of biometrics?”

• Biometric: (adjective) - Of or pertaining to technologies that utilize behavioral or physiological characteristics to determine or verify identity. “Do you plan to use biometric identification or older types of identification?”

dataPrivacy Partners Ltd.

9 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometric Template

• Distinctive encoded files derived and encoded from the unique features of a biometric sample

• A basic element of biometric systems• Templates, not samples, are used in

biometric matching

• Much smaller amount of data than sample (1/100th, 1/1000th)

• Vendor specific

• Different templates are generated each time an individual provides a biometric sample.

dataPrivacy Partners Ltd.

10 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Verification

• Also called 1:1 ‘Authentication’• Performs comparison against a single

biometric record

• Answers question: “Am I who I say I am?”

dataPrivacy Partners Ltd.

11 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Identification

• Also called 1:N Search• Performs comparison against entire

biometric database

• Answers question: “Who am I?”

dataPrivacy Partners Ltd.

12 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Is DNA a biometric?

• DNA requires actual physical sample• DNA matching is not performed in real

time• DNA matching does not employ

templates or feature extraction

• however – Policy issues and risks are identical

dataPrivacy Partners Ltd.

13 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

In a strict sense then, DNA matching is not a biometric in the same way that traditional forensic fingerprint examination is not a biometric.

Regardless of these distinctions, we believe that DNA-based technologies should be discussed alongside other biometric-based technologies inasmuch as they make use of a physiological characteristic to verify or determine identity. Beyond the definition, to most observers DNA looks, acts and may be used like other biometrics. The policy ramifications, while much more serious for DNA-based technologies share some common attributes with other biometrics.

dataPrivacy Partners Ltd.

14 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Taxonomy

Physiological Biometrics

• Finger Scanning• Hand Geometry• Facial Recognition• Iris Scanning• Retinal Scanning• Finger Geometry

Behavioral Biometrics

• Voice Recognition• Dynamic Signature

Verification• Keystroke Dynamics

(In reality all biometrics are both physiological and behavioral to some degree.)

dataPrivacy Partners Ltd.

15 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Finger Scanning

Minutiae based or pattern based

dataPrivacy Partners Ltd.

16 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Hand Geometry

Measures dimensions of hands

Easy to use / Widely used in access control applications

dataPrivacy Partners Ltd.

17 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Facial Recognition

Based on distinctive facial features

dataPrivacy Partners Ltd.

18 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Iris Scanning Takes a picture of the iris.

Performs an analysis of the ‘features’ of the iris.

Ridges

Furrows

Striations

Scan distance - up to 1 Meter

dataPrivacy Partners Ltd.

19 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Retinal Scanning

Utilizes distinctive patterns visible on retina at back of eye.

dataPrivacy Partners Ltd.

20 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Finger Geometry

• Measures the shape and size of a single (or pair) of fingers.

dataPrivacy Partners Ltd.

21 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Voice Recognition

Performs an analysis of features from an audio waveform.

dataPrivacy Partners Ltd.

22 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Dynamic Signature Verification

• Measures the pressure, vector and number of strokes of signature.

• Can be used with existing signature applications.

dataPrivacy Partners Ltd.

23 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Keystroke Dynamics

Measures the rhythm and distinctive timing patterns for keyboarding.

dataPrivacy Partners Ltd.

24 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Other

• Ear Geometry

• Body Odour

• Gait (walking pattern)

dataPrivacy Partners Ltd.

25 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometrics SummaryBiometric Accuracy Ease of

UseUser

AcceptanceStability Cost Typical Applications Suitability for

1:1 1:N

Finger-Scanning High High Medium Low

High *** Traveler Clearance, Drivers License, Welfare

Yes Yes

Hand Geometry High High Medium High

Medium High

*** Access Control, Traveler Clearance, Day Care

Yes No

Facial Recognition

High[i] Medium High

High Medium Low

*** Casino, Traveler Clearance Yes Yes[ii]

Iris Scanning Very High Medium Low

Medium High

High ***** Prisons, Access Control, Traveler Clearance

Yes Yes

Retinal Scanning

Very High Low Low High **** Access Control, Traveler Clearance,

Yes Yes

Finger Geometry

Medium High Medium High

Medium High

*** Access Control, Amusement Park Ticket holder

Yes No

Voice Recognition

Medium High High Medium Low

* Low security applications, telephone authentication

Yes No

Signature Verification

Medium High Medium High

Medium Low

** Low security applications, applications with existing

‘signature’

Yes No

[i] Note: Although the ‘potential’ exists for high accuracy, recent pilot projects have indicated great difficulty in obtaining accurate results with 1:N systems.[ii] Ibid.

Chart by Peter Hope-Tindall – developed for the OECD

dataPrivacy Partners Ltd.

26 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

How does a biometric system work?

• Scanning / Collection of Sample• Feature Extraction• Biometric template creation• Biometric template matching

• Many vendors have proprietary searching subsystems and optimized hardware

dataPrivacy Partners Ltd.

27 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Types of Function

• Identification (1:N)• Submission of sample as a search

candidate against entire database

• Verification (1:1)• Validation of sample against a

presumed identity

dataPrivacy Partners Ltd.

28 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Standard Biometric System

SensorSensor

LogicLogic ReferenceReference

DatabaseDatabaseApplicationApplication

dataPrivacy Partners Ltd.

29 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

BiometricDevice

BiometricDevice

Capture andFeature Extraction

Capture andFeature Extraction

Create ReferenceTemplate or dataset

Create CandidateMatch Template or

dataset

Store Templatein Database

BiometricVerification

2

6

3

7

8

4

BusinessApplication

Data Subject

1

5

9

dataPrivacy Partners Ltd.

30 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Metrics

• Scientific Method / Biometric Testing

“The real purpose of the scientific method is to make sure Nature hasn't misled you into thinking you know something you don't actually know.”

  Robert M. Pirsig, Zen and the Art of

Motorcycle Maintenance

dataPrivacy Partners Ltd.

31 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Perceptions

• Public perceptions• Looking for a magic solution

• Feel safe technology

• Post terrorism opportunism• Limited information

dataPrivacy Partners Ltd.

32 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometric Performance

• “False Reject Rate” a.k.a. False Non-Match Rate (FNMR)

• “False Acceptance Rate” a.k.a. False Match Rate (FMR)

• “Equal Error Rate” • Biometric System Error Trade-off

dataPrivacy Partners Ltd.

33 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Equal error rate crossover

Error Rate

Sensitivity

FRFA

dataPrivacy Partners Ltd.

34 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

dataPrivacy Partners Ltd.

35 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Other Metrics

• “Failure to Acquire” • Missing fingers/eyes

• “Failure to Enroll” • Insufficient features

• Throughput • System Cost

May be as high as 2-4 % in the general population. (up to 20-30 % in elderly).

dataPrivacy Partners Ltd.

36 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Publicly Available Independent Evaluations

• CESG • http://www.cesg.gov.uk/site/ast/index.cfm?

menuSelected=4&displayPage=4

• Face Recognition Vendor Test• http://www.frvt.org

• Fingerprint Verification Competition• http://bias.csr.unibo.it/fvc2002

• US National Biometric Test Center• http://www.engr.sjsu.edu/biometrics/nbtccw.pdf

dataPrivacy Partners Ltd.

37 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Security Concerns related to Biometrics

• Spoofing

• Countermeasures

• Replay Attacks

• Cannot revoke a biometric

• Improper Reliance

• Insufficient Enrolment Rigour

dataPrivacy Partners Ltd.

38 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Liveness

Steve McCurry, photographer of ‘Afghan Girl’ portrait for National Geographic - 1984.

National Geographic

http://www.melia.com/ngm/0204/feature0/

dataPrivacy Partners Ltd.

39 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Concerns about Biometric systems

• Rigour of enrollment process• Lack of independent performance

metrics • No very-large population biometric

system examples• Failure-to-enroll and Failure-to-acquire

underclass (maybe as high as 2-4% to even 20-30%)

• Post terrorism opportunism• Technology panacea• Large scale biometric system failure

dataPrivacy Partners Ltd.

40 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Privacy Concerns

1.Function Creep

2.Infrastructure of Surveillance/Unique Identifier• Default method of identification

• Used inappropriately

3.Consent/Transparency

• Information Leakage• Glaucoma

• DNA Profiling

dataPrivacy Partners Ltd.

41 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Function Creep/Finality

• ‘Function Creep’ (also known as ‘purpose creep’) is the term used to describe the expansion of a process or system, where data collected for one specific purpose is subsequently used for another unintended or unauthorized purpose.

• In fair information practice terms, we may think of function creep as the subsequent use, retention or disclosure or data without the consent of the individual and of unauthorized changes in the purpose specification for a given data collection.

dataPrivacy Partners Ltd.

42 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Function Creep/Finality Example

• As an example, we may think of a social service (welfare) system that requires a finger scan to enroll. Let us assume that undertakings were made at enrollment to the user that the finger scan is being collected solely for the purposes of guarding against ‘double dipping’ (ensuring that the user is not already registered for welfare). If the finger scan were subsequently used for another purpose (e.g. a law enforcement purpose, something not described in the initial purpose specification) then we have ‘function creep’.

dataPrivacy Partners Ltd.

43 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Infrastructure of Surveillance/Unique identifier

• An overarching concern for some people is that biometrics will become a technology of surveillance and social control. Perhaps as the ultimate personal identifier, they may be seen to facilitate all the ominous and dehumanizing aspects of an information society -- a society in which unparalleled amounts of personal information may be collected and used on a systematic basis.

see O’Connor, “Collected, Tagged, and Archived.”

dataPrivacy Partners Ltd.

44 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Consent/Transparency

• Certain biometrics may be used without the consent or active participation (or indeed even the knowledge) of the individual.

• Iris scanning can already be performed at a substantial distance (a range of 18 to 24 inches)[i] from the subject. As the technology improves, it is quite likely that iris acquisition may take place from even greater distances and without any user involvement whatsoever.

• From a privacy perspective these situations can conflict with the collection limitation, openness and purpose specification principles.

[i] http://www.eweek.com/article2/0,3959,115743,00.asp

dataPrivacy Partners Ltd.

45 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Implementation Modalities to Protect Privacy

• Statutory • Policy

• Privacy Impact Assessment

• Threat Risk Assessment

• Common Criteria Scheme

• Standards

• Technology• Tamper proof hardware

dataPrivacy Partners Ltd.

46 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Statutory

• In some jurisdictions, generalized or specific criminal sanction may be used to provide security protection for biometric systems and to outlaw certain activities to bypass security controls.

• Ontario Works Acthttp://www.e-laws.gov.on.ca/DBLaws/Statutes/English/97o25a_e.htm

• Biometric Identifier Privacy Act – State of New Jersey

http://www.njleg.state.nj.us/2002/Bills/A2500/2448_I1.HTM

dataPrivacy Partners Ltd.

47 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Statutory

• Statutory proscription and prohibition• Problem; may always be modified or interpreted

by the Government of the day.• Example: Statistics Canada 1906-1911 Census

dataPrivacy Partners Ltd.

48 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Policy

• The Privacy Impact Assessment (PIA) and privacy audits can ensure that privacy policies are followed and to ensure that the policies meet the needs of a given level of privacy protection or compliance. Although these techniques are commonplace within government, they are just starting to appear in the private sector.

• Depends of rigour and independence of PIA process.

dataPrivacy Partners Ltd.

49 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Technology

•STEPS - Security Technology Enabling Privacy • Build security systems that are privacy

enabled

• Meet both Security and Privacy requirements

• Privacy Architecture• De-Identification

• De-Linkability

• De-Observability

• Divide and conquer (similar to SIGINT)

dataPrivacy Partners Ltd.

50 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Standard Biometric System

SensorSensor

LogicLogic

DatabaseDatabaseApplicationApplication

dataPrivacy Partners Ltd.

51 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Standard Biometric System

• 1:1 and 1:N Functionality• Maximizes Control for System Owner• No Cards to lose• Back End Database Model• Potential for Surveillance• Greatest Potential for abuse

dataPrivacy Partners Ltd.

52 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Smart Card Biometric System

SensorSensor

LogicLogic

DatabaseDatabaseApplicationApplication

dataPrivacy Partners Ltd.

53 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Smart Card Biometric System

• 1:1 Functionality• Balance of Control between System

Owner and Data Subject• Lost Card Issues• Card Failure Issues• Smart Card Infrastructure has

Surveillance Potential

dataPrivacy Partners Ltd.

54 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Privacy Hardened Biometric System

• Designed for Toronto CIBS Project - 1997• Tamperproof tokens in scanner to prevent device

substitution/direct image injection• FPGA logic to restrict use of system within

preprogrammed guidelines• Must be a live finger on the authorized scanner

• Discourage systematic ‘dumping’ of identity database

• Keys required for identity resolution

dataPrivacy Partners Ltd.

55 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Privacy Hardened 1:N System

SensorSensor

LogicLogic

DatabaseDatabase Identity Identity ResolverResolver

dataPrivacy Partners Ltd.

56 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Privacy Hardened

Identity Identity

ResolverResolver

PseudoPseudo

IdentityIdentity

Real Real

IdentityIdentity

17943568957845

73458734857384

53798475839753

Mr John Smith

637-759-986

August 31st, 1953

dataPrivacy Partners Ltd.

57 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Wildcard Option - Biometric Encryption

• Biometric Encryption• Feature Extraction provides an

encryption/decryption key• Promising techniques

• Optical Feature Extraction

• Fourier Transform of ‘visual’ plaintext using Biometric Feature data resulting in ‘visual’ ciphertext

• One Installed site in Canada• Needs further research to bring to

market

dataPrivacy Partners Ltd.

58 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Biometric Encryption

Fingerprint Pattern 73981946 %h*9%4Kd

Enrollment PIN Encrypted PIN is stored

encrypts

Fingerprint Pattern %h*9%4Kd 73981946

Authentication Encrypted PIN PIN used for access

decrypts

dataPrivacy Partners Ltd.

59 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

from: http://www.darpa.mil/iao/HID.htm

dataPrivacy Partners Ltd.

60 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Recommendations

• Communicate openly and honestly about any planned system.

• Smaller inward looking systems.

• Focus on 1:1 authentication systems instead of 1:N identification systems

• Whenever possible, develop opt-in voluntary enrollment systems.

dataPrivacy Partners Ltd.

61 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Recommendations

• Collect biometric samples openly and with the consent of the user.

• If possible, allow the user to retain custody of the biometric template (perhaps on a smart card or token) and do not store the biometric template in a central system.

• Where 1:N systems are required craft protections in legislation/policy and technology.

• Oversight – restrictions on systems usage/identity resolution

dataPrivacy Partners Ltd.

62 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Hope for the future

• Biometric Encryption

• Biometric sensor on a card

• Credential vs. Certificate (Brands)

• Trusted Extension of Self

dataPrivacy Partners Ltd.

63 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Conclusions

• We need to incorporate statutory, policy and technological controls.

• Engage the issues honestly and openly.

• Don’t use a hammer to kill a fly.

dataPrivacy Partners Ltd.

64 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Our Challenge

• Open discussion• The technology is not evil• Develop the best Technology• Develop the best Policy• Develop the best Statutory Protections• Raise the bar• Search for improvement

dataPrivacy Partners Ltd.

65 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Epilogue

• Objectivity of privacy• Question the appropriateness of ‘Public

Acceptance’ as a measurement of anything.

• Useful at telling us what is wrong.

• Not so useful at telling us what is right.

• Storing Template/Minutiae/Image• Privacy Concerns are identical

dataPrivacy Partners Ltd.

66 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Resources

• OECD• http://www.oecd.org

• Information and Privacy Commission/Ontario• http://www.ipc.on.ca

• dataPrivacy Partners Ltd.• http://www.dataprivacy.com

• Roger Clarke• http://www.anu.edu.au/people/Roger.Clarke/

• Biometric Consortium - US• http://www.biometrics.org

• CATA Biometrics Group - Canada• http://www.cata.ca/biometrics/

dataPrivacy Partners Ltd.

67 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

Contact Information

Peter Hope-TindalldataPrivacy Partners

Ltd.5744 Prairie Circle.Mississauga, ON L5N

6B5

+1 (416) 410-0240

[email protected]

dataPrivacy Partners Ltd.

68 Biometrics Presentation © 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.© 2003 dataPrivacy Partners Ltd.

[email protected]

http://www.dataprivacy.com