safenet’s tokenization manager - safenet - world … · safenet’s tokenization manager solution...

38
1 © SafeNet Confidential and Proprietary © SafeNet Confidential and Proprietary SafeNet’s Tokenization Manager Solution Overview

Upload: lekhanh

Post on 30-Jul-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

1

© SafeNet Confidential and Proprietary

© SafeNet Confidential and Proprietary

SafeNet’s Tokenization Manager Solution Overview

2

© SafeNet Confidential and Proprietary

Agenda

> Overview – Tokenization Manager

> The need for Tokenization / Addressing Regulations

> Tokenization vs. Encryption

> SafeNet Tokenization Manager

> Solution Components

> Features and Benefits

> PCI Compliance

3

© SafeNet Confidential and Proprietary

Compliance with Regulations

> Privacy laws and regulations require that personal information such as

Personal Identifiable Information (PII), personal health information and

payment cards cardholder data, should be protected by organizations that

acquire or process such information.

> The Payment Card Industry Data Security Standard (PCI DSS)

specifically requires that payment cardholder data that is stored, processed

or transmitted throughout the organization will be protected.

> Selecting the right solution enables reducing the regulatory scope of PCI

DSS compliance.

> Refer to the following article for more information on selecting the best

solution for PCI compliance:

Tokenization vs. Encryption: How to Determine the Best Data

Protection Solution for PCI Compliance - White Paper

4

© SafeNet Confidential and Proprietary

The Traditional Way: Encryption

It Works! It still is the most accepted method

But….

> The risk of sensitive data exposure is not completely eliminated:

> When sensitive data is being processed it still needs to be decrypted

> Applications processing sensitive data have access to the encryption /

decryption keys

> Database manipulation is harder and in some cases require to decrypt

and re-encrypt the sensitive data

5

© SafeNet Confidential and Proprietary

What is Tokenization?

> Tokenization replaces sensitive data (credit cards,

social security numbers etc.) with a surrogate value, a

token

> The sensitive data is encrypted and stored with the

token and hash key in a single secure data vault

> The token is stored, processed or transmitted

throughout the organization instead of the sensitive

data

6

© SafeNet Confidential and Proprietary

Tokenization Protects Sensitive Data

> Reducing regulatory scope and audit costs

> Regulatory usually excludes from scope system components that are

not exposed to sensitive data

> Reducing risk of exposure of sensitive data

> Minimizing data-blooming and data footprint

> Minimizing deployment costs and hassle

“Companies that do not store any electronic payment card

data on-site are able to significantly reduce the scope of PCI

compliance audits.” Gartner Hype Cycle GRC 2010

7

© SafeNet Confidential and Proprietary

Tokenization vs. Encryption

Both database encryption and tokenization help organizations

comply with regulations

Organizations need to check, which solution best addresses their

needs

Factors that influence this decision include (but are not limited to):

• Tokenization does not work on non-structured data

• Deployment costs

• Number of systems that can be taken out of scope

• Existing infrastructure and integration costs

7

Both SafeNet ProtectDB (for database encryption) and the Tokenization

Manager, use SafeNet’s DataSecure appliance to provide a single,

centralized interface for logging, auditing, and reporting access to

protected data, keys, and tokens.

8

© SafeNet Confidential and Proprietary

SafeNet Tokenization Manager

9

© SafeNet Confidential and Proprietary

SafeNet Ties It All Together for You From the data center to the cloud

Web/Application

Servers

Database

Servers

File Shares

Storage

Cloud/Virtualization

Proprietary

Systems

ProtectApp

ProtectDB ProtectFile

StorageSecure

KMIP/

APIs

ProtectV

Enterprise Crypto Management

DataSecure & KeySecure

SNMP, NTP, SYSLOG

Tokenization

Manager

ICAPI

LKM

10

© SafeNet Confidential and Proprietary

Features

Replacement of sensitive data with data of a

similar size that is not sensitive (a ―token‖)

1-to-1 mapping of tokens to sensitive data

Customization of token formats

Reducing PCI DSS Audit Scope: SafeNet Tokenization Manager

Benefits

Systems with tokens are taken out of

scope of compliance audits such as PCI

Data protection is ―transparent‖ – no

changes to database tables or file layouts

Format preserving—meaning no

application changes for systems that don’t

handle data in the clear

Wide support of various data types

00

00

00

0

00

00

00

00

0

00

00

00

00

0

00

00

00

00

0

00 Tokenization

Manager

Processing Sensitive Data

Out of Scope

11

© SafeNet Confidential and Proprietary

Protecting Your Credit Card Numbers

> A merchant’s database contains customers’ credit card numbers.

> Initially, credit card numbers are stored without encryption, protected only by

access-control measures.

> The credit card numbers are used across systems.

> The tokenization technology is meant reduce regulatory scope and minimize the

risk of exposure of the credit card information in storage.

> The process replaces card data with randomized numbers which are useless

out of the transaction scope. The real data is then deleted from the merchants DB.

> The full 20 CC numbers are replaced with 20 characters token created by defined

format

> Only tokens are then present in the data storage systems.

N A CC

5467 1009 4594 5420

Merchant DB Secured DB

[Out of PCI Scope] Credit Card

numbers that will

be tokenized

Contains

tokens

representing

Credit card

numbers

N A CC

5487 9811 0948 5420

12

© SafeNet Confidential and Proprietary

Deploying SafeNet Tokenization Manager

Token Vault

DataSecure

Out-of-scope

Applications

SafeNet Tokenization

Manager (Multiple Instances/hardware servers)

13

© SafeNet Confidential and Proprietary

SafeNet Tokenization Manager and PCI-DSS

PCI-DSS V2.0 defines Tokenization as a method for protecting

Primary Account Numbers (PAN) and for reducing Card Data

Environment (CDE) and audit-scope

SafeNet Tokenization Manager complies with

• PCI Tokenization Guidelines (Published August 2011)

• VISA Tokenization Best Practices

SafeNet Tokenization Manager API Web Services provide End-to-

End Tokenization, reducing regulatory scope to a minimum

• End-to-End tokenization takes the organization completely out scope if

the tokenization is performed at Service Providers

13

14

© SafeNet Confidential and Proprietary

SafeNet Tokenization Manager for Service

Providers:

Tokenization as a Service (TaaS)

15

© SafeNet Confidential and Proprietary

15

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

Tokenization Manager

Processing Sensitive Data

Out of Scope

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

0

Amazon.com PAN, PII, Payment Transactions

Best Buy PII, Accounts, Cardholder Data

Kaiser Patient Billing

Payment Processor Tokenization-as-a-Service

End-to-End TaaS Benefits

PAN is never stored locally

Easily Integrated into existing workflows

Potentially zero exposure to PCI for merchants

16

© SafeNet Confidential and Proprietary

Service Provider Environment

End-to-End TaaS

16

SafeNet

Tokenization

Manager

Payment

Acquisition

Server Point of Sale

Merchant Back Office

(Out of PCI Scope)

Plaintext PAN

Token

17

© SafeNet Confidential and Proprietary

SafeNet Tokenization Manager for TaaS

> Fully complies with PCI-DSS requirements

> API Web Services allow easy integration and clear segmentation

from CDE to non-CDE

> Elastic deployment and business model that best fits a Service

environment and pricing

> Support of different Token Vaults for different merchants

17

18

© SafeNet Confidential and Proprietary

SafeNet’s Tokenization Formats

> Replacement of sensitive structured data (max. 20 chars) with data

of a similar size that is not sensitive (―Token‖)

> Stores sensitive data in an encrypted protected zone – apart from

the original data now containing only the tokens

> Data format and representation can be preserved

> Token’s may be generated using a variety of formats:

Random First_Two_Last_Four

Sequential First_Six_Last_Four

Last_Four Fixed_Nineteen

First_Six Fixed_Twenty_Last_Four

> Or, token format can be user-defined (with version > V5.3 (v2.0))

19

© SafeNet Confidential and Proprietary

Format Preserving Tokenization

> Format Preserving Tokenization (FPT) uses tokens that preserve

the length and format of the sensitive data

> FPT facilitates the implementation of a tokenization solution,

minimizing the need to modify applications and databases

> Support for multiple formats

> Different formats for credit card numbers

> SSN and other PII data

> Support for alphanumeric

> Supports PCI-DSS guidelines for token / PAN distinguishability

> Achieved through LUHN algorithm enforcement

19

20

© SafeNet Confidential and Proprietary

Tokenization Solution Components

> The customer’s application –

> Accesses the database to retrieve or insert sensitive data such as SSN

or CC number.

> Token Vault - holding the tokens and keys information:

> Two tables are added: Token Vault & Key Table

> Token Server- running the Token Manager program - Handles all the

tokenization functions

> Can run on an application server or Web server, either as an application

or as Web service.

> DataSecure appliance – performs all Token Manager crypto operations

> Holding the encryption key, the HMAC key and the user owning the keys.

21

© SafeNet Confidential and Proprietary

Token Generation

De-tokenization: Token is sent by application with request for plaintext value (Get Token)

> Token is looked up

> Corresponding ciphertext is decrypted and sent back to the application

Token generation: Plaintext (sensitive information) is sent by application with request

for tokenization (Insert Token)

Keyed hash is

generated using

hash key on DS.

Lookup on hash is

performed.

If hash exists for the input

value, corresponding token is

returned.

If no hash exists:

• Token is generated

• Original value is encrypted

• Token, ciphertext, and hash

are written to the token vault

DataSecure

Protected

Zone

Token Servers

Vault & Key

Table

Application

HMAC

SHA 256

key

AES 256

Versioned

key

22

© SafeNet Confidential and Proprietary

Token Vault

Enc CC HASH

CC

Custom

Data

Rotation

Date Creation

date Token

AF6754

CFD89

5647fdge

6785aab

c

Apple

12-06-2011 12-12-2010 12345678

34561594

JKSDIU

WUEUJ

S#*%%&

@#(

56e7fdge

67wea7b

c

Toy R Us 12-06-2011 10-10-2010

458045678

654322

Token Vault VAULT1

[In PCI Scope]

23

© SafeNet Confidential and Proprietary

Key Table

Vault Key

Enc

Name

Key

Hash

Name

Token

Vault

Name

Key

Rotation

Date

Vault1 AES256_

Enc_Key_

1

HMAC256_

Key_1

Vault1 12-12-2010

Vault2 AES256_

Enc_Key_

2

HMAC256_

Key_2 Vault2 10-10-2010

Key Table

[In PCI Scope]

DataSecure

Appliance

AES 256

Versioned

key

HMAC

SHA 256

key

Local DS Key Owner

For each Token Vault there is one

row in the key table

24

© SafeNet Confidential and Proprietary

Format Table

Format

ID

Format

Description

Lead

Positions

Trail

Positions

Lead Mask Luhn Check Token

Length

101 Null 0 4

7777 0 20

102 Null 1 1 Null 1

25

Format Table

[In PCI Scope]

25

© SafeNet Confidential and Proprietary

Using the Tokenization Manager

Getting a Token

> User can check if the data was tokenized already or not:

getToken() requires the input value and the token table name.

For example, getToken("1234567890123456", null, Credit_Card_Numbers).

> User can get the tokens by date:

getTokensByDate() returns all the tokens create on or before the specified date. This

method requires Calendar date and the token table name.

For example, getTokensByDate (null, today, Credit_Card_Number)

26

© SafeNet Confidential and Proprietary

Using Tokenization Manager:

Creating new persisted Custom Format

> In case customers want to tokenize their data using the tokenization

format that is not predefined by Tokenization Manager, they can

create their own Custom Format.

> For example, to create a new Token Format that will not keep any

leading characters, will keep last 3 characters, adds the leading

mask ―777‖ to each token, passes Luhn check, and will be always

25 characters long (independent of input data), user will call

int newFormat = createNewFormat(0, 3, ―777‖, 1, 25);

And then use it to create a new token:

insert(―1234 5678 1234‖, Credit_Card_Numbers, newFormat, true);

27

© SafeNet Confidential and Proprietary

More new features….

> Allow to specify maximum length of token for each token vault.

> Provide ability to query Tokenization Manager version.

> Update ReKey CLI - delete passwords from command line

parameters; rekey progress and status updates will be provided via

log file entries; in case of crash, rekey process should not start from

the beginning, but where it left.

> Provide ability to upgrade and rollback on DataSecure, client side and

database (Token Vault).

> Test Tokenization Manager with Axis 2

> For licensing tracking, log IP address of each WS client

> Allow ProServ to plug in code to support other Databases (out-of-the-

box Tokenization Manager supports only Oracle and SQL Server).

28

© SafeNet Confidential and Proprietary

28

Benefits

The Tokenization Manager operates on SafeNet’s DataSecure, which

complies with the National Institute of Standards and Technology (NIST)

and has been FIPS 140-2 validated

Format Preserving Tokenization minimizes the need to modify

applications and databases that store, process or transmit the token

End-to-End Tokenization removes even the data entry points out of

scope

The administration console includes a comprehensive logging and

alerting system that notifies about abnormal activities in the

tokenization/de-tokenization process

SafeNet Tokenization Manager is highly scalable and can generate and

retrieve millions of tokens/per day for best performance

29

© SafeNet Confidential and Proprietary

Tokenization - Positioninig

> Applicable for small pieces of data (SSN, PANs, CCnums)

> Some integration work needed (with API or Web service)

> No changes to existing databases, 3rd party applications

> Token preserves original data format and fits into original field

> Scalable solution (per instance: max. ~250 ops/sec single calls,

~1000 ops/sec with bulk calls)

> Made for PCI-DSS compliancy

> Reduces scope of audits

30

© SafeNet Confidential and Proprietary

30

Summary

SafeNet Tokenization Manager

Supports Format Preserving Tokenization, minimizing the need to modify

applications

Offers deployment elasticity to ensure a cost effective implementation

Is managed on SafeNet’s DataSecure appliance, for robust key

management and maximum security

Provides End-to-End Tokenization, reducing regulatory scope to minimum

Is fully compliant with the PCI Tokenization Guidelines and VISA

Tokenization Best Practices

Tokenization is part of SafeNet’s Data Encryption and Control suite,

protecting different types of data on multiple environments

TaaS enables service providers to offer Tokenization as a Service to

their customers, while generating a new revenue stream

31

© SafeNet Confidential and Proprietary

© SafeNet Confidential and Proprietary

SafeNet Data Encryption & Control Solutions

Data Encryption solutions

to achieve PCI DSS

compliance

32

© SafeNet Confidential and Proprietary

> Cryptographic solutions to meet PCI DSS

requisites .

3.4 – Render PAN unreadable

3.5 – Protect any keys used

3.6.4 – Cryptographic key changes

4.1 – Use strong cryptography to safeguard

sensitive cardholder data during transmission

7.1 – Least privileges

> Case study

> SafeNet solutions portfolio for PCI DSS

compliance

> Q&A

33

© SafeNet Confidential and Proprietary

PCI DSS 3.4: Render PAN unreadable

34

© SafeNet Confidential and Proprietary

PCI DSS 3.5: Protect any keys used

35

© SafeNet Confidential and Proprietary

PCI DSS 3.6.4: Cryptographic key changes

36

© SafeNet Confidential and Proprietary

PCI DSS 4.1: protect PAN while in transit

37

© SafeNet Confidential and Proprietary

PCI DSS 7.1: Least privileges

38

© SafeNet Confidential and Proprietary

• 3.6 Fully document and implement all the processes and procedures for the management of cryptographic keys used for encryption of cardholder data [...]

• 8.2 In addition to assigning a unique ID to use [...] authentication [strong] of all users

• 8.4 Render all passwords unreadable during transmission and storage on all system components using strong cryptography.

• 10.5 Secure audit trails so they can not be changed.

PCI DSS 3.6, 8.2, 8.4, 10.5

• p. 32 Do not virtualize critical resources used in the generation of cryptographic keys

• 4.1.4 Implement defense in depth […] consider how security can be applied to protect each technical layer, including but not limited to […] VMs, […] application, and data layers.

PCI DSS Virtualization Guidelines

• Put here whatever local or international regulation is relevant

• …

Beyond PCI DSS

Cryptographic foundation and data

security solutions applicable not only to

the previous requisites