share orlando ulf mattsson session 9353 2011
DESCRIPTION
TRANSCRIPT
PCI Compliance Without Compensating Controls – How to Take Your Mainframe Out of Scope
Ulf MattssonCTO Protegrity
August 8, 2011Session 9353
Ulf Mattsson
• 20 years with IBM Software Development • Received US Green Card ‘EB 11 – Individual of Extraordinary
Ability’ endorsed by IBM Research
• Inventor of 21 Patents• Encryption Key Management, Policy Driven Data Encryption,
Distributed Tokenization and Intrusion Prevention
• Research member of the International Federation for Information Processing (IFIP) WG 11.3 Data and Application Security
• Created the Architecture of the Protegrity Database Security Technology
• Received Industry's 2008 Most Valuable Performers (MVP) award together with technology leaders from IBM, Google, Cisco, Ingres and other leading companies
2 Session 9353
03
4
“It is fascinating that the top threat events in both 2010 and 2011 are the same
and involve external agents hacking and installing malware to compromise the confidentiality and integrity of servers.”
Best Source of Incident Data
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team
Session 93535
Compromised Data Types - # Records
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
%
Session 93536
File System
Data Entry
Database
Storage
Application
Attacks at Different System Layers
Backup
DATABASE ATTACK
MALWARE / TROJAN
FILE ATTACK
SQL INJECTION
MEDIA ATTACK…
SNIFFER ATTACK
Network
Authorized/ Un-authorized
Users
HW Service
Contractors
Vendors
Database Admin
System Admin
…
“The perimeter is gone – need for new security approaches”
7 Session 9353
PCI DSS - Payment Card Industry Data Security Standard
• Applies to all organizations that hold, process, or exchange cardholder information
• A worldwide information security standard defined by the Payment Card Industry Security Standards Council (formed in 2004)
• Began as five different programs: • Visa Card Information Security Program, MasterCard Site Data
Protection, American Express Data Security Operating Policy, Discover Information and Compliance, and the JCB Data Security Program.
• 12 requirements for compliance, organized into six logically related groups, which are called "control objectives."
8 Session 9353
PCI DSS # 3, 6, 7, 10 & 12
Build and maintain a secure network.
1. Install and maintain a firewall configuration to protect data
2. Do not use vendor-supplied defaults for system passwords and other security parameters
Protect cardholder data. 3. Protect stored data4. Encrypt transmission of cardholder data
and sensitive information across public networks
Maintain a vulnerability management program.
5. Use and regularly update anti-virus software
6. Develop and maintain secure systems and applications
Implement strong access control measures.
7. Restrict access to data by business need-to-know
8. Assign a unique ID to each person with computer access
9. Restrict physical access to cardholder data
Regularly monitor and test networks.
10. Track and monitor all access to network resources and cardholder data
11. Regularly test security systems and processes
Maintain an information security policy.
12. Maintain a policy that addresses information security
9 Session 9353
PCI DSS #3 & 4 – Protect Cardholder Data
• 3.4 Render PAN, at minimum, unreadable anywhere it is stored by using any of the following approaches:• One-way hashes based on strong cryptography• Truncation• Index tokens and pads (pads must be securely stored)• Strong cryptography with associated key-management processes and procedures
• 4.1 Use strong cryptography to safeguard sensitive cardholder data during transmission over open, public networks.
• Comments – Cost effective compliance• Encrypted PAN is always “in PCI scope”• Tokens can be “out of PCI scope”
10 Session 9353
PCI DSS - Appendix B: Compensating Controls
• Compensating controls may be considered for most PCI DSS requirements when an entity cannot meet a requirement explicitly as stated, due to legitimate technical or documented business constraints, but has sufficiently mitigated the risk associated with the requirement through implementation of other, or compensating, controls.
• Compensating controls must satisfy the following criteria:• Meet the intent and rigor of the original PCI DSS requirement.• Provide a similar level of defense as the original PCI DSS requirement, such that the
compensating control sufficiently offsets the risk that the original PCI DSS requirement was designed to defend against.
• Be “above and beyond” other PCI DSS requirements. (Simply being in compliance with other PCI DSS requirements is not a compensating control.)
Session 935311
PCI DSS - Network Segmentation
• Network segmentation of, or isolating (segmenting), the cardholder data environment from the remainder of an entity’s network is not a PCI DSS requirement.
• However, it is strongly recommended as a method that may reduce:• The scope of the PCI DSS assessment• The cost of the PCI DSS assessment• The cost and difficulty of implementing and maintaining PCI DSS controls• The risk to an organization (reduced by consolidating cardholder data into
fewer, more controlled locations)
Session 935312
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 -
Transactions per second (16 digits)*
I
Format
Preserving
Encryption
Speed of Different Protection Methods
I
Data
Type
Preservation
I
Memory
Data
Tokenization
I
AES CBC
Encryption
Standard
I
Traditional
Data
Tokenization
Encryption
13
*: Speed will depend on the configuration
Session 9353
I
Format
Preserving
Encryption
Security of Different Protection Methods
I
Data
Type
Preservation
I
Memory
Data
Tokenization
I
AES CBC
Encryption
Standard
I
Traditional
Data
Tokenization
High -
Low -
Security Level
Encryption
14Session 9353
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 -
Transactions per second (16 digits)
I
Format
Preserving
Encryption
Speed and Security
I
Data
Type
Preservation
I
Memory
Data
Tokenization
I
AES CBC
Encryption
Standard
I
Traditional
Data
Tokenization
Speed*
Security
High
Low
Security
Level
Encryption*: Speed will depend on the configuration
15Session 9353
Different Approaches for Tokenization
• Traditional Tokenization• Dynamic Model or Pre-Generated Model• 5 tokens per second - 5000 tokenizations per second
• Next Generation Tokenization• Memory-tokenization• 200,000 - 9,000,000+ tokenizations per second • “The tokenization scheme offers excellent security,
since it is based on fully randomized tables.” *• “This is a fully distributed tokenization approach with
no need for synchronization and there is no risk for collisions.“ *
*: Prof. Dr. Ir. Bart Preneel, Katholieke University Leuven, Belgium
016Session 9353
Best Worst
Area ImpactDatabase
File Encryption
DatabaseColumn
Encryption
CentralizedTokenization
(old)
MemoryTokenization
(new)
Scalability
Availability
Latency
CPU Consumption
Security
Data Flow Protection
Compliance Scoping
Key Management
Randomness
Separation of Duties
Evaluating Encryption & Tokenization Approaches
EncryptionEvaluation Criteria Tokenization
017Session 9353
Evaluation Criteria Strong Field Encryption
Formatted Encryption
Memory Tokenization
Disconnected environments
Distributed environments
Performance impact when loading data
Transparent to applications
Expanded storage size
Transparent to databases schema
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
High risk data
Security - compliance to PCI, NIST
Best Worst
Evaluating Field Encryption & Distributed Tokenization
18Session 9353
Token Flexibility for Different Categories of Data
Type of Data Input Token Comment
Token Properties
Credit Card 3872 3789 1620 3675 8278 2789 2990 2789 Numeric
Medical ID 29M2009ID 497HF390D Alpha-Numeric
Date 10/30/1955 12/25/2034 Date
E-mail Address [email protected] [email protected] Alpha Numeric, delimiters in input preserved
SSN delimiters 075-67-2278 287-38-2567 Numeric, delimiters in input
Credit Card 3872 3789 1620 3675 8278 2789 2990 3675 Numeric, Last 4 digits exposed
Policy Masking
Credit Card 3872 3789 1620 3675 clear, encrypted, tokenized at rest3872 37## #### ####
Presentation Mask: Expose 1st 6 digits
19Session 9353
Some Tokenization Use Cases
• Customer 1• Vendor lock-in: What if we want to switch payment processor?• Performance challenge: What if we want to rotate the tokens?• Performance challenge with initial tokenization
• Customer 2• Reduced PCI compliance cost by 50%• Performance challenge with initial tokenization• End-to-end: looking to expand tokenization to all stores
• Customer 3• Desired a single vendor• Desired use of encryption and tokenization• Looking to expand tokens beyond CCN to PII
• Customer 4• Remove compensating controls on the mainframe• Pushing tokens through to avoid compensating controls
20Session 9353
Tokenization Use Case #2
• A leading retail chain• 1500 locations in the U.S. market
• Simplify PCI Compliance• 98% of Use Cases out of audit scope• Ease of install (had 18 PCI initiatives at one time)
• Tokenization solution was implemented in 2 weeks • Reduced PCI Audit from 7 months to 3 months• No 3rd Party code modifications• Proved to be the best performance option• 700,000 transactions per days• 50 million card holder data records• Conversion took 90 minutes (plan was 30 days)• Next step – tokenization servers at 1500 locations
21Session 9353
Case Study 1: Goal – PCI Compliance & Application Transparency
FileEncryption:Windows
DatabaseEncryption:
DB2 (zOS, iSeries),Oracle,
SQL Server
Applications
RetailStore
Applications FTP
FileDecryption
Central HQ Location
FileEncryption:Windows,
UNIX,Linux, zOS
Credit CardEntry
: Encryption service
22 Session 9353
Case Study 2: Goal – Addressing Advanced Attacks & PCI DSS
Application
Application FTP
DatabaseEncryption:
DB2,SQL Server
FileEncryption:Windows,
UNIX,zOS
RetailStore Central HQ Location
Credit CardEntry
Application Application
Encryption
: Encryption service
End-to-End-Encryption (E2EE)
23 Session 9353
UDFVIEW
CPACF (CCF)
EDITPROCICSF
CPACF
EDITPROC
FIELDPROC
Encryption Topologies – Mainframe Example
: Encryption service * : 20 bytes
Local Encryption
Remote Encryption
TCP/IPUDFVIEW
Mainframe(z/OS)
DB2
DB2
DB2
DB2
User Defined Function
Integrated Cryptographic Services Facility
CP Assist for Cryptographic
Function
Key Server
Crypto Server
1Micro-second*
1Micro-second*
1000 Micro-seconds*
1Micro-second*
24 Session 9353
Data Loading (Batch)
1 000 000 –
100 000 -
10 000 –
1 000 –Encryption
Topology
Rows Decrypted / s (100 bytes)
z/OS
Hardware
Crypto - CPACF
(All Operations)
Queries (Data Warehouse & OLTP)
Column Encryption Performance - Different Topologies
I
Network Attached
Encryption (SW/HW)
I
Local Encryption (SW/HW)
25 Session 9353
Evaluation of Encryption Options for DB2 on z/OS
Encryption Interface
Performance PCI DSS Security Transparency
API
UDF DB2 V8
UDF DB2 V9 -
Fieldproc
Editproc
Best Worst
26 Session 9353
200 000 –
100 000 –
10 000 –
1000 –
5 – Tokenization
Topology
PAN Tokenization
(per second) New Distributed
Tokenization Approach
(per deployed token server)
Different Tokenization Approaches - Performance
I
New
Old Centralized
Tokenization Approach
(enterprise total)
I
Old
Outsourced
On-site
On-site
27 Session 9353
Evaluating Different TokenizationSolutions
Best Worst
Evaluating Different Tokenization Implementations
Evaluation Area Hosted/Outsourced On-site/On-premises
Area Criteria Central (old) Distributed Central (old) Distributed Integrated
Operational
Needs
Availability
Scalability
Performance
PricingModel
Per Server
Per Transaction
DataTypes
Identifiable - PII
Cardholder - PCI
SecuritySeparation
Compliance Scope
28 Session 9353
PCI DSS - Ways to Render the PAN* Unreadable
Two-way cryptography with associated key management processes
One-way cryptographic hash functions
Index tokens and pads
Truncation (or masking – xxxxxx xxxxxx 6781)
* PAN: Primary Account Number (Credit Card Number)
029Session 9353
123456 777777 1234
123456 123456 1234
aVdSaH gF4fJh sDla
!@#$%a^&*B()_+!@4#$2%p^&*
How to not Break the Data Format
Hashing -
Binary Encryption -
Alpha Encoding -
Encoding -
Partial Encoding -
Clear Text - Data Field
Length
Protection Method
!@#$%a^&*B()_+!@
666666 777777 8888Tokenizing
orFormatted Encryption
Length and Type Changed
Type Changed
CCN / PAN
30 Session 9353
Different Security Options for Data Fields
Evaluation Criteria Strong Encryption
Formatted Encryption
New DistributedTokenization
Old Central
Tokenization
Disconnected environments
Distributed environments
Performance impact – data loading
Transparent to applications
Expanded storage size
Transparent to database schema
Long life-cycle data
Unix or Windows &“big iron”
Re-keying of data in a data flow
High risk data
Compliance to PCI, NIST
Best Worst
31 Session 9353
Database Protection Approach
Performance Storage Availability Transparency Security
Monitoring, Blocking, Masking
Column Level Formatted Encryption
Column Level Strong Encryption
Distributed Tokenization
Central Tokenization
Database File Encryption
Best Worst
Choose Your Defenses – Positioning of Alternatives
32 Session 9353
Data Protection Challenges
• Actual protection is not the challenge
• Management of solutions• Key management• Security policy• Auditing and reporting
• Minimizing impact on business operations• Transparency• Performance vs. security
• Minimizing the cost implications
• Maintaining compliance
• Implementation Time
33 Session 9353
Central Manager for:•Encryption keys•Security policy•Reporting
HardwareSecurity
RACFApplications
DB2 z/OS
Files
ICSFEncryption
Solution Mainframe z/OS
DB2 LUW
Informix
System i
Other
HardwareSecurity
Single Point of Control for Data Encryption
API
: Encryption service34 Session 9353
Data Security Management
Database Protector
File System Protector
Policy
Secure Distribution
AuditLog
Secure Archive
Secure Collection
Application Protector
Tokenization Server
EnterpriseData SecurityAdministrator
Broad Platform Support
35 Session 9353
Application Databases
Hiding Data in Plain Sight – Data Tokenization
400000 123456 7899
400000 222222 7899
Y&SFD%))S( Tokenization Server
Data Token
Data Entry
036
What is Encryption and Tokenization?
Used Approach Cipher System Code System
Cryptographic algorithms
Cryptographic keys
Code books
Index tokens
Source: McGraw-HILL ENCYPLOPEDIA OF SCIENCE & TECHNOLOGY
TokenizationEncryption
37Session 9353
• Visa recommendations should have been simply to use a random number
• You should not write your own 'home-grown' token servers
Comments on Visa’s Tokenization Best Practices
038
Applications & Databases
: Data Token
Reducing the Attack Surface
Protected sensitive information
Unprotected sensitive information:
123456 999999 1234 123456 999999 1234 123456 999999 1234
123456 123456 1234
123456 999999 1234
123456 123456 1234
123456 999999 1234 123456 999999 1234
39
Positioning of Different Protection Options
Evaluation Criteria Strong Encryption
Formatted Encryption
Data Tokens
Security & Compliance
Total Cost of Ownership
Use of Encoded Data
Best Worst
40Session 9353
Why Tokenization – A Triple Play
1. No Masking2. No Encryption3. No Key Management
041Session 9353
Why In-memory Tokenization
1. Better 2. Faster 3. Lower Cost / TCO
042
$
Session 9353