software security metrics

58
Software Security Metrics OWASP AppSec California, January 2016 Caroline Wong, CISSP, Director

Upload: cigital

Post on 15-Apr-2017

344 views

Category:

Software


1 download

TRANSCRIPT

Page 1: Software Security Metrics

Software Security Metrics

OWASP AppSec California, January 2016

Caroline Wong, CISSP, Director

Page 2: Software Security Metrics

• “What most people do when faced with creating a metrics program is calculate a few measurements that seem interesting on the surface. This is the traffic light approach that oversimplifies the data.

• Or they barrage the audience with a ton of detailed metrics that overwhelm the reader.

• But for most organizations, none of that works.

• And what happens if you just do nothing? Then you have little to no understanding of the effectiveness of your AppSec program.”

• BSIMM Community Member, 2015

Page 3: Software Security Metrics

Agenda

1. Questions from Executives

2. AppSec Capabilities and Metrics

3. Common Metrics Scenarios

4. Developing Key Metrics

5. A Detailed Example

Page 4: Software Security Metrics

Questions from Executives

• More often than not, company executives ask the “wrong” questions about AppSec:

• This data is often not available, and even if it is, it’s very hard to find an “apples to apples” comparison.

• What’s our mean time to recover from a security incident?

• Mean time to recover is largely outside of your control. It depends on the incident!

• How does our bug count compare to that of our competitors?

Page 5: Software Security Metrics

Questions from Executives

• What about when executives ask the “right” questions?

• “We’ve invested so much money into the AppSec program…

• What’s the impact on the firm’s risk posture?• What value are we getting out of the dollars spent?”

• What kinds of questions are your executives asking about your program?

• How do you respond? What challenges do you face in answering their questions?

Page 6: Software Security Metrics

Why Metrics?

• Execs (and customers, auditors, regulators, etc.) want to know about risk management.

• How do you talk about AppSec and risk management?

• Good software helps business.• Bad software hurts business.

o We’re doing all of these things to make our software good and prevent it from being bad.

Page 7: Software Security Metrics

Why Metrics?

• But how do we know we’re doing the right things? How do we know if we’re doing enough? Too much? Too little?

1. Start with risk management objectives2. Ask questions about managing risk3. Answer those questions with data based on the activities

you are doing.

Page 8: Software Security Metrics

Vocabulary• Measurement vs. Metric – what’s the

difference?

• A measurement is the value of a specific characteristic of a given entity

• A metric is the aggregation of one or more measurements to create a piece of business intelligence.o What is the question the metric answers?o What is the decision the metric supports?o What is the environmental context?

• What types of measurements do you collect from your program?

• What types of questions and decisions do these measurements help to support?

Page 9: Software Security Metrics

Security Metrics Phases

Phase 1 Phase 2 Phase 3

Page 10: Software Security Metrics

AppSec Capabilities

Capability Maturity

1. Risk Identification(Find defects)

2. Policy Compliance(Require testing)

3. Risk Reduction(Fix defects)

4. RiskPrevention

(Prevent defects) How mature are the capabilities in your

program?

Page 11: Software Security Metrics

AppSec Metrics

Capability Maturity

Data

Ava

ilabi

lity

1. Defect DiscoveryParticipation / Coverage

Metrics

2. Policy ComplianceMetrics

3. EffectivenessMetrics

4. RiskPrevention

Metrics

How mature are your program metrics?

Page 12: Software Security Metrics

Scenario #1 - Explain the Incident

• An incident occurs.

• “Check the box” metrics are implemented, stay the same, and fail to provide any useful information.

Page 13: Software Security Metrics

Scenario #2A - Vanity Metrics

• We have “metrics!”

• BUT – the AppSec Team cannot explain the impact of their effort. Executive Management cannot make decisions based on the information.

Page 14: Software Security Metrics

Scenario #2B - Lots of Effort, Little Reward

• AppSec counts a lot of things, shares those counts with some people, and call them metrics

• Executive Management asks, “so what?” and AppSec struggles to come up with a satisfactory response.

Page 15: Software Security Metrics

Scenario #2C - Charlie Brown Grown Up Speak

• Executive Management doesn’t understand what is being presented by AppSec

• AppSec earns a reputation for being wasteful or simply impossible to understand

Page 16: Software Security Metrics

Scenario #3 - Proactive Communication

• The AppSec Team explains AppSec in a way that is understood by Executive Management.

• The AppSec Team provides context for metrics and explains how to interpret the data, helping stakeholders to understand the intended message.

Page 17: Software Security Metrics

The Cigital Approach

1. Identify Risk Management Objectives2. Take an Inventory of Current and Planned Activities3. Define Key Metrics

Page 18: Software Security Metrics

Why Metrics?

• Execs (and customers, auditors, regulators, etc.) want to know about risk management.

• How do you talk about AppSec and risk management?

• Good software helps business.• Bad software hurts business.

o We’re doing all of these things to make our software good and prevent it from being bad.

Page 19: Software Security Metrics

1. Identify Risk Management ObjectivesApplication Portfolio

We are not appropriately managing AppSec risk if we are not:

• Able to enumerate our current software portfolio• Able to enumerate our deployed applications and databases• Using an AppSec program to ensure, with sign-off, the

appropriate security posture for every application • Assigning a “risk designator” to every software asset, software

project, software security defect, and data asset• Managing risk across the entire portfolio • Providing a complete risk picture for executive management

Page 20: Software Security Metrics

1. Identify Risk Management ObjectivesSSDLC

We are not appropriately managing AppSec risk if we are not:

• Guiding every software project through a Secure SDLC • Ensuring appropriate levels of defect discovery are applied • Ensuring defects are documented and remediated, and

variances are documented and tracked• Tuning our Secure SDLC to reduce friction with

engineering• Moving efforts “left” in the Secure SDLC • Analyzing the risk associated with hundreds of “medium”

security defects in production • Using threat and attack intelligence to continually improve

Page 21: Software Security Metrics

1. Identify Risk Management ObjectivesPolicies, Standards, and Outreach

We are not appropriately managing AppSec risk if we are not:

• Using a foundational governance structure of policies and standards

• Incorporating every stakeholder in the software security strategy

• Performing regular out-reach to executives and to all stakeholders

• Ensuring all stakeholders have the appropriate level of training

Page 22: Software Security Metrics

1. Identify Risk Management ObjectivesContext: Software Environment and Vendors

We are not appropriately managing AppSec risk if we are not:

• Ensuring all adjacent IT, information, and data security practices are sufficiently mature

• Establishing software security requirements with all software vendors

Page 23: Software Security Metrics

1. Identify Risk Management ObjectivesContinuous Improvement

We are not appropriately managing AppSec risk if we are not:

• Aiming for a level of maturity beyond simple compliance with external drivers

• Using customized metrics and KPIs to chart ongoing progress

Page 24: Software Security Metrics

2. Create an Inventory of Current and Planned Activities

· Secure SDLC with Gates· Satellite· Metrics· Portfolio Management· Policy and Standards· Vendor Management· Defect Discovery: Design· Defect Discovery: Fuzzing· Defect Discovery: Penetration Testing· Defect Discovery: Quality Assurance· Defect Discovery: Code Review· Defect Discovery: Research· Defect Management

· Attack Intelligence· Open Source Management· Risk and Compliance· Secure By Design· AppSec Outreach· Competency Management· IT Operations

Page 25: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• What percentage of the applications in the portfolio have been reviewed and signed off, indicating an acceptable level of security?

• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 26: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• What percentage of software projects in the last 12 months have been reviewed and signed off, indicating an acceptable level of security?

• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 27: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• What percentage of software projects in the last 12 months did not go through the Secure SDLC?

• Per reason• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 28: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• What percentage of software projects in the last 12 months have passed all software security checkpoints?

• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 29: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• What percentage of the applications have 1 or more open exceptions for not passing an Secure SDLC gate?

• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 30: Software Security Metrics

3. Define Key Metrics

• We are not appropriately managing AppSec risk if we are not guiding every software project through a Secure SDLC that determines whether the software is acceptably secure

• For each security checkpoint in the Secure SDLC, what is the average percentage of artifacts provided versus expected across all software projects in the last 12 months?• Per risk ranking• Per tech stack• Per business unit• Per software project type

Page 31: Software Security Metrics

Evolve the Program, Evolve the Metrics

1. Identify Risk Management Objectives2. Take an Inventory of Current and Planned Activities3. Define Key Metrics

Page 32: Software Security Metrics

APPENDIX

Page 33: Software Security Metrics

1. Identify Risk Management Objectives

We may not be appropriately managing software security risk if we are not:

1. Using an SSI with full-time SSG to ensure, with sign-off, the appropriate security posture for every application in the firm’s portfolio

2. Using a foundational governance structure of policies and standards and measuring adherence to their requirements

3. Able to enumerate their current software portfolio, including open source software4. Able to enumerate their deployed applications and databases, including the

various kinds of PII processed and stored5. Performing regular out-reach on management issues by the SSG to executives

and on technical issues by the satellite to all stakeholders6. Guiding every software project (whether in-house development, out-sourced

development, or COTS acquisition) through a Secure SDLC (an SDLC with software security checkpoints) that determines whether the software is acceptably secure

7. Assigning a “risk designator” to every software asset (application risk ranking), software project (project impact assessment), software security defect (defect severity rating), and data asset (data classification label)

Page 34: Software Security Metrics

1. Identify Risk Management Objectives

We may not be appropriately managing software security risk if we are not:

8. Aiming for a level of maturity beyond simple compliance with external drivers9. Managing risk across the portfolio rather than only managing budget by

neglecting portions of the portfolio10. Establishing software security requirements with all software vendors,

including those whose software remotely processes sensitive data11. Ensuring appropriate levels of defect discovery are applied to all software at

required checkpoints and also periodically regardless of whether it’s been modified

12. Ensuring all software security defects are documented, all are remediated according to policy, and all variances are documented and tracked

13. Using threat and attack intelligence to continually improve the Secure SDLC and the portfolio

14. Providing a complete software portfolio risk picture for executive management

Page 35: Software Security Metrics

1. Identify Risk Management Objectives

We may not be appropriately managing software security risk if we are not:

15. Moving efforts “left” in the Secure SDLC to maximize prevention efforts16. Analyzing the math associated with allowing dozens or even hundreds of

“medium” security defects in production while dropping everything to fix one “high” defect

17. Tuning their Secure SDLC to both reduce friction with and work at the speeds required by engineering

18. Using customized metrics and KPIs to chart ongoing progress19. Incorporating every stakeholder in the software security strategy20. Ensuring all stakeholders have the appropriate level of software security

training21. Ensuring all adjacent IT, information, and data security practices are sufficiently

mature to not undermine software security efforts

Page 36: Software Security Metrics

What makes a metric?

• Metric Name – a unique, descriptive name that humans can understand

• Description – a short narrative explaining the metric and its importance

• Intended Audience – names the stakeholders for whom the metric is being created

• Question Answered – Write out the exact question the metric answers, given that it may take several evolutions of the metric to fully answer the question or that it may actually be the trend line that answers the questiono The question will likely also evolve multiple times as the stakeholders get a handle on what’s actually

important

• Component Measurements – Describe each of the metric’s component parts, including each associated data sourceo Include any useful comments about the data and its collection, such as whether it’s manual or automated,

it’s dependent upon a particular person, the data are reliable, special access is required, and so on

• Metric Calculation – Give the formula for combining the components to create the metrico Many formulas may be as simple as “A over B”

Page 37: Software Security Metrics

What makes a metric?

• Update Cycle – Tell how often the metric is calculated

• Location – Tell where the metric can be found by those authorized to access it

• Expected Value Range – The acceptable upper and lower boundaries for the metrico Upper Trigger Action – The action taken when the metric value rises above its

upper boundaryo Lower Trigger Action – The action taken when the metric value falls below its

lower boundary

• Expected Trend – Tell how the values are expected to move over timeo An upward trend may be good for some numbers and bad for others. There

may be a need for upper and lower values and triggers specific to the trend.

• Targets – Note the metric values expected to be achieved at specific times, if any

Page 38: Software Security Metrics

What makes a metric?

• Benchmark – Describe any reference point used for comparison. o This might be a similar metric from another firm, the same metric from some

past time period (e.g., year-on-year), and so on

• Precision and Accuracy (optional) – Describe any known know data capture issues in these areas. o Although the expectation is that data capture is always 100% precise and

100% accurate, that often isn’t true. Document cases where it’s possible to precisely capture data known to be inaccurate and where it may not be possible to precisely capture the accurate data.

• Feedback Loop (optional) – Describes the periodic process by which the a group judges the metric in terms of usefulness, accuracy, and so on, and directs efforts to make any required changes

Page 39: Software Security Metrics

My Fitness Pal (iPhone App)

• I ask questions and make decisions about my health every day

What should I eat for breakfast? How much? How often? What kind of exercise should I do? For what length of time? How often?

• I can change my behavior by setting goals and measuring progress

SMART goals Specific, measurable, actionable, reasonable, time-based

Page 40: Software Security Metrics

Vocabulary• Measurement vs. Metric – what’s the

difference?

o It is 67 degrees Fahrenheit in San Franciscoo I had 2 cups of coffee this morning

Page 41: Software Security Metrics

The Cigital Approach

1. Identify ObjectivesFirm:Publicly owned firm generates revenue primarily through 10 Internet-facing web applications.

Audience:Executives

SSI Objectives:• Achieve a defensible level of “due care” as expected by

various groups such as shareholders, the Board of Directors, regulators, law enforcement, and the public.

• Do not allow into production bugs for which well-known automated attacks exist.

Page 42: Software Security Metrics

The Cigital Approach

2. Create an Inventory

Data from SSI Inventory:

• The SSG uses static analysis during development and penetration testing during QA to check for bugs for which well-known automated attacks exist. The SSG has deployed a commercial static analysis tool with appropriate rules enabled and runs the tool on applications during development.

• The SSG relies on external penetration testers who, as part of their penetration testing service, use a commercial tool that performs dynamic scanning to discover ~40 common vulnerabilities.

• The SSG has issued a policy that states development teams must fix any exploitable software security bug discovered by an automated commercial tool prior to the code going to production.

Page 43: Software Security Metrics

The Cigital Approach

3. Define Key Metrics

SSG Communication Objectives:

• Coverage: The scope of the SSI is all 10 Internet-facing web applications; however, only eight are undergoing penetration testing during quality assurance and only five currently receive static analysis during development. The SSG wants to increase the Executive understanding of coverage for these software security activities.

• Policy Variance: Some application teams comply with the stated policy while others do not. The SSG wants to increase the Executive understanding of policy compliance by the application teams.

• Effectiveness: Depending on the level of coverage and policy compliance for each application team, the effectiveness of the software security controls will vary. The SSG wants to compare the effectiveness of the controls across the application portfolio by looking at software security bugs discovered by an automated commercial tool that are found versus fixed, as well as which are discovered post-production during a software security incident.

Page 44: Software Security Metrics

The Cigital Approach

3. Define Key Metrics

Questions:

• What is the level of software security testing coverage across the revenue generating web applications?

• Which application teams comply with the software security policy?• How effective is the software security testing and defect management

capability, given the various levels of coverage and compliance?

Page 45: Software Security Metrics

The Cigital Approach

3. Define Key MetricsStatic Analysis EffectivenessWhat is the effectiveness of preventing critical severity defects found through Fortify static analysis from going into production?Tool Efficacy = # Critical Severity Defects Fixed / # Critical Severity Defects Reported

Ineffectiveness Indicator What percentage of software security defects found in production were also found prior to production (but not addressed)?Ineffectiveness Indicator= # Software Security Defects found in production which were also found prior to production / Total # Software Security Defects found in production

Page 46: Software Security Metrics

The Cigital Approach

3. Define Key MetricsPenetration Testing ParticipationAre revenue-generating development teams employing penetration testing to discover risks?Penetration Testing Participation [for time period] [by business unit]= # Applications Pen Tested / # Total Applications

Static Analysis (Fortify) ParticipationHow many Fake Firm developers are using Fortify to scan their code for security defects?% Code Scanned by Static Analysis = KLOC scanned / KLOC released

Software Security Defect Density in ProductionWhat is the density of open exploitable critical, high, and medium severity software security defects discovered by any automated commercial tool and allowed to go to production?Software Security Defect Density in Production [by application]= # Open Exploitable Critical, High, and Medium Severity Defects / KLOC

Page 47: Software Security Metrics

Security Metrics Phases

• Software security objectives

• Activity inventory• Key metrics

definitions• Guidance on

visualization and outreach

• Identify process flows, data sources, and owners for each

• Obtain data and automate data collection

• Populate metrics• Visualize metrics

• Communicate metrics to the right people at the right time with the right visualizations

Phase 1 Phase 2 Phase 3

Page 48: Software Security Metrics

Security Metrics Phases

Phase 1 Phase 2 Phase 3

Option 1Already have metrics, need help to get to the next level

Option 2Don’t have metrics, want to get started quickly

This approach works for firms that already have metrics, and for firms that don’t.

Page 49: Software Security Metrics

Phase 1 Phase 2 Phase 3

Executive Summary High level overview of project goals, approach, and conclusions.

Methodology Overview Description of Cigital’s interview and artifact review-driven software security metrics development process.

Software Security Context & Objectives

Based on client interviews, a description of the client’s unique external, internal, and organizational context and goals for the software security program.

Key Metrics Definitions # recommended metrics definitions customized to the client’s unique risk management view, current software security activities, and planned software security activities.

Metrics Template List and descriptions of the 14 attributes of a mature and comprehensive metrics definition.

The deliverable for Phase 1 is a final report, including the items below:

Security Metrics Deliverables: Phase 1

Page 50: Software Security Metrics

Sample Schedule: Phase 1

Day 1 Day 2 Day 3 Day 4 Day 5

Days 1 and 2 (on-site)

• Cigital consultants go on-site to deliver software security instructor-led training (security metrics theory)

• Data gathering is performed via interviews and artifact review to identify software security objectives and inventory software security activities.

Days 3, 4, and 5 (remote)

• Cigital consultants perform analysis and develop an initial draft of the customized key metrics definitions.

• A detailed review of the initial key metrics definitions is conducted with the client and feedback is obtained.

• Client feedback is incorporated into a final report and presented in a read-out meeting.

• Each metrics engagement is unique and should be scoped individually.• A Phase 1 metrics engagement will require a minimum of 1 week of effort.• Clients can increase the depth and breadth of a Phase 1 engagement by

scheduling additional weeks of effort.

Page 51: Software Security Metrics

Phase 1 Phase 2 Phase 3

The deliverable for a Phase 2 engagement is a specification document and a deployed technology solution, including the items below:

Security Metrics Deliverables: Phase 2* A Phase 2 metrics engagement assumes that Phase 1 has already been completed, either by Cigital or by the client.

Process Flow (Document) Based on client security metrics definitions and interviews, a description and diagram of current security activity process flows and supporting data sources with identified owners

Architecture and API Specification (Document)

Recommended architecture and API specifications for data collection and key metrics implementation

Stakeholder Roles and Responsibilities (Document)

A description of roles and responsibilities which will be required of the process and data source owners to support on-going data collection, metrics calculations, and dashboard population

Deployed Technology Solution

With support from the necessary client stakeholders, Cigital will build, test, and deploy the solution as described in the specification document

Page 52: Software Security Metrics

Sample Schedule: Phase 2

Week 1 Week 2 Week 3 Week 4

Weeks 1 and 2

• Review client security metrics definitions and create detailed documentation of relevant security activity process flows and data sources.

• Identify owners for process flows and source data systems.

• Define architecture and API specification for metrics implementation

Weeks 3 and 4

• Define roles and responsibilities for process and data source owners to support on-going data collection, metrics calculation, and dashboard population.

• Secure stakeholder buy-in for solution implementation

• Build, test, and deploy the solution

• Each metrics engagement is unique and should be scoped individually.• Phase 2 schedule will be highly dependent on the complexity of

chosen client metrics and security activity processes.

Page 53: Software Security Metrics

Phase 1 Phase 2 Phase 3

Metrics Narratives and Visualizations (Presentation)

Cigital will create a custom presentation including the firm’s software security metrics, contextual narratives for each, and visualizations to meaningfully display the data.

Report: Stakeholders, Objections, and Responses (Report)

A list and detailed description of the client’s software security stakeholders – the metrics recipients. A set of customized potential questions from stakeholders in response to the metrics and recommended responses for the client to use in objection handling.

The deliverable for a Phase 3 engagement is a final report and a presentation, including the items below:

Security Metrics Deliverables: Phase 3* A Phase 3 metrics engagement assumes that Phases 1 and 2 have already been completed, either by Cigital or by the client.

Page 54: Software Security Metrics

Sample Schedule: Phase 3

Day 1 Day 2 Day 3 Day 4 Day 5

Days 1 and 2

• Cigital will conduct interviews to understand what the SSI owner is trying to achieve with the SSI and how the metrics and context shared around those metrics tell that story.

• Cigital will also lead an interactive discussion with the SSG on the roles and perspectives of various metrics recipients (software security stakeholders).

• If applicable, Cigital consultants will conduct interviews with software security stakeholders (metrics recipients) to obtain a first hand perspective on software security metrics and communications received to date.

Days 3, 4, and 5

• Cigital consultants perform analysis and develop an initial draft of the custom metrics presentation.

• Cigital consultants perform analysis and develop an initial draft of the potential questions from stakeholders and recommended responses.

• A detailed review of the presentation and report is conducted with the client and feedback is obtained.

• Client feedback is incorporated into a final report and presentation. Cigital presents the final report and presentation in a read-out meeting.

• Each metrics engagement is unique and should be scoped individually.

• A Phase 3 metrics engagement will typically require 1 week of effort.

• Clients can increase the depth and breadth of a Phase 3 engagement by scheduling additional weeks of effort.

Page 55: Software Security Metrics

1. Identify Risk Management ObjectivesFirm Specific Context

• Existing and planned software security processes• Existing definitions for data classification levels, application risk classification

levels, development project impact levels, security defect severity levels, technology stacks.

• External environmental context for the SSI – e.g. regulatory or contractual requirements, legal precedents in standards of due care, customer demands, or market drivers

• Internal environmental context for the SSI – e.g. related business objectives, culture, how decisions are made, how projects are funded, how values are embedded and objectives are communicated

• The firm’s risk tolerance and the factors that affect risk tolerance• Role and value of software in the organization• Structure of software in the organization – how software is developed,

acquired, deployed• Application portfolio inventory and status• Purpose, impact, or desired outcome of the SSI – e.g. compliance,

improvement, marketing discriminator

Page 56: Software Security Metrics

One Dozen Software Security Metrics

1. Application Portfolio Visibility• What parts of the application portfolio do we have visibility into

from a security perspective?

2. Application Portfolio Risk• What parts of the application portfolio have the highest risk?

3. Testing Frequency by Risk Level• How frequently do apps at different risk levels undergo security

testing?

Page 57: Software Security Metrics

One Dozen Software Security Metrics

4. Defect Discovery Participation• Are teams employing [defect discovery method] to discover risks?

5. Defect Density by Risk Level• What is the density of open critical severity defects by risk level?

6. Defect Density by Tech Type• What is the density of open critical severity defects by technology

type?

7. Defect Management Effectiveness• How many of the critical defects found actually get fixed?

8. Defect Remediation Timeframes• What percentage of defects found are fixed within an appropriate

amount of time?

Page 58: Software Security Metrics

One Dozen Software Security Metrics

9. SSDLC (Secure Software Development Lifecycle) Gates• What percentage of software development projects pass all

required security gates?

10.Compliance Approval• How much of the app portfolio has been reviewed for compliance

and approved?

11. Software Vendor Security• How many of the software vendors have been reviewed for

security and approved?

12.Competency Management• How many software developers have taken software security

training in the past year?