web applications under siege: defending against attack outbreaks
TRANSCRIPT
Applications Under Siege: Defending Against Attack Outbreaks
Amichai Shulman, CTO, Imperva
Agenda
Introduction to our Hacker Intelligence Initiative (HII) and Web Application Attack Report (WAAR)
Taking a new approach Analyzing real-life attack traffic
+ Key findings + Take-aways
Summary of recommendations
2
Speaker at Industry Events + RSA, Sybase Techwave, Info Security UK, Black Hat
Lecturer on Info Security + Technion - Israel Institute of Technology
Former security consultant to banks & financial services firms
Leads the Application Defense Center (ADC) + Discovered over 20 commercial application
vulnerabilities – Credited by Oracle, MS-SQL, IBM and others
Amichai Shulman – CTO Imperva
Amichai Shulman one of InfoWorld’s “Top 25 CTOs”
CONFIDENTIAL
Introduction to HII and WAAR
Hacker Intelligence Initiative is focused on understanding how attackers are operating in practice
+ A different approach from vulnerability research
Data set composition + ~50 real world applications + Anonymous Proxies
More than 18 months of data Powerful analysis system
+ Combines analytic tools with drill down capabilities
5
HII - Hacker Intelligence Initiative
HII - Motivation
Focus on actual threats + Focus on what hackers want, helping good guys prioritize + Technical insight into hacker activity + Business trends of hacker activity + Future directions of hacker activity
Eliminate uncertainties + Active attack sources + Explicit attack vectors + Spam content
Devise new defenses based on real data + Reduce guess work
HII Reports
Monthly reports based on data collection and analysis
Drill down into specific incidents or attack types
2011 / 2012 reports + Remote File Inclusion + Search Engine Poisoning + The Convergence of Google and Bots + Anatomy of a SQLi Attack + Hacker Forums Statistics + Automated Hacking + Password Worst Practices + Dissecting Hacktivist Attacks + CAPCHA Analysis
WAAR – Web Application Attack Report
Semi annual Based on aggregated analysis of
6 / 12 months of data Motivation
+ Pick-up trends + High level take outs + Create comparative measurements
over time
Download Roports: WAAR Edition #1 WAAR Edition #2 WAAR Edition #3
CONFIDENTIAL
Taking a New Approach
Retrospective
Assumptions + Attack requests are more or less evenly spread over time + Applications are more or less similar
Method + Count and analyze individual requests + Look at average over time / application
Consequence + “An application experiences an attack every other minute”
Contemplation
Observations + Attack traffic has a burst nature + Applications in our data set show some outliers
Reflections + Do organizations really need to handle an alert every two
minutes? + Do organizations handle a steady stream of attacks of an evenly
distributed nature?
Resolution
Abandon individual requests and look at incidents
+ 30 requests (or more) within 5 mins + Intensity and durability
Further aggregate incidents into “battle days”
+ A day that includes at least one incident
Resolution (cont.)
Then there is the man who drowned crossing a stream with an average depth of six inches - W.I.E. Gates
+ Distribution of web attacks is asymmetric and includes rare, yet extremely meaningful, outliers
+ Security professionals who would prepare for the “average case” will be overwhelmed by the intensity of incidents when these actually happen
+ We shifted away from average into other measures like median and quartiles
+ Use Box & Whisker charts to display data – Express dispersion and skewness
Box and Whisker
Median 75%
25%
95%
5%
CONFIDENTIAL
Data Analysis
Goals
Frequency + How many incidents / battle days per
time frame
Persistency + Duration of incidents
Magnitude + Volume of traffic during involved in an
incident / battle day
Predictability + Can one predict the timing of next
incident based on analyzing the timing of past incidents?
Overview
Typical (median)
Worst-case (max)
Battle days (over a 6 months period)
59 141
Incidents (over a 6 months period)
137 1383
Incident magnitude (requests per incident)
195 8790
Incident duration (minutes) 7.70 79
Overview – Frequency
An incident is expected every 3rd day Some applications are attacked almost every day A battle day usually includes more than a single attack Expected frequency affects the resources an
organization needs to allocate on a constant basis for handling attacks
Overview – Frequency
Take-away #1: Find out your expected attack frequency
Overview - Magnitude
Typical case is ~200 requests Average is 1 every 2 minutes Worst case is more than 400 times that number Affects the size of equipment an organization needs for
handling attacks Affects the capabilities required for handling incidents
+ Aggregation and summary + Quickly take action based on summary
Overview - Magnitude
Take-away #2: Base line for scaling should be typical
numbers. Aim for 3rd quartile.
Granular Comparison - Frequency
0
50
100
150
200
250
300
350
SQLi RFI LFI DT XSS HTTP
amou
nt o
f in
cide
nts
Granular Comparison - Frequency
SQL injection is the most prevailing attack type + As opposed to previous edition that showed XSS and DT
RFI attacks much more common than indicated by just looking at number of requests
Outliers indicate that some applications are heavily targeted by a specific type of attack
– SQLi – HTTP (malformed requests of various types) – DT
Granular Comparison - Frequency
Take-away #3: Attackers would try attacks that have better potential benefit regardless of vulnerability
assessment.
Granular Comparison – Frequency – Battle Days
0
10
20
30
40
50
60
70
80
SQLi RFI LFI DT XSS HTTP
# o
f ba
ttle
day
s in
6 m
onth
s
Granular Comparison - Magnitude
0
200
400
600
800
1000
1200
1400
1600
SQLi RFI LFI DT XSS HTTP
Req
uest
s pe
r in
cide
nt
Granular Comparison - Intensity
LFI is typically the most intensive attack RFI attacks tend to be more intensive than DT and SQLi Incidents are usually at the lowest 100s of requests per
incident with extreme cases at the lower thousands
Granular Comparison - Intensity
Take-away #4: Make sure your solution tackles SQL injection
and RFI at large scales.
Granular Comparison - Persistence
0
5
10
15
20
25
30
35
40
SQLi RFI LFI DT XSS HTTP
min
utes
per
inci
dent
Granular Comparison - Persistence
Majority of attacks are short + No more than 15 mins + Usually below 10 mins
DT attacks tend to last longer, while XSS attacks tend to be shorter
Figures suggest that attack type does not affect the intensity (requests per second) of attacks
+ LFI seems to have a higher tendency to intense incidents (higher magnitude with lower persistence)
Supports our assumption with respect to the bursty nature of attack traffic
Granular Comparison - Persistence
Take-away #4: No time to analyze individual requests and
attack vector during an ongoing attack.
Worst Case Analysis
SQLi RFI LFI DT XSS
Magnitude (requests) 359390 35276 3941 8197 16222
Intensity (requests per minute)
543.2 742.2 418.4 378 455.4
Intensity (requests per battle day)
359465 41495 8343 11549 21113
Trending – A Single Application View
0
5
10
15
20
25
30
35
40
45
05/0
6/20
11
19/0
6/20
11
03/0
7/20
11
17/0
7/20
11
31/0
7/20
11
14/0
8/20
11
28/0
8/20
11
11/0
9/20
11
25/0
9/20
11
09/1
0/20
11
23/1
0/20
11
06/1
1/20
11
20/1
1/20
11
04/1
2/20
11
18/1
2/20
11
01/0
1/20
12
15/0
1/20
12
29/0
1/20
12
12/0
2/20
12
26/0
2/20
12
11/0
3/20
12
25/0
3/20
12
08/0
4/20
12
22/0
4/20
12
06/0
5/20
12
20/0
5/20
12
03/0
6/20
12
# A
ttac
k pe
r w
eek
SQLi
RFI
LFI
DT
XSS
Trending – A Single Application View
Bursty nature of attacks clearly shows in this graph Extreme attack load of attacks during January Second half (even without the January burst) shows
more attacks than first half (576 vs. 322) This trend is also true for general malformed HTTP
requests + Empiric evidence to the correlation between malformed HTTP
traffic and attacks
Predictability - Goals
Try to predict the timing of next attack / battle day based on history of attacks / battle days
We’ve showed that if an application faces an incident during a specific day, it is likely to experience more incidents that same day
+ Probably due to being part of a list distributed to attack bots + Maybe due to a change that made it pop on the to-do list of
attack bots
Being able to predict would affect the ability to effectively allocated resources
Predictability - Method
Looked for Linear predication between battle days Use Auto Correlation Function (ACF) We employed Wessa, a freely available online service
that performs auto-correlation
Predictability - Results
No apparent correlation over a simple time gat
Predictability - Results
Unreported, periodic, vulnerability scan
Summary – Previous Advice Still Holds True
Acquire intelligence on malicious sources and apply it in real time.
Detect known vulnerability attacks.
Deploy security solutions that deter automated attacks.
Participate in a security community and share data on attacks.
Summary – The Bursty Nature of Attacks
Deploy for the right scale – Don’t be fooled by “average” good weather
Automated response procedures - When under attack volume is too high
Aggregate and summarize data in real time – Too many individual attacks to look at individually
Be prepared – Bursts are unpredictable. Test your team’s readiness
Usage Audit
Access Control
Rights Management
Attack Protection
Reputation Controls
Virtual Patching
Imperva: Our Story in 60 Seconds
CONFIDENTIAL
Webinar Materials
42
Webinar Materials
Post-Webinar Discussions
Answers to Attendee Questions
Webinar Recording Link Join Group
Join Imperva LinkedIn Group, Imperva Data Security Direct, for…
www.imperva.com
- CONFIDENTIAL -