pc anti-virus protection 2013 - dennistechnologylabs · pc anti-virus protection 2013 page 6 of 21...

21
PC Anti-Virus Protection 2013 ADYNAMIC ANTI-MALWARE COMPARISON TEST Dennis Technology Labs, 03/08/2012 www.DennisTechnologyLabs.com This report aims to compare the effectiveness of Symantec’s Norton Internet Security 2013 product with competing products available from other well- known security companies. The products were exposed to internet threats that were live during the test period. This exposure was carried out in a realistic way, closely reflecting a customer’s experience. For example, each test system visited real, infected websites that significant numbers of internet users were encountering at the time of the test. These results reflect what would have happened if those users were using one of the products tested. EXECUTIVE SUMMARY Products tested AVG Internet Security 2012 Avira Internet Security 2012 Avast! Internet Security 7 BitDefender Internet Security 2013 ESET Smart Security 5 F-Secure Internet Security 2012 Kaspersky Internet Security 2012 Microsoft Security Essentials McAfee Total Protection 2012 Panda Internet Security 2012 Symantec Norton Internet Security 2013 Trend Micro Titanium Maximum Security 2012 Webroot SecureAnywhere Complete 2012 The effectiveness of premium anti-malware security suites varies wildly in the paid-for market. Nearly every product was compromised at least once. The most effective were compromised just once or not at all, while the least effective were compromised by one third of the threats, despite the relatively small sample size. Blocking malicious sites based on reputation is an effective approach. Those products that prevented users from visiting the malicious sites in the first place gained a significant advantage. If the malware can’t download onto the victim’s computer then the anti-malware software faces less of an ongoing challenge. Some anti-malware programs are too harsh when evaluating legitimate software All of the software generated at least one false positive. Avira Internet Security 2012 was by far the least accurate in this respect, blocking the majority of legitimate applications. Which was the best product? The most accurate programs were Symantec Norton Internet Security 2013; BitDefender Internet Security 2013; Kaspersky Internet Security 2013; ESET Smart Security 5; and Trend Micro Titanium Maximum Security 2012. The Symantec stands at the very top of this distinct group, having protected against all of the threats while generating just one false positive result. BitDefender’s product comes a very close second. Simon Edwards, Dennis Technology Labs

Upload: others

Post on 12-Jun-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013

A DYNAMIC ANTI-MALWARE COMPARISON TEST

Dennis Technology Labs, 03/08/2012www.DennisTechnologyLabs.com

This report aims to compare the effectiveness ofSymantec’s Norton Internet Security 2013 productwith competing products available from other well-known security companies.

The products were exposed to internet threats thatwere live during the test period. This exposure was

carried out in a realistic way, closely reflecting acustomer’s experience.

For example, each test system visited real, infectedwebsites that significant numbers of internet userswere encountering at the time of the test. Theseresults reflect what would have happened if thoseusers were using one of the products tested.

EXECUTIVE SUMMARY

Products tested AVG Internet Security 2012 Avira Internet Security 2012 Avast! Internet Security 7 BitDefender Internet Security 2013 ESET Smart Security 5 F-Secure Internet Security 2012 Kaspersky Internet Security 2012

Microsoft Security Essentials McAfee Total Protection 2012 Panda Internet Security 2012 Symantec Norton Internet Security 2013 Trend Micro Titanium Maximum Security 2012 Webroot SecureAnywhere Complete 2012

The effectiveness of premium anti-malware security suites varies wildly in the paid-for market.Nearly every product was compromised at least once. The most effective were compromised just once or not at all,while the least effective were compromised by one third of the threats, despite the relatively small sample size.

Blocking malicious sites based on reputation is an effective approach.Those products that prevented users from visiting the malicious sites in the first place gained a significant advantage.If the malware can’t download onto the victim’s computer then the anti-malware software faces less of an ongoingchallenge.

Some anti-malware programs are too harsh when evaluating legitimate softwareAll of the software generated at least one false positive. Avira Internet Security 2012 was by far the least accurate inthis respect, blocking the majority of legitimate applications.

Which was the best product?The most accurate programs were Symantec Norton Internet Security 2013; BitDefender Internet Security 2013;Kaspersky Internet Security 2013; ESET Smart Security 5; and Trend Micro Titanium Maximum Security 2012. TheSymantec stands at the very top of this distinct group, having protected against all of the threats while generatingjust one false positive result. BitDefender’s product comes a very close second.

Simon Edwards, Dennis Technology Labs

Page 2: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 2 of 21

CONTENTS

Executive summary ..................................................................................................................................................................... 1

Contents ........................................................................................................................................................................................ 2

1. Total Accuracy Ratings........................................................................................................................................................... 3

2. Protection Ratings ................................................................................................................................................................... 5

3. Protection Scores..................................................................................................................................................................... 7

4. Protection Details.................................................................................................................................................................... 9

5. False Positives ........................................................................................................................................................................ 11

6. The Tests ................................................................................................................................................................................ 15

7. Test Details............................................................................................................................................................................. 16

8. Conclusions ............................................................................................................................................................................ 19

Appendix A: Terms Used......................................................................................................................................................... 20

Appendix B: Terms of the test ................................................................................................................................................ 21

Page 3: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 3 of 21

1. TOTAL ACCURACY RATINGS

The total accuracy ratings provide a way to judgehow effectively the security programs work bylooking at a single graph. It takes into account howaccurately the programs treated threats and handledlegitimate software.

We believe that anti-malware software should notjust detect threats. It should allow legitimate softwareto run unhindered as well.

The total accuracy ratings take into account successes and failures with both malware and legitimate applications.

We ran two distinct tests: one that measured how theproducts handled internet threats and one thatmeasured how they handled legitimate programs.

When a product fails to protect the system against athreat it is compromised. When it warns against, oreven blocks, legitimate software then it generates a‘false positive’ result.

Products gain points for stopping threats successfullyand for allowing users to install and run legitimatesoftware. Products lose points for failing to stopthreats and when they handle legitimate filesincorrectly.

Each product then receives a final rating based on itsperformance in each of the ‘threat’ and ‘legitimatesoftware’ tests.

The following results show a combined accuracyrating, taking into account each product’sperformance with both threats and non-malicioussoftware. There is a maximum possible score of 150and a minimum of -350.

See 5. False Positives for detailed results and anexplanation on how the false positive ratings arecalculated.

148.

5

143.

75

141

140.

5

135.

5

132.

6

131.

5

131

127.

95

122.

5

112

86.5

63.4

-10

10

30

50

70

90

110

130

150

Total Accuracy

Total

Page 4: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 4 of 21

TOTAL ACCURACY RATINGS

Product Total Accuracy Rating

Norton Internet Security 2013 148.5

BitDefender Internet Security 2013 143.75

Kaspersky Internet Security 2012 141

ESET Smart Security 5 140.5

Trend Micro Titanium Maximum Security 2012 135.5

AVG Internet Security 2012 132.6

Webroot SecureAnywhere Complete 2012 131.5

F-Secure Internet Security 2012 131

Avast! Internet Security 7 127.95

Panda Internet Security 2012 122.5

McAfee Total Protection 2012 112

Microsoft Security Essentials 86.5

Avira Internet Security 2012 63.4

Page 5: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 5 of 21

2. PROTECTION RATINGS

The following results show how each product hasbeen scored for its accuracy in detecting and handlingmalware only. They do not take into account falsepositives.

We awarded two points for defending against athreat, one for neutralizing it and deducted twopoints every time a product allowed the system to be

compromised. The best possible score is 100 and theworst is -100.

The reason behind this score weighting is to givecredit to products that deny malware an opportunityto tamper with the system and to penalize those thatallow malware to damage it.

With protection ratings we award products extra points for completely blocking a threat, while removing points whenthey are compromised by a threat.

99

96 95 95 94

86 86

83 81

73

65

60

370

10

20

30

40

50

60

70

80

90

100

Protection Ratings

Page 6: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 6 of 21

It is quite possible that a compromised system will bemade unstable, or even unusable without expertknowledge. Even if active malware was removed, weconsidered such damaged systems to count as beingcompromised.

Symantec Norton Internet Security 2012 defendedagainst 49 out of the 50 threats. It gains double

points for each defense (2x49), totaling 98. Itneutralized one threat (1x1), gaining one furtherpoint bringing the subtotal to 99.

BitDefender scored lower because its results includedone compromise. It defended 49 times (2x49) andwas compromised once. Its score is calculated likethis: (2x47) + (-2*1) = 96.

PROTECTION RATINGS

Product Protection Rating

Norton Internet Security 2013 99

ESET Smart Security 5 96

Kaspersky Internet Security 2012 95

Trend Micro Titanium Maximum Security 2012 95

BitDefender Internet Security 2013 94

AVG Internet Security 2012 86

Webroot SecureAnywhere Complete 2012 86

F-Secure Internet Security 2012 83

Avast! Internet Security 7 81

Panda Internet Security 2012 73

McAfee Total Protection 2012 65

Avira Internet Security 2012 60

Microsoft Security Essentials 37

Page 7: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 7 of 21

3. PROTECTION SCORES

The following illustrates the general level ofprotection provided by each of the security products,combining the defended and neutralized incidents.

These figures illustrate how many times the systemswere protected by a defense or neutralization.

They are not weighted with an arbitrary scoringsystem as in 1. Total Accuracy Ratings and 2.Protection Ratings.

The average protection levels afforded by the testedproducts was 92 per cent.

The protection scores simply indicate how many time each product prevented a threat from compromising the system.

50 49 49 49 49 47 47 46 46 45

42 41

36

0

10

20

30

40

50

Protection Scores

Page 8: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 8 of 21

PROTECTION SCORES

Product Protected Scores

Norton Internet Security 2013 50 100%

BitDefender Internet Security 2013 49 98%

ESET Smart Security 5 49 98%

Kaspersky Internet Security 2012 49 98%

Trend Micro Titanium Maximum Security 2012 49 98%

AVG Internet Security 2012 47 94%

Webroot SecureAnywhere Complete 2012 47 94%

Avast! Internet Security 7 46 92%

F-Secure Internet Security 2012 46 92%

Panda Internet Security 2012 45 90%

McAfee Total Protection 2012 42 84%

Avira Internet Security 2012 41 82%

Microsoft Security Essentials 36 72%

(Average: 92 per cent)

Page 9: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 9 of 21

4. PROTECTION DETAILS

The security products provided different levels ofprotection. When a product defended against athreat, it prevented the malware from gaining afoothold on the target system. A threat might have

been able to exploit or infect the system and, in somecases, the product neutralized it either after theexploit ran or later. When it couldn’t the system wascompromised.

The products are ordered according to their Defended and Compromised results, and then alphabetically. For overallprotection scores see 3. Protection Scores on page 7.

0

10

20

30

40

50

Protection Details

Sum Compromised Sum Neutralized Sum Defended

Page 10: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 10 of 21

PROTECTION DETAILS

Product Sum Defended Sum Neutralized Sum Compromised

Norton Internet Security 2013 49 1 0

ESET Smart Security 5 49 0 1

Trend Micro Titanium Maximum Security 2012 48 1 1

BitDefender Internet Security 2013 47 2 1

Kaspersky Internet Security 2012 48 1 1

Webroot SecureAnywhere Complete 2012 45 2 3

AVG Internet Security 2012 45 2 3

F-Secure Internet Security 2012 45 1 4

Avast! Internet Security 7 43 3 4

McAfee Total Protection 2012 39 3 8

Panda Internet Security 2012 38 7 5

Avira Internet Security 2012 37 4 9

Microsoft Security Essentials 29 7 14

Page 11: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 11 of 21

5. FALSE POSITIVES

5.1 False positive scoresA security product needs to be able to protect thesystem from threats, while allowing legitimatesoftware to work properly. When legitimate softwareis misclassified a false positive is generated.

We split the results into two main groups becausemost products we test take one of two basicapproaches when attempting to protect the system

from the legitimate programs. They either warn thatthe software was suspicious or take the more decisivestep of blocking it.

Blocking a legitimate application is more serious thanissuing a warning because it directly hampers theuser.

When generating a false positive the products were more likely to warn against installing or running a program thanblocking it completely.

0

5

10

15

20

25

30

35

40

Avira

Inte

rnet

Sec

urity

201

2Tr

end

Mic

ro T

itani

um M

axim

um S

ecur

ity 2

012

Web

root

Sec

ureA

nyw

here

Com

plet

e 20

12ES

ET S

mar

t Sec

urity

5F-

Secu

re In

tern

et S

ecur

ity 2

012

Nor

ton

Inte

rnet

Sec

urity

201

3Av

ast!

Inte

rnet

Sec

urity

7Pa

nda

Inte

rnet

Sec

urity

201

2M

icro

soft

Sec

urity

Ess

entia

lsM

cAfe

e To

tal P

rote

ctio

n 20

12AV

G In

tern

et S

ecur

ity 2

012

Kasp

ersk

y In

tern

et S

ecur

ity 2

012

BitD

efen

der I

nter

net S

ecur

ity 2

013

Avira

Inte

rnet

Sec

urity

201

2Tr

end

Mic

ro T

itani

um M

axim

um S

ecur

ity 2

012

Web

root

Sec

ureA

nyw

here

Com

plet

e 20

12ES

ET S

mar

t Sec

urity

5F-

Secu

re In

tern

et S

ecur

ity 2

012

Nor

ton

Inte

rnet

Sec

urity

201

3Av

ast!

Inte

rnet

Sec

urity

7Pa

nda

Inte

rnet

Sec

urity

201

2M

icro

soft

Sec

urity

Ess

entia

lsM

cAfe

e To

tal P

rote

ctio

n 20

12AV

G In

tern

et S

ecur

ity 2

012

Kasp

ersk

y In

tern

et S

ecur

ity 2

012

BitD

efen

der I

nter

net S

ecur

ity 2

013

Warnings Blockings

False Positive Scores

Total

Page 12: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 12 of 21

FALSE POSITIVE SCORES

5.2 Taking file prevalence into accountThe prevalence of each file is significant. If a productmisclassified a common file then the situation wouldbe more serious than if it failed to detect a lesscommon one. That said, it is usually expected thatanti-malware programs should not misclassify anylegitimate software.

The files selected for the false positive testing wereorganized into five groups: Very High Impact, High

Impact, Medium Impact, Low Impact and Very LowImpact.

These categories were based on download numbersas reported by sites including Download.com at thetime of testing. The ranges for these categories arerecorded in the table below:

FALSE POSITIVE PREVALENCE CATEGORIES

FalsePositiveType

Product Total

Warnings Avira Internet Security 2012 23

Trend Micro Titanium Maximum Security 2012

Webroot SecureAnywhere Complete 2012 2

ESET Smart Security 5

F-Secure Internet Security 2012 1

Norton Internet Security 2013

Avast! Internet Security 7 2

Panda Internet Security 2012

Microsoft Security Essentials

McAfee Total Protection 2012 1

AVG Internet Security 2012 7

Kaspersky Internet Security 2012 2

BitDefender Internet Security 2013 1

Blockings Avira Internet Security 2012 11

Trend Micro Titanium Maximum Security 2012 4

Webroot SecureAnywhere Complete 2012 2

ESET Smart Security 5 2

F-Secure Internet Security 2012 2

Norton Internet Security 2013 1

Avast! Internet Security 7 1

Panda Internet Security 2012 1

Microsoft Security Essentials 1

McAfee Total Protection 2012 1

AVG Internet Security 2012 1

Kaspersky Internet Security 2012 1

BitDefender Internet Security 2013

Impact category Prevalence (downloads in the previous week)

Very High Impact >20,000High Impact 1,000 – 20,000Medium Impact 100 – 999Low Impact 25 – 99Very Low Impact < 25

Page 13: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 13 of 21

5.3 Modifying scoresThe following set of score modifiers were used tocreate an impact-weighted accuracy score. Each timea product allowed a new legitimate program to install

and run it was awarded one point. It lost points (orfractions of a point) if and when it generated a falsepositive. We used the following score modifiers:

FALSE POSITIVE PREVALENCE SCORE MODIFIERS

False positive action Impact category Score modifierBlocked Very High Impact -5

High Impact -2Medium Impact -1Low Impact -0.5Very Low Impact -0.1

Warning Very High Impact -2.5High Impact -1Medium Impact -0.5Low Impact -0.25Very Low Impact -0.05

5.4 Distribution of impact categoriesProducts that scored highest were the most accuratewhen handling the legitimate applications used in thetest. The best score possible is 50, while the worstwould be -250 (assuming that all applications were

classified as Very High Impact and were blocked). Infact the distribution of applications in the impactcategories was not restricted only to Very HighImpact. The table below shows the true distribution:

FALSE POSITIVE CATEGORY FREQUENCY

Prevalence Rating Frequency

Very High Impact 17

High Impact 17

Medium Impact 6

Low Impact 5

Very Low Impact 5

Page 14: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 14 of 21

5.5 False positive ratingsCombining the impact categories with weighted scores produces the following false positive accuracy ratings.

When a product misclassified a popular program it faced a stronger penalty than if the file was more obscure.

FALSE POSITIVE RATINGS

Product Accuracy Rating

BitDefender Internet Security 2013 49.75

Norton Internet Security 2013 49.5

Microsoft Security Essentials 49.5

Panda Internet Security 2012 49.5

F-Secure Internet Security 2012 48

McAfee Total Protection 2012 47

Avast! Internet Security 7 46.95

AVG Internet Security 2012 46.6

Kaspersky Internet Security 2012 46

Webroot SecureAnywhere Complete 2012 45.5

ESET Smart Security 5 44.5

Trend Micro Titanium Maximum Security 2012 40.5

Avira Internet Security 2012 3.4

05

101520253035404550

False Positive Ratings

Total

Page 15: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 15 of 21

6. THE TESTS

6.1 The threatsProviding a realistic user experience was important inorder to illustrate what really happens when a userencounters a threat on the internet.

For example, in these tests web-based malware wasaccessed by visiting an original, infected websiteusing a web browser, and not downloaded from aCD or internal test website.

All target systems were fully exposed to the threats.This means that any exploit code was allowed to run,as were other malicious files, They were run andpermitted to perform exactly as they were designedto, subject to checks made by the installed securitysoftware. A minimum time period of five minuteswas provided to allow the malware an opportunity toact.

6.2 Test roundsTests were conducted in rounds. Each roundrecorded the exposure of every product to a specificthreat. For example, in ‘round one’ each of theproducts were exposed to the same maliciouswebsite.

At the end of each round the test systems werecompletely reset to remove any possible trace ofmalware before the next test began.

6.3 MonitoringClose logging of the target systems was necessary togauge the relative successes of the malware and theanti-malware software. This included recordingactivity such as network traffic, the creation of filesand processes and changes made to important files.

6.4 Levels of protectionThe products displayed different levels of protection.Sometimes a product would prevent a threat fromexecuting, or at least making any significant changesto the target system.

In other cases a threat might be able to performsome tasks on the target (such as exploiting a securityvulnerability or executing a malicious program), afterwhich the security product would intervene andremove some or all of the malware.

Finally, a threat may be able to bypass the securityproduct and carry out its malicious tasks unhindered.It may even be able to disable the security software.

Occasionally Windows' own protection system mighthandle a threat while the anti-virus program ignoredit. Another outcome is that the malware may crashfor various reasons.

The different levels of protection provided by eachproduct were recorded following analysis of the logfiles.

If malware failed to perform properly in a givenincident, perhaps because of the very presence of thesecurity product, rather than any specific defendingaction that the product took, the product was giventhe benefit of the doubt and a Defended result wasrecorded.

If the test system was damaged, becoming hard touse following an attempted attack, this was countedas a compromise even if the active parts of themalware had eventually been removed by theproduct.

6.5 Types of protectionAll of the products tested provided two main typesof protection: real-time and on-demand. Real-timeprotection monitors the system constantly in anattempt to prevent a threat from gaining access.

On-demand protection is essentially a ‘virus scan’that is run by the user at an arbitrary time.

The test results note each product’s behavior when athreat is introduced and afterwards. The real-timeprotection mechanism was monitored throughout thetest, while an on-demand scan was run towards theend of each test to measure how safe the productdetermined the system to be.

Manual scans were run only when a tester determinedthat malware had made an interaction with the targetsystem. In other words, if the security productclaimed to block the attack at the initial stage, and themonitoring logs supported this claim, the case wasconsidered closed and a Defended result wasrecorded.

Page 16: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 16 of 21

7. TEST DETAILS

7.1 The targetsTo create a fair testing environment, each productwas installed on a clean Windows XP Professionaltarget system. The operating system was updatedwith Windows XP Service Pack 3 (SP3), although nolater patches or updates were applied.

We test with Windows XP SP3 and Internet Explorer7 due to the high prevalence of internet threats thatrely on this combination. The prevalence of thesethreats suggests that there are many systems with thislevel of patching currently connected to the internet.

A selection of legitimate but old software was pre-installed on the target systems. These posed securityrisks, as they contained known vulnerabilities. Theyincluded out of date versions of Adobe Flash Playerand Adobe Reader.

A different security product was then installed oneach system. Each product’s update mechanism wasused to download the latest version with the mostrecent definitions and other elements.

Due to the dynamic nature of the tests, which werecarried out in real-time with live malicious websites,the products' update systems were allowed to runautomatically and were also run manually before eachtest round was carried out.

The products were also allowed to 'call home' shouldthey be programmed to query databases in real-time.Some products might automatically upgradethemselves during the test. At any given time oftesting, the very latest version of each program wasused.

Target systems used identical hardware, including anIntel Core 2 Duo processor, 1GB RAM, 160GB harddisk and DVD-ROM drive. Each was connected tothe internet via its own virtual network (VLAN) toavoid cross-infection of malware.

7.2 Threat selectionThe malicious web links (URLs) used in the testswere not provided by any anti-malware vendor. Theywere picked from lists generated by DennisTechnology Labs’ own malicious site detectionsystem, which uses popular search engine keywordssubmitted to Google. It analyses sites that arereturned in the search results from a number ofsearch engines and adds them to a database ofmalicious websites.

In all cases, a control system (Verification TargetSystem - VTS) was used to confirm that the URLslinked to actively malicious sites.

Malicious URLs and files are not shared with anyvendors during the testing process.

7.3 Test stagesThere were three main stages in each individual test:

1. Introduction2. Observation3. Remediation

During the Introduction stage, the target system wasexposed to a threat. Before the threat was introduced,a snapshot was taken of the system. This created alist of Registry entries and files on the hard disk. Weused Regshot (see Appendix D: Tools) to take andcompare system snapshots. The threat was thenintroduced.

Immediately after the system’s exposure to the threat,the Observation stage is reached. During this time,which typically lasted at least 10 minutes, the testermonitored the system both visually and using a rangeof third-party tools.

The tester reacted to pop-ups and other promptsaccording to the directives described below (see 7.6Observation and intervention).

In the event that hostile activity to other internetusers was observed, such as when spam was beingsent by the target, this stage was cut short. TheObservation stage concluded with another systemsnapshot. This ‘exposed’ snapshot was compared tothe original ‘clean’ snapshot and a report generated.The system was then rebooted.

The Remediation stage is designed to test the products’ability to clean an infected system. If it defendedagainst the threat in the Observation stage then weskipped this stage. An on-demand scan was run onthe target, after which a ‘scanned’ snapshot wastaken. This was compared to the original ‘clean’snapshot and a report was generated.

All log files, including the snapshot reports and theproduct’s own log files, were recovered from thetarget. In some cases the target became so damagedthat log recovery was considered impractical. Thetarget was then reset to a clean state, ready for thenext test.

Page 17: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 17 of 21

7.4 Threat introductionMalicious websites were visited in real-time usingInternet Explorer. This risky behavior was conductedusing live internet connections. URLs were typedmanually into Internet Explorer’s address bar.

Web-hosted malware often changes over time.Visiting the same site over a short period of time canexpose systems to what appear to be a range ofthreats (although it may be the same threat, slightlyaltered to avoid detection). Also, many infected siteswill only attack a particular IP address once, whichmakes it hard to test more than one product againstthe same threat.

In order to improve the chances that each targetsystem received the same experience from amalicious web server, we used a web replay system.

When the verification target systems visited amalicious site, the page’s content, including maliciouscode, was downloaded, stored and loaded into thereplay system. When each target system subsequentlyvisited the site, it received exactly the same content.

The network configurations were set to allow allproducts unfettered access to the internet throughoutthe test, regardless of the web replay systems.

7.5 Secondary downloadsEstablished malware may attempt to downloadfurther files (secondary downloads), which are storedin a cache by a proxy on the network and re-servedto other targets in some circumstances. Thesecircumstances include cases where:

1. The download request is made using HTTP(e.g. http://badsite.example.com/...) and

2. The same filename is requested each time(e.g. badfile1.exe)

There are scenarios in which target systems receivedifferent secondary downloads. These include caseswhere:

1. The download request is made usingHTTPS or a non-web protocol such as FTPor

2. A different filename is requested each time(e.g. badfile2.exe; random357.exe)

7.6 Observation and interventionThroughout each test, the target system was observedboth manually and in real-time. This enabled thetester to take comprehensive notes about thesystem’s perceived behavior, as well as to comparevisual alerts with the products’ log entries. At certainstages the tester was required to act as a regular user.

To achieve consistency, the tester followed a policyfor handling certain situations, including dealing withpop-ups displayed by products or the operatingsystem, system crashes, invitations by malware toperform tasks and so on.

This user behavior policy included the followingdirectives:

1. Act naively. Allow the threat a good chanceto introduce itself to the target by clickingOK to malicious prompts, for example.

2. Don’t be too stubborn in retrying blockeddownloads. If a product warns againstvisiting a site, don’t take further measures tovisit that site.

3. Where malware is downloaded as a Zip file,or similar, extract it to the Desktop thenattempt to run it. If the archive is protectedby a password, and that password is knownto you (e.g. it was included in the body ofthe original malicious email), use it.

4. Always click the default option. This appliesto security product pop-ups, operatingsystem prompts (including Windowsfirewall) and malware invitations to act.

5. If there is no default option, wait. Give theprompt 20 seconds to choose a course ofaction automatically.

6. If no action is taken automatically, choosethe first option. Where options are listedvertically, choose the top one. Whereoptions are listed horizontally, choose theleft-hand one.

7.7 RemediationWhen a target is exposed to malware, the threat mayhave a number of opportunities to infect the system.The security product also has a number of chances toprotect the target. The snapshots explained in 7.3Test stages provided information that was used toanalyze a system’s final state at the end of a test.

Before, during and after each test, a ‘snapshot’ of thetarget system was taken to provide information aboutwhat had changed during the exposure to malware.For example, comparing a snapshot taken before amalicious website was visited to one taken aftermight highlight new entries in the Registry and newfiles on the hard disk.

Snapshots were also used to determine how effectivea product was at removing a threat that had managedto establish itself on the target system. This analysisgives an indication as to the levels of protection thata product has provided.

Page 18: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 18 of 21

These levels of protection have been recorded usingthree main terms: defended, neutralized, andcompromised. A threat that was unable to gain afoothold on the target was defended against; one thatwas prevented from continuing its activities wasneutralized; while a successful threat was considered tohave compromised the target.

A defended incident occurs where no maliciousactivity is observed with the naked eye or third-partymonitoring tools following the initial threatintroduction. The snapshot report files are used toverify this happy state.

If a threat is observed to run actively on the system,but not beyond the point where an on-demand scanis run, it is considered to have been neutralized.

Comparing the snapshot reports should show thatmalicious files were created and Registry entries weremade after the introduction. However, as long as the‘scanned’ snapshot report shows that either the fileshave been removed or the Registry entries have beendeleted, the threat has been neutralized.

The target is compromised if malware is observed torun after the on-demand scan. In some cases aproduct might request a further scan to complete theremoval. We considered secondary scans to beacceptable, but further scan requests would beignored.

Even if no malware was observed, a compromiseresult was recorded if snapshot reports showed theexistence of new, presumably malicious files on thehard disk, in conjunction with Registry entriesdesigned to run at least one of these files when thesystem booted.

An edited ‘hosts’ file or altered system file alsocounted as a compromise.

7.8 Automatic monitoringLogs were generated using third-party applications, aswell as by the security products themselves. Manualobservation of the target system throughout itsexposure to malware (and legitimate applications)provided more information about the securityproducts’ behavior.

Monitoring was performed directly on the targetsystem and on the network.

Client-side logging

A combination of Process Explorer, ProcessMonitor, TcpView and Wireshark were used tomonitor the target systems. Regshot was usedbetween each testing stage to record a systemsnapshot.

A number of Dennis Technology Labs-createdscripts were also used to provide additional systeminformation. Each product was able to generate somelevel of logging itself.

Process Explorer and TcpView were run throughoutthe tests, providing a visual cue to the tester aboutpossible malicious activity on the system. In addition,Wireshark’s real-time output, and the display fromthe web proxy (see Network logging, below),indicated specific network activity such as secondarydownloads.

Process Monitor also provided valuable informationto help reconstruct malicious incidents. Both ProcessMonitor and Wireshark were configured to save theirlogs automatically to a file. This reduced data losswhen malware caused a target to crash or reboot.

In-built Windows commands such as 'systeminfo'and 'sc query' were used in custom scripts to provideadditional snapshots of the running system's state.

Network logging

All target systems were connected to a live internetconnection, which incorporated a transparent webproxy and a network monitoring system. All traffic toand from the internet had to pass through thissystem.

Further to that, all web traffic had to pass throughthe proxy as well. This allowed the testers to capturefiles containing the complete network traffic. It alsoprovided a quick and easy view of web-based traffic,which was displayed to the testers in real-time.

The network monitor was a dual-homed Linuxsystem running as a transparent router, passing allweb traffic through a Squid proxy.

An HTTP replay system ensured that all targetsystems received the same malware as each other. Itwas configured to allow access to the internet so thatproducts could download updates and communicatewith any available ‘in the cloud’ servers.

Page 19: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 19 of 21

8. CONCLUSIONS

Where are the threats?The threats used in this test were genuine, real-lifethreats that were infecting victims globally at thesame time as we tested the products. In almost everycase the threat was launched from a legitimatewebsite that had been compromised by an attacker.

The types of infected or malicious sites were varied,which demonstrates that effective anti-virus softwareis essential for those who want to use the web using aWindows PC.

Most threats installed automatically when a uservisited the infected webpage. This infection wasoften invisible to a casual observer.

Where does protection start?There were a significant number of compromises inthis test, as well as a relatively large number ofneutralizations. The strongest products blocked thesite before it was even able to deliver its payload.

Symantec’s Norton Internet Security 2013 scoredhighest in terms of malware protection, while ESETSmart Security 5 came a close second. The Nortonproduce produced just one false positive, whileESET’s product produced just two. This was a close-run race.

Sorting the wheat from the chaffBitDefender Internet Security 2013 was the best interms of avoiding significant false positives.Combining these results with its ability to stopmalware puts it into a respectable second placeoverall.

Anti-malware products needs to be able todistinguish between malicious and non-malicious

programs. This is where Avira’s product performedparticularly poorly. Not only did it fail to protectagainst one in five of the threats, it also blockedeleven legitimate applications and warned against afurther 23. This means that it had issues with wellover half of the legitimate programs.

The false positive results affected Avira InternetSecurity 2012’s total accuracy rating, pushing it fromlast-but-one place firmly into last place. It was onlythe ability of Microsoft Security Essentials to allowaccess to almost all legitimate applications that savedit from being the least effective product tested.

Anti-virus is important (but not a panacea)This test shows that even with a relatively smallsample set of 50 threats there is a significantdifference in performance between the anti-virusprograms. Most importantly, it illustrates thisdifference using real threats that were attacking realcomputers at the time of testing.

The average protection level of the tested products is92 per cent (see 3. Protection Scores), which is asignificant value for two reasons. First, it is very closeto the average figures published in previous DennisTechnology Labs reports over the years. Second, it ismuch lower than some detection results typicallyquoted in anti-malware marketing material..

The presence of anti-virus software can be seen todecrease the chances of a malware infection evenwhen the only sites being visited are proven to beactively malicious. That said, no product produced a100 per cent protection rate, while all generated atleast one false positive result.

Page 20: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 20 of 21

APPENDIX A: TERMS USED

Compromised Malware continues to run on an infected system, even after an on-demand scan.

Defended Malware was prevented from running on, or making changes to, the target.

False Positive A legitimate application was incorrectly classified as being malicious.

Introduction Test stage where a target system is exposed to a threat.

Neutralized Malware or exploit was able to run on the target, but was then removed by the securityproduct.

Observation Test stage during which malware may affect the target.

On-demand (protection) Manual ‘virus’ scan, run by the user at an arbitrary time.

Prompt

Questions asked by software, including malware, security products and the operatingsystem. With security products, prompts usually appear in the form of pop-up windows.Some prompts don’t ask questions but provide alerts. When these appear anddisappear without a user’s interaction, they are called ‘toasters’.

Real-time (protection) The ‘always-on’ protection offered by many security products.

Remediation Test stage that measures a product’s abilities to remove any installed threat.

Round Test series of multiple products, exposing each target to the same threat.

Snapshot Record of a target’s file system and Registry contents.

Target Test system exposed to threats in order to monitor the behavior of security products.

Threat A program or other measure designed to subvert a system.

Update Code provided by a vendor to keep its software up to date. This includes virusdefinitions, engine updates and operating system patches.

Page 21: PC Anti-Virus Protection 2013 - DennisTechnologyLabs · PC Anti-Virus Protection 2013 Page 6 of 21 It is quite possible that a compromised system will be made unstable, or even unusable

PC Anti-Virus Protection 2013 Page 21 of 21

APPENDIX B: TERMS OF THE TEST

This test was sponsored by Symantec. The test rounds were conducted between 27/06/2012 and 19/07/2012 using the most up to date versions of

the software available on any given day. All products were able to communicate with their back-end systems over the internet. The products selected for this test were chosen by Symantec, with advice from Dennis Technology Labs. Samples were located and verified by Dennis Technology Labs. Products were exposed to threats within 24 hours of the same threats being verified. In practice there was only

a delay of up to three to four hours. Details of the samples, including their URLs and code, were provided to Symantec only after the test was

complete. The sample set comprised 50 actively-malicious URLs and 50 legitimate applications.

FAQs

Does the sponsor know what samples are used, before or during the test?No. We don’t even know what threats will be used until the test starts. Each day we find new ones, so it isimpossible for us to give this information before the test starts. Neither do we disclose this information until the testhas concluded. If we did the sponsor may be able to gain an advantage that would not reflect real life.

Do you share samples with the vendors?The sponsor is able to download all samples from us after the test is complete. Other vendors may request a subsetof the threats that compromised their products in order for them to verify our results.

The same applies to client-side logs, including the network capture files. There is a small administration fee for theprovision of this service.

What is a sample?In our tests a sample is not simply a set of malicious executable files that runs on the system. A sample is an entirereplay archive that enables researchers to replicate the incident, even if the original infected website is no longeravailable. This means that it is possible to reproduce the attack and to determine which layer of protection is wasable to bypass. Replaying the attack should, in most cases, produce the relevant executable files. If not, these areavailable in the client-side network capture (pcap) file.

Does the sponsor have a completely free choice of products?No. While the sponsor may specify which products it wants us to compare, we will always advise on this decisionand may refuse to include certain products if we feel that a comparison with the others is not fair.