Bern University Thesis testing of AV software

Languy99 my bad lucky I rechecked in time.

Here is the link: SEL :stuck_out_tongue:

This guy is probably doing his Thesis but it makes for very interesting results and reading. If you want the results scroll all the way to page 171.

As the document is big and some have bad connections I’ll state the findings.

Norton IS 82.6
Kaspersky 78.9
Sophos ES 72.4
F-Secure IS 91.8
Comodo IS 97.8

Comodo does do well in these tests as he tests all layers of it’s security.

Enjoy & always strive to improve. Thank you Wilder security member for this valuable finding & take care.

link or paper?

Are all these suites? Or are there traditional antivirus programs included, as the headline says?

According the cited PDF, they appear to be all suites.

edit: The thesis (PDF) is actually entitled…

Building an Automated Large-Scale Analysis System for Measuring the Effectiveness of AV-Software on the End Host by Peter Linder

Nice way to describe that there are different ways of having “anti virus protection by tools”. It needs “large scale analysis” to have an overview of effectivity :wink:

Interesting comments about the effectiveness of the Suite.

http://sel.bfh.ch/resources/thesis/linder11.pdf

[u]Results–Discussion–Page 171[/u]

Product oa (%) oe (%) detection (%) no detection (%)
Norton IS ’10 6,367 44.7 5,395 37.9 11,762 82.6 2,438 17.4
Kaspersky IS ’10 8,878 62.3 2,351 16.5 11,229 78.9 2,971 21.1
Sophos ES 8,222 57.7 2,090 14.7 10,312 72.4 3,888 27.6
F-Secure IS ’10 8,706 61.1 4,370 30.7 13,076 91.8 1,124 8.2
Comodo IS 5,322 37.4 8,603 60.4 13,925 97.8 275 2.2

identical computers based on our configuration, the following number of computers would
have been infected:
• 2,438 times, if using Norton IS 2010
• 2,971 times, if using Kaspersky IS 2010
• 3,888 times, if using Sophos ES
• 1,124 times, if using F-Secure IS 2010
• 275 times, if using Comodo IS
The first question that arises is: Why are the detection rates of Comodo IS and F-Secure
better than the of the rest of the AV-software? The detection rate of Comodo IS can
easily be explained, as this products follows a whitelisting strategy for unknown exe-
cutables
. Whenever a sample is executed and not known by Comodo or contains no
trusted digital signature, the sample is put in a sandbox and a heuristic scanner is used
to determine the maliciousness of the sample. Based on the classification of the heuristic
scanner, the HIPS of Comodo IS is used with a corresponding policy (which is usually
very strict). As soon as a policy is breached, the user is requested if he wants to allow the
action that is triggered by an unknown executable or not. In general, all detections of
the HIPS must be handled with support of the AV-software that provides the user with
additional information. F-Secure IS 2010 uses a similar strategy, but blocks suspicious
executables by default. Our predefined and simulated user had therefore been infected
a lot more with Comodo IS 2010 than with F-Secure IS 2010
.

We welcome any real “independent” testing research like this one. The author has put a lot of time and thought into this work, welldone!

The quotes are citations of the test, collected in SivaSureshs post:

What?

Detection rate (of malware in unknowns) is not like “comparing an entry to a whitelist”.
Furthermore, comodo uses a “default deny for unknown executeables/a sandbox”. Unknown executeables are per definition in no list, as they wouldnt be unknown then!

Thats what i tried to tell from time to time :smiley:

This work seems interesting and well done, I’ll give it a look…

Another discussion about the same topic is found at Automated whole product testing of 5 programs (2010 versions) vs. over 14,000 samples | Wilders Security Forums.