AV-Comparitives Dynamic testing - December 2009

Dynamic Test

The goal of this Whole Product Dynamic Test is to compare the protection offered by various security solutions, by testing them under real-world conditions. There has been a lot of talk in the past years about such tests and their value for home users. Some issues related to these tests are that they are very expensive to perform (due to the time and personnel required) and difficult to replicate. Nonetheless, such tests are very important and show the ability of the various security products to protect the users against malware.


http://www.av-comparatives.org/

On the left, Click comparatives and reviews
Click Dynamic testing







Need a PDF Reader\viewer?
http://www.foxitsoftware.com/pdf/reader/
Good, and free.

only 100 test cases :frowning:

I appreciate this is their first version and hope to see a credible number of tests performed in their future test. Also they don’t claim this to be AMTSO compliant yet do they?

Melih

AV-C has locked the PDF so I can’t copy and paste\quote the testing methodology…
Suggest you read it… sounds like what your looking for melih.

oh great, do they say its AMTSO compliant?
also, just testing it with 100 malware/malicious sites?

Melih

Shows how much research you do :slight_smile:

Melih, I’m not going to retype the whole thing. I can’t copy and paste it either without a lot of hassle as the PDF is locked. suggest you read it as I think you would like the methodology. (honestly)

Given the fact that Melih said the test used 100 samples (which was contained in the article, I think you can assume he read it. :wink:

Thank you Panic…
Obviously they totally missed the reason why I asked those questions :wink: (and as you rightly guessed, not because I didn’t know the answer but to try to get the posters to read and understand…but I failed :frowning: )
What I should have said in hindsight: Hey look, its not AMTSO compliant and it only has 100 malwares they test with! (But I was being diplomatic :slight_smile: ). But I am glad after so much talking avcomparitives are showing some signs of dynamic testing. Now we need them to do test following AMTSO guidelines and get a good representative amount of malware to test (100 is NOT enough).

Melih

I think it shall be awhile before that.

http://amtso.org/amtso---faqs---will-amtso-endorse-my-methodology-and-competence-to-test.html

Will AMTSO endorse my methodology and competence to test?

Q: Will AMTSO endorse my methodology and competence to test?

A: One of the issues that AMTSO and affiliated organizations are looking at is how testing organizations can demonstrate the effectiveness of their testing and their own competence, for instance by some form of certification for testers. However, AMTSO does not currently offer such certification, or any other formal mechanism for auditing a testing methodology or specific product review. Thus, product vendors and testers are not able to claim that their service is “endorsed” or “confirmed” by AMTSO or its members. Please note, however, that once AMTSO standards and guidelines are published, we will encourage AMTSO members and others to publicly reference conformity to such guidelines.

Last Updated on Thursday, 11 June 2009 14:45

Here is an excerpt:

How we tested

The Whole-Product-Dynamic Test is not a simple “detection” test as usual, it is more a “protection/prevention” test. The test mimics malware reaching and executing on a user’s machine, as it happens in the real world (e.g. by visiting a website with a malicious payload such as drive-by downloads/exploits, or by being fooled into downloading a malicous file by social engineering tactics). This means that not only the signatures, heuristics and in-the-cloud detections are evaluated, but URL-blockers, Web reputation services, exploit-shields, in-the-cloud heuristics, HIPS and behavioral detection are also considered. Firewall warnings when the malware was already running and just trying to connect to the outside world were considered a fail. We browsed to websites with exploits/drive-by downloads, and also to a few websites with malicious files that we downloaded and executed. The criteria for success/failure is independent of the technology used by the products. What matters is that the products provide reliable protection to the user, ideally without requiring any user decisions as to whether something is malicious or not. For the Whole-Product-Dynamic Test we used 16 identical physical PCs (not virtual machines), with identical hardware, software and OS configuration (administrator account). Each PC had one security product installed. We used the security suite product of each vendor where available, evaluating the overall protection provided. Products were always up-to-date and had a live Internet connection, as in the real world. Each machine had its own IP address. We used the default settings of the products. The test started on the 16th November 2009. Each day we tested about 15 or 20 test cases (new URLs with fresh/relevant exploits/malware, but taking care not to use URLs which deliver identical malware) gathered from our own crawler. As each machine had to be inspected, and all machines returned to their original state (which meant waiting until all machines were ready for the next threat), it took nearly 12 hours each day for 4 people to perform the tests (although we developed tools to speed up some procedures). Each test case was first verified by browsing to it on an unprotected system (with no security software installed), in order to see if the sample was valid and did something in the test environment. After that, all 15 security products were updated before browsing to any test-case. We took care that the site exposed all the machines to the same threat. All URLs were browsed to at the same moment, and screenshots taken in the event that the security product reacted; otherwise, we checked to see if the product had taken any action silently, or if the threat had been successful in compromising the machine (i.e. the security product had failed).

Anti‐Virus Comparative ‐ December 2009

Melih, can you please tell us whats wrong with the above methodology?
EDIT: Also exclude the fact of 100 malware, the reason for being that low was explained in the .pdf will by higher in later tests 2010 onwards.

AMTSO have a review board that will give its verdict about a test. I believe someone posted saying NSS is going thru this review board as we speak.

The methodology is ok, as long as its executed right. So how do we know its executed right? We rely on the AMTSO review board’s result. (pls refer to other threads where this was discussed to death)

Melih

::Deleted my above posts and quotes as I can’t openly share an private email::

Andreas mentioned in a discussion we had via email that AV vendors which want and haven’t done so far can get in contact with AV-C or discuss at the next AMTSO meeting testing methods and guidelines etc etc.

Ever since the first draft of AMTSO review process has been approved (may 2009) it doesn’t appear AMTSO affiliated testers released so many tests to the public.

Matters not they believe their test to comply with AMTSO guidelines or they would endorse an informal post-publication approach without leveraging on the formal/official AMTSO Review Board or would be actually willing to possibly relegate that review process as an exceptional/optional measure…

…I hope that the next AMTSO meeting will encourage AMTSO affiliated testers to submit their tests “due for imminent publication” to the AMTSO-review board at their own (earliest) convenience.

The result of AMTSO review could even be added to such tests as soon it has been finalized (for what I care), though “pending” AMTSO Analysis of review process should be pointed out ever since publication.

If there actually are testers affiliated with AMTSO that are opposed to such approach I hope their concerns will be addressed at the next AMTSO meeting.

If the whole AMTSO will actually result opposed to such approach, then there is no helping it. :-\

??

Its simple…just give us some credible tests (more than 100 malware) that follow AMTSO guidelines and reviewed by AMTSO review board…what is there to discuss?

Melih

As stated, this first dynamic test was conducted with only 100 and tests that follow will include much much more.

Will you or someone representing Comodo be attending the meeting with various other vendors on the 25th of February to discuss such things with AMTSO? Posting on here won’t change things :smiley:

yes we are…
but its not an amtso thing anymore… its about getting testing organisations to put money where their mouth is… AMTSO has done its work and provided guidelines and review board…

melih