AV Testing: Service or Selfishness?

Please read this before Voting & Specifying your answer: http://www.melih.com/2008/09/13/service-to-human-race-or-fame-seeking-selfishness/

Now lets see people’s thoughts! BUT ABIDE BY THE FORUM POLICY:

Josh

I do not claim to know the internal motivations of AV testers (attitudes of service and selfishness can manifest in the same actions). However, the results of some AV testers have benefited users. Many AV companies have improved their products after public embarrassment resulting from AV tests. Let me provide examples.

Regarding rootkits, simply having the signatures for them is insufficient. Many companies have added new methods for rootkit detection after tests like this: http://www.anti-malware-test.com/?q=taxonomy/term/7

Self protection is important for the time before the relevant signature update is released/downloaded. Some companies improved or started implementing self protection after tests like this: http://www.anti-malware-test.com/?q=taxonomy/term/16

Active infections treatment is important after the relevant signature update is released/downloaded. Some companies improved active infections treatment after tests like this: http://www.anti-malware-test.com/?q=taxonomy/term/14

I do not know as much as Melih about AV architecture and strategies, but it seems to me that AV software that can detect a class of polymorphic virus variants with one signature (genetic detection?) provides better proactive security than one that requires a separate signature for each polymorphic virus variant. See http://www.anti-malware-test.com/?q=taxonomy/term/20

It also seems to me that an AV on-access scanner with heuristics has certain benefits even when HIPS/Defense+ is installed. When downloading free software, users would tell the HIPS to trust the software installation, but they may change their mind if the AV heuristics say it is suspicious. Also, the AV on-access scanner catches malware before installation, while HIPS allows partial installation before alerting to suspicious behavior (assuming the user allowed the installation because of social engineering). Here is a test of AV heuristics: http://www.anti-malware-test.com/?q=taxonomy/term/17

By the way, I am not an AV tester or employed by an AV software company. I am an embedded software engineer.

hi SilentMusic7

I agree with you, as per my post reporting about “Capability” of an AV engine is the thing that av testers should “test” and NOT whether they catch %age or not.

That is exactly my point, however AV testers care about having a malware library that they don’t share with AV vendors and say, ha ha you don’t catch these so you must be bad!

Instead, as per my article, they should share these baddies with all AV vendors and then check who can or cannot detect these and judge the AV product accordingly!

thanks

Melih

I can see a scenario where an unselfish AV tester does not share malware with AV vendors. To fairly test for AV software’s ability to detect a polymorphic virus, the AV tester would want malware which all AV software could be expected to have a genetic signature, but which none of the AV software vendors have a specific signature for the test period, which may be a few weeks. Any malware that is out in the wild could be detected using a specific signature, so that wouldn’t work. A way to achieve the goal is to take a malware that is already in all AV vendor’s database, but make a variant of it. The AV tester should not release his new malware to AV vendors because:

  1. It was not found in the wild.
  2. It could not be used in the future for polymorphic virus testing because AV vendors would create a specific signature for it.

I have to respectfully disagree.

How will AV tester know which AV company has already got this variant or not? how can they guarantee the mutation they created off this poly doesn’t exist in the wild? No AV tester has a full view of whats out there.

I would suggest there are better ways to test whether AV engines unpack or have emulation than mutating a malware and test it.

Thanks
Melih

Hmmm. Would it also be the job of the AV developer to make variants and add them to the list of malware to increase the possible detection rate? Or, is the AV software suppose to be smart enough to detect the variant without the sig, thus only one entry is needed per “parent” (the one the variants are derived from) malware?

Also, if the AV tester makes his/her own variant, then wouldn’t he/she be considered a virus writer? In which case, if the law is unclear or not if the AV tester would get in trouble in his/her own country, then I can see why they would not disclose the fact that they made a variant. Some countries laws could differ on this subject, I think. Though, the AV tester does have a responsibility of submitting that variant to the AV company if the AV product does not detect said variant. I really don’t like legal-speak.

Anyways, this is an interesting topic of discussion. :slight_smile:

The objections you raise are potential reasons why a AV tester’s methodology is uncontrolled or unfair. You could then suggest another methodology in detail and explain why it is more complete and fair. If you present this analysis to the AV tester, a fair-minded AV tester may change his methodology to yours. But if you accuse the AV tester of being selfish, he may think you are unreasonable and ignore you.

I find that people sometimes get stubborn or defensive if I complain about their work. But if give a detailed suggestion and explain how it satisfies the listener’s concerns and offers benefits that they haven’t considered, then I notice that people are more likely to listen.

One of the concerns I suspect that AV testers have is that AV vendors will “design for test” (without benefit to end users) if the AV tester reveals his malware. Another concern is that publicly-revealed methodology gives info to malware writers. So a new methodology that is transparent to all must address these concerns.

If a truly selfish AV tester does not accept your new and superior methodology, then here are some suggestions for overcoming this:

  1. Post your analysis and new methodology to the public.
  2. Send a link to computer magazines suggesting that it would make an interesting article.
  3. Setup a site, such as Comodo did with http://www.testmypcsecurity.com, to implement and improve upon the new AV testing methodology.
  4. Create an industry trade organization of AV vendors, using your suggestion as the starting point for a working group.
  5. AV vendors document on their websites their testing methodology and problems with other methodologies.

An industry trade organization, by showing competence in their best practices, could establish more credibility than a lone AV tester. It would go toward creating a safe and secure internet.

But can we really trust the AV vendor industry to police themselves? Other AV vendors may not see any benefit to themselves in cooperating with an industry trade organization, as they may rely on advertising an illusion of security rather than revealing the truth about their performance.

This is really tricky situation! I salute Melih for how Comodo has raised the bar already for the security software industry.

I outlined it in my blog about some suggestions.

Its about the “Capability” of the AV rather than whether they detect a malware from 1980s or not.

Then the issue is defining capability. This could be: Being able to handle any unpacked malware and other kind of malware, maybe some heuristic, speed of creating signatures, speed of on access, speed of on demand and so on.

these would be the things that would matter to end users as for example speed of on access component directly affects the user’s day to day usage of their computers!

thanks
Melih