Signature Updates :- New Threats vs False Positives.

Every time the signature database is updated, it may :-

  1. Fix detection of one or more new threats ;
  2. Fix detection of one of more older threats not previously detected ;
  3. Fix detection of previous False Positives.

Just as a matter of interest, I am wondering about the average number of each type of “fix” per week,
(not the number of signature database updates)
and especially interested in the ratio.

If 100 threats are dealt with over a period of time :-
What is the percentage success rate ( a failure is having to try again ) ?
How many new false positives will those 100 fixes produce ?

Firstly I would like to know the general A.V. industry average fixes per week,
and secondly it would be interesting to know how Comodo stacks up against the competition.

Regards
Alan

And you want that info for free?Hmm.

Yes please.

On the CCleaner user forum today there was a complaint that the download from FileHippo was infected,
but the same file from Piriform was not.

The consensus was that his A.V. signature caused a False Positive when he went to FileHippo,
but the False Positive had been reported and fixed in his A.V. update prior to Piriform.

He was NOT using Comodo for protection.

Problem solved.

It makes me wonder however, what is the percentage failure rate of a detection fix in terms of :-
Still not detecting the threat for all hardware and software (O.S. and Applications) ;
False Positives.

e.g. is a fix perfect 999 times out of 1000,
or will 1000 fixes produce 100 False positives to be dealt with as/when reported.

I am mostly interested in “industry averages”,
but a ranking list comparing individuals would not be ignored ! !

Alan

Good questions. :-TU