Video: a large problem with suggested solution (old but persistent problem)

I am using Comodo 5.10 but I have seen this issue for tonnes of other versions so there is no hope of this getting fixed unless someone does something about it.

I have a 2 core CPU.
I have 450+ programs in my Defense+ list.
Whenever I add a new rule to Defense+ it uses ~125,000,000,000 CPU cycles to do so over the course of 50 seconds.

Here is a video:

:0 - :12 computer doing nothing
:12 - 1:11 computer adding a rule
1:11 - 1:18 computer handles not adding a rule but allowing just fine
1:18 - 2:22 adding a rule again
it is worth the download as it has quite a bit of information in it.

If handling the processing 450 programs rules is a mathematically intractable problem THEN THAT’S FINE! but I don’t think it is. ;D

If the process of adding a rule inherently takes a long time then you could implement the following technique. Have two places for rules.

  1. add rule to a simple list that is slow to check against but never has too many programs.
  2. check new programs against both the main rule data structure and the small list
  3. every 10 mins spend half an hour using 5% CPU to compile your new complicated main rule data structure by adding the rules in the small list. :wink:

I know your current method of doing this rule stuff has stayed the same for many years at least since 3.9, but that just means the method hasn’t been fixed yet.

Glad I could help and happy to do so again! Thanks for making such an awesome product that outbeats everything I’ve ever used!(save this one (massive) flaw)

Could you upload the video to a videoplayer site?
I naturally hesitate to load from filehosters for a “did you see that?”
Its a reflex :smiley:

No, every 10 minutes/intervalls sounds bad. And 5% cpu is relative :wink:

Try erasing the list of Unrecognised Files ans Submitted Files. See if that helps.

Removing 20% of entries makes it 20% faster than really slow. It didn’t help.

Description of video above. - YouTube

I was making a suggestion to spread the recreation of the large data structure over a longer period of time. If Comodo accepts new rules in a fast way, it will still need to do its slow thing eventually so the 5% CPU merely allows it to get the job done… eventually.

I am unsure of how the Defense+ rules are managed but it my experience using a DOM parser and writer for large data structures is slow. It’s great for keeping thing organized at it simplifies operations, but it stops being useful after a certain size. No idea about any details regarding this or whether it’s even relevant.