Cloud computing for CIS

I thought about this program a lot and after using it for some time I think there could be a lot improved.

Now after a while of using there is a buildup from files in the proactive security section “files waiting for check”. We are asked to check, send the file and so on.

This could be done in a modern cloud check like the new Microsoft anti-virus tool. The files it finds and does not recognize in its virus list, it checks it by sending it to the cloud at Microsoft and there it gets checked if its a safe file. The info is send back within a second or 3 to 5 and the program takes charge of the rest. No human interference is needed.

There are to much links to click and actions to be taken. This, in my humble opinion, can be 75% less so it gets understandable for kids and old folk to.

How can a file be determined safe within a couple of seconds? Not possible IMO…

The problem with the cloud concept (and it is indeed a big problem) in regards to an AV solution is that you are dependent on internet access. There is a lot of malware that kills net access. What do you do then? 88)

Even if your access is active, what if the servers hosting the info are down? I’m not all that enthused about this “cloud” stuff.

I’m not enthused about it much either, but it just absolutely doesn’t make sense for such a critical process as a security application. But you make a good point. Even the goliath Google can’t keep their (paid) cloud systems up as reliably as they promised.

I think you all need to step out of the small Comodo thinking box and in to the future for a second. Oh wait it’s working already for M$ so it can be done.

1 - The av gets it’s normal daily updates against viruses (pushed), no cloud needed for that.
2 - Only unknown files\signatures need to be checked by the cloud.
3 - when there is no Internet normal av does still its job but can’t get updates either and so becomes unsafe also.
4 - if Malware could kill the Internet, the AV is not good enough. (same as point 3)

It’s working for MS as long as the servers stay up. Also, I am using MSE on the other machine here and it has problems with it’s auto updates. Sometimes they work and sometimes you go days without it being updated. The problem is all over their forums and occurs across all versions of Windows. The manual updater works but you shouldn’t have to rely on that for a real time AV product. The default period for updates is 24 hours but if the machine is not on at that time, MSE will not do a catch up check when the machine is started . It’s supposed to do that but in most cases, it doesn’t. So MSE is a poor example to use. With the way it is currently working, MS says that if you receive something that has no signature but is detected as suspicious by MSE’s heuristics, then it will check the cloud to see if the file has been reported as bad and will download available updates. To me , this is not acceptable when the auto updater is not working as intended and seems like a coverup for a flawed update system. They also need to set it to check for the updates more frequently than every 24 hours.

Thanks for this explanation. So the M$ program has it’s problems and maybe does not work yet correctly. I do not say the M$ AV is good or better but Comodo has it’s problems to. What I was trying to show is that the technique could be implemented and upgraded to a good working part of Comodo. This could take a lot of decision taking back from the user back to the program.

Using the cloud in a security program could cause problems in my opinion. It’s only advantage would be more frequent access to updates than the programmed update checks would provide. If the security app used what Norton calls pulse updates that send out incremental updates every 10 minutes, it’s hard to see how a cloud feature could be any better. Actually, the Threatcast feature of CIS is kind of like a cloud anyway but it doesn’t always work.

I also am not impressed by “cloud” systems, I would never trust them.

I think one day, the AV’s will have to be in the cloud… signatures will continue to grow and keeping them on the users pc won’t be practical.

I’m not convinced either way which is right or wrong, but is cloud really that bad ???
Worried about reliability, Well of course they would have multiple servers. I doubt that’s an issue at all.

Also mention servers getting “hacked” seriously, whats the chance there would be any success? Though if be that 0.0001% chance does occur… thats why you have another layer to cover it.

I was having a more of a read about panda today, it also takes noted of safe files. What a great way of expanding a whitelist. I think I read they are at 81.050.002 known files.

In the future, Compare a HIPS with a giiiiiiaaaannnnnttttt whitelist vs a av black list. Hows that for usability and protection to unknowns? :smiley:

That might be perfectly fine but I was really talking off topic, relying on something like Google Documents to not lose your data is a bad idea…

Why?

Yeah, I’m not worried about Google losing anything. They’re experts at keeping everything they get their hands on. However, as I stated earlier, even Google with all of it’s massive server farms can’t keep the uptime that they promised users. In other words, your data is there, you just can’t access it…

If M$ and Google can’t pull off the cloud concept yet, what hope does anyone else have? They are pretty much the only companies with as good as unlimited resources to throw at the problem. Cloud just isn’t reliable yet.

Hey Heffed what service exactly are you talking about google? I’ve never ever had any experiences where google.com or it’s gmail has been affected\down, never.
++ What cloud services are you talking about microsoft?

There have been several fairly well publicized outages after Google started charging for some of their services. I don’t use them, I’ve only read about them. I haven’t had any problems accessing Gmail either, but I’ve read articles within the last couple months about roving Gmail outages. You can do searches on Gmail outage and Google outage if you want to read pages and pages of articles.

M$ has a cloud component in their security essentials package. If it finds something suspicious that it doesn’t have a signature for, it checks the cloud.

You can’t be serious. There are still ways Google could lose your data. Natural disaster, War, really really bad timing for a large-scale hardware failure etc. Pretty improbable but not impossible.

This is why there is a thing called ““backups”” For when the unforeseen happens.

Even if the problem were just as simple as downtime, what if you needed that document for a critical reason on that day?

It’s just not sensible to trust your data to one location.

Oh, but I am serious. In all the scenarios you mention, I figure my data is safer mirrored around the globe than any backup I can make. Chances are pretty good that any backup private users make is going to be stored in their house. If the house burns down or something of that nature, my backups are going to be pretty far down the list of things to immediately save, and I’d be willing to bet the same for you. Even if say, I take the backup off site. Perhaps a safety deposit box in my bank’s vault. In the case of a natural disaster, my bank will likely be affected as well.

Yup. That is exactly the reason I say cloud isn’t there yet.

Again, you are correct. That’s why having Google mirror data around the world isn’t such a bad storage option.

If it sounds like I’m contradicting myself, I’m really not. I don’t feel anyone has the infrastructure in place to serve timing critical data 100% of the time. With a cloud based AV, this is a crucial issue!

However, for non-timing critical data, a cloud setup like Google is likely a safer way to store data than any backup solution I could come up with. Google is pretty quiet about just how many data centers they have and their location, but I believe 6 in the US alone are known as definite. I’d feel pretty good if I had my backups stored in 6 different locations in a single country, let alone who knows how many across the globe…

It seems that the update problem with MSE has been fixed. My other machine is getting updates now and sometimes twice a day. Unless it was my editing the update interval in the registry that fixed it, they have done something.