Ftp Connection before back up created. [CLOSED]

I searched the forum for similar topics and found

it says a ticket has been added, I am wondering what the chances are of getting this fixed anytime soon ?

I my back up is rather large, and the ftp connection times out before the backup is finished being created, so when comodo attempt to upload the back up, it just sits at that step, but can’t finish because it has no connection.

after editing the timeout on the server, I also noticed that my backup fails to upload anything, after further digging on the forum I saw the mention of a 2 gig limit on zip archives, is there a work around for this as well, aside from not zipping ?

also since my backup is over 2 gigs, is it possible to have the software report this as some sort of failure ?

I love the product, well though out gui design, hopefully we can get this fixed soon.


ok, so I’m hoping to get some feed back here, as I like the idea of this product, but honestly its really failing to perform at this point.

I have 7.4 gigs of data to back up to a remote server via FTP, when attempting to do incromental, the application would hang on file transfers about 10 to 20 minutes into the process, no errors in the log, no nothing.

I them decided to FTP the files myself using filezilla, this worked fine, I have a full achieve set on the server, the time stamps on the file where not preserved.

when I attempt to run a back up, the back seems to stop processing (the backs in que go to 0, and the FTP connection times out) on the step “Deleting old files in folder”

I have yet to get the application to work on anything but very small transfers, this application seems like a wonderful product, and there is certainly a need for it. But with these current issues, the application is failing to meet my requirements (:AGY)

Hi bruceburge

Welcome to Comodo Backup

As far as I know there is no workaround to the 2 gig limit for a zip file. I think with the size of backups now this would be a good item to suggest in the wishlist (again). It has already been suggested but wouldn’t hurt to suggest again. We have not had an update since May so I hope we can get one soon.


thanks for the reply, but honestly out of all the information mentioned in my 2 post, the zip limit is moot point if the software fails to get the data to the server.

do you have insight as to the hanging on “deleting old folders” issue, is there some method of configuring the application so that it gives a more detailed report of the errors the application is having ?

is there a pro version of this application that would offer non community based support ?

The “deleting old folders” is a new issue that I have not run up against before. You can get a more detailed report of the backup by going to the backup folder in Program Files → Comodo → Backup → CmdBkpSvc.log

Let us know if you find anything there that can help.

There is no pro version this is it. Version is the latest.



10/16/2008 at 3:00:00 AM | Backup "datasrv_backup_incro" started. 10/16/2008 at 3:00:00 AM | Reg Key Backup\datasrv_backup_incro 10/16/2008 at 3:00:00 AM | Resume started 10/16/2008 at 3:00:00 AM | Calculating backup size... 10/16/2008 at 3:00:05 AM | Backup size - 5741,142 KB. 10/16/2008 at 3:00:05 AM | Connecting to FTP-server... 10/16/2008 at 3:00:06 AM | Connected 10/16/2008 at 3:00:06 AM | Copy E:. 10/16/2008 at 3:00:07 AM | Copying of folder E:

here is an example of the log, the times don’t match, I had the log open when I did the last to attempts to run, so I’m guessing it was able to write, but what I have posted is consistent with the issue. The log never mentions “deleting old folders” I am running as administrator, and the service was set up with the same account, testing backup works with no errors reported.

I am running comodo on 2k3 server latest services packs, running the latest version of comodo.

I see the size of the backup is different as well as the time stamp. Your top screenshot looks like a successful backup as it took almost 2 minutes. The second shot show only a 7 second run so probably was not a successful backup.

This has me puzzled as well and maybe we can get some more Backup users to help us out on this one.


The screen shot is of two attempts to run the back up, both stop at “deleting old files in folders” which is never even written to the log. The size difference is likely due to users on the system removing files.

I have yet to get a successful back up outside of the test of using a few selected directories, I’m wondering if the size, and number of files is over loading the app.

have you ever split your back up in to multiple back ups of say around 2 gigs, I’m thinking I could do multiple zip achieves, what are your thoughts on this as an alternative ?

Sadly attempting to split the zip file up resulted in hangs and inconstant behavior. I have given this application a fair shake, but it has failed to consistently perform the backups in a usable fashion.

I have instead replaced the application with winrar and winscp and a batch file.