How to best update remote compressed, encrypted archives incrementally?

robert no-spam at no-spam-no-spam.com
Fri Mar 10 09:13:07 EST 2006


Hello,

I want to put (incrementally) changed/new files from a big file tree 
"directly,compressed and password-only-encrypted" to a remote backup 
server incrementally via FTP,SFTP or DAV....  At best within a closed 
algorithm inside Python without extra shell tools.
(The method should work with any protocol which allows somehow read, 
write & seek to a remote file.)
On the server and the transmission line there should never be 
unencrypted data.

Usually one would create a big archive, then compress, then encrypt 
(e.g. with gpg -c file) , then transfer. However for that method you 
need to have big free temp disk space and most costing: transfer always 
the complete archive.
With proved block-file encryption methods like GPG I don't get the 
flexibility needed for my task, I guess?

ZIP2 format allows encryption (Is this ZIP encryption method supported 
with Python somehow/basically?). Somehow it would be possible to 
navigate in a remote ZIP (e.g. over FTP) . But ZIP encryption is also 
known to be very weak and can be cracked within some hours computing 
time, at least when every file uses the same password.

Another method would be to produce slice files: Create inremental 
TAR/ZIP archives, encrypt them locally with "gpg -c"  and put them as 
different files. Still a fragile setup, which allows only rough control, 
needs a common archive time stamp (comparing individual file attributes 
is not possible), and needs external tools.

Very nice would be a method which can directly compare agains and update 
a single consistent file like
ftp://..../archive.zip.gpg

Is something like this possible?

Robert



More information about the Python-list mailing list