[Borgbackup] how to back up 300 million files

Marian Beermann public at enkore.de
Thu May 4 06:26:36 EDT 2017


As far as I can see the information there is correct and complete.
MAX_OBJECT_SIZE is an internal constant

> ... limited in size to MAX_OBJECT_SIZE (20MiB).

Regarding 1.1.x beta compatibility with stable releases; there is no
intent to break it. Doing so would make them pointless, since no one
would test such an unstable release.

Cheers, Marian

On 04.05.2017 12:20, Zsolt Ero wrote:
> Also, is this page still not updated to reflect 1.10 changes?
> http://borgbackup.readthedocs.io/en/latest/internals/data-structures.html#note-about-archive-limitations
> 
> Is MAX_OBJECT_SIZE a constant or can be set using run time parameters?
> 
> On 4 May 2017 at 11:42, Zsolt Ero <zsolt.ero at gmail.com> wrote:
>> Hi,
>>
>> I am trying to solve the problem of backing up 300 million files,
>> preferably with borg. The files are small, totalling only 100 GB all
>> together (12 kb on average).
>>
>> Is it possible to use borg to handle such task? The documentation says
>> it is quite limited in the number of files. I am looking for a
>> solution which doesn't need to be changed even if the files grow to
>> 10x the size of this.
>>
>> Preferably I'd like to back up over to a remote server, running "borg
>> serve", having a local disk.
>>
>> I am OK to use 1.10 beta as long as the archives stay compatible with
>> the final 1.10 version. Is that the case?
>>
>> Zsolt
> _______________________________________________
> Borgbackup mailing list
> Borgbackup at python.org
> https://mail.python.org/mailman/listinfo/borgbackup
> 



More information about the Borgbackup mailing list