[Borgbackup] how to back up 300 million files

Mario Emmenlauer mario at emmenlauer.de
Thu May 4 06:59:02 EDT 2017


On 04.05.2017 12:47, Maurice Libes wrote:
> Le 04/05/2017 à 12:26, Marian Beermann a écrit :
>> As far as I can see the information there is correct and complete.
>> MAX_OBJECT_SIZE is an internal constant
>>
>>> ... limited in size to MAX_OBJECT_SIZE (20MiB).
>> Regarding 1.1.x beta compatibility with stable releases; there is no
>> intent to break it. Doing so would make them pointless, since no one
>> would test such an unstable release.
>>
>> Cheers, Marian
>>
>> On 04.05.2017 12:20, Zsolt Ero wrote:
>>> Also, is this page still not updated to reflect 1.10 changes?
>>> http://borgbackup.readthedocs.io/en/latest/internals/data-structures.html#note-about-archive-limitations
>>>
>>> Is MAX_OBJECT_SIZE a constant or can be set using run time parameters?
>>>
>>> On 4 May 2017 at 11:42, Zsolt Ero <zsolt.ero at gmail.com> wrote:
>>>> Hi,
>>>>
>>>> I am trying to solve the problem of backing up 300 million files,
>>>> preferably with borg. The files are small, totalling only 100 GB all
>>>> together (12 kb on average).`
> other answer/question from another point of view of a neophyte:
> 
> Is borg an appropriate solution in this case of very small files (12kb) , since borg will never split the files into chunks?
> so don't we lose the benefit of deduplication ?
> or am I wrong?
> I don't remember what is the limit for a file to be split into chunks

If some of the files are identical, they would still be de-duplicated.
But I agree that its not the standard use case of borg.

Zsolt, do you have many duplicate files in your collection? If not, do
the files often change? Did you think about a simpler backup solution
like rsync with hard-links?

Just my two cents.

Cheers,

    Mario



More information about the Borgbackup mailing list