On Fri, Jun 27, 2025 at 10:41:05PM -0400, Reese Johnson wrote:
> Backup into got. :)

I realize you might be joking, but to anyone taking this seriously, don't do 
this!
The most well-known caveat is that this is not a suitable solution for large 
files.
But in any case you will eventually hit the limits where it stops working.

I ran an experiment once where I comitted all my archived email and newly 
arriving
messages into a (private) git repository every 5 minutes, just to see what 
would happen.
(Email was simply the most convenient source of plenty of text data available 
to me.)
This repo ended up being about 600GB in size at which point I stopped the 
experiment.
8 GB of the objects were already packed, but with thousands of loose objects 
created via
regular commits it became impossible to repack the repository due to memory 
constraints.
On box with 16GB of RAM I ended up with 592GB of loose objects which could not 
be compressed,
neither by git gc nor by gotadmin cleanup.

Reply via email to