Hi there,

I've been using git for some time now, and host my remote bare
repositories on my shared hosting account at Dreamhost.com.  As a
protective feature on their shared host setup, they enact a policy
that kills processes that consume too much memory.  This happens to
git sometimes.

By "sometimes" I mean on large repos (>~500MB), when performing
operations like git gc and git fsck and, most annoyingly, when doing a
clone.  It seems to happen in the pack phase, but I can't be sure
exactly.

I've messed around with the config options like pack.threads and
pack.sizeLimit, and basically anything on the git config manpage that
mentions memory.  I limit all of these things to 1 or 0 or 1m when
applicable, just to be sure. To be honest, I really don't know what
I'm doing ;)

Oddly enough, I'm having trouble reproducing my issue with anything
but git fsck.  Clones were failing in the past, but after a successful
git gc, everything seems to be ok(?)

Anyway, I'd like some advice on what settings limit memory usage, and
exactly how to determine what the memory usage will be with certain
values.

Thanks!
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to