We have a hard limit in the service of 5GB for a single push.

The advice we've given other customers is to do partial pushes by checking out 
an older commit, pushing that, and then checking out a newer commit, pushing, 
etc.  You have to push multiple times, but you can build up the entire history 
that way.

This is due to a limit set by the TFS product.

-----Original Message-----
From: git-ow...@vger.kernel.org <git-ow...@vger.kernel.org> On Behalf Of Aram 
Maliachi (WIPRO LIMITED)
Sent: Friday, June 14, 2019 11:48 AM
To: git@vger.kernel.org
Cc: Kranz, Peter <kranz.peter....@siemens-healthineers.com>; Brettschneider, 
Marco <marco.brettschneider....@siemens-healthineers.com>
Subject: commit sized around 100 gb in changes failed to push to a TFS remote - 
Git

To @Git Community
>From the perspective of an Azure DevOps support engineer. I have a customer 
>who is unable to make a push with following error:

fatal: The remote end hung up unexpectedly failed to push some refs into 
https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fzelos.healthcare.siemens.com%2Ftfs%2FHoover%2FVA20A.DevInt.Gvfs%2F_git%2FSaturn&amp;data=02%7C01%7Cv-armal%40microsoft.com%7C00a886aa8e6e4eb7b38308d6f0e81171%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636961276860256000&amp;sdata=HF36q%2FZff3882jBBNdyXQdQMUcFrsJ1jHtWJyfbTu0s%3D&amp;reserved=0

The local repository has only one change when comparing it to the remote and it 
is a commit labelled with SHA value: 504aedfdbb to a branch called gitTest This 
being said the scheme is as following:

[Remote] - master
b946c27c

[Local] - gitTest branch
504aedfdbb
b946c27c


Important data:
- The commit 504aedfdbb contains +100 GB in file changes
- The remote git repository is a TFS server
- Customer isn't building code - it is using the remote kind of as a storage 
service <- We understand these are not best practices but is the way customer 
is using Git and TFS. If @Git Community could confirm/elaborate on this 
customer may change up the current approach he is using.

Things tried:
- reset the history for the local repository back to the latest shared commit 
b946c27c  and committed something small which succeeded to push into remote 
into a brand new branch by running $ git push origin <name of local branch>
- cherry-picked the commit into local master and attempted to push = failed. <- 
this makes me think this is entirely caused by the oversized commit
- boosted up the http post buffer configuration = failed. Rolled configuration 
back to default according to the MSFT docs 
https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fdevops%2Frepos%2Fgit%2Frpc-failures-http-postbuffer%3Fview%3Dazure-devops&amp;data=02%7C01%7Cv-armal%40microsoft.com%7C00a886aa8e6e4eb7b38308d6f0e81171%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636961276860256000&amp;sdata=jnxDSnfGiRpHbs%2F1n0yRt3V%2FE3UogElyRhhyoxFc%2FTM%3D&amp;reserved=0
- since this is a TFS server I initially though this could be caused by 
insufficient disk storage capacity in the server containing the TFS product. 
But @Vimal Thiagaraj has confirmed that the repositories size limit depend upon 
the remote TFS databases and not the server itself. Is there a limit on these 
databases or on how much changes can a git commit contain?

Things I've suggested to customer:
- commit more frequently in smaller batches
- understand that the nature of git is to collaborate and track versions of 
files over time - not a cloud storage provider

Would appreciate any insight on this @Git Community. Thanks to @Phillip Oakley 
who took the time to answer last time I posted a question to this mailing list.

Reply via email to