Since I started this thread let me chime back in. I know the need for VCS but using it for binaries I think , for my setup, isn't a good fit. What I have done have always kept copies of ecery compiled COBOL code and C code with an extention to match the date it was taken out of service and also a directory tree by version number. Now I am a small shop. 1 Manager and three programmers. I don't count myself as a programmer, just the keeper of the "keys".
I agree with Yves about pulling a ton of data when you pull the repository. >>> Yves Dorfsman <y...@zioup.com> 05/06/10 9:16 PM >>> It looks as though this message hasn't gone through, so re-sending - sorry if it ends up being a duplicate. -------- Original Message -------- Subject: Re: [lopsa-discuss] Version control system for small group Date: Thu, 06 May 2010 13:55:24 -0600 From: Yves Dorfsman <y...@zioup.com> To: discuss@lopsa.org On 10-05-06 01:25 PM, Brian Mathis wrote: > Please don't start a "which VCS is better even though I know it's git" war. > All we need to talk about are which options are out there, which I believe > we have already done, and then allow the OP to make the right decision > based on their own requirements. I'm not getting into git vs. hg vs.... but if the nature of the work means you store a lot of binaries, or large binaries, then I do believe distributed VCs aren't a good fit. And for the sake of full disclosure, I use hg (mercurial) for my pet programing/scripting projects. > On Thu, May 6, 2010 at 3:22 PM, Trey Harris<t...@lopsa.org> wrote: >> >> Why do you say to use old-fashioned tools for storing binaries? In my >> experience, git does a fine job managing binaries. You can even set an >> attribute to tell git what tool to use instead of diff to compare >> revisions of binaries (if such a tool is available to dump the file into >> text form). >> Version Control systems keep one (or a few) version of a file, and all of the deltas to be able to recreate every other versions. But delta are typically meaningless for binary files, AND binary files tend to be order of magnitudes bigger than text files. Say you have a 20 KB text file with 100 changes of 10 lines. The size of this file in the repos is ~ 20000 + 100 * 80 = 28 KB ; and a binary file that is 30MB and you keep 10 revision of it, so the size occupies in the repo will be roughly 300 MB (MB vs. MiB is irrelevant here). Checking these two files with a centralised type of VC will mean that you will be pulling 20 KB + 30 MB ~= 30 MB. Checking them out with a distributed VC, essentially copying the entire repo, means pulling 300 MB. Also, if the binaries get changed a lot in a particular branch, it means that you will keep pulling all those changed binaries every time you sync up, even if you do not work with that branch. I know of companies who use a lot of large graphic files as part of their development (computer games for example), and also know of companies who keep 10 versions of their executables for some applications, the reasoning being that if the new version is broken, they need to switch back to the previous version in a matter of minutes, they don't have the time to re-build the system or even ask for a backup (backups) to be restored, they want to be able to point to version 736, and install it back in minutes, because version 736 is the last one that worked perfectly. -- Yves. http://www.SollerS.ca/ xmpp:y...@zioup.com _______________________________________________ Discuss mailing list Discuss@lopsa.org http://lopsa.org/cgi-bin/mailman/listinfo/discuss This list provided by the League of Professional System Administrators http://lopsa.org/ _______________________________________________ Discuss mailing list Discuss@lopsa.org http://lopsa.org/cgi-bin/mailman/listinfo/discuss This list provided by the League of Professional System Administrators http://lopsa.org/