I'm also interested in hearing opinions on this. We do the latter (many repos), and have over 250 of them. This causes some difficulties, especially because Github's search *sucks*[1].
Onboarding: People need to *find out about *the repos they need, have permission to access them, etc. Reading code: If you're looking at a part of the code you haven't looked at before, you may have stop to clone a repo. Searching: Interested in which services import your package? Github's search is *useless[1]*. grep/ack-grep/suchen are great, but you have to have everything cloned. Keeping up-to-date: Try having 250+ repos checked out locally and have the latest version checked out. Requires an automated process all on its own. I like the idea of a monorepo. Keeping internal and vendored packages in one place with a single command to test/build everything to prove you haven't broken others' work. Other than the sheer size and the extra coordination that has to go into preventing/resolving merge conflicts, are there any big downsides? [1] https://help.github.com/articles/searching-code/ They delete/ignore special characters, making it almost impossible to find anything across repos (or even within them). Forget a regex search... WTF -- You received this message because you are subscribed to the Google Groups "golang-nuts" group. To unsubscribe from this group and stop receiving emails from it, send an email to golang-nuts+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.