Hal Murray <hmur...@megapathdsl.net>: > It's reasonably easy to setup a cron job to make a copy of the data in git. > How many of us should do it?
Git data and metadata are not the problem. Every pull of any of our reopositories replicates its entire state. What we risk losing is tracker data, and possibly the hooks for the CI machinery. > Is there any way to grab a copy of the issues data? There does not seem to be yet, though I won't know for sure until I can read the issue discussion on GitLab's tracker. It is, of course, down. I bet there's going to be a lot of pressure on GiTlab to provide an export after this... > What do we have in the way of backup for our other stuff? How much is there > that's not on gitlab? Ask Gary and John that. We have a website and bunch of other state on ntpsec.org, which is hosted at OSU. I don't know how it's backed up. > A while ago, I signed up for lots of email from the issue stuff. I've pushed > most of it off to a folder. Do you want a copy? (Some of it is full of html > bloat/crap.) Hold on to that. GitLab claimed to be 60% done with their DB copy two hours ago. If they come back up and our tracker data is missing I will want to grovel through what you have and see what I can reconstruct. This could be much worse. I am feeling extremely grateful that we managed to stay on top of our issues list as well as we have. Losing it all would be annoying, but not a disaster. -- <a href="http://www.catb.org/~esr/">Eric S. Raymond</a> _______________________________________________ devel mailing list devel@ntpsec.org http://lists.ntpsec.org/mailman/listinfo/devel