Hi! Simon Tournier <zimon.touto...@gmail.com> skribis:
>> I was wondering whether we’re now doing better for Bioconductor >> tarballs. The answer, based on small sample, seems to be “not quite”: [...] > but, now the past reads, > > $ for url in > https://bioconductor.org/packages/release/bioc/src/contrib/BiocNeighbors_1.20.0.tar.gz > \ > > https://bioconductor.org/packages/3.18/bioc/src/contrib/BiocNeighbors_1.20.0.tar.gz > ; \ > do guix download $url ;done Thanks for investigating & explaining! I my previous message, I wrote: > As for past tarballs, #swh-devel comrades say we could send them a list > of URLs and they’d create “Save Code Now” requests on our behalf (we > cannot do it ourselves since the site doesn’t accept plain tarballs.) Were you able to retrieve some of these? What are the chances of success? > Hence the discussion we had: switch from url-fetch to git-fetch. > However, after some investigations, it does not seem straightforward: > The main issue being the almost automatic current updater. See for > details [2]. [...] > https://issues.guix.gnu.org/msgid/878rnwuemq....@elephly.net Indeed, thanks for the link. I agree that long-term moving to ‘git-fetch’ sounds preferable, but there are quite a few obstacles to overcome. Ludo’.