Possible error on glibc package specification
I that even after installing glibc-2.31 gcc still cant build static binaries, I just used -static flag, which shows that libc.a is not present, Im just new in Guix then I do not know how to create packages or patches, but I believe the problem is around this part of package spec https://git.savannah.gnu.org/cgit/guix.git/tree/gnu/packages/base.scm#n851 I dont know Scheme at all, but those line around line 851 seems to be moving or not moving the statically linked correct, will probably can help when I learn how to debug package specifications, but Im sending this email because I think its worth to someone take a look. Since some applications relies on gcc being able to produce static binaries, another improvement can be to allow gcc produce static-pie by default.. Let me know what you all think. I will probably get on IRC in a few days, I just dont get online because I had not time to configure Tor SASL authentication on freenode... For now i just I want to say that im enjoyng to use Guix :) Regards, Anonymous_ OpenPGP_0x8E28A44555230AEC.asc Description: application/pgp-keys OpenPGP_signature Description: OpenPGP digital signature
Re: Possible error on glibc package specification
Hi anon, Anonymous_ 写道: I that even after installing glibc-2.31 gcc still cant build static binaries, I just used -static flag, which shows that libc.a is not present, On Guix, libc.a is provided by the glibc:static package output, which is what https://git.savannah.gnu.org/cgit/guix.git/tree/gnu/packages/base.scm#n851 produces. After installing it ‘gcc -static’ works fine. I don't think there's an error there. Kind regards, T G-R signature.asc Description: PGP signature
Re: Proposed change for the disruptive changes process (staging/core-updates)
Hi, On Fri, 23 Oct 2020 at 18:20, Christopher Baines wrote: > A simple process change that I think would help to address this is as > follows (I'll use core-updates as the example, but this applies for > staging as well): > > - core-updates is effectively renamed to core-updates-next > > - When you want to merge core-updates-next in to master, you create >core-updates pointing at the same commit as core-updates-next. This >begins the freeze. > > - Once a sufficient amount of time has past for the things on >core-updates to have been built, you merge in to master > > - Shortly after the merge to master, you then delete the core-updates >branch > > This would mean that a build server can track core-updates, and it'll > only build things when they're relevant for substitutes. For > ci.guix.gnu.org, maybe it could build both branches initially, to > replicate the current setup, but I think in the long run, it would be > helpful to separate out the behaviour so that ci.guix.gnu.org > concentrates on builds for substitutes, and there's another thing for > actually testing out potential core-updates changes. Based on the current CI issues, and orthogonal with the Chris’s and Mathieu’s effort (Build Coordinator and Cuirass)––thanks a lot for all the tough work––I agree with this proposal. And it would help to reduce the load on Berlin and so increase the throughput of substitutes. BTW, I agree that it seems better to separate what is “test” and what is “production”, i.e., build on separate machines. All the wip-* branches could be built on Bayfront. This implies a rebuild once merged but somehow this rebuild already happens more than often. All the best, simon
[outreachy] Walk through the Git history (guix git {authenticate, log})
Hi, Discussing how to walk through the Git history ~/.cache/guix/checkouts/, there is 2 ways to do: 1. Loop with commit-parents as it is done for ’commit-closure’ in guix/git.scm. 2. Bind git_revwalk_* and use it instead. WDYT? Well, #1 is more straightforward but less efficient, IIUC. All the best, simon
Re: Manual PDF and translation (modular texlive?)
Hello Ricardo, I've a huge backlog of messages from guix, I've read this thread today and I just want to say: thank you so much for your work on this!!! I don't (still) know if this work on modular texlive is done or not, anyway you solved a lot of issues and - as we can see - texlive workflows are very very hard to debug. Reproducible TeX documents with modular packages is a really great great feature. [...] Thanks! Gio' -- Giovanni Biscuolo Xelera IT Infrastructures signature.asc Description: PGP signature
Re: [outreachy] Walk through the Git history (guix git {authenticate,log})
Hi, On 11/12/2020 14:41, zimoun wrote: > Hi, > > Discussing how to walk through the Git history > ~/.cache/guix/checkouts/, there is 2 ways to do: > > 1. Loop with commit-parents as it is done for ’commit-closure’ in > guix/git.scm. > > 2. Bind git_revwalk_* and use it instead. > > WDYT? > > Well, #1 is more straightforward but less efficient, IIUC. > > > All the best, > simon I think #2 is the way to go. It may be more difficult to implement right now, but in the future it'll make things easier if we think about adding an option that allows us to choose the order the commits will be shown, for instance. Magali
Re: Guix day: Summary fo rust BoF session
Hi Hartmut and Sébastien, I started with some talking points which I will attach. We did not get to everything though. Mostly we discussed the possibility of using system dependencies and installing actual compiled artifacts. Someone mentioned that the possibility of using system dependencies came up on the #rust IRC channel. After some research, I think I may take a look at it soon. The other topic I remember coming up were some packaging idioms and module organization. I think consensus was generally reached that we should try to keep package definitions based on "domain" or "use" rather than by language - i.e. put rust web libraries in a web module. I think the newer crates-graphics and other modules are decent examples of this. In future, perhaps we move packages out of rust-apps.scm though. In other news, I am glad the importer is merged now! Congratulations! - John #+TITLE: (BoF) Rust + Cargo Discussion Rust has come a long way since I started using Guix in 2018: crate importer, cargo-build-system, bootstrapping with mrustc, etc. I use Guix to write Rust for work everyday now! Still a long way to go. Some topics below (add your own before we start). * Topics ** Improved, semantic versioning-aware, crate importer - New work in https://issues.guix.gnu.org/44560 - Original issue https://issues.guix.gnu.org/38408 ** rustfmt as an output of rust - Open patch in https://issues.guix.gnu.org/42295 - Other candidates include rls/rust-analyzer, clippy, racer ** Keeping rust versions up to date - On a 6 week release cycle, perhaps we need rust-updates branch? - I have been using version 1.46 without issue ** Packaging idioms - How best to remove vendored sources? - How to propagate the required environment variables. - When to include minor version in package variable name? ** guix refresh does not pick up dependencies between rust dependencies #+begin_src bash guix refresh --list-dependent rust-serde #+end_src #+RESULTS: : No dependents other than itself: rust-serde@1.0.117 ** Incremental compilation/shared libraries possible? - Use the store as a registry? - $CARGO_OUT_DIR to put artifacts in build outputs https://github.com/rust-lang/cargo/issues/6790 - cargo metadata, guile-toml, cargo-build --manifest-path=...? ** Packaging efforts and updates pijul patches available. Others waiting to submit. - teip - skim - dog - sd - zoxide - tealdeer ** Collaboration with Rust community directly - Start with communication, maybe advance to RFC? - Collaborate with Nix to understand how to make cargo work better for functional package managers? ** Wasm32 target support - May need to patch-cargo-checksums of Cargo
Offline build failure
I am attempting an offline build without success. I have a Guix 1.2.0 node with internet access on which I download sources with transitive dependencies: $ guix build --sources=transitive tzdata > ~/transfer I then copy the files as root to a Guix 1.2.0 node without internet access (only local network access): # cat /home//transfer | xargs -n 1 -I{} scp -p {} :{} I then attempt to build on the offline node: $ guix build --no-substitutes tzdata Guix starts downloading and the transferred file is gone! I'm lost as to why a new download attempt is made as the file data and timestamps match the original server. The following derivations will be built: /gnu/store/83aqdbhnk33p6phd92b48aqb2wlhr7fa-tzdata-2020a.drv /gnu/store/0j6gi10v9hz2s2413nhwbzihrydafgvj-glibc-2.31.drv /gnu/store/04s4qih73dwvpzpvxs299wcg9dam8f0y-bash-mesboot-4.4.drv /gnu/store/xxwjr96ihqhpv7ag4cbwzk0r0p5cxakk-tzdata2020a.tar.gz.drv building /gnu/store/m5z2bvv96wqhxs4aahjp5xxm910czkhl-tzcode2020a.tar.gz.drv... Starting download of /gnu/store/r7xqxgfmm5msvazlzxaccw7h0h6dm898-tzcode2020a.tar.gz >From https://data.iana.org/time-zones/releases/tzcode2020a.tar.gz... Building on the original, internet-connected server no re-download is attempted, though additional packages not included in the transitive sources are requested. The following derivations will be built: /gnu/store/83aqdbhnk33p6phd92b48aqb2wlhr7fa-tzdata-2020a.drv /gnu/store/0j6gi10v9hz2s2413nhwbzihrydafgvj-glibc-2.31.drv /gnu/store/04s4qih73dwvpzpvxs299wcg9dam8f0y-bash-mesboot-4.4.drv /gnu/store/69z1j6lx09krsa019a2r9v72i80lnkql-libsigsegv-2.12.drv /gnu/store/0npjqnxv0acwdkxnc6askz2ac4dikj73-libsigsegv-2.12.tar.gz.drv building /gnu/store/4lqmlyvbvddzi50cbqjwh8lzb2lvmkia-module-import-compiled.drv... successfully built /gnu/store/4lqmlyvbvddzi50cbqjwh8lzb2lvmkia-module-import-compiled.drv building /gnu/store/4d3cjvjrgs0q38fs3bgfajcw7apc0vml-Python-3.5.9.tar.xz.drv... Starting download of /gnu/store/f99fblkzb6ip268sg096shhs7wzjyp55-Python-3.5.9.tar.xz >From https://www.python.org/ftp/python/3.5.9/Python-3.5.9.tar.xz... downloading from https://www.python.org/ftp/python/3.5.9/Python-3.5.9.tar.xz ... Python-3.5.9.tar.xz 14.7MiB 141.9MiB/s 00:00 [##] 100.0% successfully built /gnu/store/4d3cjvjrgs0q38fs3bgfajcw7apc0vml-Python-3.5.9.tar.xz.drv building /gnu/store/ldkjb6wjcl6dbf0glpfw9f5wh1ia8bq3-bash-2.05b.tar.gz.drv... Starting download of /gnu/store/zkkcabiqcy11wy4wkn8bysava6qz8w7v-bash-2.05b.tar.gz >From https://ftpmirror.gnu.org/gnu/bash/bash-2.05b.tar.gz... following redirection to `https://gnu.askapache.com/bash/bash-2.05b.tar.gz'. .. downloading from https://ftpmirror.gnu.org/gnu/bash/bash-2.05b.tar.gz ... bash-2.05b.tar.gz 1.9MiB 4.1MiB/s 00:00 [##] 100.0% successfully built /gnu/store/ldkjb6wjcl6dbf0glpfw9f5wh1ia8bq3-bash-2.05b.tar.gz.drv building /gnu/store/8nnmv6hfwg543s7b0sbx54ysi514dsz6-bash-4.4.tar.gz.drv... -- If there is a better way to setup / configure / execute offline builds please let me know! Thanks, Greg
Re: Offline build failure
Hullo Greg, Greg Hogan 写道: If there is a better way to setup / configure / execute offline builds please let me know! ...yes :-) I am attempting an offline build without success. I have a Guix 1.2.0 node with internet access on which I download sources with transitive dependencies: $ guix build --sources=transitive tzdata > ~/transfer OK. I then copy the files as root to a Guix 1.2.0 node without internet access (only local network access): # cat /home//transfer | xargs -n 1 -I{} scp -p {} :{} Now you've basically reinvented ‘guix copy --to=’, but in a way that won't update the store database in /var/guix/db. I'm afraid that won't work. Guix won't ‘see’ the files you copy to the remote store and will consider them G to be C'd next time you run ‘guix gc’. Or in this case: Guix starts downloading and the transferred file is gone! Same thing. I'm lost as to why a new download attempt is made as the file data and timestamps match the original server. If the file isn't registered in the database, the store item is never considered valid. Guix doesn't (yet) care about the data/timestamps at this point. If there's a reason you can't/won't use ‘guix copy’, you might work around that by copying each file in ~/transfer to, say, :/tmp/staging (instead of :/gnu/store), then running ‘guix download /tmp/staging/...’ on the remote host. Kind regards, T G-R signature.asc Description: PGP signature