I noticed a few of our test scripts taking a long time to run, even
though they used to be quick. Here's one:
$ time ./t7612-merge-verify-signatures.sh
ok 1 - create signed commits
ok 2 - merge unsigned commit with verification
ok 3 - merge commit with bad signature with verification
ok 4 - merge commit with untrusted signature with verification
ok 5 - merge signed commit with verification
ok 6 - merge commit with bad signature without verification
# passed all 6 test(s)
1..6
real 0m12.285s
user 0m0.048s
sys 0m0.044s
That's a lot of time not using any CPU. What's going on? Running with
"sh -x" shows that we spend most of the time in this line from
lib-gpg.sh:
gpg --homedir "${GNUPGHOME}" 2>/dev/null --import \
"$TEST_DIRECTORY"/lib-gpg/keyring.gpg
And running gpg with "--debug-level guru" shows that we are blocking
while waiting for entropy. Has anybody else seen this? I feel like I
noticed it starting a few weeks ago, and indeed dropping back to gpg
2.0.26 (from 2.1.17) makes the problem go away.
Is it a bug in gpg (oddly, the kernel reports lots of entropy available,
and generating the signatures themselves is quite fast)? Or is the new
version doing something special in the import process that we need to
work around or disable?
The reason we use --import at all is to handle differences in the .gnupg
format between versions 1 and 2. So the nuclear option would be to just
carry pre-made .gnupg directories for _both_ versions in our test suite,
and pick the appropriate one based on the output of "gpg --version".
That feels like a hack, but it gives us a lot of control.
I'd love it if we could figure out a way of making --import reliably
fast, though.
-Peff