On 2024-08-05 19:08, Björn Persson wrote:
Daniel Kahn Gillmor via Gnupg-users wrote:
For example, if the detached signature contains multiple signatures, and
gpgv can't verify one of them, it will return a non-zero error code,
even if it *can* verify the other signature.

It's true that requiring verification of all the signatures is not
always desirable. Allowing all but one to fail is not always right
either. Deciding how many correct signatures should be required is
nontrivial. I doubt any general verification tool can automatically do
the right thing in every case without parameters telling it what's
desired in the specific usecase.

The easiest way to do this would be to add an option to require N verified signatures for success rather than one (which IMO should still be the default).

The usual approach for that would be to sign the file with both the old
and new algorithms, and put them both in the detached signature file.
That way, existing implementations can continue to verify the old
algorithm, and newer implementations can verify the new one.

I don't remember ever seeing an example of that approach. As far as I
can tell, the usual approach is to have verification tools support both
old and new algorithms. Signers switch keys and signatures to the new
algorithm when they think support for it is sufficiently widespread.
Then support for the old algorithm is removed from tools when almost
nobody uses it anymore.

The downside of asking signers to guess when support for a signature algorithm is "sufficiently widespread" is that it forces them to be excessively conservative in what they produce, because the impact of a low number of signature failures (and the subsequent complaints) is high. The principle of "a broken signature is a missing signature" is much more robust and future-proof.

A

_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
https://lists.gnupg.org/mailman/listinfo/gnupg-users

Reply via email to