Do I understand correctly that, while the complexity of
conventional factoring scales with a logarithm of RSA key length,
Shor's algorithm has a space requirement that scales linearly, but
the engineering challenges implied by that linear growth scale
exponentially?
The keyspace equivalency is made under the assumption the attacker is
using the generalized number field sieve. It's unsurprising for work
factors to be different for different algorithmic approaches. Your
conclusion is sound but please be careful not to draw inferences about
one approach from the behavior of the other.
Are those figures including error correction or /before/ error
correction?
Before.
Agreed. I also suspect that we may have practically-relevant PQC
right under our noses, right now: RSA.
Well, RSA is definitely quantum resistant. McEliece, another veteran
algorithm, is widely believed to be truly post-quantum.
Do the mathematical complexities that prevent conventional
computing from making short work of 256-bit ECC keys also affect
Shor's algorithm?
My understanding is "no".
If not, that would mean that RSA is actually /stronger/ against
future threats and TLS should be moving back to RSA and Diffie-
Hellman posthaste.
The US government's belief is that RSA-3072 will be sufficient for
protection of Top Secret/SCI data for the next twenty-five years.
(How do I know twenty-five years? Because whenever a document is
classified, it gets an automatic declassification date. The default
declassification date at the TS level is twenty-five years. So if NSA
says TS/SCI data today may be protected with RSA-3072, it follows they
don't believe quantum computing will break RSA-3072 for at least
twenty-five years.)
https://media.defense.gov/2022/Sep/07/2003071834/-1/-1/0/CSA_CNSA_2.0_ALGORITHMS_.PDF
_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
https://lists.gnupg.org/mailman/listinfo/gnupg-users