Alessandro Vesely wrote in
 <36be1ab7-4317-4ae6-9370-f2f2e8323...@tana.it>:
 ...
 |[.] Indeed, a DKIM signing filter running
 |during the outgoing connection can take also into account any 8BITMIME and
 |SMTPUTF8 parameters, which current DKIM implementations can only try \
 |to guess.

To say that standards like S/MIME etc usually enforce or require
a 7-bit transfer-encoding in order to create reliably reproducable
results in the signature-covered data.

SMTPUTF8 really, really not, imho.

For 8BITMIME: the MUA i maintain, it was somewhat famous >20 years
ago (a decade+ before i took maintainership) in Germany, and it
always used 8-bit encoding (by default) over SMTP iirc, even
without supporting the 8BITMIME extension.  No sooner but with the
next release it will support that (with credits to Mr. John
Levine).  Noone ever complained as far as i know.  Changing the
default content-transfer-encoding was one of the first things
i have done when taken maintainership.  All that iirc, but i think
so.

That is: Jonathan B. Postel and his wonderful law that current
engineers of the IETF doubt about in all their brilliance, cast in
stone by their own RFCs, it worked for a long time very well.
Other than that everybody should keep their hands off, and
suddenly things will start working again.

DKIM signers/verifiers do not have a deal with content transfer
encoding, they work on SMTP envelope data.
And that is best reproducible, *in my humble opinion*.

--steffen
|
|Der Kragenbaer,                The moon bear,
|der holt sich munter           he cheerfully and one by one
|einen nach dem anderen runter  wa.ks himself off
|(By Robert Gernhardt)

_______________________________________________
Ietf-dkim mailing list -- ietf-dkim@ietf.org
To unsubscribe send an email to ietf-dkim-le...@ietf.org

Reply via email to