On Oct 5, 2015, at 5:17 AM, Eric Rescorla <e...@rtfm.com> wrote: > > The problem is that we don't know how to generically provide compression > safely. To take a concrete example: HTTP2's solution to header compression, > HPACK, is extremely limited compared to a generic compression system > like gzip, LZ77, etc., as well as being tightly-coupled to HTTP, and yet we > still know that there are potential security problems [0]. Doing something > generically secure is much harder. > > If you have a solution to this problem, then great. But the mere fact that > it's > desirable doesn't mean we have an answer.
A very very limited answer is that, if you use a compression method with a fixed, small dictionary and non-adaptive compression (i.e., the dictionary is not updated), then recovery of a high entropy secret from a compressed-then-encrypted value is hard, even with an adversary adaptively choosing the prefix/suffix of the plaintext. This is a much weaker security property than the standard indistinguishability / semantic security of encryption, and would be unsuitable for general encryption purposes. Being a non-adaptive technique, the compression rate is also quite poor, but can be better than nothing for a dictionary that's chosen well for the typical document. Douglas _______________________________________________ TLS mailing list TLS@ietf.org https://www.ietf.org/mailman/listinfo/tls