> > If we want secure compare > > by hash, then almost any sync protocol that uses SHA-256 will > > be fine but > > almost any that uses MD5 will not. Why? Because SHA-256 is good > > for compare > > by hash and MD5 is not. Any protocol that's not brain-damaged that uses > > SHA-256 will work, and any that uses MD5 will not.
> MD5 is (still) vastly stronger (no known second-preimage attacks) in most > applications than the weakest parts in real security systems. Spending > time choosing between MD5 to SHA1 is in most cases a waste of time. Sure, > use SHA1, it is best practice to do so, but this is extremely likely to > have any positive impact on the security of the system in question: Actually, MD5 is almost worthless for compare-by-hash. > http://www.schneier.com/book-practical-preface.html > > Security is only as strong as the weakest link, and the mathematics > of cryptography is almost never the weakest link. The fundamentals > of cryptography are important, but far more important is how those > fundamentals are implemented and used. Arguing about whether a > key should be 112 bits or 128 bits long is rather like pounding > a huge stake into the ground and hoping the attacker runs right > into it. You can argue whether the stake should be a mile or a > mile-and-a-half high, but the attacker is simply going to walk around > the stake. Security is a broad stockade: it's the things around the > cryptography that make the cryptography effective. That's certainly not true for the specific case of where there are known deficiencies in an algorithm and the algorithm is used such that those deficiencies break the guarantees the system is supposed to provide. Using MD5 in a case where the ability of an attacker to create two plaintexts with the same hash is fatal would be suicide. MD5 was specifically designed for this not to be possible. > If leaving MD5 enabled improves interoperability, leave it enabled... That depends on the application. MD5 is perfectly suitable for some applications and completely unsuitable for others. For some applications, if an attacker who knows the real data can craft modified data with the same hash, you are completely sunk. IOW, to evaluate the protocol sensibly, you have to not only know that it does use MD5, but you have to know *how* it uses MD5. If it uses it as part of a signature algorithm, you are relatively safe. If it uses it to validate data an attacker cannot know, you are again safe. If it uses it to validate data known to an attacker, you are not safe. MD5 is simply now broken for that purpose. > Yes, once you have a decent protocol, disable legacy symmetric ciphers > weaker than ~80 bits (no 40-bit "export" ciphers, no single-DES), > but choose the protocol that addresses the right security model (say > secure transport) and don't sweat the algorithms too much, the protocol > designers should have taken care of that. In other words, once you choose the protocol, disable the algorithms that don't meet your requirements and enable the ones that do. How can you do that unless you *know* which algorithms meet your requirements and which don't? > This advice is of course for application developers, not cryptographers. > > Still, somehow, I don't think we're likely to reach consensus, over > and out. It doesn't seem like it. DS ______________________________________________________________________ OpenSSL Project http://www.openssl.org User Support Mailing List openssl-users@openssl.org Automated List Manager majord...@openssl.org