For what it's worth, Bob Jenkins has some good discussion around this, and some code that can help measure such properties of hashes:
http://burtleburtle.net/bob/hash/index.html Specifically the "tests" section may offer a good start for measuring the characteristics of the hash output with quantitative results. On Mon, Dec 29, 2014 at 3:08 PM, Ben Pfaff <b...@nicira.com> wrote: > On Mon, Dec 22, 2014 at 03:35:22PM -0800, Joe Stringer wrote: > > Previously, when using the 128-bit hash in conjunction with the 32-bit > > hash tables, we would ignore the upper 96 bits rather than attempting to > > redistribute the hash across the 32-bit output space. This patch adds a > > new function to translate the hash down from 128 bits to 32 bits for > > this use case. > > Suppose that we had 128-bit random numbers instead of 128-bit hashes. > Then, if combining the 32-bit pieces of that random number gave us a > higher-quality random 32-bit number than just taking any one 32-bit > piece, it would mean that the random numbers weren't very random. > > By analogy, I think that this patch (without reading it) should only > make a difference if the 128-bit hash isn't very high-quality. If so, > it might be better to consider improving our 128-bit hash function, > instead of the approach taken here. > _______________________________________________ > discuss mailing list > discuss@openvswitch.org > http://openvswitch.org/mailman/listinfo/discuss > -- Andrew Mann DivvyCloud Inc. www.divvycloud.com
_______________________________________________ discuss mailing list discuss@openvswitch.org http://openvswitch.org/mailman/listinfo/discuss