Robert Haas <robertmh...@gmail.com> writes: > On Sat, Oct 30, 2010 at 10:01 AM, Tom Lane <t...@sss.pgh.pa.us> wrote: >> marcin mank <marcin.m...@gmail.com> writes: >>> This would make the hash the same for arrays with elements 32 apart swapped. >> >> Well, there are *always* going to be cases where you get the same hash >> value for two different inputs; it's unavoidable given that you have to >> combine N 32-bit hash values into only one 32-bit output.
> Sure. The goal is to make those hard to predict, though. Really? I think "I don't understand when this fails" isn't obviously better than being able to predict when it fails ... > I think > "multiply by 31 and add the next value" is a fairly standard way of > getting that behavior. It mixes up the bits a lot more than just > left-shifting by a variable offset. What concerns me about that is that it tends to push the bits to the left --- I think the effects of the earlier inputs are going to overflow out as you incorporate a lot of newer inputs. What you want is a scheme where every input item affects all the bits of the final result about equally, and my gut feeling is this doesn't provide that. I'd be happier about this approach if there were some actual theory behind it ... maybe there's some analysis out there, but the one link that was given was just about entirely unconvincing. regards, tom lane -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers