"George Sakkis" <[EMAIL PROTECTED]> wrote in message 
news:[EMAIL PROTECTED]
> "Christian Stapfer" <[EMAIL PROTECTED]> wrote:
>
>> <[EMAIL PROTECTED]> wrote:
>> > try to use set.
>>
>>     Sorting the two lists and then extracting
>> A-B, B-A, A|B, A & B and A ^ B in one single
>> pass seems to me very likely to be much faster
>> for large lists.
>
> Why don't you implement it, test it and time it
> to be more convincing about your intuition ?

The problem is in the generation of the test data.
Even merely generating a set of (suitably "average",
"random", and suitably "worst case") datasets might
turn out to be a major undertaking.
 If the documentation stated the order-of-magnitude
behavior of those basic operations up front, then
I (and *anyone* else who ever wanted to use those
operations on large lists / large sets) could do
a quick order-of-magnitude estimation of how
a certain program design will behave, performance
wise.
  *Experimenting* is not necessarily as easy to
do as you seem to believe. How do you, for example,
hit upon the worst-case behavior with your test
data? - Without knowing *anything* about the
implementation it might a matter sheer luck.
If you *know* something about the implementation
then, of course, you might be able to figure it
out. (But note that if you know *that* much about
the implementation, you usually have an order-of-
magnitude estimate anyway and don't need to do
*any* experimenting in order to answer my question.)

Regards,
Christian


-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to