Re: Slow down while creating a big list and iterating over it

2010-01-31 Thread marc magrans de abril
Hi! ...I have found a good enough solution, although it only works if the number of patterns (clusters) is not very big: def classify(f): THERESHOLD=0.1 patterns={} for l in enumerate(f): found = False for p,c in patterns.items(): if dist(l,p) < THERESHOLD:

Re: Slow down while creating a big list and iterating over it

2010-01-31 Thread marc magrans de abril
>Find out which pattern is being used on the second iteration and then try it >on the first iteration. Is it just as slow? You were right, the second pattern was 1891 bytes but the first was just 142 :P I will need to put more thought than I expect in the "small script". -- http://mail.python.or

Slow down while creating a big list and iterating over it

2010-01-30 Thread marc magrans de abril
ne knows the reson of this behavior? How should I write a program that deals with large data sets in python? Thanks a lot! marc magrans de abril -- http://mail.python.org/mailman/listinfo/python-list

profiling differences using an extra function call

2009-11-23 Thread marc magrans de abril
Hi, I was a trying to profile a small script and after shrinking the code to the minimum I got a interesting profile difference. Given two test functions test1 and test2, that only differs from an extra level of indirection (i.e. find_substr), I wonder why I got a timming difference >50%? What is