Christian Doll wrote: > i have e little performance problem with my code... > > i have to compare many lists of very much floats. at moment i have > nested for-loops > > for a in range( len(lists) ): > for b in range( a+1 , len(lists) ): > for valuea in lists[a]: > equal=False > for valueb in lists[b]: > if inTolerance( valuea , valueb , 1.0): # inTolerance > is an own function, which checks if the difference of valuea and > valueb is not more then 1.0% > equal=True > break > if equal: > print a , "and" , b , "are equal"
My crystal ball says that if you profile the above you'll find that the above spends most of the time in the inTolerance() function that you don't provide. > i found a version with set which is faster, but i cannot assign an > tolerance (%) > for a in range( len(lists) ): > for b in range( a+1 , len(lists) ): > if len( lists[a] ) == > len( set( lists[a] ).intersection( set( lists[b] ) ) ): > print a , "and" , b , "are equal" I can't think of a problem that can be solved with that ;) > have you an idea how i can change my code, that i can compare many > lists of floats with a tolerance in percentage very fast? You can usually speed up number-crunching tasks with numpy, but it looks like you don't have a clear notion what vectors should be regarded as equal. Perhaps you can provide a bit of background information about the problem you are trying to solve with the code, preferably in plain (if bad) english. -- http://mail.python.org/mailman/listinfo/python-list