On 9 May 2005 15:36:37 -0700, "[EMAIL PROTECTED]" <[EMAIL PROTECTED]> wrote:
>OK I need to be more clear I guess!Unique Elements I mean, elements >that are non repeating. so in the above list 0.4, 0.9 are unique as >they exist only once in the list. > You want to be careful of your definitions, especially with floating point values, which may surprise the uninitiated. Dicts and sets hash numerically equal values to the same hash, and then do equality test, so you get legitimate results like: >>> set([9*.1-8*.1, .1]) set([0.10000000000000001, 0.099999999999999978]) >>> set([123, 123.0, 123L]) set([123]) >>> set([123.0, 123, 123L]) set([123.0]) >>> set([123L, 123, 123.0]) set([123L]) You may want to consider creating representations other than the original data to use in your uniqueness testing, depending on your definition. If you are happy with the way dicts and sets compare elements, you could do a dict that keeps counts, or FTHOI something different (hot off untested griddle, so test before relying ;-) >>> data = [0.1,0.5,0.6,0.4,0.1,0.5,0.6,0.9] >>> once=set() >>> more=set() >>> for el in data: ... if el in once: more.add(el) ... else: once.add(el) ... >>> once-more set([0.90000000000000002, 0.40000000000000002]) Not the most efficient space-wise, not sure about speed. Regards, Bengt Richter -- http://mail.python.org/mailman/listinfo/python-list