Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-23 Thread Skip Montanaro
> > Have tracked-down and communicated with the site owner/operator. He > advised a loop-back problem which has now been blocked. > I believe this has been corrected in the past, more than once, though my memory is a bit hazy now. It's not clear to me why this particular site keeps messing up thei

Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-23 Thread DL Neil via Python-list
Have tracked-down and communicated with the site owner/operator. He advised a loop-back problem which has now been blocked. -- Regards =dn -- https://mail.python.org/mailman/listinfo/python-list

Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-22 Thread Skip Montanaro
> I filter out these messages in my news setup (using gnus on Emacs) on > the header: > > ("head" > ("Injection-Info: news.bbs.nz" -1002 nil s)) > > i.e. each message that contains "news.bbs.nz" in the "Injection-Info" > header will be made invisible. > This solved the problem for me. Thanks. M

Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-22 Thread Pieter van Oostrum
Skip Montanaro writes: >> This just arrived at my newserver: >> > ... >> I find that very curious because the post is mine but which I >> sent out with these headers: >> I filter out these messages in my news setup (using gnus on Emacs) on the header: ("head" ("Injection-Info: news.bbs.nz" -

Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-22 Thread Skip Montanaro
> This just arrived at my newserver: > ... > I find that very curious because the post is mine but which I > sent out with these headers: > ... > The timezone on the date header has changed, the subject has been > truncated, the Path and injection info is all different, and most > crucially, the ME

Re: news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-22 Thread Peter Pearson
On Tue, 21 Apr 2020 21:42:42 + (UTC), Eli the Bearded wrote: > This just arrived at my newserver: > > Path: > reader2.panix.com!panix!goblin2!goblin.stu.neva.ru!news.unit0.net!2.eu.feeder.erje.net!4.us.feeder.erje.net!feeder.erje.net!xmission!csiph.com!news.bbs.nz!.POSTED.agency.bbs.nz!not

news.bbs.nz is spewing duplicates to comp.lang.python

2020-04-21 Thread Eli the Bearded
This just arrived at my newserver: Path: reader2.panix.com!panix!goblin2!goblin.stu.neva.ru!news.unit0.net!2.eu.feeder.erje.net!4.us.feeder.erje.net!feeder.erje.net!xmission!csiph.com!news.bbs.nz!.POSTED.agency.bbs.nz!not-for-mail From: Eli the Bearded <*@eli.users.panix.com> (Eli the Bea

Re: how to remove duplicates dict from the list of dictionary based on one of the elements is duplicate in list of dict

2019-11-16 Thread Cameron Simpson
On 17Nov2019 12:26, Iranna Mathapati wrote: How to remove duplicates dict from the list of dictionary based on one of the duplicate elements in the dictionary, l = [{"component":"software", "version":"1.2" }, {"component":"

how to remove duplicates dict from the list of dictionary based on one of the elements is duplicate in list of dict

2019-11-16 Thread Iranna Mathapati
Hi, How to remove duplicates dict from the list of dictionary based on one of the duplicate elements in the dictionary, l = [{"component":"software", "version":"1.2" }, {"component":"hardware", "version":"2.2"

RE: Duplicates

2019-03-08 Thread Steve
To: python-list@python.org Subject: Duplicates I get two messages for every post - p...@netrh.com -- Regards, Milt m...@ratcliffnet.com -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list

Duplicates

2019-03-08 Thread Milt
I get two messages for every post - p...@netrh.com -- Regards, Milt m...@ratcliffnet.com -- https://mail.python.org/mailman/listinfo/python-list

Re: I am trying to delete duplicates but the job just finishes with an exit code 0

2017-11-07 Thread Peter Otten
tysondog...@gmail.com wrote: > I am trying to delete duplicates but the job just finishes with an exit > code 0 and does not delete any duplicates. > > The duplicates for the data always exist in Column F and I am desiring to > delete the entire row B-I > > Any ideas? &g

I am trying to delete duplicates but the job just finishes with an exit code 0

2017-11-06 Thread tysondogerz
I am trying to delete duplicates but the job just finishes with an exit code 0 and does not delete any duplicates. The duplicates for the data always exist in Column F and I am desiring to delete the entire row B-I Any ideas? import openpyxl wb1 = openpyxl.load_workbook('C:/dwad/SWWA

Re: pandas dataframe, find duplicates and add suffix

2017-03-30 Thread Pavol Lisy
', 'b.jpg', 'c.jpg', 'd.jpg', > 'e.jpg', 'f.jpg', 'g.jpg'], columns=['model', 'dtime']) > > print(df.head(10)) > > model dtime > a.jpg first 2017-01-01_112233 > b.jpg fir

pandas dataframe, find duplicates and add suffix

2017-03-28 Thread zljubisic
pg', 'g.jpg'], columns=['model', 'dtime']) print(df.head(10)) model dtime a.jpg first 2017-01-01_112233 b.jpg first 2017-01-01_112234 c.jpg second 2017-01-01_112234 d.jpg second 2017-01-01_112234 e.jpg second 2017-01-01_112234 f.j

Re: Duplicates of third-party libraries

2009-12-08 Thread Martin P. Hellwig
Grant Edwards wrote: Does windows even _have_ a library dependancy system that lets an application specify which versions of which libraries it requires? Well you could argue that easy_install does it a bit during install. Then there is 'Windows Side By Side' (winsxs) system which sorta does i

Re: Duplicates of third-party libraries

2009-12-08 Thread Grant Edwards
On 2009-12-08, Martin P. Hellwig wrote: > Lie Ryan wrote: > >> >> The only thing that package managers couldn't provide is for the >> extremist bleeding edge; those that want the latest and the greatest in >> the first few seconds the developers releases them. The majority of >> users don't fa

Re: Duplicates of third-party libraries

2009-12-08 Thread Martin P. Hellwig
Lie Ryan wrote: The only thing that package managers couldn't provide is for the extremist bleeding edge; those that want the latest and the greatest in the first few seconds the developers releases them. The majority of users don't fall into that category, most users are willing to wait a

Re: Duplicates of third-party libraries

2009-12-08 Thread Grant Edwards
On 2009-12-08, Martin P. Hellwig wrote: > - In the ideal world, a upgrade of a dependency won't break > your program, in reality users fear upgrading dependencies > because they don't know for sure it won't result in a dll > hell type of problem. In my experience with binary-based distros

Re: Duplicates of third-party libraries

2009-12-08 Thread Lie Ryan
On 12/9/2009 12:02 AM, David Cournapeau wrote: On Tue, Dec 8, 2009 at 9:02 PM, Lie Ryan wrote: I disagree, what you should have is an Operating System with a package management system that addresses those issues. The package management must update your software and your dependencies, and keep

Re: Duplicates of third-party libraries

2009-12-08 Thread David Cournapeau
On Tue, Dec 8, 2009 at 9:02 PM, Lie Ryan wrote: > > I disagree, what you should have is an Operating System with a package > management system that addresses those issues. The package management must > update your software and your dependencies, and keep track of > incompatibilities between you a

Re: Duplicates of third-party libraries

2009-12-08 Thread Martin P. Hellwig
Lie Ryan wrote: Yes from an argumentative perspective you are right. But given the choice of being right and alienate the fast majority of my potential user base, I rather be wrong. For me the 'Although practicality beats purity' is more important than trying to beat a dead horse that is a p

Re: Duplicates of third-party libraries

2009-12-08 Thread Lie Ryan
On 12/8/2009 3:25 PM, Martin P. Hellwig wrote: Ben Finney wrote: "Martin P. Hellwig" writes: Along with the duplication this introduces, it also means that any bug fixes — even severe security fixes — in the third-party code will not be addressed in your duplicate. I disagree, what you ne

Re: Duplicates of third-party libraries

2009-12-07 Thread Martin P. Hellwig
Ben Finney wrote: This omits the heart of the problem: There is an extra delay between release and propagation of the security fix. When the third-party code is released with a security fix, and is available in the operating system, the duplicate in your application will not gain the advantage o

Re: Duplicates of third-party libraries

2009-12-07 Thread Ben Finney
"Martin P. Hellwig" writes: > Ben Finney wrote: > > Along with the duplication this introduces, it also means that any bug > > fixes — even severe security fixes — in the third-party code will not be > > addressed in your duplicate. > I disagree, what you need is: > - An automated build system

Re: Duplicates of third-party libraries

2009-12-07 Thread Martin P. Hellwig
Ben Finney wrote: "Martin P. Hellwig" writes: Along with the duplication this introduces, it also means that any bug fixes — even severe security fixes — in the third-party code will not be addressed in your duplicate. I disagree, what you need is: - An automated build system for your del

Duplicates of third-party libraries (was: When will Python 3 be fully deployed)

2009-12-07 Thread Ben Finney
ill not be addressed in your duplicate. This defeats one of the many benefits of a package management operating system: that libraries, updated once, will benefit any other package depending on them. Please reconsider policies like including duplicates of third-party code. Don't Repeat Yourself is

Re: extracting duplicates from CSV file by specific fields

2009-04-28 Thread VP
unqs.append(row) print "\nUniques:\n" for row in unqs: print row print "\nDuplicates:\n" for row in dups: print row print "\n" Result: - Originals: ['a.a', 'sn-01'] ['b.b', 'sn-02'] ['c.c', 

Re: extracting duplicates from CSV file by specific fields

2009-04-28 Thread Rhodri James
3' 'ccc.444', 'T400', 'pn123', 'sn444' 'ddd', 'T500', 'pn123', 'sn555' 'eee.666', 'T600', 'pn123', 'sn444' 'fff.777', 'T700', 'pn123', '

Re: extracting duplicates from CSV file by specific fields

2009-04-28 Thread MRAB
0', 'pn123', 'sn444' 'ddd', 'T500', 'pn123', 'sn555' 'eee.666', 'T600', 'pn123', 'sn444' 'fff.777', 'T700', 'pn123', 'sn777' How can I extract duplicates che

extracting duplicates from CSV file by specific fields

2009-04-28 Thread VP
'pn123', 'sn444' 'ddd', 'T500', 'pn123', 'sn555' 'eee.666', 'T600', 'pn123', 'sn444' 'fff.777', 'T700', 'pn123', 'sn777' How can I extract duplicates checking eac

Strange pexpect behaviour: just duplicates stdin

2009-03-30 Thread Nikolaus Rath
Hello, I have a strange problem with pexpect: $ cat test.py #!/usr/bin/python import pexpect child = pexpect.spawn("./test.pl") while True: try: line = raw_input() except EOFError: break child.sendline(line) print child.readline().rstrip("\r\n") child.close()

Re: Find duplicates in a list/array and count them ...

2009-03-27 Thread MRAB
--- delete end --- And here you're scanning the entire list _for every item_; if there are 'n' items then it's being scanned 'n' times! The number of times each item occurred is now stored in oid_count. for oid in tmp_list: a = str(oid) + ' '

Find duplicates in a list/array and count them ...

2009-03-27 Thread Paul . Scipione
ows.Next() writeMessage(' ') writeMessage(str(strftime("%H:%M:%S", localtime())) + ' generating statistics...') dup_count = len(tmp_list) tmp_list = list(set(tmp_list)) tmp_list.sort() for oid in tmp_list: a = str(oid) + ' ' while len(a) < 2

Re: Find duplicates in a list and count them ...

2009-03-26 Thread Josh Dukes
On Thu, 26 Mar 2009 16:02:20 -0400 "D'Arcy J.M. Cain" wrote: or l = ( randint(0,9) for x in xrange(8) ) > On Thu, 26 Mar 2009 16:00:01 -0400 > Albert Hopkins wrote: > > > l = list() > > > for i in xrange(8): > > > l.append(randint(0,10)) > > ^^^ > > should

Re: Find duplicates in a list and count them ...

2009-03-26 Thread John Machin
On Mar 27, 8:14 am, paul.scipi...@aps.com wrote: > Hi D'Arcy J.M. Cain, > > Thank you.  I tried this and my list of 76,979 integers got reduced to a > dictionary of 76,963 items, each item listing the integer value from the > list, a comma, and a 1. I doubt this very much. Please show: (a) your

Re: Find duplicates in a list and count them ...

2009-03-26 Thread MRAB
ontain only 11 items listing 11 integer values and the number of times they appear in my original list. Not all of the values are 1. The 11 duplicates will be higher. Just iterate through the dict to find all keys with values > 1. >>> icounts {1: 2, 2: 1, 3: 1, 4: 1, 5: 1, 6: 1, 7

Re: Find duplicates in a list and count them ...

2009-03-26 Thread Benjamin Kaplan
umber of times they appear in my original > list. > Not all of the values are 1. The 11 duplicates will be higher. Just iterate through the dict to find all keys with values > 1. >>> icounts {1: 2, 2: 1, 3: 1, 4: 1, 5: 1, 6: 1, 7: 5, 8: 3, 9: 1, 10: 1, 11: 1} Python 2.x : >

RE: Find duplicates in a list and count them ...

2009-03-26 Thread Paul . Scipione
abase Administrator work: 602-371-7091 cell: 480-980-4721 -Original Message- From: D'Arcy J.M. Cain [mailto:da...@druid.net] Sent: Thursday, March 26, 2009 12:50 PM To: Scipione, Paul (ZP5296) Cc: python-list@python.org Subject: Re: Find duplicates in a list and count them ... On Thu, 2

Re: Find duplicates in a list and count them ...

2009-03-26 Thread Paul Rubin
"D'Arcy J.M. Cain" writes: > icount = {} > for i in list_of_ints: > icount[i] = icount.get(i, 0) + 1 from collections import defaultdict icount = defaultdict(int) for i in list_of_ints: icount[i] += 1 -- http://mail.python.org/mailman/listinfo/python-list

Re: Find duplicates in a list and count them ...

2009-03-26 Thread D'Arcy J.M. Cain
On Thu, 26 Mar 2009 16:00:01 -0400 Albert Hopkins wrote: > > l = list() > > for i in xrange(8): > > l.append(randint(0,10)) > ^^^ > should have been: > l.append(randint(0,9)) Or even: l = [randint(0,9) for x in xrange(8)] -- D'Arcy J.M. Cain

Re: Find duplicates in a list and count them ...

2009-03-26 Thread Albert Hopkins
On Thu, 2009-03-26 at 15:54 -0400, Albert Hopkins wrote: [...] > $ cat test.py > from random import randint > > l = list() > for i in xrange(8): > l.append(randint(0,10)) ^^^ should have been: l.append(randint(0,9)) > > hist = dict() > for i in l: >

Re: Find duplicates in a list and count them ...

2009-03-26 Thread Albert Hopkins
On Thu, 2009-03-26 at 12:22 -0700, paul.scipi...@aps.com wrote: > Hello, > > I'm a newbie to Python. I have a list which contains integers (about > 80,000). I want to find a quick way to get the numbers that occur in > the list more than once, and how many times that number is duplicated > in t

Re: Find duplicates in a list and count them ...

2009-03-26 Thread D'Arcy J.M. Cain
On Thu, 26 Mar 2009 12:22:27 -0700 paul.scipi...@aps.com wrote: > I'm a newbie to Python. I have a list which contains integers (about > 80,000). I want to find a quick way to get the numbers that occur in the > list more than once, and how many times that number is duplicated in the > list.

Find duplicates in a list and count them ...

2009-03-26 Thread Paul . Scipione
Hello, I'm a newbie to Python. I have a list which contains integers (about 80,000). I want to find a quick way to get the numbers that occur in the list more than once, and how many times that number is duplicated in the list. I've done this right now by looping through the list, getting a

Re: Checking for the existence of Duplicates

2007-09-28 Thread Paul Hankin
plicate checking code, although fast, is executed so many times. For a sudoku solver, you may be better dodging the problem, and maintaining a set per row, column and box saying which numbers have been placed already - and thus avoiding adding duplicates in the first place. It may be better to use a

Checking for the existence of Duplicates

2007-09-28 Thread AndyB
I have found a lot of material on removing duplicates from a list, but I am trying to find the most efficient way to just check for the existence of duplicates in a list. Here is the best I have come up with so far: CheckList = [x[ValIndex] for x in self.__XRList[z

Re: removing duplicates, or, converting Set() to string

2006-07-27 Thread maphew
thank you everybody for your help! That worked perfectly. :) I really appreciate the time you spent answering what is probably a pretty basic question for you. It's nice not to be ignored. be well, -matt -- http://mail.python.org/mailman/listinfo/python-list

Re: removing duplicates, or, converting Set() to string

2006-07-26 Thread John Machin
Simon Forman wrote: > > Do ','.join(clean) to make a single string with commas between the > items in the set. (If the items aren't all strings, you'll need to > convert them to strings first.) > And if the items themselves could contain commas, or quote characters, you might like to look at the

Re: removing duplicates, or, converting Set() to string

2006-07-26 Thread John Machin
[EMAIL PROTECTED] wrote: > Hello, > > I have some lists for which I need to remove duplicates. I found the > sets.Sets() module which does exactly this I think you mean that you found the sets.Set() constructor in the set module. If you are using Python 2.4, use the built-in se

Re: removing duplicates, or, converting Set() to string

2006-07-26 Thread Simon Forman
[EMAIL PROTECTED] wrote: > Hello, > > I have some lists for which I need to remove duplicates. I found the > sets.Sets() module which does exactly this, but how do I get the set > back out again? > > # existing input: A,B,B,C,D > # desired result: A,B,C,D > > import

Re: removing duplicates, or, converting Set() to string

2006-07-26 Thread bearophileHUGS
The write accepts strings only, so you may do: out.write( repr(list(clean)) ) Notes: - If you need the strings in a nice order, you may sort them before saving them: out.write( repr(sorted(clean)) ) - If you need them in the original order you need a stable method, you can extract the relevant co

removing duplicates, or, converting Set() to string

2006-07-26 Thread maphew
Hello, I have some lists for which I need to remove duplicates. I found the sets.Sets() module which does exactly this, but how do I get the set back out again? # existing input: A,B,B,C,D # desired result: A,B,C,D import sets dupes = ['A','B','B','C',

Re: Removing duplicates from a list

2005-09-16 Thread Steven Bethard
drochom wrote: > i suppose this one is faster (but in most cases efficiency doesn't > matter) > def stable_unique(s): > > e = {} > ret = [] > for x in s: > if not e.has_key(x): > e[x] = 1 > ret.append(x) > retur

Re: Removing duplicates from a list

2005-09-16 Thread martijn
Thanks for all the information. And now I understand the timeit module ;) GC-Martijn -- http://mail.python.org/mailman/listinfo/python-list

Re: Removing duplicates from a list

2005-09-15 Thread drochom
thanks, nice job. but this benchmark is pretty deceptive: try this: (definition of unique2 and unique3 as above) >>> import timeit >>> a = range(1000) >>> t = timeit.Timer('unique2(a)','from __main__ import unique2,a') >>> t2 = timeit.Timer('stable_unique(a)','from __main__ import stable_unique,a

Re: Removing duplicates from a list

2005-09-15 Thread martijn
Ow thanks , i'm I newbie and I did this test. (don't know if this is the best way to do a small speed test) import timeit def unique2(keys): unique = [] for i in keys: if i not in unique:unique.append(i) return unique def unique3(s): e = {} ret = [] for x in s:

Re: Removing duplicates from a list

2005-09-15 Thread drochom
i suppose this one is faster (but in most cases efficiency doesn't matter) >>> def stable_unique(s): e = {} ret = [] for x in s: if not e.has_key(x): e[x] = 1 ret.append(x) return ret cheers, przemek

Re: Removing duplicates from a list

2005-09-15 Thread drochom
there wasn't any information about ordering... maybe i'll find something better which don't destroy original ordering regards przemek -- http://mail.python.org/mailman/listinfo/python-list

Re: Removing duplicates from a list

2005-09-15 Thread martijn
Look at the code below def unique(s): return list(set(s)) def unique2(keys): unique = [] for i in keys: if i not in unique:unique.append(i) return unique tmp = [0,1,2,4,2,2,3,4,1,3,2] print tmp print unique(tmp) print unique2(tmp) -- [0, 1, 2, 4, 2

Re: Removing duplicates from a list

2005-09-15 Thread drochom
Rubinho napisal(a): > I've a list with duplicate members and I need to make each entry > unique. > hi, other possibility (my newest discovery:) ) >>> a = [1,2,2,4,2,1,3,4] >>> unique = d.fromkeys(a).keys() >>> unique [1, 2, 3, 4] regards przemek -- http://mail.python.org/mailman/listinfo/pyth

Re: Removing duplicates from a list

2005-09-14 Thread Steven Bethard
przemek drochomirecki wrote: > def unique(s): > e = {} > for x in s: > if not e.has_key(x): >e[x] = 1 > return e.keys() This is basically identical in functionality to the code: def unique(s): return list(set(s)) And with the new-and-improved C implementation of sets comin

Re: Removing duplicates from a list

2005-09-14 Thread tcc . chapman
This works too, if speed isn't your thing.. >> a = [ 1,2,3,2,6,1,3,4,1,7,5,6,7] >> a = dict( ( (i,None) for i in a)).keys() a [1, 2, 3, 4, 5, 6, 7] -- http://mail.python.org/mailman/listinfo/python-list

Re: Removing duplicates from a list

2005-09-14 Thread przemek drochomirecki
roach) > > for x in mylist: > if mylist.count(x) > 1: > mylist.remove(x) > > Method 2 (not so traditional) > > mylist = set(mylist) > mylist = list(mylist) > > Converting to a set drops all the duplicates and converting back to a > list, well, gets it back to

Re: Removing duplicates from a list

2005-09-14 Thread Will McGugan
Steven D'Aprano wrote: > > > Don't imagine, measure. > > Resist the temptation to guess. Write some test functions and time the two > different methods. But first test that the functions do what you expect: > there is no point having a blindingly fast bug. Thats is absolutely correct. Although

Re: Removing duplicates from a list

2005-09-14 Thread Steven D'Aprano
On Wed, 14 Sep 2005 13:28:58 +0100, Will McGugan wrote: > Rubinho wrote: >> I can't imagine one being much faster than the other except in the case >> of a huge list and mine's going to typically have less than 1000 >> elements. > > I would imagine that 2 would be significantly faster. Don't

Re: Removing duplicates from a list

2005-09-14 Thread Rocco Moretti
Rubinho wrote: > I can't imagine one being much faster than the other except in the case > of a huge list and mine's going to typically have less than 1000 > elements. To add to what others said, I'd imagine that the technique that's going to be fastest is going to depend not only on the lengt

Re: Removing duplicates from a list

2005-09-14 Thread martijn
I do this: def unique(keys): unique = [] for i in keys: if i not in unique:unique.append(i) return unique I don't know what is faster at the moment. -- http://mail.python.org/mailman/listinfo/python-list

Re: Removing duplicates from a list

2005-09-14 Thread Christian Stapfer
c, O(n^2), in the length n of the list if all keys are unique. Conversion to a set just might use a better sorting algorithm than this (i.e. n*log(n)) and throwing out duplicates (which, after sorting, are positioned next to each other) is O(n). If conversion to a set should turn out to be slow

Re: Removing duplicates from a list

2005-09-14 Thread Rubinho
Peter Otten wrote: > Rubinho wrote: > > > I've a list with duplicate members and I need to make each entry > > unique. > > > > I've come up with two ways of doing it and I'd like some input on what > > would be considered more pythonic (or at least best practice). > > > > Method 1 (the traditional

Re: Removing duplicates from a list

2005-09-14 Thread Peter Otten
Rubinho wrote: > I've a list with duplicate members and I need to make each entry > unique. > > I've come up with two ways of doing it and I'd like some input on what > would be considered more pythonic (or at least best practice). > > Method 1 (the traditional approach) > > for x in mylist: >

Re: Removing duplicates from a list

2005-09-14 Thread Will McGugan
ional approach) > > for x in mylist: > if mylist.count(x) > 1: > mylist.remove(x) > > Method 2 (not so traditional) > > mylist = set(mylist) > mylist = list(mylist) > > Converting to a set drops all the duplicates and converting back to a > list,

Re: Removing duplicates from a list

2005-09-14 Thread Thomas Guettler
ctice). > mylist = set(mylist) > mylist = list(mylist) > > Converting to a set drops all the duplicates and converting back to a > list, well, gets it back to a list which is what I want. > > I can't imagine one being much faster than the other except in the case > of a

Removing duplicates from a list

2005-09-14 Thread Rubinho
st.count(x) > 1: mylist.remove(x) Method 2 (not so traditional) mylist = set(mylist) mylist = list(mylist) Converting to a set drops all the duplicates and converting back to a list, well, gets it back to a list which is what I want. I can't imagine one being much faster than the other

Re: Remove duplicates from list

2005-06-10 Thread Derek Perriero
em.Saturday + item.Sunday,  the order is> already this preset configuration.  I want 'collect' to be static so it can> compare it against another libraries hours and group it if necessary.  The > libraries that fail to be duplicates of other libraries will be generated as> u

Re: Remove duplicates from list

2005-06-09 Thread Chris Lambacher
sday + item.Wednesday + > item.Thursday + item.Friday + item.Saturday + item.Sunday, the order is > already this preset configuration. I want 'collect' to be static so it can > compare it against another libraries hours and group it if necessary. The > libraries that fai

Re: Remove duplicates from list

2005-06-09 Thread Derek Perriero
sary.  The libraries that fail to be duplicates of other libraries will be generated as usual under the grouped libraries.  They will have a single heading. An example can be seen here of what I am trying to achieve: http://www.libraries.wvu.edu/hours/summer.pdf These are the outputs I failed to

Re: Remove duplicates from list

2005-06-09 Thread Chris Lambacher
string addition and the result is a string. The ouput you provide is in fact a list with no duplicates, i.e. there are no two strings the same. If order is not important to you a structure that will give you an 'unordered list with no duplicates' is a set (available in the std library

Remove duplicates from list

2005-06-09 Thread Derek Perriero
I've been un-triumphantly trying to get a list of mine to have no repeats in it.   First, I'm pulling attributes from Zope and forming a list.  Next, I'm  pulling those same values and comparing them against the same list and if the values equal each other and are not already in the list, they appe

Re: remove duplicates from list *preserving order*

2005-02-07 Thread Alex Martelli
Steven Bethard <[EMAIL PROTECTED]> wrote: ... > I have a list[1] of objects from which I need to remove duplicates. I > have to maintain the list order though, so solutions like set(lst), etc. > will not work for me. What are my options? So far, I can see: I think the recipe

Re: remove duplicates from list *preserving order*

2005-02-06 Thread Steven Bethard
John Machin wrote: So, just to remove ambiguity, WHICH one of the bunch should be retained? Short answer: "the first seen" is what the proverbial "man in the street" would expect For my purposes, it doesn't matter which instance is retained and which are removed, so yes, retaining the first one is

Re: remove duplicates from list *preserving order*

2005-02-06 Thread John Machin
> You have to exhaust the iteratable before yielding anything. Last solution? All of them have essentially the same logic to decide which items to reject. Further, what you say is true only if you are interpreting Steven's ambiguous(?) requirement as: remove ALL instances of a bunch of duplicat

Re: remove duplicates from list *preserving order*

2005-02-06 Thread Steven Bethard
Francis Girard wrote: I think your last solution is not good unless your "list" is sorted (in which case the solution is trivial) since you certainly do have to see all the elements in the list before deciding that a given element is not a duplicate. You have to exhaust the iteratable before yie

Re: remove duplicates from list *preserving order*

2005-02-06 Thread Francis Girard
gt; I'm sorry, I assume this has been discussed somewhere already, but I > found only a few hits in Google Groups... If you know where there's a > good summary, please feel free to direct me there. > > > I have a list[1] of objects from which I need to remove duplicates.

Re: remove duplicates from list *preserving order*

2005-02-03 Thread Michael Spencer
Steven Bethard wrote: I'm sorry, I assume this has been discussed somewhere already, but I found only a few hits in Google Groups... If you know where there's a good summary, please feel free to direct me there. I have a list[1] of objects from which I need to remove duplicates.

Re: remove duplicates from list *preserving order*

2005-02-03 Thread [EMAIL PROTECTED]
You could create a class based on a list which takes a list as argument, like this: -class uniquelist(list): -def __init__(self, l): -for item in l: -self.append(item) - -def append(self, item): -if item not in self: -list.append(self, item) - -l = [1

Re: remove duplicates from list *preserving order*

2005-02-03 Thread Steven Bethard
Carl Banks wrote: from itertools import * [ x for (x,s) in izip(iterable,repeat(set())) if (x not in s,s.add(x))[0] ] Wow, that's evil! Pretty cool, but for the sake of readers of my code, I think I'll have to opt against it. ;) STeVe -- http://mail.python.org/mailman/listinfo/python-list

Re: remove duplicates from list *preserving order*

2005-02-03 Thread Larry Bates
e's a good summary, please feel free to direct me there. I have a list[1] of objects from which I need to remove duplicates. I have to maintain the list order though, so solutions like set(lst), etc. will not work for me. What are my options? So far, I can see: def filterdups(iterable):

Re: remove duplicates from list *preserving order*

2005-02-03 Thread [EMAIL PROTECTED]
You could do it with a class, like this, I guess it is a bit faster than option 1, although I'm no connaisseur of python internals. -class uniquelist(list): -def __init__(self, l): -for item in l: -self.append(item) -def append(self, item): -if item not in s

Re: remove duplicates from list *preserving order*

2005-02-03 Thread Carl Banks
Steven Bethard wrote: > I'm sorry, I assume this has been discussed somewhere already, but I > found only a few hits in Google Groups... If you know where there's a > good summary, please feel free to direct me there. > > > I have a list[1] of objects from which I

remove duplicates from list *preserving order*

2005-02-03 Thread Steven Bethard
I'm sorry, I assume this has been discussed somewhere already, but I found only a few hits in Google Groups... If you know where there's a good summary, please feel free to direct me there. I have a list[1] of objects from which I need to remove duplicates. I have to maintain the