MRAB, Terry, Ethan, and others ...
Thank you - collections.deque is exactly what I was looking for.
Malcolm
--
https://mail.python.org/mailman/listinfo/python-list
On 06/22/2014 11:03 AM, pyt...@bdurham.com wrote:
Should I have any performance concerns with the index position used
to pop() values off of large lists? In other words, should pop(0) and
pop() be time equivalent operations with long lists?
I believe lists are optimized for adding and
pop(0) and pop() be time equivalent operations
with long lists?
No. If you want this, use collections.deque.
--
Terry Jan Reedy
--
https://mail.python.org/mailman/listinfo/python-list
On 2014-06-22 19:03, pyt...@bdurham.com wrote:
Should I have any performance concerns with the index position used
to pop() values off of large lists?
In other words, should pop(0) and pop() be time equivalent operations
with long lists?
When an item is popped from a list, all of the later
Should I have any performance concerns with the index position used to
pop() values off of large lists?
In other words, should pop(0) and pop() be time equivalent operations
with long lists?
--
https://mail.python.org/mailman/listinfo/python-list
On Wed, 2011-02-23 at 13:57 +, Jorgen Grahn wrote:
> If that's the *only* such use, I'd experiment with writing them as
> sortable text to file, and run GNU sort (the Unix utility) on the file.
> It seems to have a clever file-backed sort algorithm.
+1 - and experiment with the different flags
On Tue, 2011-02-22, Ben Finney wrote:
> Kelson Zawack writes:
>
>> I have a large (10gb) data file for which I want to parse each line
>> into an object and then append this object to a list for sorting and
>> further processing.
>
> What is the nature of the further processing?
>
> Does that furt
On 2/22/2011 4:40 AM, Kelson Zawack wrote:
The answer it turns out is the garbage collector. When I disable the
garbage collector before the loop that loads the data into the list
and then enable it after the loop the program runs without issue.
This raises a question though, can the logic of th
I am using python 2.6.2, so it may no longer be a problem.
I am open to using another data type, but the way I read the
documentation array.array only supports numeric types, not arbitrary
objects. I also tried playing around with numpy arrays, albeit for
only a short time, and it seems that alth
Kelson Zawack wrote:
> The answer it turns out is the garbage collector. When I disable the
> garbage collector before the loop that loads the data into the list
> and then enable it after the loop the program runs without issue.
> This raises a question though, can the logic of the garbage colle
Kelson Zawack writes:
> This raises a question though, can the logic of the garbage collector
> be changed so that it is not triggered in instances like this were you
> really do want to put lots and lots of stuff in memory.
Have you considered using a more specialised data type for such large
d
The answer it turns out is the garbage collector. When I disable the
garbage collector before the loop that loads the data into the list
and then enable it after the loop the program runs without issue.
This raises a question though, can the logic of the garbage collector
be changed so that it is
On Mon, Feb 21, 2011 at 7:24 PM, Dan Stromberg wrote:
>
> On Mon, Feb 21, 2011 at 6:57 PM, Kelson Zawack <
> zawack...@gis.a-star.edu.sg> wrote:
>
>> I have a large (10gb) data file for which I want to parse each line into
>> an object and then append this object to a list for sorting and further
Kelson Zawack writes:
> I have a large (10gb) data file for which I want to parse each line
> into an object and then append this object to a list for sorting and
> further processing.
What is the nature of the further processing?
Does that further processing access the items sequentially? If s
On Mon, Feb 21, 2011 at 6:57 PM, Kelson Zawack
wrote:
> I have a large (10gb) data file for which I want to parse each line into an
> object and then append this object to a list for sorting and further
> processing. I have noticed however that as the length of the list increases
> the rate at wh
alex23 writes:
> On Feb 22, 12:57 pm, Kelson Zawack
> wrote:
>> I did not bother to further analyze or benchmark it. Since the answers
>> in the above forums do not seem very definitive I thought I would
>> inquire here about what the reason for this decrease in performance is,
>> and if ther
On Feb 22, 12:57 pm, Kelson Zawack
wrote:
> I did not bother to further analyze or benchmark it. Since the answers
> in the above forums do not seem very definitive I thought I would
> inquire here about what the reason for this decrease in performance is,
> and if there is a way, or another da
I have a large (10gb) data file for which I want to parse each line into
an object and then append this object to a list for sorting and further
processing. I have noticed however that as the length of the list
increases the rate at which objects are added to it decreases
dramatically. My fir
On May 7, 10:21 pm, "Gabriel Genellina" <[EMAIL PROTECTED]>
wrote:
> En Mon, 07 May 2007 09:14:34 -0300, Merrigan <[EMAIL PROTECTED]>
> escribió:
>
> > The Script it available at this url :
> >http://www.lewendewoord.co.za/theScript.py
>
> I understand this as a learning exercise, since there are l
En Mon, 07 May 2007 09:14:34 -0300, Merrigan <[EMAIL PROTECTED]>
escribió:
> The Script it available at this url :
> http://www.lewendewoord.co.za/theScript.py
I understand this as a learning exercise, since there are lot of utilities
for remote syncing.
Some comments:
- use os.path.join t
On May 7, 5:14 am, Merrigan <[EMAIL PROTECTED]> wrote:
> On May 7, 10:18 am, Steven D'Aprano
>
>
>
> <[EMAIL PROTECTED]> wrote:
> > On Mon, 07 May 2007 00:28:14 -0700, Merrigan wrote:
> > > 1. I have the script popping all the files that need to be checked into
> > > a list, and have it parsing the
In <[EMAIL PROTECTED]>, Merrigan wrote:
> The Script it available at this url :
> http://www.lewendewoord.co.za/theScript.py
>
> P.S. I know it looks like crap, but I'm a n00b, and not yet through
> the OOP part of the tutorial.
One spot of really horrible runtime is the `comp_are()` function,
On May 7, 10:18 am, Steven D'Aprano
<[EMAIL PROTECTED]> wrote:
> On Mon, 07 May 2007 00:28:14 -0700, Merrigan wrote:
> > 1. I have the script popping all the files that need to be checked into
> > a list, and have it parsing the list for everything...Now the problem is
> > this : The sever needs to
On Mon, 07 May 2007 00:28:14 -0700, Merrigan wrote:
> 1. I have the script popping all the files that need to be checked into
> a list, and have it parsing the list for everything...Now the problem is
> this : The sever needs to check (at the moment) 375 files and eliminate
> those that don't need
Hi All,
Firstly - thank you Sean for the help and the guideline to get the
size comparison, I will definitely look into this.
At the moment I actually have 2 bigger issues that needs sorting...
1. I have the script popping all the files that need to be checked
into a list, and have it parsing th
25 matches
Mail list logo