On Wed, 2011-02-23 at 13:57 +, Jorgen Grahn wrote:
> If that's the *only* such use, I'd experiment with writing them as
> sortable text to file, and run GNU sort (the Unix utility) on the file.
> It seems to have a clever file-backed sort algorithm.
+1 - and experiment with the different flags
On Tue, 2011-02-22, Ben Finney wrote:
> Kelson Zawack writes:
>
>> I have a large (10gb) data file for which I want to parse each line
>> into an object and then append this object to a list for sorting and
>> further processing.
>
> What is the nature of the further processing?
>
> Does that furt
On 2/22/2011 4:40 AM, Kelson Zawack wrote:
The answer it turns out is the garbage collector. When I disable the
garbage collector before the loop that loads the data into the list
and then enable it after the loop the program runs without issue.
This raises a question though, can the logic of th
I am using python 2.6.2, so it may no longer be a problem.
I am open to using another data type, but the way I read the
documentation array.array only supports numeric types, not arbitrary
objects. I also tried playing around with numpy arrays, albeit for
only a short time, and it seems that alth
Kelson Zawack wrote:
> The answer it turns out is the garbage collector. When I disable the
> garbage collector before the loop that loads the data into the list
> and then enable it after the loop the program runs without issue.
> This raises a question though, can the logic of the garbage colle
Kelson Zawack writes:
> This raises a question though, can the logic of the garbage collector
> be changed so that it is not triggered in instances like this were you
> really do want to put lots and lots of stuff in memory.
Have you considered using a more specialised data type for such large
d
The answer it turns out is the garbage collector. When I disable the
garbage collector before the loop that loads the data into the list
and then enable it after the loop the program runs without issue.
This raises a question though, can the logic of the garbage collector
be changed so that it is
On Mon, Feb 21, 2011 at 7:24 PM, Dan Stromberg wrote:
>
> On Mon, Feb 21, 2011 at 6:57 PM, Kelson Zawack <
> zawack...@gis.a-star.edu.sg> wrote:
>
>> I have a large (10gb) data file for which I want to parse each line into
>> an object and then append this object to a list for sorting and further
Kelson Zawack writes:
> I have a large (10gb) data file for which I want to parse each line
> into an object and then append this object to a list for sorting and
> further processing.
What is the nature of the further processing?
Does that further processing access the items sequentially? If s
On Mon, Feb 21, 2011 at 6:57 PM, Kelson Zawack
wrote:
> I have a large (10gb) data file for which I want to parse each line into an
> object and then append this object to a list for sorting and further
> processing. I have noticed however that as the length of the list increases
> the rate at wh
alex23 writes:
> On Feb 22, 12:57 pm, Kelson Zawack
> wrote:
>> I did not bother to further analyze or benchmark it. Since the answers
>> in the above forums do not seem very definitive I thought I would
>> inquire here about what the reason for this decrease in performance is,
>> and if ther
On Feb 22, 12:57 pm, Kelson Zawack
wrote:
> I did not bother to further analyze or benchmark it. Since the answers
> in the above forums do not seem very definitive I thought I would
> inquire here about what the reason for this decrease in performance is,
> and if there is a way, or another da
12 matches
Mail list logo