> "h" == hsfrey writes:
h> I rewrote the thing to use arrays instead of hashes, updating in place
h> instead of moving stuff around, and generally using brute force
h> instead of cleverness.
h> I didn't want to do it that way before, because I was afraid it would
h> take too much
OK!
I rewrote the thing to use arrays instead of hashes, updating in place
instead of moving stuff around, and generally using brute force
instead of cleverness.
I didn't want to do it that way before, because I was afraid it would
take too much time.
Anyway now it runs to completion, and takes
> "h" == hsfrey writes:
h> I have a big data file - about 7000 entries, each about 100 bytes.
h> I have to search it on the order of 7000 times, so I need to keep it
h> in memory.
just fyi, that is a tiny data file today. less than 1MB.
uri
--
Uri Guttman -- u...@stemsystems
hsfrey wrote:
I have a big data file - about 7000 entries, each about 100 bytes.
700,000 bytes is *not* a big file these days.
I have to search it on the order of 7000 times, so I need to keep it
in memory.
As I search, some items no longer need to be searched, but they still
need to be sav
hsfrey wrote:
> Does anyone have any suggestion about how I could retrieve the lost
> memory?
You know it's really, really difficult to give meaningful advice without
seeing the code.
--
Just my 0.0002 million dollars worth,
Shawn
Programming is as much about organization and communicati
I have a big data file - about 7000 entries, each about 100 bytes.
I have to search it on the order of 7000 times, so I need to keep it
in memory.
As I search, some items no longer need to be searched, but they still
need to be saved. To save search time, I store the data as a set of
parallel has
> >> The perlref docs state "Hard references are smart--they keep track of
> >> reference counts for you, automatically freeing the thing referred to
> >> when its reference count goes to zero." My interpretation of this is
> >> that when a reference goes out of scope the memory used by
> >> the
>> The perlref docs state "Hard references are smart--they keep track of
>> reference counts for you, automatically freeing the thing referred to
>> when its reference count goes to zero." My interpretation of this is
>> that when a reference goes out of scope the memory used by
>> the referent i
your program, but cannot be used by other programs.
-Original Message-
From: Freimuth,Robert [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 03, 2004 1:25 PM
To: [EMAIL PROTECTED]
Subject: references and freeing memory
Hi all,
The perlref docs state "Hard references are smart--they kee
Hi all,
The perlref docs state "Hard references are smart--they keep track of
reference counts for you, automatically freeing the thing referred to when
its reference count goes to zero." My interpretation of this is that when a
reference goes out of scope the memory used by the referent is freed
On Wed, Sep 04, 2002 at 05:07:41PM +0100, Jeff AA wrote:
> >
> > It could be garbage collection, but it shouldn't take 2 minutes to
> > free() 700MBs of data. Could be that your code is written in such
> > a way that it is having to back out of lots of subroutines and
> > free'in
>
> It could be garbage collection, but it shouldn't take 2 minutes to
> free() 700MBs of data. Could be that your code is written in such
> a way that it is having to back out of lots of subroutines and
> free'ing things as it goes? Such as with recursion?
no recursion, and on
.--[ Jeff AA wrote (2002/09/04 at 09:34:56) ]--
|
| I have a Perl script that creates a large hash, from a collection of
| files. I am running Perl 5.6.1 on SMP Linux 2.4.18, top shows my process
| using up to 700MB of memory (which is fine on our servers). I have
| noticed thou
Hi folks,
I have a Perl script that creates a large hash, from a collection of
files. I am running Perl 5.6.1 on SMP Linux 2.4.18, top shows my process
using up to 700MB of memory (which is fine on our servers). I have
noticed though that when perl hits my
exit 0;
line, the process pauses for
14 matches
Mail list logo