Hi Jess,
Tied variables might work, but I was also browsing CPAN this afternoon, and
I noticed that there is a whole SAS module under "Commercial Software". I
have no idea what it does, but you might find it interesting to look at.
Thanx,
Smiddy
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
On Fri, Feb 15, 2002 at 02:34:41PM -0500, Balint, Jess wrote:
> Would it be possible to use 'tie' to operate on a large complex data
> structure from disk?
Certainly. The MLDBM module found on CPAN is good for this. It has a few
caveats on usage, which are described in the documentation.
Mich
s there a dbm file size limit that I have to worry about?
> >
> > -Original Message-
> > From: Chas Owens [mailto:[EMAIL PROTECTED]]
> > Sent: Thursday, February 14, 2002 4:30 PM
> > To: Balint, Jess
> > Subject: RE: Caching Large Data Structures To Disk
> >
>
On Thu, 2002-02-14 at 16:35, Balint, Jess wrote:
> Is there a dbm file size limit that I have to worry about?
>
> -Original Message-
> From: Chas Owens [mailto:[EMAIL PROTECTED]]
> Sent: Thursday, February 14, 2002 4:30 PM
> To: Balint, Jess
> Subject: RE: Caching Large
> -Original Message-
> From: Balint, Jess [mailto:[EMAIL PROTECTED]]
> Sent: Thursday, February 14, 2002 3:11 PM
> To: '[EMAIL PROTECTED]'
> Subject: Caching Large Data Structures To Disk
>
>
> Hello all. First off, I want to thank everybody on this lis
Hashes can be stored in dbm files. Also you can use Data::Dumper to
create a string that contains valid Perl syntax to create a give data
structure. You can then write that string to a data file. When you
need that data again you can read the file back into a string and eval
it. See perldoc -
while(){
do something to each line ($_)...
}
would be better.
-Original Message-
From: Balint, Jess [mailto:[EMAIL PROTECTED]]
Sent: Thursday, February 14, 2002 12:11 PM
To: [EMAIL PROTECTED]
Subject: Caching Large Data Structures To Disk
Hello all. First off, I want to thank every
Hello all. First off, I want to thank everybody on this list who has helped
me with with my previous questions. Thank you.
I am working with input file of very large size. Sometimes up to and greater
than a gig of data. I have previously gotten out of memory errors (people at
work don't like when