> OK, I had to try the two ways again to see how much difference it made. I
> created a random contents fixed field file 14500 lines long X 80 columns
> wide, and tried processing the lines (using substr($_,)to
> break lines up into 4 sections, substitute based on a few patterns, and
change a
oward
-Original Message-
From: Steve Howard [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, June 13, 2001 5:46 PM
To: Jos Boumans
Cc: Stephen Henderson; [EMAIL PROTECTED]
Subject: RE: use of split command
The reason I preferred to read a file into an array (when it is manageable)
then processing it is be
pe of the question asked
when responding here.
Steve Howard
-Original Message-
From: Jos Boumans [mailto:[EMAIL PROTECTED]]
Sent: Wednesday, June 13, 2001 6:55 AM
To: Steve Howard
Cc: Stephen Henderson; [EMAIL PROTECTED]
Subject: Re: use of split command
I'd like to make a few adju
I'd like to make a few adjustments on this code, and give you some things you might
want
to concider using:
open (ACCS, "C:\\Perl\\BioPerl-0.7\\Seqdata\\Accession.txt") or die "can't open
Accessions
file ", $!;
# this will produce also the error returned by the system call... usefull for figuri
if this is what you want:
while () { $string .= $_ }
you'll want to concider using this instead:
{ local $/; $in = }
which will put the entire contents of HANDLE into $in (by undeffing $/, which
is holds the line seperator character - it will be restored once the block
exits)
but like genie s
ED]
Cc: Stephen Henderson
Subject: Re: use of split command
> I am trying to read a quite large file (ca 13000 lines) directly into an
> array (for speed)
Sorry, it's a bad idea.
One day your file will be 1 GB size and @ets= will kill your PC
trying to load the whole gig into the mem
MAIL PROTECTED]]
Sent: Tuesday, June 12, 2001 9:59 AM
To: '[EMAIL PROTECTED]'
Subject: use of split command
I am trying to read a quite large file (ca 13000 lines) directly into an
array (for speed)
like this
open (ACCS, "C:\\Perl\\BioPerl-0.7\\Seqdata\\Accession.txt") or die &q
> I am trying to read a quite large file (ca 13000 lines) directly into an
> array (for speed)
Sorry, it's a bad idea.
One day your file will be 1 GB size and @ets= will kill your PC
trying to load the whole gig into the memory ..
while ( ){..}
is the best way for reading large files, I think
--- Stephen Henderson <[EMAIL PROTECTED]> wrote:
> I am trying to read a quite large file (ca 13000 lines) directly into
> an array (for speed)
> like this
>
> open (ACCS, "C:\\Perl\\BioPerl-0.7\\Seqdata\\Accession.txt") or die
> "can't open Accessions file";
> @ets=;
> $ets_count=@ets;
>
> the
I don't imagine you pick up all that much extra speed by reading the whole
file into an array first (anyone?).
I would do something like
open(ACCS, "Accession.txt") || die "blah";
my %values;
while()
{
chomp $_;
my($col1, $col2) = split(/\t/, $_); # you better be sure
I am trying to read a quite large file (ca 13000 lines) directly into an
array (for speed)
like this
open (ACCS, "C:\\Perl\\BioPerl-0.7\\Seqdata\\Accession.txt") or die "can't
open Accessions file";
@ets=;
$ets_count=@ets;
the actual data is a 2 column tab delimited file like:
<<...OLE_Obj...>
11 matches
Mail list logo