----- Original Message ----
From: JBallinger <[EMAIL PROTECTED]>
To: [email protected]
Sent: Monday, March 17, 2008 9:18:12 PM
Subject: Re: Hash & CSV
On Mar 14, 3:26 pm, [EMAIL PROTECTED] (Manoj) wrote:
> When using Data: Dumper is taking more time for my 10000 lines of CSV file.
> This solved a few queries...and the benchmark was a new value addition for
> me. Thanks
> > 2) Is there any optimal method for reading a CSV file and put to
> hash table.
>
> > May have better approaches however this read CSV into Hash
>
> > use Data::Dumper;
> > open(INFILE, "<", "sample.csv") or die $!;
> > my %hsh;
> > %hsh = ( %hsh, (split(/,/, $_))[1,2] ) while ( <INFILE> );
>
> That is a *very* inefficient way to populate a hash as you are copying
> the entire hash for every record in the file. Better to add the keys
> and values individually:
>
> my %hsh;
> while ( <INFILE> ) {
> chomp;
> my ( $key, $value ) = split /,/;
> }
>
> > print Dumper \%hsh;
>
> John
> - Show quoted text -
Suppose my csv file has 5 columns: f id fa mo ge
Will my ($key, $value) = split/,/; still work?
Is there any other mothod that is more efficient?
JB
Hi,
There is a module use AnyData;
which we can use for different format of files for CSV , here is a sample piece
of code
use AnyData;
use Data::Dumper;
$format= 'CSV';
$data = 'sample.csv';
$open_mode = 'r';
# Open file in read/update mode
my $table = adTie( $format, $data, $open_mode,);
print Dumper %$table;
Hope this helps.
Best Regards,
Prabu
____________________________________________________________________________________
Be a better friend, newshound, and
know-it-all with Yahoo! Mobile. Try it now.
http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ