I need to read in from a temp file that is about 10 megs big in 1.5 meg increments and write results to a database. I can't slurp up the whole temp file because I'm only allowed 2 megs of memory. I was hoping to read in only 1.5 megs per pass of the file but the read seems to be slurping up the whole thing. Do I need sysread?

Anyway, here's my attempt:


open (TEMP, "dataTemp.txt") or die "Can't open dataTemp for reading:$!"; my $offset = 0; while (read TEMP, my $tempbuf, 1500000, $offset) { my @temp = split/[\r\n\f]/, $tempbuf;

foreach my $temp(@temp)
{ my ($label,
$genre,
$catalogue_num,
$upc,
$artist,
$title,
$description,
$format,
$shop_price,
$dist_price,
$dist_1,
$dist_2,
$vendor,
$cost,
$release_date,
$special_buy,
$items_buy_unit,
$items_sell_unit,
$quant_break,
$asset_acct,
$buy,
$buy_unit_measure,
$expense_cos_account,
$inactive_item,
$income_acct,
$inventory,
$item_picture,
$minimum_level,
$selling_base_price,
$price_level_b,
$price_level_c,
$price_level_e,
$price_level_f,
$sell,
$sell_unit_measure,
$quantity_break) = split/\t/, $temp; next if $genre eq "";
my $inserts = "INSERT INTO bigdata VALUES (
'$label',
'$genre',
'$catalogue_num',
'$upc',
'$artist',
'$title',
'$description',
'$format',
'$shop_price',
'$dist_price',
'$dist_1',
'$dist_2',
'$vendor',
'$cost',
'$release_date',
'$special_buy',
'$items_buy_unit',
'$items_sell_unit',
'$quant_break',
'$asset_acct',
'$buy',
'$buy_unit_measure',
'$expense_cos_account',
'$inactive_item',
'$income_acct',
'$inventory',
'$item_picture',
'$minimum_level',
'$selling_base_price',
'$price_level_b',
'$price_level_c',
'$price_level_e',
'$price_level_f',
'$sell',
'$sell_unit_measure',
'$quantity_break')";


my $sth = $dbh->do($inserts) or warn "Can't load <TEMP> into bigdata: $!:$@";


}
$offset += 1500000;
}
close (TEMP);
















-- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to