i've realized that for each link, i spend most of the time in the following
perl script

foreach my $url (@lines){ -- I READ MY 1-ROW URL FILE
       $contador=2;
       $test=0;
       while(!$test){

           $browser2->get($url);
           $content = $browser2->content();

--IN THESE 2 STEPS I SPEND 6 SECONDS for a 86 kb html, Is it ok? Can i
perform these 2 steps faster?

Thanks!

T



On 1/22/07, Igor Sutton <[EMAIL PROTECTED]> wrote:

Hi Tatiana,

2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
> I do it from the JAva and not from the PErl because i need to perform an
> insert into the database each time i process a link and also i have to
> inform via rss about the progress of the global download process (23.343out
> of 70.000 files have been downloaded) ....
>

You can use the awesome XML::RSS module to create RSS. Now, for
database insert you have the excellent DBI module.

I bet Java is your problem there (no, I'm not initiating a language war
here).

--
Igor Sutton Lopes <[EMAIL PROTECTED]>

Reply via email to