Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
I've executed script using ip number instead of domain name but it takes
more or less the same time in these 2 steps:
$browser2->get($url);
$content = $browser2->content();
You also has to consider the network tr
I've executed script using ip number instead of domain name but it takes
more or less the same time in these 2 steps:
$browser2->get($url);
$content = $browser2->content();
Honestly i dont know if these 9 seconds can be improved ...
Regards,
T
On 1/22/07, Igor Sutton <[EMAIL PROTE
Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
does it work for this kind of urls?
http://patft.uspto.gov/netacgi/nph-Parser?
thanks!
T.
For these kind of problem, you always have URI:
#!env perl
use strict;
use warnings;
use URI;
my $url = "http://patft.uspto.gov/net
does it work for this kind of urls?
http://patft.uspto.gov/netacgi/nph-Parser?
thanks!
T.
On 1/22/07, Igor Sutton <[EMAIL PROTECTED]> wrote:
Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
> i've realized that for each link, i spend most of the time in the
following
> perl
Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
i've realized that for each link, i spend most of the time in the following
perl script
foreach my $url (@lines){ -- I READ MY 1-ROW URL FILE
$contador=2;
$test=0;
while(!$test){
$browser2->
i've realized that for each link, i spend most of the time in the following
perl script
foreach my $url (@lines){ -- I READ MY 1-ROW URL FILE
$contador=2;
$test=0;
while(!$test){
$browser2->get($url);
$content = $browser2->content();
--IN THESE 2 STEPS
Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
I do it from the JAva and not from the PErl because i need to perform an
insert into the database each time i process a link and also i have to
inform via rss about the progress of the global download process (23.343 out
of 70.0
I do it from the JAva and not from the PErl because i need to perform an
insert into the database each time i process a link and also i have to
inform via rss about the progress of the global download process (23.343 out
of 70.000 files have been downloaded)
On 1/22/07, Igor Sutton <[EMAI
Hi Tatiana,
2007/1/22, Tatiana Lloret Iglesias <[EMAIL PROTECTED]>:
Regarding the performance problem:
The schema of my application is:
1. I execute perl script which performs a search in a public database. It
gets total results in *several pages*. Pressing "Next Page" button (with
perl script
Regarding the performance problem:
The schema of my application is:
1. I execute perl script which performs a search in a public database. It
gets total results in *several pages*. Pressing "Next Page" button (with
perl script) i get a list of all the links related to my query (70.000 more
or le
Tatiana Lloret Iglesias wrote:
Hi all,
from my java applicaation i invoke a perl script which downloads a huge
quantity of files from an external database using WWW-mechanize library and
my problem is that I have big CPU performance problems ... can you give me
any advice to avoid this?
Hi Tati
On 1/15/07, Tatiana Lloret Iglesias <[EMAIL PROTECTED]> wrote:
from my java applicaation i invoke a perl script which downloads a huge
quantity of files from an external database using WWW-mechanize library and
my problem is that I have big CPU performance problems ... can you give me
any advice
Hi all,
from my java applicaation i invoke a perl script which downloads a huge
quantity of files from an external database using WWW-mechanize library and
my problem is that I have big CPU performance problems ... can you give me
any advice to avoid this?
Thanks!
T.
13 matches
Mail list logo