HI,

I have a bunch of news websites that are stored in my mysql db and
each morning I have a script to go to each site and download the top
stories (this is for personal use, not commercial). My problem is that
sometimes www::mechanize will fail to get the website because the
server is busy, or for whatever reason. HOw do I get it so that
instead of dying right then and there it will just ignore it and move
onto the next page?

partial code below:

........
while ( $ref = $st->fetchrow_arrayref() )
{
print "Retrieving news stories from $page\n";
my $mech=WWW::Mechanize->new(autocheck => 1);
$mech->get($page) or warn "Cannot get page" and next;  ####this should
just throw a warning and then continue, but instead dies here#####
my $content= $mech->content();
print "content\n\n\n";
}

THanks


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to