On Sat, 17 Feb 2018 18:21:26 +0100
Manfred Lotz wrote:
> Thanks. The attached program does better as https://notabug.org
> works. Only http://scripts.sil.org doesn't work. It seems there are
> special checks active on that site.
Yeah, some sites block user-agents recognised as robots, scripts
On Thu, 15 Feb 2018 05:46:33 -0600
Mike Flannigan wrote:
> See if some version of the attached program
> gives the results you expect.
>
>
> Mike
>
>
Thanks. The attached program does better as https://notabug.org works.
Only http://scripts.sil.org doesn't work. It seems there are special
ch
See if some version of the attached program
gives the results you expect.
Mike
On 2/13/2018 8:33 PM, beginners-digest-h...@perl.org wrote:
I tried WWW::Mechanize, and (of course) got also 403.
Really strange.
Is there another tool I could use for checking? I mean some tool in the
Perl univ
> Is there another tool I could use for checking? I mean some tool in the
Perl universe?
Well, lwp-dump is a perl util - comes w/ LWP I believe. The sil.org, for
one, just returns forbidden/403 for their own policy reasons, but as far as
your "is it up?" question, that should be answer enough. It
On Tue, 13 Feb 2018 13:50:55 -0600
Andy Bach wrote:
> $ wget http://scripts.sil.org/OFL
> --2018-02-13 13:42:50-- http://scripts.sil.org/OFL
> Resolving scripts.sil.org (scripts.sil.org)... 209.12.63.143
> Connecting to scripts.sil.org (scripts.sil.org)|209.12.63.143|:80...
> connected.
> HTTP r
$ wget http://scripts.sil.org/OFL
--2018-02-13 13:42:50-- http://scripts.sil.org/OFL
Resolving scripts.sil.org (scripts.sil.org)... 209.12.63.143
Connecting to scripts.sil.org (scripts.sil.org)|209.12.63.143|:80...
connected.
HTTP request sent, awaiting response... 302 Found
Location: http://scrip
On Tue, 13 Feb 2018 10:47:42 -0600
Andy Bach wrote:
> The site doesn't like 'head' requests? get works
> #!/usr/bin/perl
>
> use strict;
> use warnings;
>
> use LWP::Simple;
> # my $url="https://shlomif.github.io/";;
> my $url="http://www.notabug.org/";;
> print "$url is ", (
>
The site doesn't like 'head' requests? get works
#!/usr/bin/perl
use strict;
use warnings;
use LWP::Simple;
# my $url="https://shlomif.github.io/";;
my $url="http://www.notabug.org/";;
print "$url is ", (
(! get($url)) ? "DOWN"
: "up"
Hi Manfred!
On Tue, 13 Feb 2018 12:25:31 +0100
Manfred Lotz wrote:
> Hi there,
> Somewhere I found an example how to check if a website is up.
>
> Here my sample:
>
> #! /usr/bin/perl
>
> use strict;
>
> use LWP::Simple;
> my $url="https://notabug.org";;
> if (! head($url)) {
> die "$url
Hi there,
Somewhere I found an example how to check if a website is up.
Here my sample:
#! /usr/bin/perl
use strict;
use LWP::Simple;
my $url="https://notabug.org";;
if (! head($url)) {
die "$url is DOWN"
}
Running above code I get
https://notabug.org is DOWN at ./check_url.pl l
10 matches
Mail list logo