> "Dan" == Dan Anderson <[EMAIL PROTECTED]> writes:
Dan> I guess I should stop then, but I was looking at O'Reilly's
Dan> robots.txt file (http://safari.oreilly.com/robots.txt):
Dan> User-Agent: *
Dan> Allow: /
Dan> Which made me think spidering was alright.
That's for spide
> "Fair use" is copyright law -- I don't know whether you're infringing
> anybody's copyright, but you're certainly violating O'Reilly's Terms of
> Service, which requires that you agree:
>
> not to use "Web spiders" or any other automated retrieval
> mechanisms when using the Service othe
On Fri, Dec 26, 2003 at 12:52:06PM -0500, Dan Anderson wrote:
> So, all in all, I think that my usage falls under the term fair use.
> I have no desire to circumvent Safari's security -- I'm just looking
> to speed up something I do which conforms to the TOS of the web site.
"Fair use" is copyri
> Call me an old fogy, but I think that some of the mechanization of Web
> communications has gone too far. Providing interactive features in the CGI
> is one thing. It provides services for both sides of any transaction
> involved. Batch harvesting of pages meant for human perusal, like batch
>
zentara wrote:
> On 24 Dec 2003 16:05:16 -0500, [EMAIL PROTECTED] (Dan Anderson) wrote:
> >
> >I am trying to create a spider to grab my books off of Safari
> >for a batch printing job so I don't need to go through each chapter
> >myself and hit the Print button. So I used this script
Dan Anderson wrote:
>
> I am trying to create a spider to grab my books off of Safari
> for a batch printing job so I don't need to go through each chapter
> myself and hit the Print button. So I used this script to try and log
> myself in to the safari site:
>
> # BEGIN CODE
> #! /usr
I am trying to create a spider to grab my books off of Safari
for a batch printing job so I don't need to go through each chapter
myself and hit the Print button. So I used this script to try and log
myself in to the safari site:
# BEGIN CODE
#! /usr/bin/perl
use strict;
use warning
gt;
> > -Ursprüngliche Nachricht-
> > Von: Tim Keefer [mailto:[EMAIL PROTECTED]]
> > Gesendet: Montag, 18. Juni 2001 15:46
> > An: Ela Jarecka; Beginners list (E-Mail)
> > Betreff: Re: Problems with LWP::UserAgent and HTTP::Response
> >
> >
> &
Ela Jarecka; Beginners list (E-Mail)
> Betreff: Re: Problems with LWP::UserAgent and HTTP::Response
>
>
> Hi Ela,
> The documentation for perl LWP agent seems sparse. I had a
> difficult time
> figuring out how to send multipart form-data. I'll share the
> code with y
Hi Ela,
The documentation for perl LWP agent seems sparse. I had a difficult time
figuring out how to send multipart form-data. I'll share the code with you that
some shared with me. Hope it helps.
require LWP;
use LWP::UserAgent;
use HTTP::Request::Common;
# Create a user agent object
Hi,
I am using the following code to send and XML document ( output.xml ) to a
remote server:
use strict;
use LWP::Debug qw(+);
use LWP::UserAgent;
use IO;
my $resp;
$resp = 'response.xml';
my $FILEH;
open (FILEH, ) or die "Can't open file output.xml!\n";
my $ua = LWP::UserAgent->new;
#another
11 matches
Mail list logo