On Sunday 23 March 2008 12:12:04 Michelle Konzack wrote:
> Am 2008-03-21 19:15:13, schrieb Børge Holen:
> > wget is fast and easy though... umm I'm on an direct 100mbit
> > connection... wget does it brute
>
> Sometimes it is too fast for me... :-)
> Specialy If I work in Paris on my Dual-STM-4 ne
Am 2008-03-21 13:58:59, schrieb Wolf:
> Both are pretty effecitve and give pretty much the same results,
> however with the CURL you can pass other things alone (user:pass)
> which with wget you can not do.
???
wget http://${USER}:[EMAIL PROTECTED]/
is working and
wget --http-user="${US
Am 2008-03-21 14:12:04, schrieb Wolf:
> OK, so I stand corrected there... But has anyone seen a PHP port of
> wget or is curl the only one of the 2 which does it natively in a
> compiled version of php with curl? :)
AFAIK, there is no native port. But why do you want one? -- "wget is
working p
Am 2008-03-21 19:15:13, schrieb Børge Holen:
> wget is fast and easy though... umm I'm on an direct 100mbit connection...
> wget does it brute
Sometimes it is too fast for me... :-)
Specialy If I work in Paris on my Dual-STM-4 network...
Then, --limit-rate= is my friend.
Thanks, Greetings and
To get directory permissions, you'd really have to go through either FTP or a
terminal connection (telnet, etc).
Yes, but that's doing it programmatically, what about this?
What if you knew that a directory on a site was set to 0777 -- what
damage could you cause?
I have seen scripts that cl
ation type link sources.
-TG
- Original Message -
From: Ray Hauge <[EMAIL PROTECTED]>
To: tedd <[EMAIL PROTECTED]>
Cc: php-general@lists.php.net
Date: Fri, 21 Mar 2008 13:45:35 -0500
Subject: Re: [PHP] spider
> Have a look at something like this:
>
> http://simplehtmld
Ok, so the CURL and WGET stuff has been mentioned, but I don't think that
really addresses your question. You didn't ask what the "best way" to do
this, you asked how you would do it in PHP.
Here's what I would consider to be the 'theory' of the exercise:
* Do we obey robots.txt? If so, get t
On Fri, Mar 21, 2008 at 1:52 PM, tedd <[EMAIL PROTECTED]> wrote:
> Also, is there a way to spider through a remote web site gathering
> directory permissions?
I should hope not.
>
> If not, can one attempt to write a file and record the
> failures/successes (0777 directories)?
I don't know i
> I knew sometime I would have to figure out what CURL is, but now WGET
> (WETFTI) as well. I was hoping for something simple that wouldn't
> hurt my brain.
>
> Thanks a lot guys! :-)
For hijacking the thread? No Problem!
For making your brain hurt? Anytime!!
We're just here to help! ;)
tedd wrote:
Hi gang:
How do you spider a remote web site in php?
I get the general idea, which is to take the root page, strip out the
links and repeat the process on those links. But, what's the code? Does
anyone have an example they can share or a direction for me to take?
Also, is there
On Fri, Mar 21, 2008 at 1:52 PM, tedd <[EMAIL PROTECTED]> wrote:
> Hi gang:
>
> How do you spider a remote web site in php?
>
> I get the general idea, which is to take the root page, strip out the
> links and repeat the process on those links. But, what's the code?
I make absolutely no war
At 1:52 PM -0400 3/21/08, tedd wrote:
Hi gang:
How do you spider a remote web site in php?
You guys are always doing this. I ask a simple question like what's a
+ b and you guys high-jack the thread and start discussing the
quadratic equation. :-)
I knew sometime I would have to figure ou
"Børge Holen" <[EMAIL PROTECTED]> wrote:
> On Friday 21 March 2008 19:12:04 Wolf wrote:
> > Daniel Brown <[EMAIL PROTECTED]> wrote:
> > > On Fri, Mar 21, 2008 at 1:58 PM, Wolf <[EMAIL PROTECTED]> wrote:
> > > > In one word: CURL
> > > >
> > > > In another word: WGET
> > > >
> > > >
On Friday 21 March 2008 19:12:04 Wolf wrote:
> Daniel Brown <[EMAIL PROTECTED]> wrote:
> > On Fri, Mar 21, 2008 at 1:58 PM, Wolf <[EMAIL PROTECTED]> wrote:
> > > In one word: CURL
> > >
> > > In another word: WGET
> > >
> > > Both are pretty effecitve and give pretty much the same results,
On Friday 21 March 2008 18:58:59 Wolf wrote:
> tedd <[EMAIL PROTECTED]> wrote:
> > Hi gang:
> >
> > How do you spider a remote web site in php?
> >
> > I get the general idea, which is to take the root page, strip out the
> > links and repeat the process on those links. But, what's the code?
>
Daniel Brown <[EMAIL PROTECTED]> wrote:
> On Fri, Mar 21, 2008 at 1:58 PM, Wolf <[EMAIL PROTECTED]> wrote:
> >
> > In one word: CURL
> >
> > In another word: WGET
> >
> > Both are pretty effecitve and give pretty much the same results, however
> > with the CURL you can pass other things
On Fri, Mar 21, 2008 at 1:58 PM, Wolf <[EMAIL PROTECTED]> wrote:
>
> In one word: CURL
>
> In another word: WGET
>
> Both are pretty effecitve and give pretty much the same results, however
> with the CURL you can pass other things alone (user:pass) which with wget you
> can not do.
[EMAIL P
On Fri, 2008-03-21 at 13:58 -0400, Wolf wrote:
> tedd <[EMAIL PROTECTED]> wrote:
>
> In one word: CURL
>
> In another word: WGET
>
> Both are pretty effecitve and give pretty much the same results, however
> with the CURL you can pass other things alone (user:pass) which with
> wget you c
tedd <[EMAIL PROTECTED]> wrote:
> Hi gang:
>
> How do you spider a remote web site in php?
>
> I get the general idea, which is to take the root page, strip out the
> links and repeat the process on those links. But, what's the code?
> Does anyone have an example they can share or a dire
On Friday 09 February 2001 21:28, Angerer, Chad wrote:
maybe late and all, this pieca code was lying around somewhere in my socks ;)
see if ya can use it,
hrishi
begin code snippet
handle;
$currpath=$fromdir_class->path;
chdir($currpath);
$dirs=array();
- Original Message -
From: "Angerer, Chad" <[EMAIL PROTECTED]>
> I am not sure if this is the correct wording. But I am wondering about a
> good tutorial about writing a PHP script that will spider a directory
> structure and extract the file names and insert them into a database.
Also
21 matches
Mail list logo