On Fri, 2009-01-16 at 16:03 -0800, Sam Smith wrote:
> I need a script that will crawl a list of websites and download all .jpg,
> .gif, .png files.
> 
> I can think of some ways how to start like, fopen() or maybe curl(). And it
> downed on me I'd need to handle files writing over with the same name. And
> it would be cool to save the full URL to the file in a database with the
> path on the local server where I'm saving them.
> 
> I was hoping someone might say, "Dude, that's simple, just do this..."
> before I spent hours guessing.
> 
> Anyone?

Use wget to crawl and save the site for you.

Cheers,
Rob.
-- 
http://www.interjinn.com
Application and Templating Framework for PHP


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to