Just fork the web application. That would probably be a much better
solution...
On Thursday 28 August 2003 04:13 pm, Robert Cummings wrote:
> This isn't the best solution but it might help bring down the total
> time. Can you set up a shell script to retrieve the content from a URL
> (in PHP if
hi,
David is right, you will not find an equivalent. My advise is to mix
PHP and perl. You can get a perl script to handle the URL retrieval
stuff and pass handlign back to php when you are done. It's this
approach that i took when creating the mega upload progress bar for php.
all the best
On Thu, 28 Aug 2003 15:49:15 -0700, you wrote:
>> Think you're out of luck. Yes, it's a problem I've run up against more than
>> once. There's no thread support in PHP [4. Anyone know if it's in 5?].
>php.net/pcntl_fork
Interesting - that's new.
Unix-only and not in the default install. Not ava
This isn't the best solution but it might help bring down the total
time. Can you set up a shell script to retrieve the content from a URL
(in PHP if you wish) and then from your web app spawn 5 processes, with
destination temporary files for the data which you can then poll for
completion (microsl
php.net/pcntl_fork
On Thursday 28 August 2003 11:09 am, David Otton wrote:
> On Thu, 28 Aug 2003 20:25:05 +0300, you wrote:
> >I am looking for PHP analog of Perl LWP::Parallel.
> >I need to fetch several URL(pages) from PHP at the same time.
> >I have a script which fetch 5-10 URL, each URL fet
On Thu, 28 Aug 2003 20:25:05 +0300, you wrote:
>I am looking for PHP analog of Perl LWP::Parallel.
>I need to fetch several URL(pages) from PHP at the same time.
>I have a script which fetch 5-10 URL, each URL fetched 0.5 - 2 sec.
>Totally it's about 10 seconds in average.
>I suppose if I fetched
6 matches
Mail list logo