I agree with Chris' method, but if you don't have cron, then what I do is a
page-based cache.

Since weather does not change with every page hit, you could store the
parsed page inside a database, or even write it to a flat file, so if the
page is hit 100 times an hour, and you do an hourly refresh on the cached
version, you only get 1 "slow" page per hour, instead of 100. if the page
gets hit 1000 times... you get the idea.

You could check timestamps of the file (using file based caching) or add a
timestamp field in your sql table.

This sort of caching mechanism is used quite extensively on large sites, but
it's easy enough to implement for smaller sites too.

Of course, what you're doing is totally illegal, unless you have an
agreement with msn to scrape their site <g> I know we don't like microsoft,
but the law's the law.... ;-)

"Chris W. Parker" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
DougD <mailto:[EMAIL PROTECTED]>
    on Thursday, July 31, 2003 11:18 AM said:

> If it were possible I want the include to occur after the rest of the
> page is loaded.

Maybe instead of including the file that does the processing and waiting
forever for it to finish, you might consider setting up that same  page
as a cron job and have it create another page on your sever that you can
quickly read/parse into your web page.

You could have the cron job run every 1/2/3/4/5/10/15 minutes depending
on how accurate it needs to be. Each time the page is loaded by the
browser the server will grab the latest version and display it.


hth,
Chris.


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to