Just a thought -

A web page (say www.yahoo.com) may contain several links, some of which could 
be non-traditional html (e.g. media files, Flash files, etc). Do you want 
"fetch_a_page" to just read the URI of the hyperlink or even "consume" (parse, 
read, etc.) the URI?

Also, since reading the links serially is not "fast enought", browsers often 
read/parse/download several links in parallel. 

-- Jayesh



----- Original Message ----
From: "practicalp...@gmail.com" <practicalp...@gmail.com>
To: Perl Beginners <beginners@perl.org>
Sent: Wednesday, April 15, 2009 8:38:33 AM
Subject: calc page's downloading time

Greetings,


What's the easy way to calculate a webpage's downloading time?
(not only the page, but all the elements in this page, like images,
JS, css etc).
For example, I want to get a function:

my $dl_time = fetch_a_page("www.yahoo.com");


Thanks in advance.

Regards.

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


      


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to