On Wed, Apr 15, 2009 at 09:38,  <practicalp...@gmail.com> wrote:
> Greetings,
>
>
> What's the easy way to calculate a webpage's downloading time?
> (not only the page, but all the elements in this page, like images,
> JS, css etc).
> For example, I want to get a function:
>
> my $dl_time = fetch_a_page("www.yahoo.com");
snip

A page's download time will be different from different machines
on the net.  You are better off figuring out how much data needs
to be downloaded (the sum of all CSS, HTML, images, Javascript,
etc. files).  If some of this data is can be cached (such as an
external CSS file that all of the HTML uses) you will want to
create two sizes: first fetch, and subsequent fetches. Once you
have the total size you can calculate the download times for a
variety of download speeds.

If this is someone else's page you can probably fetch each the
page with LWP::Simple[1], then parse it with HTML::Parser[2],
and then download each of the items that would normally be
downloaded (like external CSS and JavaScript files and images).

Get the file sizes and add them all up.

1. http://search.cpan.org/dist/libwww-perl/lib/LWP/Simple.pm
2. http://search.cpan.org/dist/HTML-Parser/Parser.pm

-- 
Chas. Owens
wonkden.net
The most important skill a programmer can have is the ability to read.

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to