Naslund, Steve wrote:
Average != Peak.
What is peak? There is a question for you. If we get all the way down to the
fundamentals of any network, peak is always 100%. There is either a bit on the
wire or not. Your network is either 100% busy or 100% idle at any
instantaneous moment in time. What matters is average transfer rate to the
user experience and even that varies a lot depending on the app in question and
how that app tolerates things like jitter, loss, and latency.
That's simply wrong - at least for folks who do any work related stuff
at home.
Consider: I've just edited a large sales presentation - say a PPT deck
with some embedded video, totaling maybe 250MB (2gbit) - and I want to
upload that to the company server. And let's say I want to do that 5
times during 12 hour day (it's crunch time, we're doing lots of edits).
On average, we're talking 20gbit/12 hours, or a shade under 500kbps, if
we're talking averages. On the other hand, if I try to push a 2gbit
file through a 500kbps pipe, it's going to take 4000 seconds (67
minutes) -- that's rather painful, and inserts a LOT of delay in the
process of getting reviews, comments, and doing the next round of edits.
On the other hand, at 50mbps it takes only 40 seconds - annoying, but
acceptable,
and at a gig, it only takes 2 seconds.
So, tell me, with a straight face, that "what matters is average
transfer rate to the user experience."
Miles Fidelman
--
In theory, there is no difference between theory and practice.
In practice, there is. .... Yogi Berra