one person's 'wasteful resolution' is another person's 'large enhancement'
going from 1080p to 4k video is not being wasteful, it's opting to use the
bandwidth in a different way.
saying that it's wasteful for someone to choose to do something is saying that
you know better what their priorities should be.
I agree that increasing resources allow programmers to be lazier and write apps
that are bigger, but they are also writing them in less time.
What right do you have to say that the programmer's time is less important than
the ram/bandwidth used?
I agree that it would be nice to have more people write better code, but
everything, including this, has trade-offs.
David Lang
On Fri, 15 Mar 2024, Spencer Sevilla via Starlink wrote:
Your comment about 4k HDR TVs got me thinking about the bandwidth “arms race”
between infrastructure and its clients. It’s a particular pet peeve of mine
that as any resource (bandwidth in this case, but the same can be said for
memory) becomes more plentiful, software engineers respond by wasting it until
it’s scarce enough to require optimization again. Feels like an awkward kind of
malthusian inflation that ends up forcing us to buy newer/faster/better devices
to perform the same basic functions, which haven’t changed almost at all.
I completely agree that no one “needs” 4K UHDR, but when we say this I think we
generally mean as opposed to a slightly lower codec, like regular HDR or 1080p. In
practice, I’d be willing to bet that there’s at least one poorly programmed TV out
there that doesn’t downgrade well or at all, so the tradeoff becomes "4K UHDR
or endless stuttering/buffering.” Under this (totally unnecessarily forced upon us!)
paradigm, 4K UHDR feels a lot more necessary, or we’ve otherwise arms raced
ourselves into a TV that can’t really stream anything. A technical downgrade from
literally the 1960s.
See also: The endless march of “smart appliances” and TVs/gaming systems that
require endless humongous software updates. My stove requires natural gas and
120VAC, and I like it that way. Other stoves require… how many Mbps to function
regularly? Other food for thought, I wonder how increasing minimum broadband
speed requirements across the country will encourage or discourage this
behavior among network engineers. I sincerely don’t look forward to a future in
which we all require 10Gbps to the house but can’t do much with it cause it’s
all taken up by lightbulb software updates every evening /rant.
On Mar 15, 2024, at 11:41, Colin_Higbie via Starlink
<starlink@lists.bufferbloat.net> wrote:
I have now been trying to break the common conflation that download "speed"
means anything at all for day to day, minute to minute, second to
second, use, once you crack 10mbit, now, for over 14 years. Am I
succeeding? I lost the 25/10 battle, and keep pointing at really
terrible latency under load and wifi weirdnesses for many existing 100/20 services today.
While I completely agree that latency has bigger impact on how responsive the Internet feels to use, I do think that 10Mbit is too low for some standard applications regardless of latency: with the more recent availability of 4K and higher streaming, that does require a higher minimum bandwidth to work at all. One could argue that no one NEEDS 4K streaming, but many families would view this as an important part of what they do with their Internet (Starlink makes this reliably possible at our farmhouse). 4K HDR-supporting TV's are among the most popular TVs being purchased in the U.S. today. Netflix, Amazon, Max, Disney and other streaming services provide a substantial portion of 4K HDR content.
So, I agree that 25/10 is sufficient, for up to 4k HDR streaming. 100/20 would provide plenty of bandwidth for multiple concurrent 4K users or a 1-2 8K streams.
For me, not claiming any special expertise on market needs, just my own
personal assessment on what typical families will need and care about:
Latency: below 50ms under load always feels good except for some intensive
gaming (I don't see any benefit to getting loaded latency further below ~20ms
for typical applications, with an exception for cloud-based gaming that
benefits with lower latency all the way down to about 5ms for young, really
fast players, the rest of us won't be able to tell the difference)
Download Bandwidth: 10Mbps good enough if not doing UHD video streaming
Download Bandwidth: 25 - 100Mbps if doing UHD video streaming, depending on #
of streams or if wanting to be ready for 8k
Upload Bandwidth: 10Mbps good enough for quality video conferencing, higher
only needed for multiple concurrent outbound streams
So, for example (and ignoring upload for this), I would rather have latency at 50ms (under load)
and DL bandwidth of 25Mbps than latency of 1ms with a max bandwidth of 10Mbps, because the
super-low latency doesn't solve the problem with insufficient bandwidth to watch 4K HDR content.
But, I'd also rather have latency of 20ms with 100Mbps DL, then latency that exceeds 100ms under
load with 1Gbps DL bandwidth. I think the important thing is to reach "good enough" on
both, not just excel at one while falling short of "good enough" on the other.
Note that Starlink handles all of this well, including kids watching YouTube while my wife and I watch 4K UHD Netflix, except the upload speed occasionally tops at under 3Mbps for me, causing quality degradation for outbound video calls (or used to, it seems to have gotten better in recent months – no problems since sometime in 2023).
Cheers,
Colin
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink