Hi Alex...

> On 16. Mar 2024, at 18:18, Alexandre Petrescu via Starlink 
> <starlink@lists.bufferbloat.net> wrote:
> 
> 
> Le 15/03/2024 à 21:31, Colin_Higbie via Starlink a écrit :
>> Spencer, great point. We certainly see that with RAM, CPU, and graphics 
>> power that the software just grows to fill up the space. I do think that 
>> there are still enough users with bandwidth constraints (millions of users 
>> limited to DSL and 7Mbps DL speeds) that it provides some pressure against 
>> streaming and other services requiring huge swaths of data for basic 
>> functions, but, to your point, if there were a mandate that everyone would 
>> have 100Mbps connection, I agree that would then quickly become saturated so 
>> everyone would need more.
>> 
>> Fortunately, the video compression codecs have improved dramatically over 
>> the past couple of decades from MPEG-1 to MPEG-2 to H.264 to VP9 and H.265. 
>> There's still room for further improvements, but I think we're probably 
>> getting to a point of diminishing returns on further compression 
>> improvements. Even with further improvements, I don't think we'll see 
>> bandwidth needs drop so much as improved quality at the same bandwidth, but 
>> this does offset the natural bloat-to-fill-available-capacity movement we 
>> see.
> 
> I think the 4K-latency discussion is a bit difficult, regardless of how great 
> the codecs are.
> 
> For one, 4K can be considered outdated for those who look forward to 8K and 
> why not 16K; so we should forget 4K. 

[SM] Mmmh, numerically that might make sense, however increasing the resolution 
of video material brings diminishing returns in perceived quality (the human 
optical system has limits...).... I remember well how the steps from QVGA, to 
VGA/SD to HD (720) to FullHD (1080) each resulted in an easily noticeable 
improvement in quality. However now I have a hard time seeing an improvement 
(heck even just noticing) if I see fullHD of 4K material on our 43" screen from 
a normal distance (I need to do immediate A?B comparisons from short 
distance).... 
        I am certainly not super sensitive/picky, but I guess others will reach 
the same point maybe after 4K or after 8K. My point is the potential for growth 
in resolution is limited by psychophysics (ultimately driven by the visual arc 
covered by individual photoreceptors in the fovea). And I am not sure whether 
for normal screen sizes and distances we do not already have past that point at 
4K....

> 8K is delivered from space already by a japanese provider, but not on IP.  
> So, if we discuss TV resolutions we should look at these (8K, 16K, and why 
> not 3D 16K for ever more strength testing).

[SM] This might work as a gimmick for marketing, but if 16K does not deliver 
clearly superior experience why bother putting in the extra effort and energy 
(and storage size and network capacity) to actually deliver something like that?

> 
> Second, 4K etc. are for TV.  In TV the latency is rarely if ever an issue.  
> There are some rare cases where latency is very important in TV (I could 
> think of betting in sports, time synch of clocks) but they dont look at such 
> low latency as in our typical visioconference or remote surgery

[SM] Can we please bury this example please, "remote surgery" over the best 
effort internet, is really really a stupid idea, or something that should only 
ever be attempted as the very last effort. As a society we already failed if we 
need to rely somthing like that.


> or group music playing use-cases

[SM] That IMHO is a great example for a realistic low latency use-case (exactly 
because the failure mode is not as catastrophic as in tele surgery, so this 
seems acceptable for a best effort network).


> on Internet starlink.
> 
> So, I dont know how much 4K, 8K, 16K might be imposing any new latency 
> requirement on starlink.
> 
> Alex
> 
>> 
>> 
>> -----Original Message-----
>> From: Spencer Sevilla
>> Sent: Friday, March 15, 2024 3:54 PM
>> To: Colin_Higbie
>> Cc: Dave Taht via Starlink <starlink@lists.bufferbloat.net>
>> Subject: Re: [Starlink] It’s the Latency, FCC
>> 
>> Your comment about 4k HDR TVs got me thinking about the bandwidth “arms 
>> race” between infrastructure and its clients. It’s a particular pet peeve of 
>> mine that as any resource (bandwidth in this case, but the same can be said 
>> for memory) becomes more plentiful, software engineers respond by wasting it 
>> until it’s scarce enough to require optimization again. Feels like an 
>> awkward kind of malthusian inflation that ends up forcing us to buy 
>> newer/faster/better devices to perform the same basic functions, which 
>> haven’t changed almost at all.
>> 
>> I completely agree that no one “needs” 4K UHDR, but when we say this I think 
>> we generally mean as opposed to a slightly lower codec, like regular HDR or 
>> 1080p. In practice, I’d be willing to bet that there’s at least one poorly 
>> programmed TV out there that doesn’t downgrade well or at all, so the 
>> tradeoff becomes "4K UHDR or endless stuttering/buffering.” Under this 
>> (totally unnecessarily forced upon us!) paradigm, 4K UHDR feels a lot more 
>> necessary, or we’ve otherwise arms raced ourselves into a TV that can’t 
>> really stream anything. A technical downgrade from literally the 1960s.
>> 
>> See also: The endless march of “smart appliances” and TVs/gaming systems 
>> that require endless humongous software updates. My stove requires natural 
>> gas and 120VAC, and I like it that way. Other stoves require… how many Mbps 
>> to function regularly? Other food for thought, I wonder how increasing 
>> minimum broadband speed requirements across the country will encourage or 
>> discourage this behavior among network engineers. I sincerely don’t look 
>> forward to a future in which we all require 10Gbps to the house but can’t do 
>> much with it cause it’s all taken up by lightbulb software updates every 
>> evening /rant.
>> 
>>> On Mar 15, 2024, at 11:41, Colin_Higbie via Starlink 
>>> <starlink@lists.bufferbloat.net> wrote:
>>> 
>>>> I have now been trying to break the common conflation that download "speed"
>>>> means anything at all for day to day, minute to minute, second to
>>>> second, use, once you crack 10mbit, now, for over 14 years. Am I
>>>> succeeding? I lost the 25/10 battle, and keep pointing at really
>>>> terrible latency under load and wifi weirdnesses for many existing 100/20 
>>>> services today.
>>> While I completely agree that latency has bigger impact on how responsive 
>>> the Internet feels to use, I do think that 10Mbit is too low for some 
>>> standard applications regardless of latency: with the more recent 
>>> availability of 4K and higher streaming, that does require a higher minimum 
>>> bandwidth to work at all. One could argue that no one NEEDS 4K streaming, 
>>> but many families would view this as an important part of what they do with 
>>> their Internet (Starlink makes this reliably possible at our farmhouse). 4K 
>>> HDR-supporting TV's are among the most popular TVs being purchased in the 
>>> U.S. today. Netflix, Amazon, Max, Disney and other streaming services 
>>> provide a substantial portion of 4K HDR content.
>>> 
>>> So, I agree that 25/10 is sufficient, for up to 4k HDR streaming. 100/20 
>>> would provide plenty of bandwidth for multiple concurrent 4K users or a 1-2 
>>> 8K streams.
>>> 
>>> For me, not claiming any special expertise on market needs, just my own 
>>> personal assessment on what typical families will need and care about:
>>> 
>>> Latency: below 50ms under load always feels good except for some
>>> intensive gaming (I don't see any benefit to getting loaded latency
>>> further below ~20ms for typical applications, with an exception for
>>> cloud-based gaming that benefits with lower latency all the way down
>>> to about 5ms for young, really fast players, the rest of us won't be
>>> able to tell the difference)
>>> 
>>> Download Bandwidth: 10Mbps good enough if not doing UHD video
>>> streaming
>>> 
>>> Download Bandwidth: 25 - 100Mbps if doing UHD video streaming,
>>> depending on # of streams or if wanting to be ready for 8k
>>> 
>>> Upload Bandwidth: 10Mbps good enough for quality video conferencing,
>>> higher only needed for multiple concurrent outbound streams
>>> 
>>> So, for example (and ignoring upload for this), I would rather have latency 
>>> at 50ms (under load) and DL bandwidth of 25Mbps than latency of 1ms with a 
>>> max bandwidth of 10Mbps, because the super-low latency doesn't solve the 
>>> problem with insufficient bandwidth to watch 4K HDR content. But, I'd also 
>>> rather have latency of 20ms with 100Mbps DL, then latency that exceeds 
>>> 100ms under load with 1Gbps DL bandwidth. I think the important thing is to 
>>> reach "good enough" on both, not just excel at one while falling short of 
>>> "good enough" on the other.
>>> 
>>> Note that Starlink handles all of this well, including kids watching 
>>> YouTube while my wife and I watch 4K UHD Netflix, except the upload speed 
>>> occasionally tops at under 3Mbps for me, causing quality degradation for 
>>> outbound video calls (or used to, it seems to have gotten better in recent 
>>> months – no problems since sometime in 2023).
>>> 
>>> Cheers,
>>> Colin
>>> 
>>> _______________________________________________
>>> Starlink mailing list
>>> Starlink@lists.bufferbloat.net
>>> https://lists.bufferbloat.net/listinfo/starlink
>> _______________________________________________
>> Starlink mailing list
>> Starlink@lists.bufferbloat.net
>> https://lists.bufferbloat.net/listinfo/starlink
> _______________________________________________
> Starlink mailing list
> Starlink@lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink

_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink

Reply via email to