I am curious what the real world bandwidth requirements are for live
sports, streaming? I imagine during episodes of high motion, encoders
struggle.



On Mon, Mar 18, 2024 at 12:42 PM Colin_Higbie via Starlink
<starlink@lists.bufferbloat.net> wrote:
>
> To the comments and question from Dave Collier-Brown in response to my saying 
> that we test latency for UX and Alex on 8K screens, both of these seem to 
> take more academic view than I can address on what I view as commercial 
> subjects. By that, I mean that they seem to assume budget and market 
> preferences are secondary considerations rather than the primary driving 
> forces they are to me.
>
> From my perspective, end user/customer experience is ultimately the only 
> important metric, where all others are just tools to help convert UX into 
> something more measurable and quantifiable. To be clear, I fully respect the 
> importance of being able to quantify these things, so those metrics have 
> value, but they should always serve as ways to proxy the UX, not a target 
> unto themselves. If you're designing a system that needs minimal lag for 
> testing your new quantum computer or to use in place of synchronized clocks 
> for those amazing x-ray photos of black holes, then your needs may be 
> different, but if you're talking about how Internet providers measure their 
> latency and bandwidth for sales to millions or billions of homes and 
> businesses, then UX based on mainstream applications is what matters.
>
> To the specifics:
>
> No, we (our company) don't have a detailed latency testing method. We test 
> purely for UX. If users or our QA team report a lag, that's bad and we work 
> to fix it. If QA and users are happy with the that and negative feedback is 
> in other areas unrelated to lag (typically the case), then we deem our 
> handling of latency as "good enough" and focus our engineering efforts on the 
> problem areas or on adding new features. Now, I should acknowledge, this is 
> largely because our application is not particularly latency-sensitive. If it 
> were, we probably would have a lag check as part of our standard automated 
> test bed. For us, as long as our application starts to provide our users with 
> streaming access to our data within a second or so, that's good enough.
>
> I realize good-enough is not a hard metric by itself, but it's ultimately the 
> only factor that matters to most users. The exception would be some very 
> specific use cases where 1ms of latency delta makes a difference, like for 
> some stock market transactions and competitive e-sports.
>
> To convert the nebulous term "good enough" into actual metrics that ISP's and 
> other providers can use to quantify their service, I stand by my prior point 
> that the industry could establish needed metrics per application. VoIP has 
> stricter latency needs than web browsing. Cloud-based gaming has still 
> stricter latency requirements. There would be some disagreement on what 
> exactly is "good enough" for each of those, but I'm confident we could reach 
> numbers for them, whether by survey and selecting the median, by reported 
> complaints based on service to establish a minimum acceptable level, or by 
> some other method. I doubt there's significant variance on what qualifies as 
> good-enough for each application.
>
> 4K vs Higher Resolution as Standard
> And regarding 4K TV as a standard, I'm surprised this is controversial. 4K is 
> THE high-end standard that defines bandwidth needs today. It is NOT 8K or 
> anything higher (similarly, in spite of those other capabilities you 
> mentioned, CD's are also still 44.1kHz (48hKz is for DVD), with musical 
> fidelity at a commercial level having DECREASED, not increased, where most 
> sales and streaming occurs using lower quality MP3 files). That's not a 
> subjective statement; that is a fact. By "fact" I don't mean that no one 
> thinks 8K is nice or that higher isn't better, but that there is an 
> established industry standard that has already settled this. Netflix defines 
> it as 25Mbps. The other big streamers, Disney+, Max, and Paramount+ all 
> agree. 25Mbps is higher than is usually needed for 4K HDR content (10-15Mbps 
> can generally hit it, depending on the nature of the scenes where slow scenes 
> with a lot of solid background color like cartoons compress into less data 
> than fast moving visually complex scenes), but it it's a good figure to use 
> because it includes a safety margin and, more importantly, it's what the 
> industry has already defined as the requirement. To me, this one is very 
> black and white and clear cut, even more so than latency. IF you're an 
> Internet provider and want to claim that your Internet supports modern 
> viewing standards for streaming, you must provide 25Mbps. I'm generally happy 
> to debate anything and acknowledge other points of view are just as valid as 
> my own, but I don't see this particular point as debatable, because it's a 
> defined fact by the industry. It's effectively too late to challenge this. At 
> best, you'd be fighting customers and content providers alike and to what 
> purpose?
>
> Will that 25Mbps requirement change in the future? Probably. It will probably 
> go up even though 4K HDR streaming will probably be achievable with less 
> bandwidth in the future due to further improvements in compression 
> algorithms. This is because, yeah, eventually maybe 8K or higher resolutions 
> will be a standard, or maybe there will be a higher bit depth HDR (that seems 
> slightly more likely to me). It's not at all clear though that's the case. At 
> some point, you reach a state where there is no benefit to higher 
> resolutions. Phones hit that point a few years ago and have stopped moving to 
> higher resolution displays. There is currently 0% of content from any major 
> provider that's in 8K (just some experimental YouTube videos), and a person 
> viewing 8K would be unlikely to report any visual advantage over 4K (SD -> HD 
> is huge, HD -> 4K is noticeable, 4K -> 8K is imperceptible for 
> camera-recording scenes on any standard size viewing experience).
>
> Where 8K+ could make a difference would primarily be in rendered content (and 
> the handful of 8K sets sold today play to this market). Standard camera 
> lenses just don't capture a sharp enough picture to benefit from the extra 
> pixels (they can in some cases, but depth of field and human error render 
> these successes isolated to specific kinds of largely static landscape 
> scenes). If the innate fuzziness or blurriness in the image exceeds the size 
> of a pixel, then more pixels don't add any value. However, in a rendered 
> image, like in a video game, those are pixel perfect, so at least there it's 
> possible to benefit from a higher resolution display. But for that, even the 
> top of the line graphics today (Nvidia RTX 4090, now over a year old) can 
> barely generate 4K HDR content with path tracing active at reasonable 
> framerates (60 frames per second), and because of their high cost, those make 
> up only 0.23% of the market as of the most recent data I've seen (this will 
> obviously increase over time).
>
> I could also imagine AI may be able to reduce blurriness in captured video in 
> the future and sharpen it before sending it out to viewers, but we're not 
> there yet. For all these reasons, 8K will remain niche for the time being. 
> There's just no good reason for it. When the Super Bowl (one of the first to 
> offer 4K viewing) advertises that it can be viewed in 8K, that's when you 
> know it's approaching a mainstream option.
>
> On OLED screens and upcoming microLED displays that can achieve higher 
> contrast ratios than LCD, HDR is far more impactful to the UX and viewing 
> experience than further pixel density increases. Current iterations of LCD 
> can't handle this, even though they claim to support HDR, which has given 
> many consumers the wrong impression that HDR is not a big deal. It is not on 
> LCD's because they cannot achieve the contrast rations needed for impactful 
> HDR. At least not with today's technology, and probably never, just because 
> the advantages to microLED outweigh the benefits I would expect you could get 
> by improving LCD.
>
> So maybe we go from the current 10-bit/color HDR to something like 12 or 16 
> bit HDR. That could also increase bandwidth needs at the same 4K display 
> size. Or, maybe the next generation displays won't be screens but will be 
> entire walls built of microLED fabric that justify going to 16K displays at 
> hundreds of inches. At this point, you'd be close to displays that duplicate 
> a window to the outside world (but still far from the brightness of the sun 
> shining through). But there is nothing at that size that will be at consumer 
> scale in the next 10 years. It's at least that far out (12+-bit HDR might 
> land before that on 80-110" screens), and I suspect quite a bit further. It's 
> one thing to move to a larger TV, because there's already infrastructure for 
> that. On the other hand, to go to entire walls made of a display material 
> would need an entirely different supply chain, different manufacturers, 
> installers, cultural change in how we watch and use it, etc. Those kinds of 
> changes take decades.
>
> Cheers,
> Colin
>
>
> Date: Sun, 17 Mar 2024 12:17:11 -0400
> From: Dave Collier-Brown <dave.collier-br...@indexexchange.com>
> To: starlink@lists.bufferbloat.net
> Subject: [Starlink] Sidebar to It’s the Latency, FCC: Measure it?
> Message-ID: <e0f9affe-f205-4f01-9ff5-3dc93abc3...@indexexchange.com>
> Content-Type: text/plain; charset=UTF-8; format=flowed
>
> On 2024-03-17 11:47, Colin_Higbie via Starlink wrote:
>
> > Fortunately, in our case, even high latency shouldn't be too terrible, but 
> > as you rightly point out, if there are many iterations, 1s minimum latency 
> > could yield a several second lag, which would be poor UX for almost any 
> > application. Since we're no longer testing for that on the premise that 1s 
> > minimum latency is no longer a common real-world scenario, it's possible 
> > those painful lags could creep into our system without our knowledge.
>
> Does that suggest that you should have an easy way to see if you're 
> unexpectedly delivering a slow service? A tool that reports your RTT to 
> customers and an alert on it being high for a significant period might be 
> something all ISPs want, even ones like mine, who just want it to be able to 
> tell a customer "you don't have a network problem" (;-))
>
> And the FCC might find the data illuminating
>
> --dave
>
> --
> David Collier-Brown,         | Always do right. This will gratify
> System Programmer and Author | some people and astonish the rest
> dave.collier-br...@indexexchange.com |              -- Mark Twain
>
>
> CONFIDENTIALITY NOTICE AND DISCLAIMER : This telecommunication, including any 
> and all attachments, contains confidential information intended only for the 
> person(s) to whom it is addressed. Any dissemination, distribution, copying 
> or disclosure is strictly prohibited and is not a waiver of confidentiality. 
> If you have received this telecommunication in error, please notify the 
> sender immediately by return electronic mail and delete the message from your 
> inbox and deleted items folders. This telecommunication does not constitute 
> an express or implied agreement to conduct transactions by electronic means, 
> nor does it constitute a contract offer, a contract amendment or an 
> acceptance of a contract offer. Contract terms contained in this 
> telecommunication are subject to legal review and the completion of formal 
> documentation and are not binding until same is confirmed in writing and has 
> been signed by an authorized signatory.
>
>
> ------------------------------
>
> Message: 2
> Date: Sun, 17 Mar 2024 18:00:42 +0100
> From: Alexandre Petrescu <alexandre.petre...@gmail.com>
> To: starlink@lists.bufferbloat.net
> Subject: Re: [Starlink] It’s the Latency, FCC
> Message-ID: <b0b5db3c-baf4-425a-a2c6-38ebc4296...@gmail.com>
> Content-Type: text/plain; charset=UTF-8; format=flowed
>
>
> Le 16/03/2024 à 20:10, Colin_Higbie via Starlink a écrit :
> > Just to be clear: 4K is absolutely a standard in streaming, with that being 
> > the most popular TV being sold today. 8K is not and likely won't be until 
> > 80+" TVs become the norm.
>
> I can agree screen size is one aspect pushing the higher resolutions to 
> acceptance, but there are some more signs indicating that 8K is just round 
> the corner, and 16K right after it.
>
> The recording consumer devices (cameras) already do 8K recording cheaply, 
> since a couple of years.
>
> New acronyms beyond simply resolutions are always ready to come up.  HDR 
> (high dynamic range) was such an acronym accompanying 4K, so for 8K there 
> might be another, bringing more than just resolution, maybe even more dynamic 
> range, blacker blacks, wider gamut,-for goggles, etc. for a same screen size.
>
> 8K and 16K playing devices might not have a surface to exhibit their entire 
> power, but when such surfaces become available, these 8K and 16K playing 
> devices will be ready for them, whereas 4K no.
>
> A similar evolution is witnessed by sound and by crypto: 44KHz CD was enough 
> for all, until SACD 88KHz came about, then DSD64, DSD128 and today DSD 1024, 
> which means DSD 2048 tomorrow.  And the Dolby Atmos and
> 11.1 outputs.   These too dont yet have the speakers nor the ears to take 
> advantage of, but in the future they might.  In crypto, the 'post-quantum' 
> algorithms are designed to resist brute force by computers that dont exist 
> publicly  (a few hundred qubit range exists, but 20.000 qubit range computer 
> is needed) but when they will, these crypto algos will be ready.
>
> Given that, one could imagine the bandwidth and latency by a 3D 16K
> DSD1024 quantum-resistant ciphered multi-party visio-conference with gloves, 
> goggles and other interacting devices, with low latency over starlink.
>
> The growth trends (4K...) can be identified and the needed latency numbers 
> can be projected.
>
> Alex
> _______________________________________________
> Starlink mailing list
> Starlink@lists.bufferbloat.net
> https://lists.bufferbloat.net/listinfo/starlink



--
https://www.youtube.com/watch?v=N0Tmvv5jJKs Epik Mellon Podcast
Dave Täht CSO, LibreQos
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink

Reply via email to