David Fernández, those bitrates are safe numbers, but many streams could get by
with less at those resolutions. H.265 compression is at a variable bit rate
with simpler scenes requiring less bandwidth. Note that 4K with HDR (30 bits
per pixel rather than 24) consistently also fits within 25Mbps.
David Lang, HDR is a requirement for 4K programming. That is not to say that
all 4K streams are in HDR, but in setting a required bandwidth, because 4K
signals can include HDR, the required bandwidth must accommodate and allow for
HDR. That said, I believe all modern 4K programming on Netflix and Amazon Prime
is HDR. Note David Fernández' point that Spain independently reached the same
conclusion as the US streaming services of 25Mbps requirement for 4K.
Visually, to a person watching and assuming an OLED (or microLED) display
capable of showing the full color and contrast gamut of HDR (LCD can't really
do it justice, even with miniLED backlighting), the move to HDR from SDR is
more meaningful in most situations than the move from 1080p to 4K. I don't
believe going to further resolutions, scenes beyond 4K (e.g., 8K), will add
anything meaningful to a movie or television viewer over 4K. Video games could
benefit from the added resolution, but lens aberration in cameras along with
focal length and limited depth of field render blurriness of even a sharp
picture greater than the pixel size in most scenes beyond about 4K - 5.5K.
Video games don’t suffer this problem because those scenes are rendered,
eliminating problems from camera lenses. So video games may still benefit from
8K resolution, but streaming programming won’t.
There is precedent for this in the audio streaming world: audio streaming
bitrates have retracted from prior peaks. Even though 48kHz and higher bitrate
audio available on DVD is superior to the audio quality of 44.1kHz CDs, Spotify
and Apple and most other streaming services stream music at LOWER quality than
CD. It’s good enough for most people to not notice the difference. I don’t see
much push in the foreseeable future for programming beyond UHD (4K + HDR).
That’s not to say never, but there’s no real benefit to it with current camera
tech and screen sizes.
Conclusion: for video streaming needs over the next decade or so, 25Mbps should
be appropriate. As David Fernández rightly points out, H.266 and other future
protocols will improve compression capabilities and reduce bandwidth needs at
any given resolution and color bit depth, adding a bit more headroom for small
improvements.
Cheers,
Colin
-----Original Message-----
From: Starlink<starlink-boun...@lists.bufferbloat.net> On Behalf
ofstarlink-requ...@lists.bufferbloat.net
Sent: Tuesday, April 30, 2024 9:31 AM
To:starlink@lists.bufferbloat.net
Subject: Starlink Digest, Vol 37, Issue 9
Message: 2
Date: Tue, 30 Apr 2024 11:54:20 +0200
From: David Fernández<davidf...@gmail.com>
To: starlink<starlink@lists.bufferbloat.net>
Subject: Re: [Starlink] It’s the Latency, FCC
Message-ID:
<CAC=tz0rrmwjunlvgupw6k8ogadcylq-eyw7bjb209ondwgf...@mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
Last February, TV broadcasting in Spain left behind SD definitively and moved
to HD as standard quality, also starting to regularly broadcast a channel with
4K quality.
A 4K video (2160p) at 30 frames per second, handled with the HEVC compression
codec (H.265), and using 24 bits per pixel, requires 25 Mbit/s.
Full HD video (1080p) requires 10 Mbit/s.
For lots of 4K video encoded at < 20 Mbit/s, it may be hard to distinguish it
visually from the HD version of the same video (this was also confirmed by SBTVD
Forum Tests).
Then, 8K will come, eventually, requiring a minimum of ~32 Mbit/s:
https://dvb.org/news/new-generation-of-terrestrial-services-taking-shape-in-europe
The latest codec VVC (H.266) may reduce the required data rates by at least
27%, at the expense of more computing power required, but somehow it is claimed
it will be more energy efficient.
https://dvb.org/news/dvb-prepares-the-way-for-advanced-4k-and-8k-broadcast-and-broadband-television
Regards,
David
Date: Mon, 29 Apr 2024 19:16:27 -0700 (PDT)
From: David Lang<da...@lang.hm>
To: Colin_Higbie<chigb...@higbie.name>
Cc: David Lang<da...@lang.hm>,"starlink@lists.bufferbloat.net"
<starlink@lists.bufferbloat.net>
Subject: Re: [Starlink] Itʼs the Latency, FCC
Message-ID:<srss5qrq-7973-5q87-823p-30pn7o308...@ynat.uz>
Content-Type: text/plain; charset="utf-8"; Format="flowed"
Amazon, youtube set explicitly to 4k (I didn't say HDR)
David Lang
On Tue, 30 Apr 2024, Colin_Higbie wrote:
Date: Tue, 30 Apr 2024 01:30:21 +0000
From: Colin_Higbie<chigb...@higbie.name>
To: David Lang<da...@lang.hm>
Cc:"starlink@lists.bufferbloat.net" <starlink@lists.bufferbloat.net>
Subject: RE: [Starlink] Itʼs the Latency, FCC
Was that 4K HDR (not SDR) using the standard protocols that streaming
services use (Netflix, Amazon Prime, Disney+, etc.) or was it just some YouTube 4K SDR videos? YouTube will
show "HDR" on the gear icon for content that's 4K HDR. If it only shows "4K" instead of
"HDR," then means it's SDR.
Note that if YouTube, if left to the default of Auto for streaming resolution it will
also automatically drop the quality to something that fits within the bandwidth and most
of the "4K" content on YouTube is low-quality and not true UHD content (even
beyond missing HDR). For example, many smartphones will record 4K video, but their optics
are not sufficient to actually have distinct per-pixel image detail, meaning it
compresses down to a smaller image with no real additional loss in picture quality, but
only because it's really a 4K UHD stream to begin with.
Note that 4K video compression codecs are lossy, so the lower quality
the
initial image, the lower the bandwidth needed to convey the stream w/o
additional quality loss. The needed bandwidth also changes with scene
complexity. Falling confetti, like on Newy Year's Eve or at the Super Bowl make
for one of the most demanding scenes. Lots of detailed fire and explosions with
fast-moving fast panning full dynamic backgrounds are also tough for a
compressed signal to preserve (but not as hard as a screen full of falling
confetti).
I'm dubious that 8Mbps can handle that except for some of the simplest
video, like cartoons or fairly static scenes like the news. Those scenes don't
require much data, but that's not the case for all 4K HDR scenes by any means.
It's obviously in Netflix and the other streaming services' interest
to
be able to sell their more expensive 4K HDR service to as many people as
possible. There's a reason they won't offer it to anyone with less than 25Mbps
– they don't want the complaints and service calls. Now, to be fair, 4K HDR
definitely doesn’t typically require 25Mbps, but it's to their credit that they
do include a small bandwidth buffer. In my experience monitoring bandwidth
usage for 4K HDR streaming, 15Mbps is the minimum if doing nothing else and
that will frequently fall short, depending on the 4K HDR content.
Cheers,
Colin
-----Original Message-----
From: David Lang<da...@lang.hm>
Sent: Monday, April 29, 2024 8:40 PM
To: Colin Higbie<colin.hig...@scribl.com>
Cc:starlink@lists.bufferbloat.net
Subject: Re: [Starlink] Itʼs the Latency, FCC
hmm, before my DSL got disconnected (the carrier decided they didn't
want
to support it any more), I could stream 4k at 8Mb down if there wasn't too much
other activity on the network (doing so at 2x speed was a problem)
David Lang
On Fri, 15 Mar 2024, Colin Higbie via Starlink wrote:
Date: Fri, 15 Mar 2024 18:32:36 +0000
From: Colin Higbie via Starlink<starlink@lists.bufferbloat.net>
Reply-To: Colin Higbie<colin.hig...@scribl.com>
To:"starlink@lists.bufferbloat.net" <starlink@lists.bufferbloat.net>
Subject: Re: [Starlink] It’s the Latency, FCC
I have now been trying to break the common conflation that download
"speed"
means anything at all for day to day, minute to minute, second to
second, use, once you crack 10mbit, now, for over 14 years. Am I
succeeding? I lost the 25/10 battle, and keep pointing at really
terrible latency under load and wifi weirdnesses for many existing
100/20 services today.
While I completely agree that latency has bigger impact on how
responsive the Internet feels to use, I do think that 10Mbit is too low for
some standard applications regardless of latency: with the more recent
availability of 4K and higher streaming, that does require a higher minimum
bandwidth to work at all. One could argue that no one NEEDS 4K streaming, but
many families would view this as an important part of what they do with their
Internet (Starlink makes this reliably possible at our farmhouse). 4K
HDR-supporting TV's are among the most popular TVs being purchased in the U.S.
today. Netflix, Amazon, Max, Disney and other streaming services provide a
substantial portion of 4K HDR content.
So, I agree that 25/10 is sufficient, for up to 4k HDR streaming.
100/20
would provide plenty of bandwidth for multiple concurrent 4K users or a 1-2 8K
streams.
For me, not claiming any special expertise on market needs, just my
own
personal assessment on what typical families will need and care about:
Latency: below 50ms under load always feels good except for some
intensive gaming (I don't see any benefit to getting loaded latency
further below ~20ms for typical applications, with an exception for
cloud-based gaming that benefits with lower latency all the way down
to about 5ms for young, really fast players, the rest of us won't be
able to tell the difference)
Download Bandwidth: 10Mbps good enough if not doing UHD video
streaming
Download Bandwidth: 25 - 100Mbps if doing UHD video streaming,
depending on # of streams or if wanting to be ready for 8k
Upload Bandwidth: 10Mbps good enough for quality video conferencing,
higher only needed for multiple concurrent outbound streams
So, for example (and ignoring upload for this), I would rather have
latency at 50ms (under load) and DL bandwidth of 25Mbps than latency of 1ms with a max bandwidth of
10Mbps, because the super-low latency doesn't solve the problem with insufficient bandwidth to
watch 4K HDR content. But, I'd also rather have latency of 20ms with 100Mbps DL, then latency that
exceeds 100ms under load with 1Gbps DL bandwidth. I think the important thing is to reach
"good enough" on both, not just excel at one while falling short of "good
enough" on the other.
Note that Starlink handles all of this well, including kids watching
YouTube while my wife and I watch 4K UHD Netflix, except the upload speed
occasionally tops at under 3Mbps for me, causing quality degradation for
outbound video calls (or used to, it seems to have gotten better in recent
months – no problems since sometime in 2023).
Cheers,
Colin
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
-------------- next part --------------
An HTML attachment was scrubbed...
URL:<https://lists.bufferbloat.net/pipermail/starlink/attachments/20240430/5572b78b/attachment-0001.html>
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink