Le 16/03/2024 à 20:10, Colin_Higbie via Starlink a écrit :
Just to be clear: 4K is absolutely a standard in streaming, with that being the most
popular TV being sold today. 8K is not and likely won't be until 80+" TVs
become the norm.
I can agree screen size is one aspect pushing the higher resolutions to
acceptance, but there are some more signs indicating that 8K is just
round the corner, and 16K right after it.
The recording consumer devices (cameras) already do 8K recording
cheaply, since a couple of years.
New acronyms beyond simply resolutions are always ready to come up. HDR
(high dynamic range) was such an acronym accompanying 4K, so for 8K
there might be another, bringing more than just resolution, maybe even
more dynamic range, blacker blacks, wider gamut,-for goggles, etc. for a
same screen size.
8K and 16K playing devices might not have a surface to exhibit their
entire power, but when such surfaces become available, these 8K and 16K
playing devices will be ready for them, whereas 4K no.
A similar evolution is witnessed by sound and by crypto: 44KHz CD was
enough for all, until SACD 88KHz came about, then DSD64, DSD128 and
today DSD 1024, which means DSD 2048 tomorrow. And the Dolby Atmos and
11.1 outputs. These too dont yet have the speakers nor the ears to
take advantage of, but in the future they might. In crypto, the
'post-quantum' algorithms are designed to resist brute force by
computers that dont exist publicly (a few hundred qubit range exists,
but 20.000 qubit range computer is needed) but when they will, these
crypto algos will be ready.
Given that, one could imagine the bandwidth and latency by a 3D 16K
DSD1024 quantum-resistant ciphered multi-party visio-conference with
gloves, goggles and other interacting devices, with low latency over
starlink.
The growth trends (4K...) can be identified and the needed latency
numbers can be projected.
Alex
The few 8K streaming videos that exist are available primarily as YouTube
curiosities, with virtually no displays on the market that support it yet and
none of the big content providers like Netflix or Disney+ provide 8K streams.
Virtually all modern streaming programming on Netflix and Disney+ is 4K HDR.
That is the standard to support.
The objective quality difference to the average human eye between SD and HD is
huge and changes whether you see the shape of a face or the detailed expression
on a face. Completely different viewing experience. The difference between HD
and 4K is significant on today's larger TV displays (not so visible on the
smaller displays that populated living rooms in prior decades). On an OLED TV
(not so much on an LCD) the difference between SDR and HDR is bigger than the
difference between HD and 4K. But because HDR generally comes with 4K and tends
not to be used much on HD streams, the real standards to contrast are HD (in
SDR) and 4K in HDR.
The minimum bandwidth needed to reliably provide a 4K HDR stream is about
15Mbps. Because of the way video compression works, a simpler scene may get by
with less than 10Mbps. A complex scene (fire, falling confetti like at the end
of the Super Bowl) can push this up to near 20Mbps. Assuming some background
activity on a typical network, safest is to think of 20Mbps as the effective
minimum for 4K. Netflix says 25Mbps to add an extra safety margin.
True that latency doesn't matter much for streaming. For streaming, unlike
VoIP, video conferencing, and gaming, bandwidth is more important.
VoIP, Video conferencing, and gaming drive low-latency use cases (web browsing
is also affected, but as long as the page starts to appear w/in about 1s and
has mostly completed within about 5s, users don't notice the lag, which is why
even geosync satellite Internet with its several hundred ms latency can be
acceptable for browsing).
Video conferencing drives high-upload (5Mbps minimum) use cases.
4K streaming drives high-download (20Mbps per user or per stream with some
safety and overhead) use cases.
These are all valid and important overall in architecting needs for an ISP, but
not all will necessarily be important to every user.
Cheers,
Colin
-----Original Message-----
From: Starlink <starlink-boun...@lists.bufferbloat.net> On Behalf Of
starlink-requ...@lists.bufferbloat.net
Sent: Saturday, March 16, 2024 1:37 PM
To: starlink@lists.bufferbloat.net
Subject: Starlink Digest, Vol 36, Issue 20
...
I think the 4K-latency discussion is a bit difficult, regardless of how great
the codecs are.
For one, 4K can be considered outdated for those who look forward to 8K and why
not 16K; so we should forget 4K. 8K is delivered from space already by a
japanese provider, but not on IP. So, if we discuss TV resolutions we should
look at these (8K, 16K, and why not 3D 16K for ever more strength testing).
Second, 4K etc. are for TV. In TV the latency is rarely if ever an issue.
There are some rare cases where latency is very important in TV (I could think
of betting in sports, time synch of clocks) but they dont look at such low
latency as in our typical visioconference or remote surgery or group music
playing use-cases on Internet starlink.
So, I dont know how much 4K, 8K, 16K might be imposing any new latency
requirement on starlink.
Alex
Date: Sat, 16 Mar 2024 18:21:48 +0100
From: Alexandre Petrescu <alexandre.petre...@gmail.com>
To: starlink@lists.bufferbloat.net
Subject: Re: [Starlink] It’s the Latency, FCC
Message-ID: <d04bf060-54e2-4828-854e-29c7f3e3d...@gmail.com>
Content-Type: text/plain; charset=UTF-8; format=flowed
I retract the message, sorry, it is true that some teleoperation and visioconf
also use 4K. So the latency is important there too.
A visioconf with 8K and 3D 16K might need latency reqs too.
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink