Le 03/05/2024 à 03:48, Ulrich Speidel via Starlink a écrit :
There's also the not-so-minor issue of video compression, which
generally has the effect of removing largely imperceptible detail from
your video frames so your high-res video will fit through the pipeline
you've got to squeeze it through.
But this is a bit of a snag in its own right, as I found out about two
decades ago when I was still amazed at the fact that you could use the
parsing algorithms underpinning universal data compression to get an
estimate of how much information a digital object (say, a CCD image
frame) contained. So I set about with two Japanese colleagues to look
at the reference image sequences that pretty much everyone used to
benchmark their video compressors against. One of the surprising finds
was that the odd-numbered frames in the sequences had a distinctly
different amount of information in them than the even-numbered ones,
yet you couldn't tell from looking at the frames.
We more or less came to the conclusion that the camera that had been
used to record the world's most commonly used reference video
sequences had added a small amount of random noise to every second
image. - the effect (and estimated information content) dropped
noticeably when we progressively dropped the least significant bits in
the pixels. We published this:
KAWAHARADA, K., OHZEKI, K., SPEIDEL, U. 'Information and Entropy
Measurements on Video Sequences', 5th International Conference on
Information, Communications and Signal Processing (ICICS2005),
Bangkok, 6-9 December 2005, p.1150-1154, DOI 10.1109/ICICS.2005.1689234
Did the world take notice? Of course not. But it still amuses me no
end that some people spent entire careers trying to optimise the
compression of these image sequences - and all that because of an
obscure hardware flaw that the cameras which their algorithms ended up
on may not have even suffered from.
Which brings me back to the question of how important bandwidth is.
The answer is: probably more important in the future. We're currently
relying mostly on CDNs for video delivery, but I can't fail but notice
the progress that's being made by AI-based video generation. Four or
five years ago, Gen-AI could barely compose a credible image. A couple
of years ago, it could do video sequences of a few seconds. Now we're
up to videos in the minutes.
If that development is sustained, you'll be able to tell your personal
electronic assistant / spy to dream up a personalised movie, say an
operatic sci-fi Western with car chases on the Titanic floating in
space, and it'll have it generated in no time starring the actors you
like. ETA: Around 2030 maybe?
For that ETA (estimated time of arrival), I would prepare to be ready in
the more immediate, in the next couple of years, rather than in year
2030, to see such easy accessibility to personalised movies. It is
already easy to generate presuasively-looking photos out of keywords;
further, some organisations authoring text generation tools announced
just recently they work towards goals of such video generation.
One great advantage of these generated pics, videos and text is that
they are easily liked by many of us: the color balance, the softness of
sound, the great text rolling, make it that myself too are very inclined
to digest them very easily, and to even like them more than content
recording reality. To such a point to prefer them to the reality, or
even believe that generated content are a genuine expression of reality.
To distinguish what is generated from what is recorded from reality will
become more and more of a challenge. It is a challenge for all in the
chain of distribution, not just for consumers.
To give an example of imediateness from this starlink list, sorry if I
disgress: recently there was discussion on this email list about clock
synch; in other simultaneous news there was announcement of a Nature
article, famously authored, about new methods of even more precise
clocks. Now, even in such context (Nature, famous author), one can find
AI-generated text. Should that be trusted? Is it right or wrong,
ethical or non-ethical? Further to the complication - is the
identification tool correct to tell that that introductory paragraph is
100% generated? Because I know that computationally it is hard to detect
AI-generated text. It will probably be even more complicated to detect
whether video is AI-generated or recorded from reality.
Alex
But these things will be (a) data-heavy and (b) aren't well suited for
CDN delivery because you may be the only one to every see a particular
movie, so you'll either need to move the movie generation to the edge,
or you need to build bigger pipes across the world. I'm not sure how
feasible either option is.
On 3/05/2024 2:47 am, Colin_Higbie via Starlink wrote:
Alex, fortunately, we are not bound to use personal experiences and
observations on this. We have real market data that can provide an
objective, data-supported conclusion. No need for a
chocolate-or-vanilla-ice-cream-tastes-better discussion on this.
Yes, cameras can film at 8K (and higher in some cases). However, at
those resolutions (with exceptions for ultra-high end cameras, such
as those used by multi-million dollar telescopes), except under very
specific conditions, the actual picture quality doesn't vary past
about 5.5K. The loss of detail simply moves from a consequence of too
few pixels to optical and focus limits of the lenses. Neighboring
pixels simply hold a blurry image, meaning they don't actually carry
any usable information. A still shot with 1/8 of a second exposure
can easily benefit from an 8K or higher sensor. Video sometimes can
under bright lights with a relatively still or slow moving scene.
Neither of these requirements lends itself to typical home video at
30 (or 24) frames per second – that's 0.03s of time per frame. We can
imagine AI getting to the point where it can compensate for lack of
clarity, and this is already being used for game rendering (e.g.,
Nvidia's DLSS and Intel's XESS), but that requires training per scene
in those games and there hasn't been much development work done on
this for filming, at least not yet.
Will sensors (or AI) improve to capture images faster per amount of
incoming photons so that effective digital shutter speeds can get
faster at lower light levels? No doubt. Will it materially change
video quality so that 8K is a similar step up from 4K as 4K is from
HD (or as HD was from SD)? No, at least not in the next several
years. Read on for why.
So far that was all on the production side. But what about the
consumer side? Mass market TV sizes max out below about 100" (83"
seems to be a fairly common large size, but some stores carry larger
models). Even those large sizes that do reach mass-market locations
and are available on Amazon, still comprise a very small % of total
TV sales. The vast, vast majority of TV sales are of sub 70" models.
This is not just because of pricing, that's a factor. It's also
because home architecture had not considered screens this big. At
these sizes, it's not just a matter of upgrading the entertainment
console furniture, it's a matter of building a different room with a
dedicated entertainment wall. There is a lot of inertia in the
architecture and building that prevents this from being a sudden
change, not to mention the hundreds of millions of existing homes
that are already sized for TV's below 100".
And important to this discussion, at several feet from even a 70" -
90" screen, most people can't see the difference between 4K and 8K
anyway. The pixels are too small at that distance to make a
difference in the User Experience. This is a contrast with 4K from
HD, which many people (not all) can see, or from SD to HD, an
improvement virtually everyone can see (to the point that news
broadcasts now blur the faces of their anchors to remove wrinkles
that weren't visible back in the SD days).
For another real-world example of this curtailing resolution growth:
smartphones raced to higher and higher resolutions, until they
reached about 4K, then started pulling back. Some are slightly
higher, but as often as not, even at the flagship level, many
smartphones fall slightly below 4K, with the recognition that
customers got wise to screens all being effectively perfect and
higher resolutions no longer mattered.
Currently, the leading contender for anything appearing at 8K are
games, not streaming video. That's because games don't require camera
lenses and light sensors that don't yet exist. They can render dimly
lit, fast moving scenes in 8K just as easily as brightly lit scenes.
BUT (huge but here), GPUs aren't powerful enough to do that yet
either at good framerates, and for most gamers (not all, but a
significant majority), framerate is more important resolution. Top of
the line graphics cards (the ones that run about $1,000, so not
mainstream yet) of the current generation are just hitting 120fps at
4K in top modern games. From a pixel moving perspective, that would
translate to 30fps at 8K (4x the # of pixels, 120/4 = 30). 30fps is
good enough for streaming video, but not good enough for a gamer over
4K at 120fps. Still, I anticipate (this part is just my opinion, not
a fact) that graphics cards on high-end gaming PCs will be the first
to drive 8K experiences for gamers before 8K streaming becomes an
in-demand feature. Games have HUDs and are often played on monitors
just a couple of feet from the gamer where ultra-fine details would
be visible and relevant.
Having said all of that, does this mean that I don't think 8K and
higher will eventually replace 4K for mass market consumer streaming?
No, I suspect that in the long-run you're right that they will.
That's a reasonable conclusion based on history of screen and TV
programming resolutions, but that timeframe is likely more than 10
years off and planning bandwidth requirements for the needs 10-years
from now does not require any assumptions relating to standard video
resolutions people will be watching then: we can all assume with
reasonable confidence based on history of Internet bandwidth usage
that bandwidth needs and desires will continue to increase over time.
The point for this group is that you lose credibility to the audience
if you base your reasoning on future video resolutions that the
market is currently rejecting without at least acknowledging that
those are projected future needs, rather than present day needs.
At the same time, 4K is indeed a market standard TODAY. That's not an
opinion, it's a data point and a fact. As I've said multiple times in
this discussion, what makes this a fact and not an opinion are that
millions of people choose to pay for access to 4K content and the
television programs and movies that are stored and distributed in 4K.
All the popular TV devices and gaming consoles support 4K HDR content
in at least some versions of the product (they may also offer
discounted versions that don't do HDR or only go to 1080p or 1440).
The market has spoken and delivered us that data. 4K HDR is the
standard for videophiles and popular enough that the top video
streaming services all offer it. It is also not in a chaotic state,
with suppliers providing different technologies until the market
sorts out a winner (like the old Blu-ray vs. HD-DVD fight 15 years
ago, or VHS vs. Beta before that). Yes, there are some variants on
HDR (Dolby Vision vs. HDR-10), but as TV's are manufactured today,
Dolby Vision is effectively just a superset of HDR-10, like G-Sync is
a superset of Adaptive Sync for variable refresh rate displays needed
for gaming. So, yes, 4K HDR is a standard, whether you buy a Blu-ray
UHD movie at Walmart or Best Buy or stream your programming from
Netflix, Disney+, Max, or Amazon Prime.
So again, this is why the minimum rational top bandwidth any new ISP
should be developing (at least in developed countries – I think it's
fair to say that if people have no Internet access within hundreds of
miles, even slow Internet for connectivity to a local library in
travel distance from home is far better than nothing) is 25Mbps as
the established bandwidth required by the 4K providers to stream 4K
HDR content. This does not mean more would not be better or that more
won't be needed in the future. But if you are endorsing ISP buildout
focused around low-latency under load at anything LESS THAN 25Mbps,
you have simply shifted the problem for customers and users of the
new service from poor latency (this group's focus) to poor bandwidth
incapable of providing modern services.
To be taken seriously and maximize your chances at success at
influencing policy, I urge this group's members to use that 25Mbps
top bandwidth as a floor. And to clarify my meaning, I don't mean
ISPs shouldn't also offer less expensive tiers of service with
bandwidth at only, say, 3 or 10Mbps. Those are fine and will be
plenty for many users, and a lower cost option with less capability
is a good thing. What I mean is that if they are building out new
service, the infrastructure needs to support and they need to OFFER a
level of at least 25Mbps. Higher is fine too (better even), but where
cost collides with technical capability, 25Mbps is the market
requirement, below that and the service offering is failing to
provide a fully functional Internet connection.
Sorry for the long message, but I keep seeing a lot of these same
subjective responses to objective data, which concern me. I hope this
long version finally addresses all of those and I can now return to
just reading the brilliant posts of the latency and TCP/IP experts
who normally drive these discussions. You are all far more
knowledgeable than I in those areas. My expertise is in what the
market needs from its Internet connectivity and why.
Cheers,
Colin
-----Original Message-----
From: Starlink <starlink-boun...@lists.bufferbloat.net> On Behalf Of
starlink-requ...@lists.bufferbloat.net
Sent: Thursday, May 2, 2024 5:22 AM
To: starlink@lists.bufferbloat.net
Subject: Starlink Digest, Vol 38, Issue 13
Today's Topics:
1. Re: It’s the Latency, FCC (Alexandre Petrescu)
----------------------------------------------------------------------
Message: 1
Date: Thu, 2 May 2024 11:21:44 +0200
From: Alexandre Petrescu <alexandre.petre...@gmail.com>
To: starlink@lists.bufferbloat.net
Subject: Re: [Starlink] It’s the Latency, FCC
Message-ID: <94ba2b39-1fc8-46e2-9f77-3b04a6309...@gmail.com>
Content-Type: text/plain; charset=UTF-8; format=flowed
Le 30/04/2024 à 22:05, Sebastian Moeller via Starlink a écrit :
> Hi Colin,
> [...]
>
>> A lot of responses like "but 8K is coming" (it's not, only
>> experimental YouTube videos showcase these resolutions to the general
>> public, no studio is making 8K content and no streaming service
>> offers anything in 8K or higher)
> [SM] Not my claim.
Right, it is my claim. '8K is coming' comes from an observation that
it is now present in consumer cameras with ability to film 8K, since
a few years now.
The SD-HD-4K-8K-16K consumer market tendency can be evaluated. One
could parallel it with the megapixel number (photo camera) evolution,
or with the micro-processor feature size. There might be levelling,
but I am not sure it is at 4K.
What I would be interested to look at is the next acronym that
requires high bw low latency and that is not in the series
SD-HD-4K-8K-16K. This series did not exist in the times of analog TV
('SD' appeared when digital TV 'HD' appeared), so probably a new
series will appear that describes TV features.
Alex
>
>> and "I don't need to watch 4K, 1080p is sufficient for me,
> [SM] That however is my claim ;)
>
>> so it should be for everyone else too"
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
--
****************************************************************
Dr. Ulrich Speidel
School of Computer Science
Room 303S.594 (City Campus)
The University of Auckland
u.spei...@auckland.ac.nz
http://www.cs.auckland.ac.nz/~ulrich/
****************************************************************
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink
_______________________________________________
Starlink mailing list
Starlink@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/starlink