On Tue, 21 Jan 2003, Andreas Beck wrote:
> > Why not? If there's enough space in the gfx board memory then the
> > offscreen buffer should be allocated there.
>
> And not be available for another application I start on the switched to
> console?

Why not? Just make the bg app go back to use its own offscreen buffer that
resides in normal memory. The rule should be: use as much gfx memory as
possible.

> > > That sounds a good idea. But then a) the memory-target must be extended
> > > to "support" overlay resources
> > Not a big deal, is it? Ideally a memory target shouldn't care at all about
> > what it contains, it's just a framebuffer after all.
>
> Umm - an overlay does _not_ influence the underlying FB content (that's
> what Sprites are all about when compared to BOBs). Thus the memtarget
> would have to understand that is has to allocate extra memory for the
> Sprite and store whatever you send there.

Maybe I haven't explained myself very well: ONE offscreen buffer PER
independent visible buffer, which means that the overlay would go in a
buffer of its own, the sprites would just not be rendered at all and the
standard display would get a buffer of its own as well.

> > If there's any support for applications to get notified when they get
> > "iconified", then the kgi case should be aliased to it, that is for the
> > application it's actually the same thing. If there's no such support, then
> > it's time to implement it :)
>
> You are mixing up GUI issues and fullscreen graphics issues here.

Not really, I'm just generalizing and reusing semantics.

> Yes there is such a mechanism for X. But it is totally X-specific and
> cannot be generalized.

If it _can't_, then I'm afraid the implementation sucks quite a bit then
(no offence intended): the application doesn't have to care of whether
it's in X or not, it just has to care of whether it's still visible or
not.

Fabio Alemagna

Reply via email to