On 5/15/05, Juergen Dankoweit <[EMAIL PROTECTED]> wrote:
> Hello,
> 
> Am Sonntag, den 15.05.2005, 10:50 +0300 schrieb [EMAIL PROTECTED]:
> > Hi.
> > In the past, I wrote a program ( in DOS), to display a
> > signal collected from ADC. I use 2 video pages - 1 is
> > visible, second is hidden (to draw), and switch they very
> > fast. The result is very smooth animated graphic. Now i try
> > to convert my program on Linux. But the picture isn't very
> > good. The graphic start blinking. Becouse my program is too
> > large, I write some little example to show my idea:
> >
> [code...]
> >       gtk_widget_show_all (window);
> >
> >       /* set timeout function - 20 times per second */
> >       handler = g_timeout_add(1000/20, timeout_func, draw_area);
> >
> 
> if you use this function you have to remind some things:
> (1) you are in a multitasking environment; that means it is not
> guarantied that the callback function is called every 50ms: it is called
> in about 50ms +/- some time!
> (2) your application isn't alone when it communicates with the Xlib.
> That means a delay in drawing wigdets or bitmaps. It doesn't work "just
> in time"
> 
> To solve such a problem your application should be multithreaded and the
> thread that does the calculation and that thread which does the output
> should run at priority level "realtime". Then it should work better.

Or you could try something like this:
http://trific.ath.cx/toys/gtkanim/
This app is rendering as many frames of some "random art" as it have
the processor power. So for example if I have 25fps with one instance
running, when I fire up another one, both are running 13fps.
You probably also want to base your app not on a specific timeout, but
on the free time of your CPU.

mati
_______________________________________________
gtk-app-devel-list mailing list
gtk-app-devel-list@gnome.org
http://mail.gnome.org/mailman/listinfo/gtk-app-devel-list

Reply via email to