> On 2015-12-18 11:34, Peter Bogdanoff wrote:
> > When I said "millisecond," I meant precision to a millisecond. 
> > Otherwise timers would be a second or two or longer. That is possible?
> 
> I would be very surprised if you need millisecond precision for anything UI 
> related - just a guarantee that your timers will trigger in order at the 
> closest time they can to your requested time (which is no different than you 
> get in the Desktop engines). People with the very best eyesight *might* be 
> able to detect changes at 60fps, but mostly it is 15-40fps I believe. So I 
> suspect an error in the region of +/-20ms would not be noticeable in any 
> use-case involving sound playback being tied to visual feedback. Warmest 
> Regards, Mark.

I would like to second this.
Time measurements on screen displays below 1 tick (=1/60 second) are said to be 
pretty good random generators. But I heard of people that even believe in 
'exactness' of data < 1 tick, measured on a Raspberry Pi (which doesn't even 
have a hardware clock).

TMHO, there are easier ways to interpret randomness ...
_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to