On 08/09/2019 02:07, Alexander Grotewohl wrote:
but the resolution is not a ms at all. every call to gettickcount is something like 10-15ms or so off. you'd be lucky to call it a ms after it was updated by the os. do we document that too?
As I said: "The minimum resolution may vary, and may be more than one tick."
I have no crystal ball, but I would guess, that when it was first introduced, it aimed to mimic the windows function. The point is, it is not very exact. And it may on some platform already have completely different units.
But that does not mean, it needs to be changed on existing targets. And if it is not going to be changed, then it can be documented.
While I can only speak from my experience, a typical usage would be a "timeout" (or timer) in a calculation loop. The loop would utilize the CPU all the time. So a TTimerĀ would not be all that practical. But every about 100ms you want to call ProcessMessages that is likely more expensive than GetTickCount. (ok move it to a thread, but that is not the point). So you check the diff between to GetTickCount. A diff of 100 means around 100ms +/- something (in that case +/- 50% would even be ok). But make GetTickCount = 0.0001ms, and it matters (too much time in ProcessMessages / slowdown of calculation).
In some unit test, I use it for timeouts of 5 or 10 seconds. It does not matter if that is 4 or 6 (8 to 12) seconds..., just approx.
_______________________________________________ fpc-pascal maillist - fpc-pascal@lists.freepascal.org https://lists.freepascal.org/cgi-bin/mailman/listinfo/fpc-pascal