Bas Nieuwenhuizen <b...@basnieuwenhuizen.nl> writes:

> Well the complication here is that in the MONOTONIC (not
> MONOTONIC_RAW) case the CPU measurement can happen at the end of the
> MONOTONIC_RAW interval (as the order of measurements is based on
> argument order), so you can get a tick that started `period` (5 in
> this case) monotonic ticks before the start of the interval and a CPU
> measurement at the end of the interval.

Ah, that's an excellent point. Let's split out raw and monotonic and
take a look. You want the GPU sampled at the start of the raw interval
and monotonic sampled at the end, I think?

                 w x y z 0 1 2 3 4 5 6 7 8 9 a b c d e f
Raw              -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-

          0         1         2         3
GPU       -----_____-----_____-----_____-----_____

                                    x y z 0 1 2 3 4 5 6 7 8 9 a b c
Monotonic                           -_-_-_-_-_-_-_-_-_-_-_-_-_-_-_-

Interval                     <----------------->
Deviation           <-------------------------->

        start = read(raw)       2
        gpu   = read(GPU)       1
        mono  = read(monotonic) 2
        end   = read(raw)       b

In this case, the error between the monotonic pulse and the GPU is
interval + gpu_period (probably plus one to include the measurement
error of the raw clock).

Thanks for finding this case.

Now, I guess the question is whether we want to try and find the
smallest maxDeviation possible for each query. For instance, if the
application asks only for raw and gpu, the max_deviation could be
max2(interval+1,gpu_period), but if it asks for monotonic and gpu, it
would be interval+1+gpu_period. I'm not seeing a simple definition
here...

-- 
-keith

Attachment: signature.asc
Description: PGP signature

_______________________________________________
mesa-dev mailing list
mesa-dev@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/mesa-dev

Reply via email to