Re: jitterentropy vs. simulation

2023-12-01 Thread Johannes Berg
On Fri, 2023-12-01 at 14:25 -0500, Simo Sorce wrote: > > Doesn't this imply the simulation is not complete Kind of? > and you need to add > clock jitter for the simulation to be more useful? > No, it's more _intentionally_ incomplete. This works fine in normal ARCH=um, but with time-travel va

Re: jitterentropy vs. simulation

2023-12-01 Thread Simo Sorce
On Fri, 2023-12-01 at 19:35 +0100, Johannes Berg wrote: > [I guess we should keep the CCs so other see it] > > > Looking at the stuck check it will be bogus in simulations. > > True. > > > You might as well ifdef that instead. > > > > If a simulation is running insert the entropy regardless and

Re: jitterentropy vs. simulation

2023-12-01 Thread Johannes Berg
[I guess we should keep the CCs so other see it] > Looking at the stuck check it will be bogus in simulations. True. > You might as well ifdef that instead. > > If a simulation is running insert the entropy regardless and do not compute > the derivatives used in the check. Actually you mostly

Re: jitterentropy vs. simulation

2023-12-01 Thread Anton Ivanov
On 01/12/2023 10:21, Johannes Berg wrote: Hi, In ARCH=um, we have a mode where we simulate clocks completely, and even simulate that the CPU is infinitely fast. Thus, reading the clock will return completely predictable values regardless of the work happening. This is clearly incompatible wi

jitterentropy vs. simulation

2023-12-01 Thread Johannes Berg
Hi, In ARCH=um, we have a mode where we simulate clocks completely, and even simulate that the CPU is infinitely fast. Thus, reading the clock will return completely predictable values regardless of the work happening. This is clearly incompatible with jitterentropy, but now jitterentropy seems t