Hello Dean,

The implementation looks plausible too, though it adds quite a large
amount of new code.

A significant part of this new code the the multiply-modulo implementation, which can be dropped if we assume that the target has int128 available, and accept that the feature is not available otherwise.
Also, there are quite a lot of comments which add to the code length.

The main thing that concerns me is justifying the code. With this kind of thing, it's all too easy to overlook corner cases and end up with trivial sequences in certain special cases. I'd feel better about that if we were implementing a standard algorithm with known pedigree.

Yep. I did not find anything convincing with the requirements: generate a permutation, can be parametric, low constant cost, good quality, work on arbitrary sizes…

Thinking about the use case for this, it seems that it's basically
designed to turn a set of non-uniform random numbers (produced by
random_exponential() et al.) into another set of non-uniform random
numbers, where the non-uniformity is scattered so that the more/less
common values aren't all clumped together.

Yes.

I'm wondering if that's something that can't be done more simply by
passing the non-uniform random numbers through the uniform random
number generator to scatter them uniformly across some range -- e.g.,
given an integer n, return the n'th value from the sequence produced
by random(), starting from some initial seed -- i.e., implement
nth_random(lb, ub, seed, n). That would actually be pretty
straightforward to implement using O(log(n)) time to execute (see the
attached python example), though it wouldn't generate a permutation,
so it'd need a bit of thought to see if it met the requirements.

Indeed, this violates two requirements: constant cost & permutation.

--
Fabien.

Reply via email to