On 9/17/2023 11:48 AM, AVI GROSS via Python-list wrote:
Timing things that are fairly simple is hard enough to do repeatedly, but when
it involves access to slower media and especially to network connections to
servers, the number of things that can change are enormous. There are all kinds
of caching at various levels depending on your hardware and resource contention
with other programs running here and there as well as on various network-like
structures and busses or just hard disks. Asking for anything to be repeated
multiple times in a row as a general rule can make your results seem slower or
faster depending on too many factors including what else is running on your
machine.
I am wondering if an approach to running something N times that may average things out a bit is to simply put in a pause. Have your program wait a few minutes between attempts and perhaps even do other things within your loop that make it likely some of the resources you want not to be in a queue have a chance to be flushed as other things take their place.
One thing I have done for timing queries is to construct a series of
test queries with the query parameters drawn randomly from a large set
of values. The hope is that random combinations will defeat caching and
provide a reasonably realistic view of the times.
--
https://mail.python.org/mailman/listinfo/python-list