On Mon, Oct 10, 2016 at 4:57 AM, BartC <b...@freeuk.com> wrote: > On 09/10/2016 18:33, Jussi Piitulainen wrote: >> >> Chris Angelico writes: >> >>> On Mon, Oct 10, 2016 at 12:26 AM, Dennis Lee Bieber wrote: >>>> >>>> {This response is delayed as I'm waiting for the program to complete so >>>> I >>>> can get the run time} >>>> {Well... it's been near 24 hours and still merrily scrolling sums on my >>>> console -- so I'm going to kill the running program} >>> >>> >>> Eight BILLION of them. I wouldn't bother waiting for that. >> >> >> Eight billion is a big number. I know from experience that computers can >> compute a big number of computations in a short time. Therefore, I would >> expect the computer to compute eight billion computations in a short >> time. Maybe. > > > Compute, maybe. But the OP is printing something out at each step. Printing > to a console is very, very slow, compared with even with running CPython. > > On my machine, CPython might run through an empty loop at 10 to 20M > iterations per second, but is some 4000 times slower if it has to print one > "A" per line at each step.
Yeah. Quickest way to slow down a calculation is to naively add a progress indicator to it. (Actually, that's not quite fair. The quickest way is to add a time-of-day call every iteration, in a misguided attempt to reduce console output to "one every 0.1 seconds". That's crazy slow.) > The OP might try redirecting the output to a file (assuming he doesn't want > to read every single line of output as it's produced). That will be much > faster, although it depends on the workload in the run() function being > executed. > > Or only print every 1000th line or something if progress has to be > monitored. Yeah, if it's just for progress status. Of course, that does assume that the run() function doesn't have anything particularly costly in it. If it does, well, dis gonna take a while.... ChrisA -- https://mail.python.org/mailman/listinfo/python-list