Tim Peters <t...@python.org> added the comment:
Henry, no, I see no problem while running your example. It's been running on my box for over 5 minutes now, and memory use remains trivial. Note that in the code I posted for you, instead of [1, 2] I used range(100), and instead of 50 I used a million: the same kind of thing, but I used _far_ larger numbers. And still didn't have a problem. Although, yes, that did consume about a gigabyte to materialize a million instances of tuple(range(100)) under the covers. The code you posted also works fine for me, and with minor memory consumption, if I replace your "50" with "1000000": >>> many_arguments = [[1,2] for i in range(1000000)] >>> for term in product(*many_arguments): ... pass 100% of a CPU is used for as long as I let it run, but memory use jumps just a little at the start. Which is what I expected. It just doesn't take all that much memory to create a million 2-tuples - which is what the `product()` implementation does. I believe you saw a MemoryError, but at this point I have to guess you misdiagnosed the cause. In any case, if you can't supply an example that reproduces the problem, we're going to have to close this. Perhaps there's some weird memory-allocation flaw on the _platform_ you're using? Extreme fragmentation? Without an example that exhibits a problem, there's just no way to guess from here :-( ---------- _______________________________________ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue40230> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com