On 10/7/2014 1:01 PM, Ned Batchelder wrote:
On 10/7/14 2:10 AM, Gelonida N wrote:
Disadvantage of itertools.product() is, that it makes a copy in memory.
Reason ist, that itertools also makes products of generators (meaning of
objects, that one can't iterate several times through)
There are two use cases, that I occasionaly stumble over:
One is making the product over lists(),
product( list_of_lists )
ex:
product( [ [1,2,3], ['A','B'], ['a', 'b', 'c'] ] )
the other one making a product over a list of functions, which will
create generators
ex:
product( [ lambda: [ 'A', 'B' ], lambda: xrange(3) ] )
I personally would also be interested in a fast generic solution that
can iterate through an N-dimensional array and which does not duplicate
the memory or iterate through a list of generator-factories or however
this would be called.
itertools.product makes a copy of the sequences passed in, but it is a
shallow copy. It doesn't copy the objects in the sequences. It also
doesn't store the entire product.
If you are calling product(j, k, l, m, n), where len(j)==J, the extra
memory is J+K+L+M+N, which is much smaller than the number of iterations
product will produce. Are you sure that much extra memory use is a
problem? How large are your lists that you are product'ing together?
Thanks for the clarification.
You are right. Consumption of a shallow copy of each iterator, should
not be a problem in the cases that I encountered so far.
I don't understand your point about a list of functions that create
generators? What is the problem there?
The idea was to even avoid the creation of a shallow copy, by having a
function, that will return the same generator over and over again (thus
no need for shallow copy)
--
https://mail.python.org/mailman/listinfo/python-list