On 12/9/2009 3:52 AM, Jorge Cardona wrote:
2009/12/8 Lie Ryan<lie.1...@gmail.com>:
First, I apologize for rearranging your message out of order.
Theoretically yes, but the semantic of generators in python is they work on
an Iterable (i.e. objects that have __iter__), instead of a Sequence (i.e..
objects that have __getitem__). That means semantically, generators would
call obj.__iter__() and call the iter.__next__() and do its operation for
each items returned by the iterator's iterable's __next__().
The lazy semantic would be hard to fit the current generator model without
changing the semantics of generator to require a Sequence that supports
indexing.
Why?
The goal is add a formal way to separate the transformation of a
generator in those that act on the indexing set and those that act on
the result set.
<snip code>
Well, that's just a pretty convoluted way to reverse the order of the
operation (hey, why not reverse it altogether?). Nevertheless, it still
requires a semantic change in that all generators must be able to
produce the underlying stream, which is not always the case. Take this
small example:
def foo(x):
i = 0
while True:
yield i
i += 1
What would you propose the .indexing_set to be? Surely not the same as
.result_set? How would you propose to skip over and islice such
generator? without executing the in-betweens?
from functools import partial
def f(x):
print("eval: %d"%x)
return x
X = range(10)
g = (partial(f, x) for x in X)
print(list(x() for x in islice(g,0,None,2)))
# # or without partial:
# g = ((lambda: f(x)) for x in X)
# print(list(f() for f in islice(g,0,None,2)))
I keep here the problem in that i shouldn't be able to define the
original generator because the function receive the already defined
generator.
In a default-strict language, you have to explicitly say if you want lazy
execution.
What i want to do is a function that receive any kind of generator and
execute it in several cores (after a fork) and return the data, so, i
can't slice the set X before create the generator.
beware that a generator's contract is to return a valid iterator *once*
only. You can use itertools.tee() to create more generators, but tee built a
list of the results internally.
Oh, yes, i used tee first, but i note then that I wasn't using the
same iterator in the same process, so, when the fork is made I can
use the initial generator in different processes without this problem,
so tee is not necessary in this case.
You would have to change tee as well:
>>> import itertools
>>> def foo(x):
... print('eval: %s' % x)
... return x + 1
...
>>> l = [1, 2, 3, 4, 5, 6, 7]
>>> it = iter(l)
>>> it = (foo(x) for x in it)
>>> a, b = itertools.tee(it)
>>> # now on to you
>>> it_a = itertools.islice(a, 0, None, 2)
>>> it_b = itertools.islice(b, 1, None, 2)
>>> next(it_b)
eval: 1
eval: 2
3
>>> next(it_b)
eval: 3
eval: 4
5
>>> next(it_b)
eval: 5
eval: 6
7
--
http://mail.python.org/mailman/listinfo/python-list