ot;"
linked_list = LinkedList()
for i in range(1, 11):
linked_list.append(i)
a = linked_list.index(1)
print(a.value)
b = linked_list.index(5)
print(b.value)
a.insert_after(27)
b.insert_after(45)
print(','.join(str(x) for x in linked_list)
Guenther Sohler writes:
Hi Python community,
I have a got an example list like
1, 2, 3, 4, 5, 6, 7, 8, 9, 10
T T
and i eventually want to insert items in the given locations
(A shall go between 2 and 3, B shall go between 6 and 7)
Right now i just use ind
position relative to the beginning/end).
No, it's not an option to sort the indexes and start inserting from the
back.
Without explaining this criteria (and any others), suggestions can only
be guesses!
The most elegant option is not to store indexes, but list iterators, which
atta
bject: Re: Python list insert iterators
On 3/3/2023 3:22 AM, Guenther Sohler wrote:
> Hi Python community,
>
> I have a got an example list like
>
> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10
> T T
>
> and i eventually want to insert items in the given location
back.
The most elegant option is not to store indexes, but list iterators, which
attach to the list element
and would automatically move, especially if an element is inserted before.
I could not find such functionality in python lists of [ 1,2,3 ]
Does python have such functionality ?
if yes,
store indexes, but list iterators, which
attach to the list element
and would automatically move, especially if an element is inserted before.
I could not find such functionality in python lists of [ 1,2,3 ]
Does python have such functionality ?
if yes, where can i find it ?
--
https://mail.pytho
Hi!
I've written a python package that lets you combine python code and shell
pipelines:
Pieshell is a Python shell environment that combines the expressiveness of
shell pipelines with the prower of python iterators.
It can be used in two major ways:
As an interactive shell replacin
On 25/07/2020 06:35, Random832 wrote:
On Tue, Jul 21, 2020, at 15:54, Terry Reedy wrote:
The transformers should be once-through iterators because they can be
passed once-through interators. I suppose one could make them iterables
and add an attribute 'pristine' set to True in __
t; whatever, then build it - it isn't hard.
Asking to be able to restart the iteration is hardly the same thing as asking
to pass through subscripts etc... C#'s Linq functions do fine with restartable
iterables and hardly ever expose Iterators [well, IEnumerators, as they're
called]
On Thu, Jul 23, 2020, at 05:14, Peter Slížik wrote:
> > Works in what way? You can't use it in a 'for' loop if it doesn't
> > define __iter__.
> >
>
> class Iterable:
> def __iter__(self):
> return Iterator(...)
>
> class Iterator:
> def __next__(self):
> return
>
>
On Sat, Jul 25, 2020 at 4:37 AM Random832 wrote:
>
> On Tue, Jul 21, 2020, at 15:54, Terry Reedy wrote:
> > The transformers should be once-through iterators because they can be
> > passed once-through interators. I suppose one could make them iterables
> > and add an att
On Tue, Jul 21, 2020, at 15:54, Terry Reedy wrote:
> The transformers should be once-through iterators because they can be
> passed once-through interators. I suppose one could make them iterables
> and add an attribute 'pristine' set to True in __init__ and False in
> __
is to call iter(iterable). If the
iterable is an iterator, iter(iterator) returns iterator. Open files
are *iterators*.
>>> file.__iter__() is file
True
This means that one can do things like
file = open(...)
for line in file: # Section 1 of file.
if separator(line): break
proce
> Works in what way? You can't use it in a 'for' loop if it doesn't
> define __iter__.
>
class Iterable:
def __iter__(self):
return Iterator(...)
class Iterator:
def __next__(self):
return
# No __iter__ here.
# I've just forgotten to def it.
With this setup, usi
On Thu, Jul 23, 2020 at 5:55 PM Peter Slížik wrote:
> Moreover, some tutorial authors make it even more difficult with using the
> terms iterator and iterable interchangeably. A notorious example is this
> wiki:
> https://wiki.python.org/moin/Iterator
>
> It says:
>
> *Here is an *iterator* that r
a random number of 1's: *
>
> class RandomIterable:def __iter__(self):return self
Yes? It is indeed an iterator, since its iter method returns itself.
It is also iterable, since it has an iter method. The article goes on
to explain this. I don't think they're being
> The view are iterables. They can be iterated more than once and used in
> other operations.
>
> The transformers should be once-through iterators because they can be
> passed once-through interators.
This is important, thank you for pointing it out.
> Python's design
storage resources for all of
the values it will return, instead performing in a 'lazy' fashion or JiT
(Just-in-time) delivery. Similarly, once a generator is "exhausted", it
terminates. It cannot be re-used, without being re-computed.
For your reading pleasure: PEP 3106 -- Revam
ld
instead pass a function to a generic 'transformer' class, but the
indirection would just make execution slower and hide the specific info
as to what the iterator is doing.
2. Why do these functions return iterators instead of iterables?
The view are iterables. They can be iterated mo
rom one another that it makes
sense to have individual implementations?
2. Why do these functions return iterators instead of iterables? First, I
find it confusing - to me, it is the loop's job to create an iterator from
the supplied iterable, and not the object that is being iterated over. A
On Fri, Nov 16, 2018 at 8:01 AM Steve Keller wrote:
>
> I wonder why iterators do have an __iter__() method? I thought
> iterable objects would have an __iter__() method (but no __next__())
> to create an iterator for it, and that would have the __next__()
> method
I wonder why iterators do have an __iter__() method? I thought
iterable objects would have an __iter__() method (but no __next__())
to create an iterator for it, and that would have the __next__()
method but no __iter__().
$ python3
Python 3.5.2 (default, Nov 12 2018, 13:43:14)
[GCC
Hi,
I have just released Pyro4 version 4.49; https://pypi.python.org/pypi/Pyro4
(super short description: it is a library that allows you to transparently call
methods
on objects that are running on other machines, as if they were local)
Pyro now also supports remote iterators. This means you
On Thu, 29 Sep 2016 11:38 am, Tim Chase wrote:
> This seems to discard the data's origin (data1/data2/data3) which is
> how I determine whether to use process_a(), process_b(), or
> process_c() in my original example where N iterators were returned,
> one for each input iterator
>
> merged = [(1, "one A"), (1, "one B"), (1, "uno"),
> (2, "two"), (2, "dos"), (2, ("ii", "extra alpha")),
> (3, "tres x"), (3, "tres y"), (3, "tres z"),
>
On Thu, 29 Sep 2016 05:10 am, Tim Chase wrote:
> I've got several iterators sharing a common key in the same order and
> would like to iterate over them in parallel, operating on all items
> with the same key. I've simplified the data a bit here, but it would
> be somet
Tim Chase wrote:
> I've got several iterators sharing a common key in the same order and
> would like to iterate over them in parallel, operating on all items
> with the same key. I've simplified the data a bit here, but it would
> be something like
>
> data1 = [ #
Here is a slight variation of Chris A's code that does not require
more than a single look-ahead per generator. It may be better
depending on the exact data passed in.
Chris A's version will store all of the items for each output that
have a matching key, which, depending on the expected data, cou
On Thu, Sep 29, 2016 at 5:10 AM, Tim Chase
wrote:
> And I'd like to do something like
>
> for common_key, d1, d2, d3 in magic_happens_here(data1, data2, data3):
> for row in d1:
> process_a(common_key, row)
> for thing in d2:
> process_b(common_key, row)
> for thing in d3
On 9/28/2016 3:10 PM, Tim Chase wrote:
I've got several iterators sharing a common key in the same order and
would like to iterate over them in parallel, operating on all items
with the same key. I've simplified the data a bit here, but it would
be something like
data1 = [ #
I've got several iterators sharing a common key in the same order and
would like to iterate over them in parallel, operating on all items
with the same key. I've simplified the data a bit here, but it would
be something like
data1 = [ # key, data1
(1, "one A"),
On 2015-08-09 19:24, Chris Angelico wrote:
> That's exactly right. The only way for the interpreter to handle
> 'in' on an iterator is something like this:
>
> def contains(iter, obj):
> for val in iter:
> if val == obj: return True
> return False
Which can nicely be written as
On 09/08/2015 12:30, Laura Creighton wrote:
Maybe add something about this here?
https://docs.python.org/2/tutorial/classes.html#iterators
Laura
Better still https://docs.python.org/3/tutorial/classes.html#iterators
--
My fellow Pythonistas, ask not what our language can do for you, ask
On 09/08/2015 14:11, Chris Angelico wrote:
On Sun, Aug 9, 2015 at 11:09 PM, Tim Chase wrote:
On 2015-08-09 19:24, Chris Angelico wrote:
That's exactly right. The only way for the interpreter to handle
'in' on an iterator is something like this:
def contains(iter, obj):
for val in iter:
On 2015-08-09 19:24, Chris Angelico wrote:
> That's exactly right. The only way for the interpreter to handle
> 'in' on an iterator is something like this:
>
> def contains(iter, obj):
> for val in iter:
> if val == obj: return True
> return False
Which can nicely be written as
On Sun, Aug 9, 2015 at 11:09 PM, Tim Chase wrote:
> On 2015-08-09 19:24, Chris Angelico wrote:
>> That's exactly right. The only way for the interpreter to handle
>> 'in' on an iterator is something like this:
>>
>> def contains(iter, obj):
>> for val in iter:
>> if val == obj: return
Maybe add something about this here?
https://docs.python.org/2/tutorial/classes.html#iterators
Laura
--
https://mail.python.org/mailman/listinfo/python-list
== z is produced while
> iterating over y" lead me to think that "0 in y" would be true.
You're almost right. The significance here is that once you've
partially iterated over y, the value z will not be produced while
iterating over y - so *at that point in time*, y doe
> The trap you're seeing here is that iterating over an iterator always
> consumes it, but mentally, you're expecting this to be iterating over
> a new instance of the same sequence.
No, I just tried to apply what I read in the docs :
1. I have y = A(10) which is an instance of a class which doe
ll next() on it, which
consumes values. But the description covers all *iterables*, and the
caveat applies only to *iterators*. Compare:
class A:
def __init__(self, x):
self.x = x
def __iter__(self):
counter = -1
while counter < self.x:
counter +
; assert False, '%s not found' %i
>
> You're dragging values from the same iterator, so you're consuming it
> as part of your membership test. You can do this kind of thing:
>
> >>> 5 in A(10)
> True
>
> but if you've already con
True
but if you've already consumed a few values, those won't be in the
iterator any more:
>>> x = A(10)
>>> next(x)
0
>>> next(x)
1
>>> next(x)
2
>>> next(x)
3
>>> 2 in x
False
This is simply how iterators work. They're very diffe
The documentation at
https://docs.python.org/3.5/reference/expressions.html#not-in says :
"For user-defined classes which do not define __contains__() but do define
__iter__(), x in y is true if some value z with x == z is produced while
iterating over y. If an exception is raised during the it
Hi again,
Thanks for your input; I'm starting to use generators to some extent.
Say we have a series of numbers:
x = randn(100)
and values beyond some criteria should be considered as outliers, but
only where there's at most 3 (or some other integer) consecutive values
beyond the criteria. The
Ian Kelly wrote:
> On Wed, Dec 24, 2014 at 1:34 PM, Rustom Mody
> wrote:
>> +1 for the slice in succinct form
>
> Not only more succinct but also more correct. The purpose of islice is to
> slice arbitrary iterables as opposed to just sequences. But this function
> requires a reentrant iterable
On Wed, Dec 24, 2014 at 1:34 PM, Rustom Mody wrote:
> +1 for the slice in succinct form
Not only more succinct but also more correct. The purpose of islice is to
slice arbitrary iterables as opposed to just sequences. But this function
requires a reentrant iterable anyway and returns garbage if y
On Wednesday, December 24, 2014 8:42:32 PM UTC+5:30, Vito De Tullio wrote:
> Seb wrote:
>
> def n_grams(a, n):
> > ... z = (islice(a, i, None) for i in range(n))
> > ... return zip(*z)
> > ...
> >
> > I'm impressed at how succinctly this islice helps to build a list of
> > tuples wit
Terry Reedy writes:
> On 12/23/2014 4:25 PM, Ben Finney wrote:
> > To be clear: there's nothing about parentheses that produce a
> > generator expression.
>
> Incorrect; parentheses *are* as a part of 'generator expression'.
> From the doc:
> generator_expression ::= "(" expression comp_for ")"
Seb wrote:
def n_grams(a, n):
> ... z = (islice(a, i, None) for i in range(n))
> ... return zip(*z)
> ...
>
> I'm impressed at how succinctly this islice helps to build a list of
> tuples with indices for all the required windows.
If you want it succinctly, there is this variation o
On 12/23/2014 4:25 PM, Ben Finney wrote:
Ian Kelly writes:
On Tue, Dec 23, 2014 at 11:55 AM, Seb wrote:
Particulary, what do the parentheses do there?
The parentheses enclose a generator expression, which is similar to a
list comprehension [1] but produce a generator, which is a type of
it
Ian Kelly writes:
> On Tue, Dec 23, 2014 at 11:55 AM, Seb wrote:
> > Particulary, what do the parentheses do there?
>
> The parentheses enclose a generator expression, which is similar to a
> list comprehension [1] but produce a generator, which is a type of
> iterator, rather than a list.
To b
On 12/23/2014 1:55 PM, Seb wrote:
def n_grams(a, n):
... z = (islice(a, i, None) for i in range(n))
... return zip(*z)
I'm impressed at how succinctly this islice helps to build a list of
tuples with indices for all the required windows. However, I'm not
quite following what goes on
On Tue, 23 Dec 2014 12:23:45 -0700,
Ian Kelly wrote:
> The parentheses enclose a generator expression, which is similar to a
> list comprehension [1] but produce a generator, which is a type of
> iterator, rather than a list.
> In much the same way that a list comprehension can be expanded out t
On Tue, Dec 23, 2014 at 11:55 AM, Seb wrote:
>
> Hi,
>
> I'm fairly new to Python, and while trying to implement a custom sliding
> window operation for a pandas Series, I came across a great piece of
> code¹:
>
> >>> def n_grams(a, n):
> ... z = (islice(a, i, None) for i in range(n))
> ...
on-language-features-and-tricks-you-may-not-know.html#sliding-windows-n-grams-using-zip-and-iterators
--
Seb
--
https://mail.python.org/mailman/listinfo/python-list
Excellent, thank you.
http://bugs.python.org/issue14573
-alfred
--
https://mail.python.org/mailman/listinfo/python-list
On Thu, Oct 2, 2014 at 4:05 PM, Alfred Morgan wrote:
> On Wednesday, October 1, 2014 3:55:19 PM UTC-7, Chris Angelico wrote:
>> At some point, you'll have to port your patch to the latest codebase
>
> Okay, done.
>
> https://github.com/Zectbumo/cpython/compare/master
>
On Wednesday, October 1, 2014 3:55:19 PM UTC-7, Chris Angelico wrote:
> At some point, you'll have to port your patch to the latest codebase
Okay, done.
https://github.com/Zectbumo/cpython/compare/master
Iterators for JSON is now Python 3 ready.
--
https://mail.python.org/mailman/
On Thu, Oct 2, 2014 at 8:01 AM, Alfred Morgan wrote:
> On Wednesday, October 1, 2014 6:07:23 AM UTC-7, Chris Angelico wrote:
>> On Wed, Oct 1, 2014 at 8:13 PM, Alfred Morgan wrote:
>> > What do you think now?
>>
>> I think that you're adding features to Python 2.7, which isn't getting
>> new featu
On Wednesday, October 1, 2014 6:07:23 AM UTC-7, Chris Angelico wrote:
> On Wed, Oct 1, 2014 at 8:13 PM, Alfred Morgan wrote:
> > What do you think now?
>
> I think that you're adding features to Python 2.7, which isn't getting
> new features. That won't be merged into trunk. Does your patch apply
On Wed, Oct 1, 2014 at 8:13 PM, Alfred Morgan wrote:
> I added a stream flag (off by default) and also added file streaming (thanks
> for the idea).
>
> https://github.com/Zectbumo/cpython/compare/2.7
>
> What do you think now?
I think that you're adding features to Python 2.7, which isn't getti
an error than to silently encode the file as if it were a
> list of strings. So it should not be the default behavior. That said,
> it sounds like it could be made easier to enable streaming from
> iterators as an option for those cases where it's desired.
I added a stream flag (off
On Mon, Sep 29, 2014 at 7:19 PM, wrote:
> I would like to add the ability to JSONEncode large iterators. Right now
> there is no way to do this without modifying the code.
>
> The JSONEncoder.default() doc string suggests to do this:
> For example, to support arbitrary
I would like to add the ability to JSONEncode large iterators. Right now there
is no way to do this without modifying the code.
The JSONEncoder.default() doc string suggests to do this:
For example, to support arbitrary iterators, you could
implement default like this
Ned Batchelder wrote:
> On 1/25/14 1:37 AM, seasp...@gmail.com wrote:
>> take the following as an example, which could work well.
>> But my concern is, will list 'l' be deconstructed after function return?
>> and then iterator point to nowhere?
>>
>> def test():
>> l = [1, 2, 3, 4, 5, 6, 7, 8
On 1/25/14 1:37 AM, seasp...@gmail.com wrote:
take the following as an example, which could work well.
But my concern is, will list 'l' be deconstructed after function return? and
then iterator point to nowhere?
def test():
l = [1, 2, 3, 4, 5, 6, 7, 8]
return iter(l)
def main():
On 1/25/14 1:37 AM, seasp...@gmail.com wrote:
take the following as an example, which could work well.
But my concern is, will list 'l' be deconstructed after function return? and
then iterator point to nowhere?
def test():
l = [1, 2, 3, 4, 5, 6, 7, 8]
return iter(l)
def main():
On Fri, 24 Jan 2014 22:37:37 -0800, seaspeak wrote:
> take the following as an example, which could work well. But my concern
> is, will list 'l' be deconstructed after function return? and then
> iterator point to nowhere?
That would be a pretty awful bug for Python, since it would like lead to
On Sat, Jan 25, 2014 at 5:37 PM, wrote:
> take the following as an example, which could work well.
> But my concern is, will list 'l' be deconstructed after function return? and
> then iterator point to nowhere?
>
> def test():
> l = [1, 2, 3, 4, 5, 6, 7, 8]
> return iter(l)
> def main()
take the following as an example, which could work well.
But my concern is, will list 'l' be deconstructed after function return? and
then iterator point to nowhere?
def test():
l = [1, 2, 3, 4, 5, 6, 7, 8]
return iter(l)
def main():
for i in test():
print(i)
--
https://m
> I think there are two aspects to your idea:
> 1. collections that share a single type
> 2. accessing multiple elements via a common interface
You are correct, and I now regret posing them in a coupled manner.
> Both are things that should be considered and I think both are useful in
> some cont
port both). Additionally, "typed" lists/iterators will allow
improved code analysis and optimization. The PyPy people have
already stated that they are working on implementing different
strategies for lists composed of a single type, so clearly there is
already community movement in this
On 12/16/2011 1:05 PM, Roy Smith wrote:
I'm working with now is functions that end in:
return [Foo(x) for x in bunch_of_x_thingies]
When something goes amiss and I want to debug the problem, I often
transform that into:
temp = [Foo(x) for x in bunch_of_x_thingies]
logger.debug(t
Roy Smith writes:
> When something goes amiss and I want to debug the problem, I often
> transform that into:
>
> temp = [Foo(x) for x in bunch_of_x_thingies]
> logger.debug(temp)
> return temp
>
> It would be convenient to be able to get at and log the intermediate
> value without
Nathan Rice, 16.12.2011 19:51:
Nothing stops me from implementing it, in fact it is VERY trivial to
wrap member class methods onto a list subclass, and wrap functions to
support vectorized behavior. The problem is that as soon as you hit
anything outside your code that returns a list or iterator
Chris Angelico wrote:
> It's no more strange than the way some people omit the u from colour. :)
Bonum Petronio Arbiteri, bonum mihi.
Mel.
--
http://mail.python.org/mailman/listinfo/python-list
etty).
I don't know that it belongs as a method on lists, especially since
that means you can't do the same thing on, for example, generators.
Or I guess, the easy answer to that lattermost objection is to make it
typed iterators rather than typed lists.
-- Devin
On Fri, Dec 16, 2011 a
't give you access to all the
>> intermediate values, and tends to be pretty awful to read.
>>
>> Having "typed" lists let you take preexisting string/int/etc methods
>> and expose them in a vectorized context and provides an easy way for
>> developers to
On Sat, Dec 17, 2011 at 5:38 AM, Arnaud Delobelle wrote:
> On 16 December 2011 18:25, Chris Angelico wrote:
>
>> tee = lambda func,arg: (func(arg),arg)[1]
>
> What a strange way to spell it!
>
> def tee(func, arg):
> func(arg)
> return arg
I started with that version and moved to the lambd
On 16 December 2011 18:25, Chris Angelico wrote:
> tee = lambda func,arg: (func(arg),arg)[1]
What a strange way to spell it!
def tee(func, arg):
func(arg)
return arg
--
Arnaud
--
http://mail.python.org/mailman/listinfo/python-list
On Sat, Dec 17, 2011 at 5:05 AM, Roy Smith wrote:
> Most of this was TL:DNR, but I will admit I often wish for a better way
> to log intermediate values. For example, a common pattern in the code
> I'm working with now is functions that end in:
>
> return [Foo(x) for x in bunch_of_x_thingies]
>
nctions dynamically to
support both). Additionally, "typed" lists/iterators will allow
improved code analysis and optimization. The PyPy people have already
stated that they are working on implementing different strategies for
lists composed of a single type, so clearly there is already community
In article ,
Nathan Rice wrote:
> I'd like to hear people's thoughts on the subject. Currently we are
> throwing away useful information in many cases that could be used for
> code analysis, optimization and simpler interfaces.
Most of this was TL:DNR, but I will admit I often wish for a bett
yped" lists let you take preexisting string/int/etc methods
and expose them in a vectorized context and provides an easy way for
developers to support both vectors and scalars in a single function
(you could easily "fix" other people's functions dynamically to
support both). Ad
On Thu, Oct 20, 2011 at 3:22 AM, Stefan Behnel wrote:
> Steven D'Aprano, 20.10.2011 10:04:
>>
>> Using Python 3, are range_iterator objects thread-safe?
>>
>> I have tried this, and it seems to be safe:
>>
>> >>> from threading import Thread
>> >>> x = iter(range(4))
>> >>> def doit(x):
>> ...
Ben Finney, 20.10.2011 13:23:
Stefan Behnel writes:
Steven D'Aprano, 20.10.2011 10:04:
Using Python 3, are range_iterator objects thread-safe?
The GIL ensures it's thread safe.
The GIL applies only to CPython.
and PyPy.
What is the answer for other Python
implementations which don't ha
Stefan Behnel writes:
> Steven D'Aprano, 20.10.2011 10:04:
> > Using Python 3, are range_iterator objects thread-safe?
> The GIL ensures it's thread safe.
The GIL applies only to CPython. What is the answer for other Python
implementations which don't have a GIL?
--
\ Eccles: “I just sa
Steven D'Aprano, 20.10.2011 10:04:
Using Python 3, are range_iterator objects thread-safe?
I have tried this, and it seems to be safe:
>>> from threading import Thread
>>> x = iter(range(4))
>>> def doit(x):
... print("result =", next(x))
...
>>> threads = [Thread(target=doit, args=(x,)) fo
Using Python 3, are range_iterator objects thread-safe?
I have tried this, and it seems to be safe:
>>> from threading import Thread
>>> x = iter(range(4))
>>> def doit(x):
... print("result =", next(x))
...
>>> threads = [Thread(target=doit, args=(x,)) for i in range(4)]
>>> for t in thread
On Wed, Jun 22, 2011 at 12:28 PM, Neal Becker wrote:
> AFAICT, the python iterator concept only supports readable iterators, not
> write.
> Is this true?
>
> for example:
>
> for e in sequence:
> do something that reads e
> e = blah # will do nothing
>
> I be
or
>>dictionary-ish items". Then ... the OP would write, e.g.:
>>
>>for elem in sequence with index:
>>...
>>sequence[index] = newvalue
>>
>>which of course calls the usual container.__setitem__. In this
>>case the "new p
P would write, e.g.:
>
>for elem in sequence with index:
>...
>sequence[index] = newvalue
>
>which of course calls the usual container.__setitem__. In this
>case the "new protocol" is to have iterators define a function
>that returns not j
(I apologize for the length of this article -- if I had more time,
I could write something shorter...)
In article
Neal Becker wrote:
>AFAICT, the python iterator concept only supports readable iterators,
>not write.
>Is this true?
>
>for example:
>
>for e in sequence:
s would occur from people reassigning to the loop variable,
>> forgetting that it had a side-effect of also reassigning to the iterable.
>> Fortunately, Python is not that badly designed.
>
> The example syntax is a non-starter, but there's nothing wrong with
> the basic idea
e loop variable,
> forgetting that it had a side-effect of also reassigning to the iterable.
> Fortunately, Python is not that badly designed.
The example syntax is a non-starter, but there's nothing wrong with
the basic idea. The STL of C++ uses output iterators and a quick
Google sea
[Sorry for over-quoting, I am not sure how to trim this properly]
Steven D'Aprano wrote:
> On Thu, 23 Jun 2011 09:30 am Thomas 'PointedEars' Lahn wrote:
>> Mel wrote:
>>> Steven D'Aprano wrote:
>>>> I *guess* that what you mean by "writabl
Don't relate it anyhow to foreach of perl I would say, although the behaviour
may be same in some aspect
--
http://mail.python.org/mailman/listinfo/python-list
On Wednesday, June 22, 2011 4:10:39 PM UTC-7, Neal Becker wrote:
> AFAIK, the above is the only python idiom that allows iteration over a
> sequence
> such that you can write to the sequence. And THAT is the problem. In many
> cases, indexing is much less efficient than iteration.
Well, if yo
On Thu, 23 Jun 2011 09:30 am Thomas 'PointedEars' Lahn wrote:
> Mel wrote:
>
>> Steven D'Aprano wrote:
>>> I *guess* that what you mean by "writable iterators" is that rebinding e
>>> should change seq in place, i.e. you would expect that s
On Thu, 23 Jun 2011 09:10 am Neal Becker wrote:
> Steven D'Aprano wrote:
>
>> On Wed, 22 Jun 2011 15:28:23 -0400, Neal Becker wrote:
>>
>>> AFAICT, the python iterator concept only supports readable iterators,
>>> not write. Is this true?
>>
1 - 100 of 392 matches
Mail list logo