Am 16.07.2011 05:42 schrieb Steven D'Aprano:
You are right - it is a very big step for a very small functionality.
Or you can look at the various recipes on the Internet for writing tail-like
file viewers in Python, and solve the problem the boring old fashioned way.
It is not only about thi
On 16Jul2011 13:42, Steven D'Aprano
wrote:
| Billy Mays wrote:
| > I was thinking that a convenient solution to this problem would be to
| > introduce a new Exception call PauseIteration, which would signal to the
| > caller that there is no more data for now, but not to close down the
| > genera
On Sat, Jul 16, 2011 at 1:42 PM, Steven D'Aprano
wrote:
> Okay, we've come up with the solution of a new exception, PauseIteration,
> that the iterator protocol will recognise. Now we have to:
>
> - write an implementation or patch adding that functionality;
- and add it to our own personal buil
Billy Mays wrote:
> I was thinking that a convenient solution to this problem would be to
> introduce a new Exception call PauseIteration, which would signal to the
> caller that there is no more data for now, but not to close down the
> generator entirely.
It never fails to amuse me how often pe
On 7/15/2011 10:42 AM, Billy Mays wrote:
On 07/15/2011 10:28 AM, Thomas Rachel wrote:
Am 15.07.2011 14:52 schrieb Billy Mays:
Really what would be useful is some sort of PauseIteration Exception
which doesn't close the generator when raised, but indicates to the
looping header that there is n
On 7/15/2011 8:26 AM, Billy Mays wrote:
On 07/15/2011 04:01 AM, bruno.desthuilli...@gmail.com wrote:
On Jul 14, 9:46 pm, Billy Mays wrote:
I noticed that if a file is being continuously written to, the file
generator does not notice it:
def getLines(f):
lines = []
for line in f:
lines.append(l
Billy Mays wrote:
A sentinel does provide a work around, but it also passes the problem
onto the caller rather than the callee
The callee can easily take care of it -- just block until more is ready.
If blocking is not an option, then the caller has to deal with it no
matter how callee is im
Am 15.07.2011 16:42 schrieb Billy Mays:
A sentinel does provide a work around, but it also passes the problem
onto the caller rather than the callee:
That is right.
BTW, there is another, maybe easier way to do this:
for line in iter(f.readline, ''):
do_stuff(line)
This provides an ite
On 07/15/2011 10:28 AM, Thomas Rachel wrote:
Am 15.07.2011 14:52 schrieb Billy Mays:
Also, in the python docs, file.next() mentions there
being a performance gain for using the file generator (iterator?) over
the readline function.
Here, the question is if this performance gain is really rele
Am 15.07.2011 14:52 schrieb Billy Mays:
Also, in the python docs, file.next() mentions there
being a performance gain for using the file generator (iterator?) over
the readline function.
Here, the question is if this performance gain is really relevant AKA
"feelable". The file object seems to
Am 15.07.2011 14:26 schrieb Billy Mays:
I was thinking that a convenient solution to this problem would be to
introduce a new Exception call PauseIteration, which would signal to the
caller that there is no more data for now, but not to close down the
generator entirely.
Alas, an exception thr
On Fri, Jul 15, 2011 at 10:52 PM, Billy Mays wrote:
> Really what would be useful is some sort of PauseIteration Exception which
> doesn't close the generator when raised, but indicates to the looping header
> that there is no more data for now.
>
All you need is a sentinel yielded value (eg None
On 07/15/2011 08:39 AM, Thomas Rachel wrote:
Am 14.07.2011 21:46 schrieb Billy Mays:
I noticed that if a file is being continuously written to, the file
generator does not notice it:
Yes. That's why there were alternative suggestions in your last thread
"How to write a file generator".
To rep
Am 14.07.2011 21:46 schrieb Billy Mays:
I noticed that if a file is being continuously written to, the file
generator does not notice it:
Yes. That's why there were alternative suggestions in your last thread
"How to write a file generator".
To repeat mine: an object which is not an iterator
On 07/15/2011 04:01 AM, bruno.desthuilli...@gmail.com wrote:
On Jul 14, 9:46 pm, Billy Mays wrote:
I noticed that if a file is being continuously written to, the file
generator does not notice it:
def getLines(f):
lines = []
for line in f:
lines.append(line)
return
On Jul 14, 9:46 pm, Billy Mays wrote:
> I noticed that if a file is being continuously written to, the file
> generator does not notice it:
>
> def getLines(f):
> lines = []
> for line in f:
> lines.append(line)
> return lines
what's wrong with file.readlines() ?
--
http:
Billy Mays writes:
> Is there any way to just create a new generator that clears its
> closed` status?
You can define getLines in terms of the readline file method, which does
return new data when it is available.
def getLines(f):
lines = []
while True:
line = f.readline()
On 7/14/2011 3:46 PM, Billy Mays wrote:
I noticed that if a file is being continuously written to, the file
generator does not notice it:
Because it does not look, as Ian explained.
def getLines(f):
lines = []
for line in f:
lines.append(line)
return lines
This nearly duplicates .readlines,
On 07/14/2011 04:00 PM, Ian Kelly wrote:
On Thu, Jul 14, 2011 at 1:46 PM, Billy Mays wrote:
def getLines(f):
lines = []
for line in f:
lines.append(line)
return lines
with open('/var/log/syslog', 'rb') as f:
lines = getLines(f)
# do some processing with lines
#
On Thu, Jul 14, 2011 at 1:46 PM, Billy Mays wrote:
> def getLines(f):
> lines = []
> for line in f:
> lines.append(line)
> return lines
>
> with open('/var/log/syslog', 'rb') as f:
> lines = getLines(f)
> # do some processing with lines
> # /var/log/syslog gets updated in
I noticed that if a file is being continuously written to, the file
generator does not notice it:
def getLines(f):
lines = []
for line in f:
lines.append(line)
return lines
with open('/var/log/syslog', 'rb') as f:
lines = getLines(f)
# do some processing with lines
21 matches
Mail list logo