Re: closure = decorator?

2013-10-11 Thread Jussi Piitulainen
Roy Smith writes:
> In article ,
>  Piet van Oostrum wrote:
> 
> > I usually say that a closure is a package, containing a function
> > with some additional data it needs. The data usually is in the
> > form of name bindings.
> 
> That's pretty close to the way I think about it.  The way it was
> originally described to me is, "A closure is a function bundled up
> with it's arguments".

Really? It should be more like "a function bundled up with some other
function's arguments" and even more like "a function bundled up with
bindings for its free variables".

And the data that makes a function a closure is bindings always, by
definition, not just usually.

> To make a real-life analogy, let's say you're modeling a parking
> garage.  I want to be able to walk up to the attendant and say,
> "Please bring my car around front at 5 O'Clock.  It's that one"
> (pointing to the slightly dented green Ford in spot 37).  So, you've
> got a class:
> 
> class DeliveryRequest:
>def __init__(self, spot, time):
>   self.spot = spot
>   self.time = time
> 
> Now, over the course of the day, the garage attendants shuffle cars
> around to make room and retrieve vehicles that packed in the back.
> Comes 5 O'Clock, what vehicle do you want the attendant to deliver
> to the front?  The one that was in spot 37 at the time you made the
> request, or the one that's in spot 37 at 5 O'Clock?
> 
> Unless you want somebody else's car (perhaps you'd like something
> better than a slightly dented Ford), you want the attendant to
> capture the current state of spot 37 and remember that until 5
> O'Clock when it's time to go get the car, no matter where it happens
> to be right now.
> 
> That's a closure.

I fail to see a closure here. I see a class. I see an implied object
that could as well be dict(spot=37, time=5). Other entities (garage
and attendants) are not made sufficiently explicit.

There's another, more widely used word that means object. There's
another, more widely used word that means state. (I'm referring to the
words "object" and "state".) I see no need to use "closure" to mean
object or state, especially when the term has a more specific
established meaning.
-- 
https://mail.python.org/mailman/listinfo/python-list


Unicode Objects in Tuples

2013-10-11 Thread Stephen Tucker
I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.

I have four questions:

1. Why is it that
 print unicode_object
displays non-ASCII characters in the unicode object correctly, whereas
 print (unicode_object, another_unicode_object)
displays non-ASCII characters in the unicode objects as escape sequences
(as repr() does)?

2. Given that this is actually *deliberately *the case (which I, at the
moment, am finding difficult to accept), what is the neatest (that is, the
most Pythonic) way to get non-ASCII characters in unicode objects in tuples
displayed correctly?

3. A similar thing happens when I write such objects and tuples to a file
opened by
 codecs.open ( ..., "utf-8")
I have also found that, even though I use  write  to send the text to the
file, unicode objects not in tuples get their non-ASCII characters sent to
the file correctly, whereas, unicode objects in tuples get their characters
sent to the file as escape sequences. Why is this the case?

4. As for question 1 above, I ask here also: What is the neatest way to get
round this?

Stephen Tucker.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: I am never going to complain about Python again

2013-10-11 Thread Joshua Landau
On 11 October 2013 03:08, Steven D'Aprano
 wrote:
> Your mistake here seems to be that you're assuming that if two numbers
> are equal, they must be in the same domain, but that's not the case.
> (Perhaps you think that 0.0 == 0+0j should return False?) It's certainly
> not the case when it comes to types in Python, and it's not even the case
> in mathematics. Given:
>
> x ∈ ℝ, x = 2  (reals)
> y ∈ ℕ, y = 2  (natural numbers)
>
> we have x = y, but since 1/y is undefined (there is no Natural number
> 1/2), 1/x != 1/y.

Surely 1/y is perfectly well defined, as only y, not 1/y, is
constrained to the natural numbers.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Terry Reedy

On 10/10/2013 11:13 PM, Cameron Simpson wrote:

On 11Oct2013 02:55, Steven D'Aprano  
wrote:



def undecorate(f):
 """Return the undecorated inner function from function f."""
 return f.func_closure[0].cell_contents


Whereas this feels like black magic. Is this portable to any decorated
function? If so, I'd have hoped it was in the stdlib. If not: black magic.


And in use:

py> f(100)
201
py> undecorate(f)(100)
200


All lovely, provided you can convince me that undecorate() is robust.
(And if you can, I'll certainly be filing it away in my funcutils
module for later use.)


It only works if the decorator returns a closure with the original 
function as the first member (of func_closure). Often true, but not at 
all a requirement.


--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Terry Reedy

On 10/11/2013 12:36 AM, Steven D'Aprano wrote:


I also like Terry Reedy's suggestion of having the decorator
automatically add the unwrapped function to the wrapped function as an
attribute:

def decorate(func):
 @functools.wraps(func)
 def inner(arg):
 blah blah
 inner._unwrapped = func  # make it public if you prefer
 return inner

which makes it all nice and clean and above board. (I seem to recall a
proposal to have functools.wraps do this automatically...)


The explicit attribute can also be set by class rather than closure 
based decorators.


--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Terry Reedy

On 10/11/2013 4:17 AM, Terry Reedy wrote:

On 10/10/2013 11:13 PM, Cameron Simpson wrote:

On 11Oct2013 02:55, Steven D'Aprano
 wrote:



def undecorate(f):
 """Return the undecorated inner function from function f."""
 return f.func_closure[0].cell_contents


Whereas this feels like black magic. Is this portable to any decorated
function? If so, I'd have hoped it was in the stdlib. If not: black
magic.


And in use:

py> f(100)
201
py> undecorate(f)(100)
200


All lovely, provided you can convince me that undecorate() is robust.
(And if you can, I'll certainly be filing it away in my funcutils
module for later use.)


It only works if the decorator returns a closure with the original
function as the first member (of func_closure). Often true, but not at
all a requirement.




--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Ben Finney
Stephen Tucker  writes:

> I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.

Python 2 is not as good at Unicode as Python 3. In fact, one of the
major reasons to switch to Python 3 is that it fixes Unicode behaviour
that was worse in Python 2.

> I have four questions:
>
> 1. Why is it that
[…]
>  print (unicode_object, another_unicode_object)
> displays non-ASCII characters in the unicode objects as escape sequences
> (as repr() does)?

Python 3 behaves correctly for that::

>>> foo = "I ♡ Unicode"
>>> bar = "I ♥ Unicode"
>>> (type(foo), type(bar))
(, )
>>> print(foo)
I ♡ Unicode
>>> print(bar)
I ♥ Unicode
>>> print((foo, bar))
('I ♡ Unicode', 'I ♥ Unicode')

> 2. Given that this is actually *deliberately *the case (which I, at
> the moment, am finding difficult to accept)

I'm pretty sure it is not, since this is corrected in Python 3.

> what is the neatest (that is, the most Pythonic) way to get non-ASCII
> characters in unicode objects in tuples displayed correctly?

Switch to Python 3 :-)

-- 
 \   “Timid men prefer the calm of despotism to the boisterous sea |
  `\of liberty.” —Thomas Jefferson |
_o__)  |
Ben Finney

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Terry Reedy

On 10/11/2013 4:17 AM, Terry Reedy wrote:

On 10/10/2013 11:13 PM, Cameron Simpson wrote:

On 11Oct2013 02:55, Steven D'Aprano
 wrote:



def undecorate(f):
 """Return the undecorated inner function from function f."""
 return f.func_closure[0].cell_contents


Whereas this feels like black magic. Is this portable to any decorated
function? If so, I'd have hoped it was in the stdlib. If not: black
magic.


And in use:

py> f(100)
201
py> undecorate(f)(100)
200


All lovely, provided you can convince me that undecorate() is robust.
(And if you can, I'll certainly be filing it away in my funcutils
module for later use.)


It only works if the decorator returns a closure with the original
function as the first member (of func_closure). Often true, but not at
all a requirement.


Another standard decorator method is to write a class with a .__call__ 
method and attach the original function to instances as an attribute. 
(Indeed, decorators were borrowed from class-happy Java ;-). But there 
is no standard as to what the function attribute of instances is called. 
The OP's request for accessing the function without modifying the tested 
code cannot be met in general. One must have access to the tested code.



--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Peter Cacioppi
On Thursday, October 10, 2013 11:01:25 PM UTC-7, Peter Cacioppi wrote:
> Could someone give me a brief thumbnail sketch of the difference between 
> multi-threaded programming in Java.
> 
> 
> 
> I have a fairly sophisticated algorithm that I developed as both a single 
> threaded and multi-threaded Java application. The multi-threading port was 
> fairly simple, partly because Java has a rich library of thread safe data 
> structures (Atomic Integer, Blocking Queue, Priority Blocking Queue, etc). 
> 
> 
> 
> There is quite a significant performance improvement when multithreading here.
> 
> 
> 
> I'd like to port the project to Python, partly because Python is a better 
> language (IMHO) and partly because Python plays well with Amazon Web 
> Services. 
> 
> 
> 
> But I'm a little leery that things like the Global Interpret Lock will block 
> the multithreading efficiency, or that a relative lack of concurrent off the 
> shelf data structures will make things much harder.
> 
> 
> 
> Any advice much appreciated. Thanks.

I should add that the computational heavy lifting is done in a third party 
library. So a worker thread looks roughly like this (there is a subtle race 
condition I'm glossing over).

while len(jobs) :
   job = jobs.pop()
   model = Model(job)  # Model is py interface for a lib written in C
   newJobs = model.solve() # This will take a long time
   for each newJob in newJobs :
 jobs.add(newJob)

Here jobs is a thread safe object that is shared across each worker thread. It 
holds a priority queue of jobs that can be solved in parallel. 

Model is a py class that provides the API to a 3rd party library written in C.I 
know model.solve() will be the bottleneck operation for all but trivial 
problems. 

So, my hope is that the GIL restrictions won't be problematic here. That is to 
say, I don't need **Python** code to ever run concurrently. I just need Python 
to allow a different Python worker thread to execute when all the other worker 
threads are blocking on the model.solve() task. Once the algorithm is in full 
swing, it is typical for all the worker threads should be blocking on 
model.Solve() at the same time. 

It's a nice algorithm for high level languages. Java worked well here, I'm 
hoping py can be nearly as fast with a much more elegant and readable code.





-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Chris Angelico
On Fri, Oct 11, 2013 at 7:41 PM, Peter Cacioppi
 wrote:
> So, my hope is that the GIL restrictions won't be problematic here. That is 
> to say, I don't need **Python** code to ever run concurrently. I just need 
> Python to allow a different Python worker thread to execute when all the 
> other worker threads are blocking on the model.solve() task. Once the 
> algorithm is in full swing, it is typical for all the worker threads should 
> be blocking on model.Solve() at the same time.

Sounds like Python will serve you just fine! Check out the threading
module, knock together a quick test, and spin it up!

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Peter Otten
Stephen Tucker wrote:

> I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.
> 
> I have four questions:
> 
> 1. Why is it that
>  print unicode_object
> displays non-ASCII characters in the unicode object correctly, whereas
>  print (unicode_object, another_unicode_object)
> displays non-ASCII characters in the unicode objects as escape sequences
> (as repr() does)?
> 
> 2. Given that this is actually *deliberately *the case (which I, at the
> moment, am finding difficult to accept), what is the neatest (that is, the
> most Pythonic) way to get non-ASCII characters in unicode objects in
> tuples displayed correctly?

"correct" being a synonym for "as I expect" ;)
 
> 3. A similar thing happens when I write such objects and tuples to a file
> opened by
>  codecs.open ( ..., "utf-8")
> I have also found that, even though I use  write  to send the text to the
> file, unicode objects not in tuples get their non-ASCII characters sent to
> the file correctly, whereas, unicode objects in tuples get their
> characters sent to the file as escape sequences. Why is this the case?
> 
> 4. As for question 1 above, I ask here also: What is the neatest way to
> get round this?

I'll second Ben's recommendation of Python 3:

>>> t = "mäßig", "müßig", "nötig"
>>> print(t)
('mäßig', 'müßig', 'nötig')
>>> print(*t)
mäßig müßig nötig
>>> print(*t, sep=", ")
mäßig, müßig, nötig

All three variants also work with files. Example:

>>> with open("tmp.txt", "w") as outstream:
... print(*t, file=outstream)
... 
>>> open("tmp.txt").read()
'mäßig müßig nötig\n'


Should you decide to stick with Python 2 you can use a helper function:

>>> t = u"mäßig", u"müßig", u"nötig"
>>> def pretty_tuple(t, sep=u", "):
... return sep.join(map(unicode, t))
... 
>>> print pretty_tuple(t)
mäßig, müßig, nötig
>>> print pretty_tuple(t, sep=u" ")
mäßig müßig nötig


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: I am never going to complain about Python again

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 09:17:37 +0100, Joshua Landau wrote:

> On 11 October 2013 03:08, Steven D'Aprano
>  wrote:
>> Your mistake here seems to be that you're assuming that if two numbers
>> are equal, they must be in the same domain, but that's not the case.
>> (Perhaps you think that 0.0 == 0+0j should return False?) It's
>> certainly not the case when it comes to types in Python, and it's not
>> even the case in mathematics. Given:
>>
>> x ∈ ℝ, x = 2  (reals)
>> y ∈ ℕ, y = 2  (natural numbers)
>>
>> we have x = y, but since 1/y is undefined (there is no Natural number
>> 1/2), 1/x != 1/y.
> 
> Surely 1/y is perfectly well defined, as only y, not 1/y, is constrained
> to the natural numbers.

Context is important, and usually implied. 1/y within the natural numbers 
is treated in the same way as sqrt(-1) within the reals. Try it on your 
calculator, and chances are very good you'll get an error. Try it in 
Python 2, or nearly any other programming language (but not Python 3), 
and again, chances are you'll get an error.

If you implicitly decide to promote entities, then of course you can 
promote y to a real then take the invoice. But that trick still doesn't 
work for the original example, int(0.0) == int(0+0j) because promoting 0 
to complex doesn't help, you have to demote 0+0j to real and that's 
ambiguous.


-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Ned Batchelder

On 10/11/13 4:16 AM, Stephen Tucker wrote:

I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.

I have four questions:

1. Why is it that
 print unicode_object
displays non-ASCII characters in the unicode object correctly, whereas
 print (unicode_object, another_unicode_object)
displays non-ASCII characters in the unicode objects as escape 
sequences (as repr() does)?


2. Given that this is actually /deliberately /the case (which I, at 
the moment, am finding difficult to accept), what is the neatest (that 
is, the most Pythonic) way to get non-ASCII characters in unicode 
objects in tuples displayed correctly?


3. A similar thing happens when I write such objects and tuples to a 
file opened by

 codecs.open ( ..., "utf-8")
I have also found that, even though I use  write  to send the text to 
the file, unicode objects not in tuples get their non-ASCII characters 
sent to the file correctly, whereas, unicode objects in tuples get 
their characters sent to the file as escape sequences. Why is this the 
case?


4. As for question 1 above, I ask here also: What is the neatest way 
to get round this?


Stephen Tucker.



Although Python 3 is better than Python 2 at Unicode, as the others have 
said, the most important point is one that you hit upon yourself.


When you print an object x, you are actually printing str(x).  The str() 
of a tuple is a paren, followed by the repr()'s of its elements, 
separated by commas, then a closing paren.  Tuples and lists use the 
repr() of their elements when producing either their own str() or their 
own repr().


Python 3 does better at this because repr() in Python 3 will gladly 
include non-ASCII characters in its output, while Python 2 will only 
include ASCII characters, and so must resort to escape sequences. (BTW: 
if you like the ASCII-only idea from Python 2, Python 3 has the ascii() 
function and the %a string formatting directive for that very purpose.)


The two string representation alternatives str() and repr() can be 
confusing.  Think of it as: str() is for customers, repr() is for 
developers, or: str() is for humans, repr() is for geeks.   The reason 
tuples use the repr() of their elements is that the parens+commas 
representation of a tuple is geeky to begin with, so it uses repr() of 
its elements, even for str(tuple).


The way to avoid repr() for the elements is to format the tuple yourself.

--Ned.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 10:14:29 +0300, Jussi Piitulainen wrote:

> Roy Smith writes:
>> In article ,
>>  Piet van Oostrum wrote:
>> 
>> > I usually say that a closure is a package, containing a function with
>> > some additional data it needs. The data usually is in the form of
>> > name bindings.
>> 
>> That's pretty close to the way I think about it.  The way it was
>> originally described to me is, "A closure is a function bundled up with
>> it's arguments".
> 
> Really? It should be more like "a function bundled up with some other
> function's arguments" and even more like "a function bundled up with
> bindings for its free variables".

Closures have nothing to do with *arguments*. A better definition of a 
closure is that it is a function together with a snapshot of the 
environment it was called from.

def func(arg):
y = arg + 1
def inner():
return y + 1000
return inner

f = func(1)

At this point, f is a closure. It needs to know the value of y (not the 
argument to func) in order to work, and the implementation is to store 
that information inside f.func_closure (or f.__closure__ in Python 3). 
The part of the calling environment which is saved is y:

py> f.func_closure[0].cell_contents
2


> And the data that makes a function a closure is bindings always, by
> definition, not just usually.

Its not just *any* bindings though, it is specifically bindings to 
variables in the environment from which it was called.


[...]
>> That's a closure.
> 
> I fail to see a closure here. I see a class. I see an implied object
> that could as well be dict(spot=37, time=5). Other entities (garage and
> attendants) are not made sufficiently explicit.

In general, anything you can do with a closure, you can do with an object 
explicitly recording whatever state you want. A closure is just one 
implementation of "callable object with state that can be set when you 
create it". The closure f defined above could instead be written as:

class Func:
def __init__(self, arg):
self.y = arg + 1
def __call__(self):
return self.y + 1000

f = Func(1)


Which is better? If you want to expose the value of y to the outside 
world to modify, the class solution is better. If you don't, the closure 
is better. Closures tend to be more compact, and I suspect more 
efficient, but there's nothing you can do with one you can't do with the 
other.



-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 17:53:02 +1100, Cameron Simpson wrote:

> Other Python implementations may be more aggressive. I'd suppose Jypthon
> could multithread like Java, but really I have no experience with them.

Neither Jython nor IronPython have a GIL.


> The standard answer with CPython is that if you want to use multiple
> cores to run Python code (versus using Python code to orchestrate native
> code) you should use the multiprocessing stuff to fork the interpreter,
> and then farm out jobs using queues.

Note that this really only applies to CPU-bound tasks. For tasks that 
depend on file IO (reading and writing files), CPython threads will 
operate in parallel as independently and (almost) as efficiently as those 
in other languages. That is to say, they will be constrained by the 
underlying operating system's ability to do file IO, not by the number of 
cores in your CPU.


-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list



Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread David
On 11 October 2013 12:27, Steven D'Aprano
 wrote:
> On Fri, 11 Oct 2013 00:25:27 +1100, Chris Angelico wrote:
>
>> On Fri, Oct 11, 2013 at 12:09 AM, Roy Smith  wrote:
>>> BTW, one of the earliest things that turned me on to Python was when I
>>> discovered that it uses j as the imaginary unit, not i.  All
>>> right-thinking people will agree with me on this.
>>
>> I've never been well-up on complex numbers; can you elaborate on this,
>> please? All I know is that I was taught that the square root of -1 is
>> called i, and that hypercomplex numbers include i, j, k, and maybe even
>> other terms, and I never understood where j comes from. Why is Python
>> better for using j?
>
> Being simple souls and not Real Mathematicians, electrical engineers get
> confused by the similarity between I (current) and i (square root of -1),
> so they used j instead.
[...]
> 

No, electrical engineers need many symbols for current for the same reason
that eskimos need many words for snow :) [*]

[*] https://en.wikipedia.org/wiki/Eskimo_words_for_snow
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Gilles Lenfant
Cameron, Steven, Ben, Ned, Terry, Roy. Many thanks for this interesting 
discussion.

I ended up... mixing some solutions provided by your hints :

* Adding an "__original__" attribute to the wrapper func in the decorators of 
my own
* Playing with "func_closure" to test functions/methods provided by 3rd party 
tools

Cheers and thanks again for taking time to help me.

-- 
Gilles Lenfant
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: I am never going to complain about Python again

2013-10-11 Thread Chris Angelico
On Fri, Oct 11, 2013 at 8:11 PM, Steven D'Aprano
 wrote:
> If you implicitly decide to promote entities, then of course you can
> promote y to a real then take the invoice.

Either you're channelling Bugs Bunny or you're trying to sell me
something... you mean "take the inverse", I assume, here :)

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Nobody
On Thu, 10 Oct 2013 14:12:36 +, Grant Edwards wrote:

> Nope.  "i" is electical current (though it's more customary to use upper
> case).

"I" is steady-state current (either AC or DC), "i" is small-signal
current.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Oscar Benjamin
On 11 October 2013 10:35, David  wrote:
> On 11 October 2013 12:27, Steven D'Aprano
>  wrote:
>> On Fri, 11 Oct 2013 00:25:27 +1100, Chris Angelico wrote:
>>
>>> On Fri, Oct 11, 2013 at 12:09 AM, Roy Smith  wrote:
 BTW, one of the earliest things that turned me on to Python was when I
 discovered that it uses j as the imaginary unit, not i.  All
 right-thinking people will agree with me on this.
>>>
>>> I've never been well-up on complex numbers; can you elaborate on this,
>>> please? All I know is that I was taught that the square root of -1 is
>>> called i, and that hypercomplex numbers include i, j, k, and maybe even
>>> other terms, and I never understood where j comes from. Why is Python
>>> better for using j?
>>
>> Being simple souls and not Real Mathematicians, electrical engineers get
>> confused by the similarity between I (current) and i (square root of -1),
>> so they used j instead.
> [...]
>> 
>
> No, electrical engineers need many symbols for current for the same reason
> that eskimos need many words for snow :) [*]

There are many other letters in the Roman alphabet to choose from
though. In particular the study of complex numbers and the choice of i
for sqrt(-1) predates most of the study of electricity and the use of
I to denote current (it was previously called C in English texts).
Obviously I understand that that's all history and once conventions
are so widely adopted it's pointless to change them but it's good to
have common notation for the elementary parts of maths. If someone
tried to explain why their field couldn't use π for the circumference
of a unit circle I would suggest that they adjust the other parts of
their notation not π (there are other uses of π.

Truthfully I've now spent more time with engineers than
physicists/mathematicians and find it natural to switch between i and
j depending on who I'm talking to and what I'm talking about. It's
still confusing for students though when I switch between conventions
to use whichever is standard for a given subject.


Oscar
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Jussi Piitulainen
Steven D'Aprano writes:
> On Fri, 11 Oct 2013 10:14:29 +0300, Jussi Piitulainen wrote:
> > Roy Smith writes:
> >> In article ,
> >>  Piet van Oostrum wrote:
> >> 
> >> > I usually say that a closure is a package, containing a
> >> > function with some additional data it needs. The data usually
> >> > is in the form of name bindings.
> >> 
> >> That's pretty close to the way I think about it.  The way it was
> >> originally described to me is, "A closure is a function bundled
> >> up with it's arguments".
> > 
> > Really? It should be more like "a function bundled up with some
> > other function's arguments" and even more like "a function bundled
> > up with bindings for its free variables".
> 
> Closures have nothing to do with *arguments*. A better definition of
> a closure is that it is a function together with a snapshot of the
> environment it was called from.

Well, first, I was only trying to see something good in Piet's and
Roy's formulations.

Second, it's precisely not (a snapshot of) the environment where the
function is *called* from, it's (a snapshot of) the environment where
the function was *created* in. This is the whole *point*.

Third, to be even more pedantic, in the context where I think closures
originally appeared as an innovation, all local variables are bound by
a lambda. There the (non-global) free variables of a function *are*
arguments of *another* function. I can expand on this if you like, but
it will be in terms of another language, and not terribly relevant to
this discussion anyway.

> def func(arg):
> y = arg + 1
> def inner():
> return y + 1000
> return inner
> 
> f = func(1)
> 
> At this point, f is a closure. It needs to know the value of y (not
> the argument to func) in order to work, and the implementation is to
> store that information inside f.func_closure (or f.__closure__ in
> Python 3).  The part of the calling environment which is saved is y:
> 
> py> f.func_closure[0].cell_contents
> 2

Whether there is a y in the *calling* environment or not is
*irrelevant*.

   >>> (lambda y : func(1))('whatever')()
   1002

> > And the data that makes a function a closure is bindings always,
> > by definition, not just usually.
> 
> Its not just *any* bindings though, it is specifically bindings to
> variables in the environment from which it was called.

In the environment where it was created.

> [...]
> >> That's a closure.
> > 
> > I fail to see a closure here. I see a class. I see an implied
> > object that could as well be dict(spot=37, time=5). Other entities
> > (garage and attendants) are not made sufficiently explicit.
> 
> In general, anything you can do with a closure, you can do with an
> object explicitly recording whatever state you want. A closure is
> just one implementation of "callable object with state that can be
> set when you create it". The closure f defined above could instead
> be written as:
> 
> class Func:
> def __init__(self, arg):
> self.y = arg + 1
> def __call__(self):
> return self.y + 1000
> 
> f = Func(1)
> 
> Which is better? If you want to expose the value of y to the outside
> world to modify, the class solution is better. If you don't, the
> closure is better. Closures tend to be more compact, and I suspect
> more efficient, but there's nothing you can do with one you can't do
> with the other.

Sure.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Franck Ditter
In article <5257c3dd$0$29984$c3e8da3$54964...@news.astraweb.com>,
 Steven D'Aprano  wrote:

> On Fri, 11 Oct 2013 10:14:29 +0300, Jussi Piitulainen wrote:
> 
> > Roy Smith writes:
> >> In article ,
> >>  Piet van Oostrum wrote:
> >> 
> >> > I usually say that a closure is a package, containing a function with
> >> > some additional data it needs. The data usually is in the form of
> >> > name bindings.
> >> 
> >> That's pretty close to the way I think about it.  The way it was
> >> originally described to me is, "A closure is a function bundled up with
> >> it's arguments".
> > 
> > Really? It should be more like "a function bundled up with some other
> > function's arguments" and even more like "a function bundled up with
> > bindings for its free variables".
> 
> Closures have nothing to do with *arguments*. A better definition of a 
> closure is that it is a function together with a snapshot of the 
> environment it was called from.
> 
> def func(arg):
> y = arg + 1
> def inner():
> return y + 1000
> return inner
> 
> f = func(1)

Maybe a better example of closure would be (just for the nonlocal) :

def fib() :
(a,b) = (0,1)
def producer() :
nonlocal a,b # Python 3
old = a
(a,b) = (b,a+b)
return old
return producer

>>> f = fib()
>>> [f() for i in range(10)]
[0, 1, 1, 2, 3, 5, 8, 13, 21, 34]

> At this point, f is a closure. It needs to know the value of y (not the 
> argument to func) in order to work, and the implementation is to store 
> that information inside f.func_closure (or f.__closure__ in Python 3). 
> The part of the calling environment which is saved is y

Shouldn't it be the (a,b) pair here ? But :

>>> f.__closure__[0].cell_contents# access to what ?
55

Shouldn't cell_contents keep the current (a,b) pair, a part of the snapshot of
the creation environment (private variables of the closure) ? 
Instead it seems to returns only a (which is the next production)...

franck
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Jussi Piitulainen
Oscar Benjamin writes:

> tried to explain why their field couldn't use π for the
> circumference of a unit circle I would suggest that they adjust the
> other parts of their notation not π (there are other uses of π.

There's τ for the full circle; π is used for half the circumference.


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: I am never going to complain about Python again

2013-10-11 Thread Neil Cerutti
On 2013-10-11, Steven D'Aprano  wrote:
> On Thu, 10 Oct 2013 17:48:16 +, Neil Cerutti wrote:
>
>> >>> 5.0 == abs(3 + 4j)
>>  False
>
> Did you maybe accidentally rebind abs? If not, what version of
> Python are you using?

Honestly, I think I got my Python term and my Vim term mixed up.
I Shall not post technical stuff while working on other thing.

-- 
Neil Cerutti
-- 
https://mail.python.org/mailman/listinfo/python-list


Process pending Tk events from GObject main loop?

2013-10-11 Thread Skip Montanaro
I know I have things bassackwards, but trying to process Gtk events
from Tkinter's main loop using after() isn't working. (I suspect our
underlying C++ (ab)use of Gtk may require a Gtk main loop). I'd like
to process Tk events periodically from a GObject main loop. I know I
want to call gobject.idle_add(), but can't find Tkinter equivalent to
gobject.MainLoop().get_context().iteration(). That is, how do you
process a single event (or all pending events)? It looks like
_tkinter.dooneevent() might be the right call to process a single
event, but how do I tell no more events are pending?

Thx,

Skip
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Is this the room for an argument?

2013-10-11 Thread Roy Smith
In article <5223ac4a-783e-405d-84a4-239070b66...@googlegroups.com>,
 John Ladasky  wrote:

> On Thursday, October 10, 2013 5:07:11 PM UTC-7, Roy Smith wrote:
> > I'd like an argument, please.
> 
> 'Receptionist' (Rita Davies) - Yes, sir?
> 'Man' (Michael Palin) - I'd like to have an argument please.
> 'Receptionist' - Certainly sir, have you been here before...?
> 'Man' - No, this is my first time.
> 'Receptionist' - I see. Do you want to have the full argument, or were you 
> thinking of taking the course?
> 'Man' - Well, what would be the cost?
> 'Receptionist' - Yes, it's one pound for a five-minute argument, but only 
> eight pounds for a course of ten.
> 'Man' - Well, I think it's probably best if I start with the five-minute one 
> and see how it goes from there. OK?
> 'Receptionist' - Fine - I'll see who's free at the moment...Mr.Du-Bakey's 
> free, but he's a little bit concilliatory...Yes, try Mr.Barnard - Room 12.
> 'Man' - Thank you.
> 
> :^)

Well, that was half of the joke.  I'm waiting to see if anybody gets the 
other half.
-- 
https://mail.python.org/mailman/listinfo/python-list


ANN: CUI text editor Kaa 0.0.4

2013-10-11 Thread Atsuo Ishimoto
Hi,

I've just released Kaa 0.0.4 to PyPI.

https://pypi.python.org/pypi/kaaedit/

Kaa is a easy yet powerful text editor for console user interface,
providing numerous features like

- Macro recording.
- Undo/Redo.
- Multiple windows/frames.
- Syntax highlighting.
- Open source software(MIT)

Kaa is written in Python 3.3. So, you can easily customize many
aspects of Kaa with simple Python scripts.

Please take a look at http://kaaedit.github.io for screen shots and
installation instruction.

Regards,
-- 
Atsuo Ishimoto
Mail: ishim...@gembook.org
Twitter: atsuoishimoto
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Stephen Tucker
A quick reply to all you contributors (by the way, I was not expecting to
get so many responses so quickly - I am (as you probably realise) new to
this kind of thing.

I am stuck with Python 2.X because ESRI's ArcGIS system uses it -
otherwise, yes, you're all right, I would be in Python 3.X like a shot! So
that rules out any answers to my questions that involve Python 3.X. (Sorry,
perhaps I should have mentioned that at the outset - as I say, I'm new to
all this.)

ESRI compound the problem, actually, by making all the strings that the
ArcGIS Python interface delivers (from MS SQLServer) Unicode! (I suppose,
on reflection, they have no choice.) So I am stuck with the worst of both
worlds - a generation of Python (2.X) that is inept at handling unicode (on
an operating system (MS Windows 7) that is not much better, and being
flooded with unicode strings from my users' databases! Anything you can
come up with to ease all this (like, "convert *all* your strings to unicode
as soon as you can and render them as ASCII as late as you can") has
already been of help.

On the original question, well, I accept Ned's answer (at 10.22). I also
like the idea of a helper function given by Peter Otten at 09.51. It still
seems like a crutch to help poor old Python 2.X to do what any programmer
(or, at least the programmers like me :-)  ) think it ought to be able to
by itself. The distinction between the "geekiness" of a tuple compared with
the "non-geekiness" of a string is, itself, far too geeky for my liking.
The distinction seems to be an utterly spurious - even artificial or
arbitrary one to me. (Sorry about the rant.)




On Fri, Oct 11, 2013 at 10:22 AM, Ned Batchelder wrote:

>  On 10/11/13 4:16 AM, Stephen Tucker wrote:
>
> I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.
>
> I have four questions:
>
> 1. Why is it that
>  print unicode_object
>  displays non-ASCII characters in the unicode object correctly, whereas
>   print (unicode_object, another_unicode_object)
>  displays non-ASCII characters in the unicode objects as escape sequences
> (as repr() does)?
>
>  2. Given that this is actually *deliberately *the case (which I, at the
> moment, am finding difficult to accept), what is the neatest (that is, the
> most Pythonic) way to get non-ASCII characters in unicode objects in tuples
> displayed correctly?
>
>  3. A similar thing happens when I write such objects and tuples to a file
> opened by
>  codecs.open ( ..., "utf-8")
>  I have also found that, even though I use  write  to send the text to the
> file, unicode objects not in tuples get their non-ASCII characters sent to
> the file correctly, whereas, unicode objects in tuples get their characters
> sent to the file as escape sequences. Why is this the case?
>
>  4. As for question 1 above, I ask here also: What is the neatest way to
> get round this?
>
>  Stephen Tucker.
>
>
> Although Python 3 is better than Python 2 at Unicode, as the others have
> said, the most important point is one that you hit upon yourself.
>
> When you print an object x, you are actually printing str(x).  The str()
> of a tuple is a paren, followed by the repr()'s of its elements, separated
> by commas, then a closing paren.  Tuples and lists use the repr() of their
> elements when producing either their own str() or their own repr().
>
> Python 3 does better at this because repr() in Python 3 will gladly
> include non-ASCII characters in its output, while Python 2 will only
> include ASCII characters, and so must resort to escape sequences.  (BTW: if
> you like the ASCII-only idea from Python 2, Python 3 has the ascii()
> function and the %a string formatting directive for that very purpose.)
>
> The two string representation alternatives str() and repr() can be
> confusing.  Think of it as: str() is for customers, repr() is for
> developers, or: str() is for humans, repr() is for geeks.   The reason
> tuples use the repr() of their elements is that the parens+commas
> representation of a tuple is geeky to begin with, so it uses repr() of its
> elements, even for str(tuple).
>
> The way to avoid repr() for the elements is to format the tuple yourself.
>
> --Ned.
>
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Roy Smith
In article ,
 Oscar Benjamin  wrote:

> If someone tried to explain why their field couldn't use ð for the 
> circumference of a unit circle I would suggest that they adjust the 
> other parts of their notation not ð (there are other uses of ð.

Pi is wrong:

http://www.youtube.com/watch?v=jG7vhMMXagQ
-- 
https://mail.python.org/mailman/listinfo/python-list


calculating download speed from .pcap file

2013-10-11 Thread patrick

hi,

im looking for a way to calculate download speed for a http connection 
inside my .pcap file.

but doing even a simple read with dpkt doesnt really work.

import pcap, dpkt
import socket

pcapReader = dpkt.pcap.Reader(file("http-download.pcap"))
for ts, data in pcapReader:
print ts, len(data)
eth = dpkt.ethernet.Ethernet(data)
print eth


according to this howto: 
http://jon.oberheide.org/blog/2008/10/15/dpkt-tutorial-2-parsing-a-pcap-file/
it should output something reable, but instead i get ascii art. nothing 
readable.


ts and len(data) work as expected, the first is the timestamp and the 
second the packet length.


any idea whats wrong?



ive had some progresss with scapy when working with icmp, but when 
reading the TCP sequence numbers output differs from wireshark/tcpdump. 
posted it here: 
http://thread.gmane.org/gmane.comp.security.scapy.general/4952



greets
--
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Piet van Oostrum
Chris Angelico  writes:

> On Fri, Oct 11, 2013 at 7:41 PM, Peter Cacioppi
>  wrote:
>> So, my hope is that the GIL restrictions won't be problematic here. That is 
>> to say, I don't need **Python** code to ever run concurrently. I just need 
>> Python to allow a different Python worker thread to execute when all the 
>> other worker threads are blocking on the model.solve() task. Once the 
>> algorithm is in full swing, it is typical for all the worker threads should 
>> be blocking on model.Solve() at the same time.
>
> Sounds like Python will serve you just fine! Check out the threading
> module, knock together a quick test, and spin it up!

But it only works if the external C library has been written to release
the GIL around the long computations. If not, then the OP could try to
write a wrapper around them that does this.
-- 
Piet van Oostrum 
WWW: http://pietvanoostrum.com/
PGP key: [8DAE142BE17999C4]
-- 
https://mail.python.org/mailman/listinfo/python-list


Consolidate several lines of a CSV file with firewall rules

2013-10-11 Thread juanscopp
Hi guys.
I have a CSV file, which I created using an HTML export from a Check Point 
firewall policy.
Each rule is represented as several lines, in some cases. That occurs when a 
rule has several address sources, destinations or services.
I need the output to have each rule described in only one line.
It's easy to distinguish when each rule begins. In the first column, there's 
the rule ID, which is a number.

Let me show you an example. The strings that should be moved are in bold:

[code]NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
ON;TIME;COMMENT
1;;fwxcluster;mcast_vrrp;;vrrp;accept;Log;fwxcluster;Any;"VRRP;;*Comment 
suppressed*
;[b]igmp**;
2;;fwxcluster;fwxcluster;;FireWall;accept;Log;fwxcluster;Any;"Management 
FWg;*Comment suppressed*
;;[b]fwmgmpe**;[b]fwmgmpe**;;[b]ssh**;
;;[b]fwmgm**;[b]fwmgm**;;;
3;NTP;G_NTP_Clients;cmm_ntpserver_pe01;;ntp;accept;None;fwxcluster;Any;*Comment 
suppressed*
;;;[b]cmm_ntpserver_pe02**;;;[/code]

What I need ,explained in pseudo code, is this:

Read the first column of the next line. If there's a number:
Evaluate the first column of the next line. If there's no number there, 
concatenate (separating with a comma) \
the strings in the columns of this line with the last one and eliminate 
the text in the current one

The output should be something like this. The strings in bold are the ones that 
were moved:

[code]NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
ON;TIME;COMMENT
1;;fwxcluster,[b]fwmgmpe**,[b]fwmgm**;mcast_vrrp,[b]fwmgmpe**,[b]fwmgm**;;vrrp,[b]ssh**;accept;Log;fwxcluster;Any;*Comment
 suppressed*
;;
;;
3;NTP;G_NTP_Clients;cmm_ntpserver_pe01,[b]cmm_ntpserver_pe02**;;ntp;accept;None;fwxcluster;Any;*Comment
 suppressed*
;;[/code]

The empty lines are there only to be more clear, I don't actually need them.

Thanks!
-- 
https://mail.python.org/mailman/listinfo/python-list


Consolidate several lines of a CSV file with firewall rules

2013-10-11 Thread Starriol
Hi guys.
I have a CSV file, which I created using an HTML export from a Check Point 
firewall policy.
Each rule is represented as several lines, in some cases. That occurs when a 
rule has several address sources, destinations or services.
I need the output to have each rule described in only one line.
It's easy to distinguish when each rule begins. In the first column, there's 
the rule ID, which is a number.

Let me show you an example:

NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
ON;TIME;COMMENT
1;;fwxcluster;mcast_vrrp;;vrrp;accept;Log;fwxcluster;Any;"VRRP;;*Comment 
suppressed*
;igmp**;
2;;fwxcluster;fwxcluster;;FireWall;accept;Log;fwxcluster;Any;"Management 
FWg;*Comment suppressed*
;;fwmgmpe**;fwmgmpe**;;ssh**;
;;fwmgm**;fwmgm**;;;
3;NTP;G_NTP_Clients;cmm_ntpserver_pe01;;ntp;accept;None;fwxcluster;Any;*Comment 
suppressed*
;;;cmm_ntpserver_pe02**;;;

What I need ,explained in pseudo code, is this:

Read the first column of the next line. If there's a number:
Evaluate the first column of the next line. If there's no number there, 
concatenate (separating with a comma) \
the strings in the columns of this line with the last one and eliminate 
the text in the current one

The output should be something like this:

NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
ON;TIME;COMMENT
1;;fwxcluster,fwmgmpe**,fwmgm**;mcast_vrrp,fwmgmpe**,fwmgm**;;vrrp,ssh**;accept;Log;fwxcluster;Any;*Comment
 suppressed*
;;
;;
3;NTP;G_NTP_Clients;cmm_ntpserver_pe01,cmm_ntpserver_pe02**;;ntp;accept;None;fwxcluster;Any;*Comment
 suppressed*
;;

The empty lines are there only to be more clear, I don't actually need them.

Thanks!
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Consolidate several lines of a CSV file with firewall rules

2013-10-11 Thread Joel Goldstick
On Fri, Oct 11, 2013 at 11:01 AM, Starriol  wrote:
> Hi guys.
> I have a CSV file, which I created using an HTML export from a Check Point 
> firewall policy.
> Each rule is represented as several lines, in some cases. That occurs when a 
> rule has several address sources, destinations or services.
> I need the output to have each rule described in only one line.
> It's easy to distinguish when each rule begins. In the first column, there's 
> the rule ID, which is a number.
>
> Let me show you an example:
>
> NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
> ON;TIME;COMMENT
> 1;;fwxcluster;mcast_vrrp;;vrrp;accept;Log;fwxcluster;Any;"VRRP;;*Comment 
> suppressed*
> ;igmp**;
> 2;;fwxcluster;fwxcluster;;FireWall;accept;Log;fwxcluster;Any;"Management 
> FWg;*Comment suppressed*
> ;;fwmgmpe**;fwmgmpe**;;ssh**;
> ;;fwmgm**;fwmgm**;;;
> 3;NTP;G_NTP_Clients;cmm_ntpserver_pe01;;ntp;accept;None;fwxcluster;Any;*Comment
>  suppressed*
> ;;;cmm_ntpserver_pe02**;;;
>
> What I need ,explained in pseudo code, is this:
>
> Read the first column of the next line. If there's a number:
> Evaluate the first column of the next line. If there's no number 
> there, concatenate (separating with a comma) \
> the strings in the columns of this line with the last one and 
> eliminate the text in the current one
>
> The output should be something like this:
>
> NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL 
> ON;TIME;COMMENT
> 1;;fwxcluster,fwmgmpe**,fwmgm**;mcast_vrrp,fwmgmpe**,fwmgm**;;vrrp,ssh**;accept;Log;fwxcluster;Any;*Comment
>  suppressed*
> ;;
> ;;
> 3;NTP;G_NTP_Clients;cmm_ntpserver_pe01,cmm_ntpserver_pe02**;;ntp;accept;None;fwxcluster;Any;*Comment
>  suppressed*
> ;;
>
> The empty lines are there only to be more clear, I don't actually need them.
>
> Thanks!
> --
> https://mail.python.org/mailman/listinfo/python-list

I think you posted twice, and perhaps in html?  Its hard to read.

At any rate, there is a csv module in python that will let you gather
your data in a list of lists.  With that you can iterate through the
csv rows, saving rows with a number in the first position.  Iterate
and append the rows below that until you run into another row with a
number in the first position.

Why don't you write some code, see how it goes, copy and paste the
code back here with full traceback if you get an error or with your
results if you have some.  Do it for a subset of a couple of rows of
input data.

-- 
Joel Goldstick
http://joelgoldstick.com
-- 
https://mail.python.org/mailman/listinfo/python-list


Problem in Multiprocessing module

2013-10-11 Thread William Ray Wing
I'm running into a problem in the multiprocessing module.

My code is running four parallel processes which are doing network access 
completely independently of each other (gathering data from different remote 
sources).  On rare circumstances, the code blows up when one of my processes 
has do start doing some error recovery.  I strongly suspect it is because there 
is a time-out that isn't being caught in the multiprocessing lib, and that in 
turn is exposing the TypeError.  Note that the error is "cannot concatenate 
'str' and 'NoneType' objects and it is occurring way down in the 
multiprocessing library.

I'd really appreciate it if someone more knowledgeable about multiprocessing 
could confirm (or refute) my suspicion and then tell me how to fix things up. 

I'm running python 2.7.5 on a Mac OS-X 10.8.5

The traceback I get is:

TypeError: cannot concatenate 'str' and 'NoneType' objects
File "/Users/wrw/Dev/Python/Connection_Monitor/Version3.0/CM_Harness.py", line 
20, in 
 my_pool = pool.map(monitor, targets)# and hands off to four targets
File 
"/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/pool.py",
 line 250, in map
 return self.map_async(func, iterable, chunksize).get()
File 
"/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/pool.py",
 line 554, in get
 raise self._value

To save you-all some time:

The "get" function at line 554 in pool.py (which is in the multiprocessing lib) 
is:

   def get(self, timeout=None):
   self.wait(timeout)
   if not self._ready:
   raise TimeoutError
   if self._success:
   return self._value
   else:
   raise self._value

And the map function (also in pool in the multiprocessing lib) is:

   def map(self, func, iterable, chunksize=None):
   '''
   Equivalent of `map()` builtin
   '''
   assert self._state == RUN
   return self.map_async(func, iterable, chunksize).get()

Finally, my code that calls all this is pretty simple (note that the targets 
are dummies here):

#!/usr/bin/env python

""" Harness to call multiple parallel copies
   of the basic monitor program
"""

targets = ["www.sdsc.edu", "www.ncsa.edu", "www.uiuc.edu", "www.berkeley.edu"]

pool = Pool(processes=4)# start 4 worker processes
my_pool = pool.map(monitor, targets)# and hands off to four targets

TiA
Bill

-- 
https://mail.python.org/mailman/listinfo/python-list


Regarding URL Shortener

2013-10-11 Thread Datin Farah Natasha
Hello guys, i want to make a simple sript that can automatically generate 
normal to adf.ly links using my accounts? it is possible for me to do this 
using python and use it as python command line. if it's possible what library 
do i need to use? any help will be appreciated.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 15:01:40 +0300, Jussi Piitulainen wrote:

> Steven D'Aprano writes:
>> Closures have nothing to do with *arguments*. A better definition of a
>> closure is that it is a function together with a snapshot of the
>> environment it was called from.
[...]
> Second, it's precisely not (a snapshot of) the environment where the
> function is *called* from, it's (a snapshot of) the environment where
> the function was *created* in. This is the whole *point*.

Ah yes, of course you are right. I actually knew that, it was a slip of 
the brain that I wrote it wrong :-(

Thanks for the correction.



-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: I am never going to complain about Python again

2013-10-11 Thread Joshua Landau
On 11 October 2013 10:11, Steven D'Aprano
 wrote:
> On Fri, 11 Oct 2013 09:17:37 +0100, Joshua Landau wrote:
>
>> On 11 October 2013 03:08, Steven D'Aprano
>>  wrote:
>>>
>>> Given:
>>>
>>> x ∈ ℝ, x = 2  (reals)
>>> y ∈ ℕ, y = 2  (natural numbers)
>>>
>>> we have x = y, but since 1/y is undefined (there is no Natural number
>>> 1/2), 1/x != 1/y.
>>
>> Surely 1/y is perfectly well defined, as only y, not 1/y, is constrained
>> to the natural numbers.
>
> Context is important, and usually implied. 1/y within the natural numbers
> is treated in the same way as sqrt(-1) within the reals.

I don't know; a rational tends to be described as any number of the
form x/y where x, y ∈ ℕ. Hence I don't agree that it's reasonable to
ever assume that 1/y has to exist in the same space as y unless
explicitly stated or generally working within, say, the integers.
Neither of those are remotely true of Python so I don't see how this
point is relevant when discussing Python's concept of equality.

> Try it on your
> calculator, and chances are very good you'll get an error. Try it in
> Python 2, or nearly any other programming language (but not Python 3),
> and again, chances are you'll get an error.

*Remains unconvinced.* None of that seems to actually matter.

> If you implicitly decide to promote entities, then of course you can
> promote y to a real then take the invoice.

I'm not. I'm just not applying the restrictions on y to the function it's in.

> But that trick still doesn't
> work for the original example, int(0.0) == int(0+0j) because promoting 0
> to complex doesn't help, you have to demote 0+0j to real and that's
> ambiguous.

I agree on this. The correct interpretation of

0.0 == 0 + 0j

is, of course

complex(0.0) == 0 + 0j
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Regarding URL Shortener

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 09:14:08 -0700, Datin Farah Natasha wrote:

> Hello guys, i want to make a simple sript that can automatically
> generate normal to adf.ly links using my accounts? it is possible for me
> to do this using python and use it as python command line. if it's
> possible what library do i need to use? any help will be appreciated.

Yes, it is possible. Start by going to adf.ly. You will see they claim to 
have an "easy-to-use API". Follow the instructions there.



-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 10:05:03 -0400, Roy Smith wrote:

> In article ,
>  Oscar Benjamin  wrote:
> 
>> If someone tried to explain why their field couldn't use ð for the
>> circumference of a unit circle I would suggest that they adjust the
>> other parts of their notation not ð (there are other uses of ð.
> 
> Pi is wrong:

Pi is right, your newsreader is wrong. Oscar's post included the header:

Content-Type: text/plain; charset=ISO-8859-7

Your newsreader ignores the charset header and just assumes it is 
Latin-1. Since pi (π) in ISO-8859-7 is byte \xF0, your newsreader wrongly 
treats it as ð (LATIN SMALL LETTER ETH).



-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Steven D'Aprano
On Fri, 11 Oct 2013 09:16:36 +0100, Stephen Tucker wrote:

> I am using IDLE, Python 2.7.2 on Windows 7, 64-bit.
> 
> I have four questions:
> 
> 1. Why is it that
>  print unicode_object
> displays non-ASCII characters in the unicode object correctly, whereas
>  print (unicode_object, another_unicode_object)
> displays non-ASCII characters in the unicode objects as escape sequences
> (as repr() does)?

Because that is the design of Python. Printing compound objects like 
tuples, lists and dicts always uses the repr of the components. 
Otherwise, you couldn't tell the difference between (say) (23, 42) and 
("23", "42").

If you want something different, you have to do it yourself.

However, having said that, it is true that the repr() of Unicode strings 
in Python 2 is rather lame. Python 3 is much better:

[steve@ando ~]$ python2.7 -c "print repr(u'∫ßδЛ')"
u'\xe2\x88\xab\xc3\x9f\xce\xb4\xd0\x9b'

[steve@ando ~]$ python3.3 -c "print(repr('∫ßδЛ'))"
'∫ßδЛ'

So if you have the opportunity to upgrade to Python 3.3, I recommend it.


> 2. Given that this is actually *deliberately *the case (which I, at the
> moment, am finding difficult to accept), what is the neatest (that is,
> the most Pythonic) way to get non-ASCII characters in unicode objects in
> tuples displayed correctly?

I'd go with something like this helper function:

def print_unicode(obj):
if isinstance(obj, (tuple, list, set, frozenset)):
print u', '.join(unicode(item) for item in obj)
else:
print unicode(item)


Adjust to taste :-)


> 3. A similar thing happens when I write such objects and tuples to a
> file opened by
>  codecs.open ( ..., "utf-8")
> I have also found that, even though I use  write  to send the text to
> the file, unicode objects not in tuples get their non-ASCII characters
> sent to the file correctly, whereas, unicode objects in tuples get their
> characters sent to the file as escape sequences. Why is this the case?

Same reason. The default string converter for tuples uses the repr, which 
intentionally uses escape sequences. If you want something different, you 
can program it yourself.


-- 
Steven
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: datetime.timedelta.replace?

2013-10-11 Thread Joshua Landau
On 9 October 2013 16:15, Skip Montanaro  wrote:
> Datetime objects have a replace method, but timedelta objects don't.
> If I take the diff of two datetimes and want to zero out the
> microseconds field, is there some way to do it more cleanly than this?
>
> delta = dt1 - dt2
> zero_delta = datetime.timedelta(days=delta.days, seconds=delta.seconds)
>
> I guess that's not bad, but replace() seems cleaner (or at least more
> congruent with datetime objects).

Maybe one of

delta - datetime.timedelta(0, 0, delta.microseconds)

or

delta - delta % datetime.timedelta(seconds=1)

is clearer.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Complex literals (was Re: I am never going to complain about Python again)

2013-10-11 Thread Gene Heskett
On Friday 11 October 2013 12:49:40 Roy Smith did opine:

> In article ,
> 
>  Oscar Benjamin  wrote:
> > If someone tried to explain why their field couldn't use ً for the
> > circumference of a unit circle I would suggest that they adjust the
> > other parts of their notation not ً (there are other uses of ً.
> 
> Pi is wrong:
> 
> http://www.youtube.com/watch?v=jG7vhMMXagQ

The funnily/serious part of this current "comedy central session" is that, 
speaking as someone who was too busy fixing tv's for a living in the 1950 
era, to go far enough in school to get any really higher math, (algebra 
enough to solve ohms law etc was all I usually needed) the above argument 
has always made perfect sense to me, and I have often arrived at the 
correct answer to some problem by using 2Pi, but usually without calling it 
Tau.  And even that wasn't needed often enough to keep my mind fresh about 
it. But I managed to get the job done anyway, those two tv cameras that 
were on the Trieste when it went into the Challenger Deep in 1960 had 
traces of my fingerprints in them.

Cheers, Gene
-- 
"There are four boxes to be used in defense of liberty:
 soap, ballot, jury, and ammo. Please use in that order."
-Ed Howdershelt (Author)

Linux poses a real challenge for those with a taste for late-night
hacking (and/or conversations with God).
-- Matt Welsh
A pen in the hand of this president is far more
dangerous than 200 million guns in the hands of
 law-abiding citizens.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Ethan Furman

On 10/10/2013 08:01 PM, Roy Smith wrote:

On 10Oct2013 19:44, Ned Batchelder  wrote:

I have to admit I'm having a hard time understanding why you'd need
to test the undecorated functions.  After all, the undecorated
functions aren't available to anyone.  All that matters is how they
behave with the decorators.


In article ,
  Cameron Simpson  wrote:

If the undecorated function is buggy, the decorated function will
be buggy. But the bug will be harder to resolve, and if you're
especially lucky the decorator will often-but-not-always conceal
the bug in the inner function.


And there lies the fundamental white-box vs. black-box testing conundrum.

The black-box camp (whose flag Ned is flying) says, "There is an exposed
interface which accepts certain inputs and promises certain outputs.
That's all you know, that's all you ever can know, and that's all you
should ever want to know.  The interface is constant.  The guts can
change without notice".  That's a perfectly valid philosophy.

The white-box camp (under which banner Cameron rides) says, "There's a
lot of neat stuff under the covers, and I can do a better, faster, and
more complete testing job if I take advantage of my knowledge of what's
under the kimono".  That, too, is a valid philosophy.


Some tests can also be done much more easily with white-box.  Imagine an 
edge case which takes an exact that requires an exact, seldom used, 
sequence of events because an internal bug is usually counteracted by 
the rest of the system, except in some small number of cases.  Directly 
testing the internal piece directly for the bug can be much easier than 
setting up the long and involved test.


--
~Ethan~
--
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Piet van Oostrum
Stephen Tucker  writes:

> ESRI compound the problem, actually, by making all the strings that the 
> ArcGIS Python interface
> delivers (from MS SQLServer) Unicode! (I suppose, on reflection, they have no 
> choice.) So I am
> stuck with the worst of both worlds - a generation of Python (2.X) that is 
> inept at handling
> unicode (on an operating system (MS Windows 7) that is not much better, and 
> being flooded with
> unicode strings from my users' databases! Anything you can come up with to 
> ease all this (like,
> "convert all your strings to unicode as soon as you can and render them as 
> ASCII as late as you
> can") has already been of help.

I wouldn't say that Python 2.x is inept at handling Unicode. You just have to 
know what you are doing. But that's also true for Python 3.x, although that 
give you a bit more help. But you can do everything you want with 2.x, I think.
-- 
Piet van Oostrum 
WWW: http://pietvanoostrum.com/
PGP key: [8DAE142BE17999C4]
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Ethan Furman

On 10/10/2013 08:13 PM, Cameron Simpson wrote:

On 11Oct2013 02:55, Steven D'Aprano  
wrote:

On Fri, 11 Oct 2013 09:12:38 +1100, Cameron Simpson wrote:

Speaking for myself, I would be include to recast this code:

   @absolutize
   def addition(a, b):
   return a + b

into:

   def _addition(a, b):
   return a + b
   addition = absolutize(_addition)

Then you can unit test both _addition() and addition().


*shudders*
Ew ew ew ew.


Care to provide some technical discourse here? Aside from losing the neat
and evocative @decorator syntax, the above is simple and overt.


And completely dismisses the whole point of adding @decorator to the 
language:  easy to use, easy to see == folks will actually use it.



I would much rather do something like this:

def undecorate(f):
 """Return the undecorated inner function from function f."""
 return f.func_closure[0].cell_contents


Whereas this feels like black magic. Is this portable to any decorated
function? If so, I'd have hoped it was in the stdlib. If not: black magic.


Probably black magic.  But you can go with the decorator.wrapped route; 
after all, you're testing your own stuff so you should have control of 
your own decorators (okay, you may have to adapt a few others ;) .


--
~Ethan~
--
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Ned Batchelder

On 10/10/13 10:22 PM, Cameron Simpson wrote:

On 10Oct2013 19:44, Ned Batchelder  wrote:

On 10/10/13 6:12 PM, Cameron Simpson wrote:

Speaking for myself, I would be include to recast this code:

   @absolutize
   def addition(a, b):
   return a + b

into:

   def _addition(a, b):
   return a + b

   addition = absolutize(_addition)

Then you can unit test both _addition() and addition(). [...]

I have to admit I'm having a hard time understanding why you'd need
to test the undecorated functions.  After all, the undecorated
functions aren't available to anyone.  All that matters is how they
behave with the decorators.

If the undecorated function is buggy, the decorated function will
be buggy. But the bug will be harder to resolve, and if you're
especially lucky the decorator will often-but-not-always conceal
the bug in the inner function.

Wanting to test the core function is perfectly reasonable. You can in
principle write simpler and more direct tests of the core function.

Having an error report that points directly at an error instead of
an error report that points at some outer dysfunction (i.e. "somewhere
deep inside here something is broken") is highly desirable in
general, and therefore also in a test suite.

Cheers,


I understand the desire to test the inner function.  But the OP said "I 
need to...", which makes me think he's dealing with some kind of "mock a 
service, but the service is in the decorator, so what should I do?" 
situation.  In which case, there might be a better solution than 
undecorating the function.


--Ned.
--
https://mail.python.org/mailman/listinfo/python-list


Re: Regarding URL Shortener

2013-10-11 Thread Datin Farah Natasha
On Saturday, October 12, 2013 1:02:30 AM UTC+8, Steven D'Aprano wrote:
> On Fri, 11 Oct 2013 09:14:08 -0700, Datin Farah Natasha wrote:
> 
> 
> 
> > Hello guys, i want to make a simple sript that can automatically
> 
> > generate normal to adf.ly links using my accounts? it is possible for me
> 
> > to do this using python and use it as python command line. if it's
> 
> > possible what library do i need to use? any help will be appreciated.
> 
> 
> 
> Yes, it is possible. Start by going to adf.ly. You will see they claim to 
> 
> have an "easy-to-use API". Follow the instructions there.
> 
> 
> 
> 
> 
> 
> 
> -- 
> 
> Steven

thanks steven. i'll check it out there. 
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Terry Reedy

On 10/11/2013 4:41 AM, Peter Cacioppi wrote:


I should add that the computational heavy lifting is done in a third party 
library. So a worker thread looks roughly like this (there is a subtle race 
condition I'm glossing over).

while len(jobs) :
job = jobs.pop()
model = Model(job)  # Model is py interface for a lib written in C
newJobs = model.solve() # This will take a long time
for each newJob in newJobs :
  jobs.add(newJob)

Here jobs is a thread safe object that is shared across each worker thread. It 
holds a priority queue of jobs that can be solved in parallel.

Model is a py class that provides the API to a 3rd party library written in C.I 
know model.solve() will be the bottleneck operation for all but trivial 
problems.

So, my hope is that the GIL restrictions won't be problematic here. That is to 
say, I don't need **Python** code to ever run concurrently. I just need Python 
to allow a different Python worker thread to execute when all the other worker 
threads are blocking on the model.solve() task. Once the algorithm is in full 
swing, it is typical for all the worker threads should be blocking on 
model.Solve() at the same time.

It's a nice algorithm for high level languages. Java worked well here, I'm 
hoping py can be nearly as fast with a much more elegant and readable code.


Given that model.solve takes a 'long time' (seconds, at least), the 
extra time to start a process over the time to start a thread will be 
inconsequential. I would therefore look at the multiprocessing module.


--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Process pending Tk events from GObject main loop?

2013-10-11 Thread Christian Gollwitzer

Am 11.10.13 14:52, schrieb Skip Montanaro:

I know I have things bassackwards, but trying to process Gtk events
from Tkinter's main loop using after() isn't working. (I suspect our
underlying C++ (ab)use of Gtk may require a Gtk main loop). I'd like
to process Tk events periodically from a GObject main loop. I know I
want to call gobject.idle_add(), but can't find Tkinter equivalent to
gobject.MainLoop().get_context().iteration(). That is, how do you
process a single event (or all pending events)? It looks like
_tkinter.dooneevent() might be the right call to process a single
event, but how do I tell no more events are pending?


To process all pending events in Tcl/Tk, you use update (Tcl function). 
So you just grab a Tkinter object and execute its .update() function, or 
you manually invoke something like


root.tk.eval('update')

Christian
--
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Terry Reedy

On 10/11/2013 9:31 AM, Stephen Tucker wrote:


to be able to by itself. The distinction between the "geekiness" of a
tuple compared with the "non-geekiness" of a string is, itself, far too
geeky for my liking. The distinction seems to be an utterly spurious -
even artificial or arbitrary one to me. (Sorry about the rant.)


There is a better reason for collections using repr for their items, but 
I forget the details.


--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Peter Cacioppi
On Thursday, October 10, 2013 11:01:25 PM UTC-7, Peter Cacioppi wrote:
> Could someone give me a brief thumbnail sketch of the difference between 
> multi-threaded programming in Java.
> 
> 
> 
> I have a fairly sophisticated algorithm that I developed as both a single 
> threaded and multi-threaded Java application. The multi-threading port was 
> fairly simple, partly because Java has a rich library of thread safe data 
> structures (Atomic Integer, Blocking Queue, Priority Blocking Queue, etc). 
> 
> 
> 
> There is quite a significant performance improvement when multithreading here.
> 
> 
> 
> I'd like to port the project to Python, partly because Python is a better 
> language (IMHO) and partly because Python plays well with Amazon Web 
> Services. 
> 
> 
> 
> But I'm a little leery that things like the Global Interpret Lock will block 
> the multithreading efficiency, or that a relative lack of concurrent off the 
> shelf data structures will make things much harder.
> 
> 
> 
> Any advice much appreciated. Thanks.

"Sounds like Python will serve you just fine! Check out the threading
module, knock together a quick test, and spin it up!"

Thanks, that was my assessment as well, just wanted a double check. At the time 
of posting I was mentally blocked on how to set up a quick proof of concept, 
but of course writing the post cleared that up ;)

Along with "batteries included" and "we're all adults", I think Python needs a 
pithy phrase summarizing how well thought out it is. That is to say, the major 
design decisions were all carefully considered, and as a result things that 
might appear to be problematic are actually not barriers in practice. My 
suggestion for this phrase is "Guido was here". 

So in this case, I thought the GIL would be a fly in the ointment, but on 
reflection it turned out not to be the case. Guido was here.

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Consolidate several lines of a CSV file with firewall rules

2013-10-11 Thread Tim Chase
On 2013-10-11 08:01, Starriol wrote:
> NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL
> ON;TIME;COMMENT
> 1;;fwxcluster;mcast_vrrp;;vrrp;accept;Log;fwxcluster;Any;"VRRP;;*Comment
> suppressed* ;igmp**;
> 2;;fwxcluster;fwxcluster;;FireWall;accept;Log;fwxcluster;Any;"Management
> FWg;*Comment
> suppressed* ;;fwmgmpe**;fwmgmpe**;;ssh**; ;;fwmgm**;fwmgm**;;;
> 3;NTP;G_NTP_Clients;cmm_ntpserver_pe01;;ntp;accept;None;fwxcluster;Any;*Comment
> suppressed* ;;;cmm_ntpserver_pe02**;;; 
> What I need ,explained in pseudo code, is this:
> 
> Read the first column of the next line. If there's a number:
>   Evaluate the first column of the next line. If there's no
> number there, concatenate (separating with a comma) \ the strings
> in the columns of this line with the last one and eliminate the
> text in the current one
> 
> The output should be something like this:
> 
> NO.;NAME;SOURCE;DESTINATION;VPN  ;SERVICE;ACTION;TRACK;INSTALL
> ON;TIME;COMMENT
> 1;;fwxcluster,fwmgmpe**,fwmgm**;mcast_vrrp,fwmgmpe**,fwmgm**;;vrrp,ssh**;accept;Log;fwxcluster;Any;*Comment
> suppressed* ;; ;;
> 3;NTP;G_NTP_Clients;cmm_ntpserver_pe01,cmm_ntpserver_pe02**;;ntp;accept;None;fwxcluster;Any;*Comment
> suppressed* ;;
>
> The empty lines are there only to be more clear, I don't actually
> need them.

Though there are a couple oddities in your source data, I'll take a
crack at it.  First, there are the dangling open-quotes on #1 that
cause most CSV parsers (I tested both Gnumeric and Python's csv
module) to read until the subsequent line is read.  If this is
intentional, then it's all good.  If not, you get weird behaviors.
That said, you can try the code below (adjusting the 3 lines for
your desired filenames and whether you *want* to write emptied rows)
to see if it gets you what you want:

##

import csv
# adjust these 3 lines
WRITE_EMPTIES = True
INFILE = "input.txt"
OUTFILE = "output.txt"
with open(INFILE, "r") as in_file:
  r = csv.reader(in_file, delimiter=";")
  with open(OUTFILE, "wb") as out_file:
previous = None
empties_to_write = 0
out_writer = csv.writer(out_file, delimiter=";")
for i, row in enumerate(r):
  first_val = row[0].strip()
  if first_val:
if previous:
  out_writer.writerow(previous)
  if WRITE_EMPTIES and empties_to_write:
out_writer.writerows(
  [["" for _ in previous]] * empties_to_write
  )
empties_to_write = 0
previous = row
  else: # append sub-portions to each other
previous = [
  ",".join(
subitem
for subitem in existing.split(",") + [new]
if subitem
)
  for existing, new in zip(previous, row)
  ]
empties_to_write += 1
if previous: # take care of the last row
  out_writer.writerow(previous)
  if WRITE_EMPTIES and empties_to_write:
out_writer.writerows(
  [["" for _ in previous]] * empties_to_write
  )


Hope this helps,

-tkc




-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Consolidate several lines of a CSV file with firewall rules [PS]

2013-10-11 Thread Tim Chase
On 2013-10-11 15:40, Tim Chase wrote:
> the dangling open-quotes on #1 that cause most CSV parsers to read
> until the subsequent line is read.

And by "subsequent line", I mean "subsequent closing-quote" of
course. :-)

-tkc



-- 
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Terry Reedy

On 10/11/2013 12:44 PM, Steven D'Aprano wrote:

On Fri, 11 Oct 2013 15:01:40 +0300, Jussi Piitulainen wrote:


Steven D'Aprano writes:

Closures have nothing to do with *arguments*. A better definition of a
closure is that it is a function together with a snapshot of the
environment it was called from.

[...]

Second, it's precisely not (a snapshot of) the environment where the
function is *called* from, it's (a snapshot of) the environment where
the function was *created* in. This is the whole *point*.


Ah yes, of course you are right. I actually knew that, it was a slip of
the brain that I wrote it wrong :-(

Thanks for the correction.


The closure is also not a 'snapshot' but a reference to (or preservation 
of) (relevant parts of) the environment. A snapshot of the environment 
at the time of definition would have been much easier to implement.


x = 1
def outer():
y = 1
def inner():
return x + y
y = 2
return inner
x = 2
print(outer()())
# 4

In a sense, all user functions are closures in that they have, and have 
always had, a reference to their definition module environment -- the 
readonly .__globals__ attribute (probably .func_globals in 2.x).


This lexical, as opposed to dynamic scoping, becomes noticable when one 
import a function from another module, as for testing. Because 
.__globals__ is read-only, one must monkey-patch the module of 
definition to change the function's global (modular) environment, as 
when replacing an object it uses with a mock. I ran into this when 
testing Idle methods that use a tk message box to display a message and 
wait for input (sometimes text, always a mouse click) from a human user.


What is relatively new (and tricky) is capturing local names of 
surrounding functions while maintaining both late binding and 
independent writability for each closure. This last means that the 
following works:


def account():
balance = 0
def trans(amt):
nonlocal balance
balance += amt
return balance
return trans

xmasfund = account()
pettycash = account()
print(xmasfund(100))
# 100
print(pettycash(50))
# 50
print(xmasfund(-100))
# 0
print(pettycash(-25))
# 25

Closures and decorators are *really* two different subjects.

--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: closure = decorator?

2013-10-11 Thread Terry Reedy

On 10/11/2013 8:08 AM, Franck Ditter wrote:

In article <5257c3dd$0$29984$c3e8da3$54964...@news.astraweb.com>,
  Steven D'Aprano  wrote:


On Fri, 11 Oct 2013 10:14:29 +0300, Jussi Piitulainen wrote:


Roy Smith writes:

In article ,
  Piet van Oostrum wrote:


I usually say that a closure is a package, containing a function with
some additional data it needs. The data usually is in the form of
name bindings.


That's pretty close to the way I think about it.  The way it was
originally described to me is, "A closure is a function bundled up with
it's arguments".


Really? It should be more like "a function bundled up with some other
function's arguments" and even more like "a function bundled up with
bindings for its free variables".


Closures have nothing to do with *arguments*. A better definition of a
closure is that it is a function together with a snapshot of the
environment it was called from.

def func(arg):
 y = arg + 1
 def inner():
 return y + 1000
 return inner

f = func(1)


Maybe a better example of closure would be (just for the nonlocal) :

def fib() :
 (a,b) = (0,1)


a,b = 0,1 is the same thing.

a and b are separate local names and are in no sense a 'pair'.


 def producer() :
 nonlocal a,b # Python 3
 old = a
 (a,b) = (b,a+b)
 return old
 return producer


f = fib()
[f() for i in range(10)]

[0, 1, 1, 2, 3, 5, 8, 13, 21, 34]


At this point, f is a closure. It needs to know the value of y (not the
argument to func) in order to work, and the implementation is to store
that information inside f.func_closure (or f.__closure__ in Python 3).
The part of the calling environment which is saved is y


Shouldn't it be the (a,b) pair here ? But :


f.__closure__[0].cell_contents# access to what ?

55

Shouldn't cell_contents keep the current (a,b) pair, a part of the snapshot of
the creation environment (private variables of the closure) ?
Instead it seems to returns only a (which is the next production)...


Look as f.__closure__[1] (.cell_contents) for b.

--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list


Re: Consolidate several lines of a CSV file with firewall rules [PS]

2013-10-11 Thread Starriol
On Friday, October 11, 2013 5:50:06 PM UTC-3, Tim Chase wrote:
> On 2013-10-11 15:40, Tim Chase wrote:
> 
> > the dangling open-quotes on #1 that cause most CSV parsers to read
> 
> > until the subsequent line is read.
> 
> 
> 
> And by "subsequent line", I mean "subsequent closing-quote" of
> 
> course. :-)
> 
> 
> 
> -tkc

Ha, thanks a million, Tim!
That worked great!
Hopefully soon enough I'm going to be able to write it on my own without having 
to bother you guys!
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threading in Python vs Java

2013-10-11 Thread Cameron Simpson
On 11Oct2013 15:53, Terry Reedy  wrote:
> On 10/11/2013 4:41 AM, Peter Cacioppi wrote:
> >I should add that the computational heavy lifting is done in a third party 
> >library. So a worker thread looks roughly like this (there is a subtle race 
> >condition I'm glossing over).
> >
> >while len(jobs) :
> >job = jobs.pop()
> >model = Model(job)  # Model is py interface for a lib written in C
> >newJobs = model.solve() # This will take a long time
> >for each newJob in newJobs :
> >  jobs.add(newJob)
> >
> >Here jobs is a thread safe object that is shared across each worker thread. 
> >It holds a priority queue of jobs that can be solved in parallel.
> >
> >Model is a py class that provides the API to a 3rd party library written in 
> >C.I know model.solve() will be the bottleneck operation for all but trivial 
> >problems.
[...]
> Given that model.solve takes a 'long time' (seconds, at least), the
> extra time to start a process over the time to start a thread will
> be inconsequential. I would therefore look at the multiprocessing
> module.

And, for contrast, I would not. Threads are my friends and Python
threads seem eminently suited to the above scenario.

Cheers,
-- 
Cameron Simpson 

[Alain] had been looking at his dashboard, and had not seen me, so I
ran into him. - Jean Alesi on his qualifying prang at Imola '93
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Skipping decorators in unit tests

2013-10-11 Thread Cameron Simpson
On 11Oct2013 02:37, Gilles Lenfant  wrote:
> * Adding an "__original__" attribute to the wrapper func in the decorators of 
> my own

Just one remark: Call this __original or _original (or even original).
The __x__ names are reserved for python operations (like __add__, supporting 
"+").

Cheers,
-- 
Cameron Simpson 

I very strongly suggest that you periodically place ice packs over the abused
areas.  - Steve Garnier
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: CUI text editor Kaa 0.0.4

2013-10-11 Thread Dave Angel
On 11/10/2013 07:09, Atsuo Ishimoto wrote:

> Hi,
>
> I've just released Kaa 0.0.4 to PyPI.
>
> https://pypi.python.org/pypi/kaaedit/
>
> Kaa is a easy yet powerful text editor for console user interface,

What's a "console user interface?"  That's what Windows calls a "DOS
box".  But otherwise you seem to imply it runs on Linux.

So, what OS's does it support?  Which terminal programs? Does it only
work locally, or can it work from an ssh terminal? Can it run within
screen?

> providing numerous features like
>
> - Macro recording.
> - Undo/Redo.
> - Multiple windows/frames.
> - Syntax highlighting.
> - Open source software(MIT)
>
> Kaa is written in Python 3.3. So, you can easily customize many
> aspects of Kaa with simple Python scripts.
>
> Please take a look at http://kaaedit.github.io for screen shots and
> installation instruction.
>
> Regards,

-- 
DaveA


-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Consolidate several lines of a CSV file with firewall rules [PS]

2013-10-11 Thread Mark Lawrence

On 11/10/2013 22:22, Starriol wrote:

On Friday, October 11, 2013 5:50:06 PM UTC-3, Tim Chase wrote:

On 2013-10-11 15:40, Tim Chase wrote:


the dangling open-quotes on #1 that cause most CSV parsers to read



until the subsequent line is read.




And by "subsequent line", I mean "subsequent closing-quote" of

course. :-)



-tkc


Ha, thanks a million, Tim!
That worked great!
Hopefully soon enough I'm going to be able to write it on my own without having 
to bother you guys!



If in future you do have to bother us we'll not object, but you'll 
certainly be less bother if you read and digest this first 
https://wiki.python.org/moin/GoogleGroupsPython :)


--
Roses are red,
Violets are blue,
Most poems rhyme,
But this one doesn't.

Mark Lawrence

--
https://mail.python.org/mailman/listinfo/python-list


Re: Problem in Multiprocessing module

2013-10-11 Thread Terry Reedy

On 10/11/2013 10:53 AM, William Ray Wing wrote:

I'm running into a problem in the multiprocessing module.

My code is running four parallel processes which are doing network access completely 
independently of each other (gathering data from different remote sources).  On rare 
circumstances, the code blows up when one of my processes has do start doing some 
error recovery.  I strongly suspect it is because there is a time-out that isn't 
being caught in the multiprocessing lib, and that in turn is exposing the TypeError. 
 Note that the error is "cannot concatenate 'str' and 'NoneType' objects and it 
is occurring way down in the multiprocessing library.

I'd really appreciate it if someone more knowledgeable about multiprocessing 
could confirm (or refute) my suspicion and then tell me how to fix things up.

I'm running python 2.7.5 on a Mac OS-X 10.8.5


The version is important, see below.


The traceback I get is:


After moving the last line to the top. Better to cut and paste as is.


TypeError: cannot concatenate 'str' and 'NoneType' objects


To understand an exception, you must know what sort of expression could 
cause it. In 2.7, this arises from something like

>>> 'a'+None

Traceback (most recent call last):
  File "", line 1, in 
'a'+None
TypeError: cannot concatenate 'str' and 'NoneType' objects

In 3.x, the same expression generates
TypeError: Can't convert 'NoneType' object to str implicitly

The equivalent join expression gives a different message in 2.7 (and 
nearly the same in 3.3):


>>> ''.join(('a', None))

Traceback (most recent call last):
  File "", line 1, in 
''.join(('a', None))
TypeError: sequence item 1: expected string, NoneType found


File "/Users/wrw/Dev/Python/Connection_Monitor/Version3.0/CM_Harness.py", line 20, in 

  my_pool = pool.map(monitor, targets)# and hands off to four targets
File 
"/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/pool.py",
 line 250, in map
  return self.map_async(func, iterable, chunksize).get()
File 
"/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/multiprocessing/pool.py",
 line 554, in get
  raise self._value

To save you-all some time:

The "get" function at line 554 in pool.py (which is in the multiprocessing lib) 
is:


class ApplyResult(object):


def get(self, timeout=None):
self.wait(timeout)


This must set self._ready, self._success, and self._value


if not self._ready:
raise TimeoutError


This did not happen, so self._ready must be True


if self._success:
return self._value


This did not happen, so self._success must be False


else:
raise self._value


This did, and self._value is the TypeError reported.
Let us look into self.wait and see if we can find where there is a 
string1 + string2 expression and then figure out how string2 might be None.


class ApplyResult(object):
def __init__(self, cache, callback):
self._cond = threading.Condition(threading.Lock())
...

threading.Condition is a (useless) function that returns a class 
_Condition object.


def wait(self, timeout=None):
self._cond.acquire()
try:
if not self._ready:
self._cond.wait(timeout)
finally:
self._cond.release()

so it seems that we need to look at the _Condition methods acquire, 
release, and wait. (The first two are lock methods

self.acquire = lock.acquire
self.release = lock.release
). However, this seems wrong because self._cond has no reference to self 
and hence cannot set self attributes.  The problem must be in some 
callback that is called while waiting. Async is terrible to debug 
because the call stack in the traceback ends with wait and does not tell 
us what function was called during the wait.


After the .get method is ._set, which starts

def _set(self, i, obj):
self._success, self._value = obj
# and goes on to set
This is the only place where self._value is set, so it must have been 
called during the wait.


It is only used in Pool._handle_results where the relevant lines are
@staticmethod
def _handle_results(outqueue, get, cache):
...
task = get()
...
job, i, obj = task
...
cache[job]._set(i, obj)

We need to find out what get is. _handle_results is only used in 
Pool.__init__:


self._result_handler = threading.Thread(
target=Pool._handle_results,
args=(self._outqueue, self._quick_get, self._cache)
)

so when _handle_results is called, get = self._quick_get, and what is 
that? .__init__ starts with self._setup_queues(self) and that has

from .queues import SimpleQueue  # relative import
self._outqueue = SimpleQueue()
self._quick_get = self._outqueue._reader.recv

SimpleQueue._reader is the first member of the pair returned by 
multiprocessing.Pipe(duplex=False), whi

Inter-process locking

2013-10-11 Thread Jason Friedman
I have a 3rd-party process that runs for about a minute and supports
only a single execution at a time.

$ deploy

If I want to launch a second process I have to wait until the first
finishes.  Having two users wanting to run at the same time might
happen a few times a day.  But, these users will not have the
skills/patience to check whether someone else is currently running.
I'd like my program to be able to detect that "deploy" is already
running, tell the user, wait a minute, try again, repeat.

I do not know whether anyone has had success with
http://pythonhosted.org/lockfile/lockfile.html.

I supose I could use http://code.google.com/p/psutil/ to check for a
process with a particular name.
-- 
https://mail.python.org/mailman/listinfo/python-list


Metaclass/abc hackery

2013-10-11 Thread Demian Brecht
As with most I'm sure, short of using abc's, I've had very little exposure
to metaclasses. So, when digging into abc implementation, I figured it
would be a good idea to dig into metaclasses, their usage and actually try
writing one.

What I did may be contrived, but it was fun nonetheless and a good
introduction to how metaclasses can be used (of course, they should only be
used when absolutely required and no other solution is readily available
due to the black magic that happens under the hood): I ended up writing a
proof of concept that's abc-like in nature. However, it doesn't depend on
inheritance. It allows you to say: "I want to make sure that this object
/look/ like this type when instantiated".

Again, simple proof of concept that has holes in it and is likely
altogether a bad idea, but it was fun to throw together nonetheless, so I
thought I'd share: https://gist.github.com/demianbrecht/6944269 (check out
the tests at the bottom for usage).

Working on this though brought up a question: Is there anything in the
data model that acts like "__setattr__" but when operating on a class
definition instead of an instance? I'd be able to get rid of the late_bind
function if something like that's available... Not likely something that
would be used very often, but would likely sometimes be useful.

Thanks,

-- 

*Demian Brecht
*http://demianbrecht.github.com
-- 
https://mail.python.org/mailman/listinfo/python-list


OT: looking for best solutions for tracking projects and skills

2013-10-11 Thread Jason Hsu
I realize this is off-topic, but I'm not sure what forum is best for asking 
about this.  I figure that at least a few of you are involved in civic hacking 
groups.

I recently joined a group that does civic hacking. (Adopt-A-Hydrant is an 
example of civic hacking.)

We need a solution for tracking projects and the skills needed for the projects 
(such as Ruby on Rails, Python, Drupal, Javascript, etc.).

I'd like to hear from those of you in similar groups that have a great system 
for tracking projects. Is there an in-house solution you use, or is there 
something else available?
-- 
https://mail.python.org/mailman/listinfo/python-list


job openiongs @ CompSoft India

2013-10-11 Thread swetha N
job openiongs @ CompSoft India

Job Title: Software Engineer

Qualification: Any Graduate

Experience: Fresher

Location: Chennai
for more details & apply click here:

http://referenceglobe.com/Postings_Internal/view_job_details_encoded.php?postid=MTM5MQ==
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Inter-process locking

2013-10-11 Thread Piet van Oostrum
Jason Friedman  writes:

> I have a 3rd-party process that runs for about a minute and supports
> only a single execution at a time.
>
> $ deploy
>
> If I want to launch a second process I have to wait until the first
> finishes.  Having two users wanting to run at the same time might
> happen a few times a day.  But, these users will not have the
> skills/patience to check whether someone else is currently running.
> I'd like my program to be able to detect that "deploy" is already
> running, tell the user, wait a minute, try again, repeat.
>
> I do not know whether anyone has had success with
> http://pythonhosted.org/lockfile/lockfile.html.

It seems to work on Mac OS X.

> I supose I could use http://code.google.com/p/psutil/ to check for a
> process with a particular name.

That will quite probably give you race conditions. 

File locking is generally the best solution for this kind of problems, unless 
you can make use of OS level semaphores.
-- 
Piet van Oostrum 
WWW: http://pietvanoostrum.com/
PGP key: [8DAE142BE17999C4]
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Metaclass/abc hackery

2013-10-11 Thread Marco Buttu

On 10/12/2013 04:47 AM, Demian Brecht wrote:


Working on this though brought up a question: Is there anything in the
data model that acts like "__setattr__" but when operating on a class
definition instead of an instance? I'd be able to get rid of the
late_bind function if something like that's available... Not likely
something that would be used very often, but would likely sometimes be
useful.

Thanks,


I am not sure about your question, but I try to explain a bit some 
possibilities. If you define a __setattr__ method in the metaclass, then 
you can intercept the attribute assignments only after class creation:


>>> class MetaFoo(type):
... def __setattr__(cls, name, value):
... print("in __setattr__(): ", name, value)
... super().__setattr__(name, value)
...
>>>
>>> class Foo(metaclass=MetaFoo):
... a = 33

As you can see, the above code does not print the message. But after 
class creation it does:


>>> Foo.a = 33
in __setattr__():  a 33

This because during the class creation there is any class yet, so it is 
not possible to intercept argument assignment by __setattr__.
If you want to intercept the assignments during class creation too, you 
can intercept the class attribute dictionary assignment. In this case 
you can just write a dictionary object that overrides __setitem__, and 
then by overriding the __prepare__ metaclass method in order to return 
this dictionary:


>>> class Namespace(dict):
... def __setitem__(self, name, value):
... print('In Namespace.__setitem__():', name, value)
... super().__setitem__(name, value)
...
>>> class MetaFoo(type):
... def __prepare__(clsname, bases):
... return Namespace()
... def __setattr__(cls, name, value):
... print("In MetaFoo.__setattr__(): ", name, value)
... super().__setattr__(name, value)
...
>>> class Foo(metaclass=MetaFoo):
... a = 33
...
In Namespace.__setitem__(): __module__ __main__
In Namespace.__setitem__(): __qualname__ Foo
In Namespace.__setitem__(): a 33
>>> Foo.a = 33
In MetaFoo.__setattr__():  a 33

Of course, it is not a so good solution, because if you need to manage 
in the same way either the attributes before or after the class 
creation, you have to do it by writing some code outside the methods:


>>> def manage(name, value):
... print('do something with', name, value)
...
>>> class Namespace(dict):
... def __setitem__(self, name, value):
... print('In Namespace.__setitem__():', name, value)
... manage(name, value)
... super().__setitem__(name, value)
...
>>> class MetaFoo(type):
... def __prepare__(clsname, bases):
... return Namespace()
... def __setattr__(cls, name, value):
... print("In MetaFoo.__setattr__(): ", name, value)
... manage(name, value)
... super().__setattr__(name, value)
...
>>> class Foo(metaclass=MetaFoo):
... a = 33
...
In Namespace.__setitem__(): __module__ __main__
do something with __module__ __main__
In Namespace.__setitem__(): __qualname__ Foo
do something with __qualname__ Foo
In Namespace.__setitem__(): a 33
do something with a 33
>>> Foo.a = 33
In MetaFoo.__setattr__():  a 33
do something with a 33

--
Marco Buttu
--
https://mail.python.org/mailman/listinfo/python-list


Re: Unicode Objects in Tuples

2013-10-11 Thread Ian Kelly
On Fri, Oct 11, 2013 at 7:31 AM, Stephen Tucker  wrote:
> On the original question, well, I accept Ned's answer (at 10.22). I also
> like the idea of a helper function given by Peter Otten at 09.51. It still
> seems like a crutch to help poor old Python 2.X to do what any programmer
> (or, at least the programmers like me :-)  ) think it ought to be able to by
> itself. The distinction between the "geekiness" of a tuple compared with the
> "non-geekiness" of a string is, itself, far too geeky for my liking. The
> distinction seems to be an utterly spurious - even artificial or arbitrary
> one to me. (Sorry about the rant.)

I agree, and that's not how I would explain the distinction.  The str
of an object is meant to be human-readable, while the repr of an
object is meant to be something that could be pasted into the
interpreter to reconstruct the object.  In the case of tuples, the
repr of the tuple uses the reprs of the components because the
resulting string will more likely be acceptable to the interpreter,
and the str of the tuple is the same as the repr because there is no
convincing reason why it should be different.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: CUI text editor Kaa 0.0.4

2013-10-11 Thread Atsuo Ishimoto
Hi,

Thank you for your question.

> What's a "console user interface?" That's what Windows calls a "DOS
> box".  But otherwise you seem to imply it runs on Linux.

I meant to say something like DOS box or Linux terminals.
Unfortunately, Kaa does not work on Windows DOS box since Kaa requires
curses library to run.

> So, what OS's does it support? Which terminal programs?

Currently, Kaa is tested on Mac OS X 10.8.5 and Ubuntu 13.04 box. I usually use
iTerm2 on Mac and Gnome Terminal on Ubuntu.

> Does it only work locally, or can it work from an ssh terminal?

Works for both locally and ssh terminal.

> Can it run within screen?

Yes, but as Emacs or bash, default escape character of screen(^A)
conflicts with Kaa,.
So I recommend to assign another character for screen.

Regards,
-- 
https://mail.python.org/mailman/listinfo/python-list


What version of glibc is Python using?

2013-10-11 Thread John Nagle
I'm trying to find out which version of glibc Python is using.
I need a fix that went into glibc 2.10 back in 2009.
(http://udrepper.livejournal.com/20948.html)

So I try the recommended way to do this, on a CentOS server:

/usr/local/bin/python2.7
Python 2.7.2 (default, Jan 18 2012, 10:47:23)
[GCC 4.4.6 20110731 (Red Hat 4.4.6-3)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import platform
>>> platform.libc_ver()
('glibc', '2.3')

This is telling me that the Python distribution built in 2012,
with a version of GCC released April 16, 2011, is using
glibc 2.3, released in October 2002.  That can't be right.

I tried this on a different Linux machine, a desktop running
Ubuntu 12.04 LTS:

Python 2.7.3 (defualt, April 10 2013, 06:20:15)
[GCC 4.6.3] on linux2
('glibc', '2.7')

That version of glibc is from October 2007.

Where are these ancient versions coming from?  They're
way out of sync with the GCC version.

John Nagle
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: What version of glibc is Python using?

2013-10-11 Thread Christian Gollwitzer

Am 12.10.13 08:34, schrieb John Nagle:

I'm trying to find out which version of glibc Python is using.
I need a fix that went into glibc 2.10 back in 2009.
(http://udrepper.livejournal.com/20948.html)

So I try the recommended way to do this, on a CentOS server:

/usr/local/bin/python2.7
Python 2.7.2 (default, Jan 18 2012, 10:47:23)
[GCC 4.4.6 20110731 (Red Hat 4.4.6-3)] on linux2
Type "help", "copyright", "credits" or "license" for more information.

import platform
platform.libc_ver()

('glibc', '2.3')


Try

ldd /usr/local/bin/python2.7

Then execute the reported libc.so, which gives you some information.

Christian


--
https://mail.python.org/mailman/listinfo/python-list