Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Gisle Vanem

Chris Angelico wrote:


There's a specific search order. Back in the days of DOS, it was
simply "com, then exe, then bat", but on modern Windowses, I think
it's governed by an environment variable.


You probably mean '%PATHEXT'. Mine is:
 .COM;.EXE;.BAT;.BTM;.CMD;.JS;.JSE;.WSF;.WSH;.MSC;.tcl;.py;.pyw;.pl;.htm;.html

In my favourite shell 4NT, I simply can have:
  set .py=python

Instead of the Explorer associations that the Python-installer puts
in my registry. Revealed from my shell:
  c:\> assoc .py
   .py=py_auto_file
  c:\> ftype py_auto_file
   py_auto_file="F:\ProgramFiler\Python27\python.exe" "%1"

In ShellExecuteEx(), what program gets launched for "py_auto_file" in this
case, seems to be determined by the 'SHELLEXECUTEINFO:lpClass' member.
I fail to see that Python uses this structure anywhere.

--
--gv
--
https://mail.python.org/mailman/listinfo/python-list


Re: Converting 5.223701009526849e-05 to 5e-05

2015-05-07 Thread Alexander Blinne
Am 03.05.2015 um 10:48 schrieb Ben Finney:
> That's not as clear as it could be. Better is to be explicit about
> choosing “exponential” format::
> 
> >>> foo = 5.223701009526849e-05
> >>> "{foo:5.0e}".format(foo=foo)
> '5e-05'
> 

Or even better the "general" format, which also works for 0.:

>>> "{foo:.1g}".format(foo=5.223701009526849e-5)
'5e-05'
>>> "{foo:.1g}".format(foo=0.)
'1'

I guess all roads lead to Rome...

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Throw the cat among the pigeons

2015-05-07 Thread Alain Ketterlin
Dave Angel  writes:

> On 05/06/2015 11:36 AM, Alain Ketterlin wrote:
>> Yes, plus the time for memory allocation. Since the code uses "r *=
>> ...", space is reallocated when the result doesn't fit. The new size is
>> probably proportional to the current (insufficient) size. This means
>> that overall, you'll need fewer reallocations, because allocations are
>> made in bigger chunks.
>
> That sounds plausible, but  a=5; a*=4  does not update in place. It
> calculates and creates a new object.  Updating lists can work as you
> say, but an int is immutable.

Ah, okay. Even for big ints? If that is the case, my suggestion doesn't
explain anything. Anyway, with so many allocations for so little
arithmetic, the difference is probably due to the behavior of the
allocator (which maybe always finds blocks big enough, since one was
released after the previous multiplication, or something like that). The
only way to know would be to profile the VM.

> It's an optimization that might be applied if the code generator were
> a lot smarter, (and if the ref count is exactly 1), but it would then
> be confusing to anyone who used id().

"Abandon all hope, ye [optimizer] who enter here."

Thanks for the clarification.

-- Alain.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Throw the cat among the pigeons

2015-05-07 Thread Chris Angelico
On Thu, May 7, 2015 at 7:14 PM, Alain Ketterlin
 wrote:
> Dave Angel  writes:
>
>> On 05/06/2015 11:36 AM, Alain Ketterlin wrote:
>>> Yes, plus the time for memory allocation. Since the code uses "r *=
>>> ...", space is reallocated when the result doesn't fit. The new size is
>>> probably proportional to the current (insufficient) size. This means
>>> that overall, you'll need fewer reallocations, because allocations are
>>> made in bigger chunks.
>>
>> That sounds plausible, but  a=5; a*=4  does not update in place. It
>> calculates and creates a new object.  Updating lists can work as you
>> say, but an int is immutable.
>
> Ah, okay. Even for big ints? If that is the case, my suggestion doesn't
> explain anything. Anyway, with so many allocations for so little
> arithmetic, the difference is probably due to the behavior of the
> allocator (which maybe always finds blocks big enough, since one was
> released after the previous multiplication, or something like that). The
> only way to know would be to profile the VM.

Yes, all integers are immutable. This is true regardless of the size
of the integer, because:

x = some_big_long_calculation()
y = x
y += 1

should never change the value of x.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Stefan Zimmermann
Nice to see that my topic gains that interest :)
And I see that I should have gone more into detail about what I'm actually 
trying to point out.

Chris Angelico wrote:
> Hmm... hm... Ha! Found the difference. I had an explicit shebang on my
> script; yours just starts out with shell commands. That means that
> your shell script wasn't truly executable, and thus requires a shell
> to execute it. Try adding "#!/bin/sh" to the top and rerun that - at
> that point, it becomes kernel-executable instead of just
> shell-executable.

That's the big advantage of Unix. You can write an kernel-executable script 
without any file extension, just by putting a shebang to the beginning of that 
file. And for the caller it makes no difference if 'command' is a binary or a 
script and Popen('command') works in both cases, without the shell=True 
overhead.

Steven D'Aprano wrote:
> Apart from any other number of problems, surely having "foo" alone run 
> foo.exe, foo.bat etc. is at best confusing and at worst a security risk? 
> What if you have *both* foo.exe and foo.bat in the same directory?

On Unix you can shadow any binary with a wrapper script of the same name 
located in a path appearing earlier in $PATH. Any caller will automatically run 
your script instead of the original binary. An that's usually seen as a big 
advantage on Unix.
On Windows executability depends on the file extension and if you want to wrap 
some command.exe you usually write a command.bat in a path with higher 
precedence. And in Windows it's standard that .exe, .com, .bat and .cmd files 
should be callable without writing the file extension.
And as already mentioned, there is a defined precedence order if they are in 
the same directory.
That's not more or less security risky as shadowing binaries with scripts on 
Unix.

My point is that compared to Unix it's just a big disadvantage on Windows that 
the subprocess.Popen(['command']) can only call command.exe implicitly, which 
makes it impossible to work with custom wrapper .bat or .cmd scripts without 
the shell=True overhead.
And this is acutally confusing for a Windows user.
You write a wrapper .bat for some .exe and are wondering why your Python script 
doesn't use it.

And the FindExecutable() function from the win32 API would just be the perfect 
solution for implementing this.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Stefan Zimmermann
And last but not least, Popen behavior on Windows makes it difficult to write 
OS-independent Python code which calls external commands that are not binary by 
default:

2 examples:

1. I wrote a coffeetools package which wraps the CoffeeScript compiler in a 
Python API. The 'coffee' command is a Node.js script. And under Windows it is 
installed with a 'coffee.cmd' wrapper to make it callable. So to make Popen 
work you have to switch and call 'coffee' on Unix and 'coffee.cmd' on Windows. 
But from the Windows shell you can just call 'coffee'. Maybe in the future the 
.cmd changes to .bat ...

2. I the embedded portable git from SourceTree instead of the standard Windows 
git installation. It has a git.bat wrapper which calls the internal git.exe 
(which must be in the same dir with a lot of other included ported Unix tools 
and therefore not recommended to add that dir to PATH). That made the dulwich 
package unworkable for me because it just tries to Popen(['git', ...]). And I 
am currently trying to make the dulwich developers accept my pull request with 
a workaround...
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Stefan Zimmermann :

> And last but not least, Popen behavior on Windows makes it difficult
> to write OS-independent Python code which calls external commands that
> are not binary by default:

Then, write OS-dependent Python code.

I don't think it's Python's job to pave over OS differences. Java does
that by not offering precious system facilities -- very painful. Python
is taking steps in that direction, but I hope it won't go too far.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Thu, May 7, 2015 at 8:10 PM, Marko Rauhamaa  wrote:
> Stefan Zimmermann :
>
>> And last but not least, Popen behavior on Windows makes it difficult
>> to write OS-independent Python code which calls external commands that
>> are not binary by default:
>
> Then, write OS-dependent Python code.
>
> I don't think it's Python's job to pave over OS differences. Java does
> that by not offering precious system facilities -- very painful. Python
> is taking steps in that direction, but I hope it won't go too far.

On the contrary, I think it *is* a high level language's job to pave
over those differences. Portable C code generally has to have a
whopping 'configure' script that digs into your hardware, OS, library,
etc availabilities, and lets you figure out which way to do things.
Python code shouldn't need to worry about that. You don't need to care
whether you're on a 32-bit or 64-bit computer; you don't need to care
whether it's an Intel chip or a RISCy one; you shouldn't have to
concern yourself with the difference between BSD networking and
WinSock. There'll be a handful of times when you do care, and for
those, it's nice to have some facilities exposed; but the bulk of code
shouldn't need to know about the platform it's running on.

Java went for a philosophy of "write once, run anywhere" in its early
days, and while that hasn't exactly been stuck to completely, it's
still the reasoning behind the omission of certain system facilities.
Python accepts and understands that there will be differences, so you
can't call os.getuid() on Windows, and there are a few restrictions on
the subprocess module if you want maximum portability, but the bulk of
your code won't be any different on Linux, Windows, Mac OS, OS/2,
Amiga, OS/400, Solaris, or a MicroPython board.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Dave Angel

On 05/07/2015 06:24 AM, Chris Angelico wrote:

On Thu, May 7, 2015 at 8:10 PM, Marko Rauhamaa  wrote:

Stefan Zimmermann :


And last but not least, Popen behavior on Windows makes it difficult
to write OS-independent Python code which calls external commands that
are not binary by default:


Then, write OS-dependent Python code.

I don't think it's Python's job to pave over OS differences. Java does
that by not offering precious system facilities -- very painful. Python
is taking steps in that direction, but I hope it won't go too far.


On the contrary, I think it *is* a high level language's job to pave
over those differences. Portable C code generally has to have a
whopping 'configure' script that digs into your hardware, OS, library,
etc availabilities, and lets you figure out which way to do things.
Python code shouldn't need to worry about that. You don't need to care
whether you're on a 32-bit or 64-bit computer; you don't need to care
whether it's an Intel chip or a RISCy one; you shouldn't have to
concern yourself with the difference between BSD networking and
WinSock. There'll be a handful of times when you do care, and for
those, it's nice to have some facilities exposed; but the bulk of code
shouldn't need to know about the platform it's running on.

Java went for a philosophy of "write once, run anywhere" in its early
days, and while that hasn't exactly been stuck to completely, it's
still the reasoning behind the omission of certain system facilities.
Python accepts and understands that there will be differences, so you
can't call os.getuid() on Windows, and there are a few restrictions on
the subprocess module if you want maximum portability, but the bulk of
your code won't be any different on Linux, Windows, Mac OS, OS/2,
Amiga, OS/400, Solaris, or a MicroPython board.

ChrisA



It's a nice goal.  But these aren't OS features in Windows, they're 
shell features.  And there are several shells.  If the user has 
installed a different shell, is it Python's job to ignore it and 
simulate what cmd.exe does?


Seems to me that's what shell=True is for.  it signals Python that we're 
willing to trust the shell to do whatever magic it chooses, from adding 
extensions, to calling interpreters, to changing search order, to 
parsing the line in strange ways, to setting up temporary environment 
contexts, etc.


If there were just one shell, it might make sense to emulate its 
features.  Or it might make sense to contort its features to look like a 
Unix shell.  But with multiple possibilities, seems that's more like 
space for a 3rd party library.


--
DaveA
--
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Thu, May 7, 2015 at 9:28 PM, Dave Angel  wrote:
> It's a nice goal.  But these aren't OS features in Windows, they're shell
> features.  And there are several shells.  If the user has installed a
> different shell, is it Python's job to ignore it and simulate what cmd.exe
> does?

It might be an unattainable goal (in fact, it almost certainly is),
but I was specifically disagreeing with the notion that it's right and
normal to write a bunch of platform-specific code in Python. That
should be the rarity.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Chris Angelico :

> I was specifically disagreeing with the notion that it's right and
> normal to write a bunch of platform-specific code in Python. That
> should be the rarity.

Why is that?

Code is written for a specific need and environment. Often trying to
write generic solutions leads to cumbersome and clunky results on *all*
platforms.

A software system is defined through its interfaces. Natural system
interfaces are very different under different operating systems. The
chosen programming language for whatever component is often an
afterthought. I'm glad I can still write native Linux code using Python.
I couldn't do that with Java, which doesn't have things like os.fork(),
file descriptors or process ids.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Thu, May 7, 2015 at 10:41 PM, Marko Rauhamaa  wrote:
> Chris Angelico :
>
>> I was specifically disagreeing with the notion that it's right and
>> normal to write a bunch of platform-specific code in Python. That
>> should be the rarity.
>
> Why is that?
>
> Code is written for a specific need and environment. Often trying to
> write generic solutions leads to cumbersome and clunky results on *all*
> platforms.
>
> A software system is defined through its interfaces.

And the most important interface is with a human. Humans are the same
whether you're running under Windows, Linux, or anything else. If you
want to write single-platform code, go for it; but if you want to
write cross-platform code, the best way is to let someone else take
care of the differences, abstracting them away into a nice tidy thing
that we call a high-level language.

I don't need forking, file descriptors, or process IDs, to describe
how a person uses my code. Those are *implementation details*. Now, it
might be that I have to concern myself with some of them. Maybe I want
to get optimal performance out of something, and that means using
multiple processes and managing them properly. Maybe I need to
interface with systemd, respond to dozens of different process-level
signals, use directory notifications, and do a bunch of other
Linux-only things, so maybe it's just completely impractical to
consider supporting even BSD-based Unixes, much less Windows. So be
it. But to the greatest extent possible, Python should let me write
code that doesn't care about any of that.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Chris Angelico :

>> A software system is defined through its interfaces.
>
> And the most important interface is with a human.

I barely ever program anything for the human interface.

> If you want to write single-platform code, go for it; but if you want
> to write cross-platform code, the best way is to let someone else take
> care of the differences, abstracting them away into a nice tidy thing
> that we call a high-level language.

You suggested most software should be platform-agnostic. Now you are
qualifying the statement.

But still, I challenge the notion that you could write a web site, game
or application that feels natural on the XBox, iPhone, Windows PC and
LXDE at the same time without significant amounts of
platform-conditioned parts.

> I don't need forking, file descriptors, or process IDs, to describe
> how a person uses my code. Those are *implementation details*.

Even if I programmed for the human and the UI experience were
more-or-less identical between platforms, the system interfaces can be
conceptually quite different. Heroic attempts have been made to overcome
those differences with generic APIs. However, Python should stay out of
that crusade.

Whole programming cultures, idioms and "right ways" differ between
platforms. What's the right way to write a service (daemon)? That's
probably completely different between Windows and Linux. Linux itself is
undergoing a biggish transformation there: an exemplary daemon of last
year will likely be deprecated within a few years.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Thu, May 7, 2015 at 11:44 PM, Marko Rauhamaa  wrote:
> Chris Angelico :
>
>>> A software system is defined through its interfaces.
>>
>> And the most important interface is with a human.
>
> I barely ever program anything for the human interface.
>
>> If you want to write single-platform code, go for it; but if you want
>> to write cross-platform code, the best way is to let someone else take
>> care of the differences, abstracting them away into a nice tidy thing
>> that we call a high-level language.
>
> You suggested most software should be platform-agnostic. Now you are
> qualifying the statement.

I'm qualifying it because it's impossible to write 100%
platform-agnostic code without restricting yourself far too much; but
that doesn't mean that it isn't a worthwhile aim.

> But still, I challenge the notion that you could write a web site, game
> or application that feels natural on the XBox, iPhone, Windows PC and
> LXDE at the same time without significant amounts of
> platform-conditioned parts.

Hmm, you're picking up some very different things there. When a human
picks up an iPhone, s/he expects to use it with a touch-based
interface; I don't know what the normal UI for an Xbox is, but Xbox
users would; and the most normal interface for LXDE would be a
mouse+keyboard. The ideal UI for each of them will differ. This is the
same as coding your application differently if you expect a blind
person to use it, or if you want to make it possible to use your
program in a theatre without disturbing the audience, or any other UI
constraint you wish to concoct. That's nothing to do with platform. If
you write a program for Ubuntu, it might go onto a tablet or a
desktop, and the ideal UI for those is (in my opinion, though not
apparently in Unity's) different.

But if you design your program to be used with the same fundamental
human interface - say, a mouse and a keyboard - then you should be
able to do that the same way on many platforms. I've seen libraries
that let you build an ncurses-like interface or a full GUI window,
using exactly the same application code. It's not difficult.

>> I don't need forking, file descriptors, or process IDs, to describe
>> how a person uses my code. Those are *implementation details*.
>
> Even if I programmed for the human and the UI experience were
> more-or-less identical between platforms, the system interfaces can be
> conceptually quite different. Heroic attempts have been made to overcome
> those differences with generic APIs. However, Python should stay out of
> that crusade.
>
> Whole programming cultures, idioms and "right ways" differ between
> platforms. What's the right way to write a service (daemon)? That's
> probably completely different between Windows and Linux. Linux itself is
> undergoing a biggish transformation there: an exemplary daemon of last
> year will likely be deprecated within a few years.

And that's where a library function can be really awesome. What's the
right way to daemonize? "import daemonize; daemonize.daemonize()"
seems good to me. Maybe there's platform-specific code in the
*implementation* of that, but in your application, no. That's the job
of a layer underneath you.

Incidentally, the way I'm seeing things shift these days is mainly
toward *not* daemonizing your services at all. That makes life a lot
easier; instead of writing special code to put yourself in the
background, you just write your code to the standard basic "glass
teletype" model, and then add a little config file that makes it run
in the background. But a Python module could provide a generic
"install as service" function, which will create a systemd config
file, or a Windows service whatever-it-is, or the equivalent on a Mac,
or an Upstart job file, or whatever it detects. Same difference. A
library takes care of all of that.

In Python, we have the 'subprocess' module. Due to Windows
limitations, you have to restrict yourself to having an importable
main file if you want perfect cross-platform compatibility, but that
doesn't affect how your code runs on Linux or Mac OS. What's the best
way to farm work off to a bunch of processes and have them communicate
their results back? You use the subprocess module, and then it doesn't
matter whether they use Unix sockets, named pipes, physical files on
the disk, or miniature nuclear explosions, they'll communicate back
just fine. And when someone develops a new platform that uses nuclear
fusion instead of fission for interprocess communication, Python's
standard library gets enhanced, and your code instantly works - you
don't have to specifically handle the new case.

That's Python's job. Abstracting away all those differences so you
don't have to look at them.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Ian Kelly
On Thu, May 7, 2015 at 8:03 AM, Chris Angelico  wrote:
> On Thu, May 7, 2015 at 11:44 PM, Marko Rauhamaa  wrote:
>> Whole programming cultures, idioms and "right ways" differ between
>> platforms. What's the right way to write a service (daemon)? That's
>> probably completely different between Windows and Linux. Linux itself is
>> undergoing a biggish transformation there: an exemplary daemon of last
>> year will likely be deprecated within a few years.
>
> And that's where a library function can be really awesome. What's the
> right way to daemonize? "import daemonize; daemonize.daemonize()"
> seems good to me. Maybe there's platform-specific code in the
> *implementation* of that, but in your application, no. That's the job
> of a layer underneath you.

https://www.python.org/dev/peps/pep-3143/
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Chris Angelico :

> What's the best way to farm work off to a bunch of processes and have
> them communicate their results back? You use the subprocess module,
> and then it doesn't matter whether they use Unix sockets, named pipes,
> physical files on the disk, or miniature nuclear explosions, they'll
> communicate back just fine. And when someone develops a new platform
> that uses nuclear fusion instead of fission for interprocess
> communication, Python's standard library gets enhanced, and your code
> instantly works - you don't have to specifically handle the new case.
>
> That's Python's job. Abstracting away all those differences so you
> don't have to look at them.

That's the difference between our opinions: you want Python to work the
same on different OS's. I want Python's system programming facilities to
closely mirror those of C.

Take your example of subprocess.Popen. It may be essential to know that
the communication channel is a pipe with standard pipe semantics. The
child program might not be written in Python, after all. In fact, at
system design level you shouldn't care what language you use as long as
the communication interfaces are specified.

Java is lousy at system programming (while excellent in many other
respects). It has traditionally tried to avoid the associated issues by
effectively mandating that all parts of a system be written in Java.
Gladly, Python hasn't (yet) made the same mistake.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Stefan Zimmermann
This discussion is getting really interesting and far beyond the actual topic :)
I want to add some additional thoughts on Popen:

Marko Rauhamaa wrote:
> Stefan Zimmermann:
> 
> > And last but not least, Popen behavior on Windows makes it difficult
> > to write OS-independent Python code which calls external commands that
> > are not binary by default:
> 
> Then, write OS-dependent Python code.

Then you even have to write tool-distribution-dependent code.
Especially Unix tools are often distributed in many different variants for 
Windows. Some installers expose .exe, some .bat, some .cmd files to the user. 
So you always have to explicitly support any variant. Or run everything through 
'cmd /c ...', which is as mentioned a real overhead for .exe files. Or you 
always have to manually use the win32 API, just to call a simple external tool. 

Calling an external command should be one of the simplest tasks in a high level 
scripting language like Python. And that should not involve any OS-specific 
differences, unless you want to use some advanced process handling features 
which are only supported by some specific OS.

Meanwhile I checked Ruby and Perl regarding this feature. Both support it. In 
both langs every standard function that calls external commands (like Perl's 
exec() or system() or Ruby's exec() or system() or IO.popen()), whether they 
invoke a shell or call it directly, support running 'tool.bat' or 'tool.cmd' by 
just writing 'tool'. Python almost seems to be the only major scripting 
language which does not support this implicitly.

Dave Angel wrote:
> It's a nice goal.  But these aren't OS features in Windows, they're 
> shell features.  And there are several shells.  If the user has 
> installed a different shell, is it Python's job to ignore it and 
> simulate what cmd.exe does?

In fact, it's something between OS and shell. Yes, .bat and .cmd files are 
always run thgrough cmd.exe. But on the OS level they are also condsidered 
executable files. And that doesn't depend on %PATHEXT% or any registered 
applications for file extensions on the explorer or shell level. On the OS 
level .exe, .com, .bat and .cmd is the exclusive set of extensions which are 
considered as executable. Not more and not less. When you search for the path 
of an executable with the win32 API, you can call FindExecutable with only 
'tool' and you will get '...\tool.exe', '.com', '.bat' or '.cmd'. Whatever is 
found first according to PATH and precedence. Not more and not less. You can 
try that via pywin32 using win32api.FindExecutable().

And interestingly, Popen can call .bat and .cmd scripts directly if you 
explicitly specify the extension, also with shell=False. But it can't call any 
other file types even if that works in the shell because some application is 
registered for them on the explorer level, unless shell=True.

On Windows .bat and .cmd scripts have a special status beginning on the lowest 
OS level and a Windows user normally expects that any scripting language should 
be able to run them without explicit extension. Other major languages do it. 
Why not Python, too?
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Stefan Zimmermann :

> Calling an external command should be one of the simplest tasks in a
> high level scripting language like Python.

Actually, that's quite a tricky operation in any OS. For example, bash's
simplicity is a trap that claims a lot of victims.

Anyway, Python has os.system() that does the quick and dirty thing you
might be looking for.

> And that should not involve any OS-specific differences, unless you
> want to use some advanced process handling features which are only
> supported by some specific OS.

I can't speak for Windows, but under Linux, it gets advanced pretty
quickly (what gets inherited, how about zombies, how about signals, etc
etc).

> Meanwhile I checked Ruby and Perl regarding this feature. Both support
> it. In both langs every standard function that calls external commands
> (like Perl's exec() or system() or Ruby's exec() or system() or
> IO.popen()), whether they invoke a shell or call it directly, support
> running 'tool.bat' or 'tool.cmd' by just writing 'tool'. Python almost
> seems to be the only major scripting language which does not support
> this implicitly.

I'm not against subprocess.Popen() doing its work under Windows the way
Windows system programmers would expect. I'm against trying to force
Windows and Linux into the same mold where there are genuine
differences.

If I were a Windows developer, I'd expect Python to support something
analogous to what I'd have in C++ or C#. If pipes are natural IPC
channels under Windows, then subprocess.Popen() is probably pretty close
to its mark. However, most of the IPC facilities listed here:

   https://msdn.microsoft.com/en-us/library/windows/desktop/aa36
   5574%28v=vs.85%29.aspx>

seem to be absent in Python (clipboard, COM, data copy, DDE, file
mapping, mailslots, rpc).


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Multi-threaded TCP server class for general use

2015-05-07 Thread mark . r . bannister
Hi,

I needed to develop a highly scalable multi-threaded TCP server in Python and 
when I started writing it in 2013 I could not find a suitable library that 
would scale the way I needed but also easy to use.

So I invented one - it's called Pyloom.  If you want to take a look, it's part 
of my DBIS project at the moment: 
https://sourceforge.net/p/dbis/code/ci/default/tree/src/pyloom

Question: does anyone see the value of this library?  Would you like me to fork 
it as a separate project?  Is SF.net good enough or do people prefer Python 
libraries to be hosted somewhere else?

In a nutshell, Pyloom is a multi-threaded TCP server class which you overload 
in your own program.  The library provides:

* Connection management.

* 1 or more marshal threads that listen and pick up new connections, passing to 
a pool of dedicated worker threads, each of which can manage multiple sessions.

* Methods that you override for handling various states: new connection, data 
received, send data, close connection.

* Can track custom sockets and wake up a session when there is I/O on the 
socket.

* Has a notification service so that one session can wait for data to be 
processed/collected by a different session, and can be woken up when the data 
is ready.

Let me know if anyone is interested in re-using this library.  Tested ok on 
Linux and Solaris, not tried on Windows yet.

Best regards,
Mark.
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Stefan Zimmermann
Marko Rauhamaa wrote:
> Anyway, Python has os.system() that does the quick and dirty thing you
> might be looking for.

Always invokes shell ==> overhead for .exe files
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 1:14 AM, Ian Kelly  wrote:
> On Thu, May 7, 2015 at 8:03 AM, Chris Angelico  wrote:
>> On Thu, May 7, 2015 at 11:44 PM, Marko Rauhamaa  wrote:
>>> Whole programming cultures, idioms and "right ways" differ between
>>> platforms. What's the right way to write a service (daemon)? That's
>>> probably completely different between Windows and Linux. Linux itself is
>>> undergoing a biggish transformation there: an exemplary daemon of last
>>> year will likely be deprecated within a few years.
>>
>> And that's where a library function can be really awesome. What's the
>> right way to daemonize? "import daemonize; daemonize.daemonize()"
>> seems good to me. Maybe there's platform-specific code in the
>> *implementation* of that, but in your application, no. That's the job
>> of a layer underneath you.
>
> https://www.python.org/dev/peps/pep-3143/

Precisely. It's definitely within the language's purview; that that
PEP is deferred is not due to it being a bad idea for the
language/stdlib to deal with these differences.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 1:24 AM, Marko Rauhamaa  wrote:
>> That's Python's job. Abstracting away all those differences so you
>> don't have to look at them.
>
> That's the difference between our opinions: you want Python to work the
> same on different OS's. I want Python's system programming facilities to
> closely mirror those of C.

In that case, what you should do is devise an alternative language
that compiles as a thin layer over C. Have a simple translation script
that turns your code into actual C, then passes it along for
compilation. You could add whatever Python-like features you want
(memory management, maybe), but still be executing as C code. But you
don't want a high level language, if your greatest goal is "closely
mirror C".

> Take your example of subprocess.Popen. It may be essential to know that
> the communication channel is a pipe with standard pipe semantics. The
> child program might not be written in Python, after all. In fact, at
> system design level you shouldn't care what language you use as long as
> the communication interfaces are specified.

Ah, but that's a completely different thing. If you're working with a
child that isn't written in Python, then you're not working at the
level of the multiprocessing library - you're working with "I need to
give this something on its stdin and get the result back from its
stdout". For that, yes, you need something a bit more concrete; but
now it's become part of the API for that child process, whereas the
example I was giving was about the multiprocessing library, where it's
part of the implementation.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: Multi-threaded TCP server class for general use

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 8:20 AM,   wrote:
> I needed to develop a highly scalable multi-threaded TCP server in Python and 
> when I started writing it in 2013 I could not find a suitable library that 
> would scale the way I needed but also easy to use.
>
> So I invented one - it's called Pyloom.  If you want to take a look, it's 
> part of my DBIS project at the moment: 
> https://sourceforge.net/p/dbis/code/ci/default/tree/src/pyloom
>
> Question: does anyone see the value of this library?

I haven't looked at your actual code, but one thing I would suggest
considering is the new asyncio facilities that are coming in Python
3.5 - or the existing asyncio that came in provisionally with 3.4. For
ultimate scalability, you may find yourself wanting less threads and
more asynchronicity, and this is something that's looking pretty cool
and awesome.

(It's also looking like over a thousand, maybe over two thousand,
posts on python-ideas. I have not, I regret to say, been reading every
single post.)

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Ben Finney
Chris Angelico  writes:

> On Fri, May 8, 2015 at 1:24 AM, Marko Rauhamaa  wrote:
> >> That's Python's job. Abstracting away all those differences so you
> >> don't have to look at them.
> >
> > That's the difference between our opinions: you want Python to work
> > the same on different OS's. I want Python's system programming
> > facilities to closely mirror those of C.
>
> In that case, what you should do is devise an alternative language
> that compiles as a thin layer over C. […] But you don't want a high
> level language, if your greatest goal is "closely mirror C".

+1.

Marko, you have many times criticised Python on the basis, essentially,
that it is not some other platform. It's quite unproductive, and leads
only to discussions that are at best frustrating for all involved.

If you want a platform that is fundamentally different from Python,
there are plenty available for you. Arguing that Python should be
fundamentally different will not avail us anything good.

-- 
 \  “Anyone who believes exponential growth can go on forever in a |
  `\finite world is either a madman or an economist.” —Kenneth |
_o__) Boulding |
Ben Finney

-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Rustom Mody
On Wednesday, May 6, 2015 at 11:19:07 AM UTC+5:30, Steven D'Aprano wrote:
> On Wednesday 06 May 2015 14:47, Rustom Mody wrote:
> 
> > It strikes me that the FP crowd has stretched the notion of function
> > beyond recognition And the imperative/OO folks have distorted it beyond
> > redemption.
> 
> In what way?
> 
> 
> > And the middle road shown by Pascal has been overgrown with weeds for some
> > 40 years...
> 
> As much as I like Pascal, and am pleased to see someone defending it, I'm 
> afraid I have no idea what you mean by this.

There are many hats...

As a programmer what you say is fine.
As a language-designer maybe even required -- one would expect a language 
designer to invoke Occm's razor and throw away needless distinctions (read 
syntax categories)

But there are other hats.
Even if you disregard the teacher-hat as irrelevant, the learner-hat is 
universal
-- whatever you are master of now is because at some past point you walked its
learning curve.

> 
> 
> > If the classic Pascal (or Fortran or Basic) sibling balanced abstractions
> > of function-for-value procedure-for-effect were more in the collective
> > consciousness rather than C's travesty of function, things might not have
> > been so messy.
> 
> I'm not really sure that having distinct procedures, as opposed to functions 
> that you just ignore their return result, makes *such* a big difference. Can 
> you explain what is the difference between these?
> 
> sort(stuff)  # A procedure.
> sort(stuff)  # ignore the function return result
> 
> And why the first is so much better than the second?

Here are 3 None-returning functions/methods in python.
ie semantically the returns are identical. Are they conceptually identical?

>>> x=print(1)
1
>>> x
>>> ["hello",None].__getitem__(1)
>>> {"a":1, "b":2}.get("c")
>>> 



> 
> 
> 
> > Well... Dont feel right bashing C without some history...
> > 
> > C didn't start the mess of mixing procedure and function -- Lisp/Apl did.
> > Nor the confusion of = for assignment; Fortran did that.
> 
> Pardon, but = has been used for "assignment" for centuries, long before 
> George Boole and Ada Lovelace even started working on computer theory. Just 
> ask mathematicians:
> 
> "let y = 23"
> 
> Is that not a form of assignment?

Truth?? Lets see...

Here is a set of assignments (as you call) that could occur in a school text
teaching business-math, viz how to calculate 'simple-interest'

amt = prin + si
si = prin * n * rate/100.0 
# for instance
prin = 1000.0 
n = 4.0
rate = 12.0

Put it into python and you get Name errors.

Put it into Haskell or some such; (after changing the comment char from '#' to 
'--')
and you'll get amt and si

Now you can view this operationally (which in some cases even the Haskell docs 
do)
and say a bunch of ='s in python is a sequence whereas in haskell its
'simultaneous' ie a set.

But this would miss the point viz that in Python
amt = prin + si 
denotes an *action*
whereas in Haskell it denotes a *fact* and a bunch of facts that modified by 
being
permuted would be a strange bunch!

Note I am not saying Haskell is right; particularly in its semantics of '='
there are serious issues:
http://blog.languager.org/2012/08/functional-programming-philosophical.html

Just that
x = y
in a functional/declarative/math framework means something quite different 
than in an imperative one
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Rustom Mody
On Wednesday, May 6, 2015 at 6:41:38 PM UTC+5:30, Dennis Lee Bieber wrote:
> On Tue, 5 May 2015 21:47:17 -0700 (PDT), Rustom Mody declaimed the following:
> 
> >If the classic Pascal (or Fortran or Basic) sibling balanced abstractions of 
> >function-for-value
> >procedure-for-effect were more in the collective consciousness rather than 
> >C's
> >travesty of function, things might not have been so messy.
> >
>   I suspect just the term "subprogram" (as in "function subprogram" and
> "subroutine subprogram") would confuse a lot of cubs these days...
> 
> >C didn't start the mess of mixing procedure and function -- Lisp/Apl did.
> >Nor the confusion of = for assignment; Fortran did that.
> 
>   I don't think you can blame FORTRAN for that, given that it was one of
> the first of the higher level languages, and had no confusion internally...

BLAME?? Ha You are being funny Dennis!

There are fat regulation-books that pilots need to follow. Breaking them can
make one culpable all the way to homicide.
The Wright-brothers probably broke them all. Should we call them homicidal 
maniacs?

Shakespeare sometimes has funny spellings. I guess he's illiterate for not 
turning on the spell-checker in Word?

Or [my favorite] Abraham Lincoln used the word 'negro'. So he's a racist?

Speaking more conceptually, there are pioneers and us ordinary folks.¹
The world as we know it is largely a creation of these pioneers.
And if you take them and stuff them into the statistically ordinary mold then 
fit badly.

That puts people especially teachers into a bind.
If I expect my students to be 1/100 as pioneering as Backus, Thomson, Ritchie 
etc, I would be foolish
And if I dont spell out all their mistakes in minute detail and pull them up
for repeating them, I'd not be doing due diligence


I guess this is nowadays called the 'romantic' view.
Ask the family-members of any of these greats for the 'other' view  :-)
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 2:06 PM, Rustom Mody  wrote:
>> > If the classic Pascal (or Fortran or Basic) sibling balanced abstractions
>> > of function-for-value procedure-for-effect were more in the collective
>> > consciousness rather than C's travesty of function, things might not have
>> > been so messy.
>>
>> I'm not really sure that having distinct procedures, as opposed to functions
>> that you just ignore their return result, makes *such* a big difference. Can
>> you explain what is the difference between these?
>>
>> sort(stuff)  # A procedure.
>> sort(stuff)  # ignore the function return result
>>
>> And why the first is so much better than the second?
>
> Here are 3 None-returning functions/methods in python.
> ie semantically the returns are identical. Are they conceptually identical?
>
 x=print(1)
> 1
 x
 ["hello",None].__getitem__(1)
 {"a":1, "b":2}.get("c")


Your second example is a poor one, as it involves calling a dunder
method. But all you've proven with that is that the empty return value
of Python is a real value, and can thus be stored and retrieved.

With print(), you have a conceptual procedure - it invariably returns
None, so it'll normally be called in contexts that don't care about
the return value. What about sys.stdout.write(), though? That's most
often going to be called procedurally - you just write your piece and
move on - but it has a meaningful return value. In normal usage, both
are procedures. The ability to upgrade a function from "always returns
None" to "returns some occasionally-useful information" is one of the
strengths of a unified function/procedure system.

With .get(), it's actually nothing to do with procedures vs functions,
but more to do with Python's use of None where C would most likely use
NULL. By default, .get() uses None as its default return value, so
something that isn't there will be treated as None. In the same way as
your second example, that's just a result of None being a real value
that you can pass around - nothing more special than that.

The other thing you're seeing here is that the *interactive
interpreter* treats None specially. In a program, an expression
statement simply discards its result, whether it's None or 42 or
[1,2,3] or anything else. You could write an interactive interpreter
that has some magic that recognizes that certain functions always
return None (maybe by checking their annotations), and omits printing
their return values, while still printing the return values of other
functions. I'm not sure what it'd gain you, but it wouldn't change the
concepts or semantics surrounding None returns.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Rustom Mody
On Friday, May 8, 2015 at 10:04:02 AM UTC+5:30, Chris Angelico wrote:
> On Fri, May 8, 2015 at 2:06 PM, Rustom Mody wrote:
> >> > If the classic Pascal (or Fortran or Basic) sibling balanced abstractions
> >> > of function-for-value procedure-for-effect were more in the collective
> >> > consciousness rather than C's travesty of function, things might not have
> >> > been so messy.
> >>
> >> I'm not really sure that having distinct procedures, as opposed to 
> >> functions
> >> that you just ignore their return result, makes *such* a big difference. 
> >> Can
> >> you explain what is the difference between these?
> >>
> >> sort(stuff)  # A procedure.
> >> sort(stuff)  # ignore the function return result
> >>
> >> And why the first is so much better than the second?
> >
> > Here are 3 None-returning functions/methods in python.
> > ie semantically the returns are identical. Are they conceptually identical?
> >
>  x=print(1)
> > 1
>  x
>  ["hello",None].__getitem__(1)
>  {"a":1, "b":2}.get("c")
> 
> 
> Your second example is a poor one, as it involves calling a dunder
> method. But all you've proven with that is that the empty return value
> of Python is a real value, and can thus be stored and retrieved.
> 
> With print(), you have a conceptual procedure - it invariably returns
> None, so it'll normally be called in contexts that don't care about
> the return value. What about sys.stdout.write(), though? That's most
> often going to be called procedurally - you just write your piece and
> move on - but it has a meaningful return value. In normal usage, both
> are procedures. The ability to upgrade a function from "always returns
> None" to "returns some occasionally-useful information" is one of the
> strengths of a unified function/procedure system.
> 
> With .get(), it's actually nothing to do with procedures vs functions,

That's backwards.

get is very much a function and the None return is semantically significant.
print is just round peg -- what you call conceptual function -- stuffed into
square hole -- function the only available syntax-category

> but more to do with Python's use of None where C would most likely use
> NULL. By default, .get() uses None as its default return value, so
> something that isn't there will be treated as None. In the same way as
> your second example, that's just a result of None being a real value
> that you can pass around - nothing more special than that.
> 
> The other thing you're seeing here is that the *interactive
> interpreter* treats None specially. 


Yeah I know
And if python did not try to be so clever, I'd save some time with
student-surprises

> In a program, an expression
> statement simply discards its result, whether it's None or 42 or
> [1,2,3] or anything else. You could write an interactive interpreter
> that has some magic that recognizes that certain functions always
> return None (maybe by checking their annotations), and omits printing
> their return values, while still printing the return values of other
> functions.


Hoo Boy!  You seem to be in the 'the-more-the-better' (of magic) camp


> I'm not sure what it'd gain you, but it wouldn't change the
> concepts or semantics surrounding None returns.

It would sure help teachers who get paid by the hour and would rather spend
time on technical irrelevantia than grapple with significant concepts
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Rustom Mody
On Friday, May 8, 2015 at 10:24:06 AM UTC+5:30, Rustom Mody wrote:

> get is very much a function and the None return is semantically significant.
> print is just round peg -- what you call conceptual function -- stuffed into
> square hole -- function the only available syntax-category

Sorry "Conceptual procedure" of course
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 2:53 PM, Rustom Mody  wrote:
> Yeah I know
> And if python did not try to be so clever, I'd save some time with
> student-surprises
>
>> In a program, an expression
>> statement simply discards its result, whether it's None or 42 or
>> [1,2,3] or anything else. You could write an interactive interpreter
>> that has some magic that recognizes that certain functions always
>> return None (maybe by checking their annotations), and omits printing
>> their return values, while still printing the return values of other
>> functions.
>
>
> Hoo Boy!  You seem to be in the 'the-more-the-better' (of magic) camp

No way! I wouldn't want the interactive interpreter to go too magical.
I'm just saying that it wouldn't break Python to have it do things
differently.

>> I'm not sure what it'd gain you, but it wouldn't change the
>> concepts or semantics surrounding None returns.
>
> It would sure help teachers who get paid by the hour and would rather spend
> time on technical irrelevantia than grapple with significant concepts

Why have the concept of a procedure? Python's rule is simple: Every
function either returns a value or raises an exception. (Even
generators. When you call a generator function, you get back a return
value which is the generator state object.) The procedure/function
distinction is irrelevant.

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: PEP idea: On Windows, subprocess should implicitly support .bat and .cmd scripts by using FindExecutable from win32 API

2015-05-07 Thread Marko Rauhamaa
Ben Finney :

> Chris Angelico  writes:
>
>> On Fri, May 8, 2015 at 1:24 AM, Marko Rauhamaa  wrote:
>> >> That's Python's job. Abstracting away all those differences so you
>> >> don't have to look at them.
>> >
>> > That's the difference between our opinions: you want Python to work
>> > the same on different OS's. I want Python's system programming
>> > facilities to closely mirror those of C.
>>
>> In that case, what you should do is devise an alternative language
>> that compiles as a thin layer over C. […] But you don't want a high
>> level language, if your greatest goal is "closely mirror C".
>
> +1.
>
> Marko, you have many times criticised Python on the basis,
> essentially, that it is not some other platform. It's quite
> unproductive, and leads only to discussions that are at best
> frustrating for all involved.
>
> If you want a platform that is fundamentally different from Python,
> there are plenty available for you. Arguing that Python should be
> fundamentally different will not avail us anything good.

I don't. Python *does* provide OS-dependent facilities. Somebody
complained about that. I said Python was the way it should be, even
though there are signs Python "wants" to become more Java-esque.

For example, Python provides the errno module. Its use and meanings can
be looked up with man pages.

So Python has the great advantage that it *can* be used as a Linux
system programming language. And I am.


Marko
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Rustom Mody
On Friday, May 8, 2015 at 10:39:38 AM UTC+5:30, Chris Angelico wrote:
> Why have the concept of a procedure? 

On Friday, Chris Angelico ALSO wrote:
> With print(), you have a conceptual procedure...

So which do you want to stand by?


Just to be clear I am not saying python should be any different on this front.

Gödel's (2nd) theorem guarantees that no formalism (aka programming language in 
our case)
can ever be complete and so informal borrowing is inevitable.
Its just that Pascal, Fortran, Basic, by ingesting this informal requirement 
into
the formal language make THIS aspect easier to learn/teach...
... at the cost of eleventeen others

[Regular expressions in Fortran anyone?]
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: asyncio: What is the difference between tasks, futures, and coroutines?

2015-05-07 Thread Chris Angelico
On Fri, May 8, 2015 at 4:36 PM, Rustom Mody  wrote:
> On Friday, May 8, 2015 at 10:39:38 AM UTC+5:30, Chris Angelico wrote:
>> Why have the concept of a procedure?
>
> On Friday, Chris Angelico ALSO wrote:
>> With print(), you have a conceptual procedure...
>
> So which do you want to stand by?

A procedure, in Python, is simply a function which returns None.
That's all. It's not any sort of special concept. It doesn't need to
be taught. If your students are getting confused by it, stop teaching
it!

ChrisA
-- 
https://mail.python.org/mailman/listinfo/python-list