Re: [Python-Dev] Cross compiling Python (for Android)

2014-10-26 Thread Stefan Krah
Frank, Matthew I  intel.com> writes:
  
> 4. Module _decimal is failing to compile.  The problem is that it has
>    a header called memory.h.  Android's libc has the problem that
>    /usr/include/stdlib.h includes .  But the build system
>    puts -I. on the include path before the system dirs (as it should)
>    so when compiling _decimal, Modules/_decimal/libmpdec/memory.h gets
>    found instead of /usr/include/memory.h.  Shiz has a patch here:
>   
https://github.com/rave-engine/python3-android/blob/master/mk/python/3.3.5/p\
> ython-3.3.5-android-libmpdec.patch
>    (which renames memory.h -> mpmemory.h) but I don't know
>  
>    a.  Is there a tracker for this yet?  and
>    b.  Is Shiz's fix the desired one or should I be looking for
>    another approach?  (Maybe modifying the -I flags for the build
>    of just the build of _decimal or something?)

I think using "memory.h" in an application is standard conforming.
Since _decimal compiles on all other Linux platforms, it may be worth
reporting this to the Android developers and see if they can fix it
(possibly by not including memory.h in stdlib.h).

FWIW, OCaml also has a "memory.h" header.


Stefan Krah

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] XP buildbot problem cloning from hg.python.org

2014-10-26 Thread M.-A. Lemburg
On 26.10.2014 00:14, Ned Deily wrote:
> In article ,
>  David Bolen  wrote:
> 
>> David Bolen  writes:
>>
>>> which appears to die mid-stream while receiving the manifests.
>>>
>>> So I'm sort of hoping there might be some record server-side as to why
>>> things are falling apart mid-way.
>>
>> Just to follow-up to myself, I get the same same error trying to do a
>> clone from my own personal XP machine rather than the buildbot (which
>> is a VM).  I've had the issue with hg 1.6.2, 2.5.2 and 3.1.2.
>>
>> However, the same clones completely successfully under OSX and Linux.
>>
>> So that's sort of strange.
> 
> Very interesting!  I had been doing some housekeeping on some of my 
> older OS X build systems over the past few days and I've run into the 
> same problem.  In particular, I am seeing this failure on an OS X 10.5.8 
> system (running in a Fusion VM) which I've used for years and from which 
> I have regularly cloned repos from hg.python.org.  I spent some time 
> yesterday trying to isolate it.  I came to the conclusion that it was 
> independent of the version of OpenSSL (identical failures occurred with 
> the system's ancient Apple 0.9.7 as well as a newly-build 1.0.1j) and 
> independent of the version of hg (at least with two data points, current 
> and a year-old version) and seemingly independent of the network 
> connection.  I was not able to reproduce the failure on the host OS X 
> system (10.10) and I didn't have problems a few days earlier with 
> various other OS X releases (10.6.x through 10.9.x) also running in VMs 
> on the same host.  I stumbled across a workaround for the problem as I 
> was experiencing it:  adding --uncompressed to hg clone eliminated 
> failures.  You can get more info on the hg failures by adding 
> --traceback and --debugger to the clone command.  After spending way too 
> much time on the issue, I was not in the mood to spend more time 
> isolating the problem after finding a workaround but if others are also 
> seeing it, it might be worth doing.  Sigh.
> 
>   $ hg --version
>   Mercurial Distributed SCM (version 3.1.2)
>   $ hg clone -U http://hg.python.org/cpython cpython
>   real URL is https://hg.python.org/cpython
>   requesting all changes
>   adding changesets
>   adding manifests
>   transaction abort!
>   rollback completed
>   abort: connection ended unexpectedly
>   $ hg clone --uncompressed -U https://hg.python.org/cpython cpython
>   streaming all changes
>   10404 files to transfer, 248 MB of data
>   transferred 248 MB in 44.4 seconds (5.58 MB/sec)

If compression is causing the problem, perhaps there's an incompatibility
with the use zlib version between the host and your client system.

hg.python.org was recently updated to a new Ubuntu version.

-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Oct 26 2014)
>>> Python Projects, Consulting and Support ...   http://www.egenix.com/
>>> mxODBC.Zope/Plone.Database.Adapter ...   http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/

2014-10-24: Released eGenix pyOpenSSL 0.13.5 ...  http://egenix.com/go63

: Try our mxODBC.Connect Python Database Interface for free ! ::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
   Registered at Amtsgericht Duesseldorf: HRB 46611
   http://www.egenix.com/company/contact/
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Tony Kelman
Thanks all for the responses. Clearly this is a subject about which
people feel strongly, so that's good at least. David Murray's guidance
in particular points to the most likely path to get improvements to
really happen.

Steve Dower:
> Building CPython for Windows is not something that needs solving.

Not in your opinion, but numerous packagers of MinGW-based native or
cross-compiled package sets would love to include Python. The fact
that they currently can't, without many patches, is a problem.

> The culture on Windows is to redistribute binaries, not source,

There are many cultures using Windows. Including open-source ones.

> and both the core team and a number of redistributors have this figured
> out (and it will only become easier with VC14 and Python 3.5).

With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang.
MSVC is not the only compiler on Windows. There are many use cases for
preferring other compilers. Have you read this wiki page for example?
https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows

In my personal experience, having recently gotten Julia to compile using
MSVC for the first time, MSVC as a compiler is highly deficient for many
needs especially in the scientific software community:
- C99 (getting better recently, but still not done)
- AT&T syntax assembly
- C++11 features (also mostly okay now, but not if you're using an older
  MSVC version with Python 2.7, which many people still have to do)
- 128-bit integer intrinsics
- cannot cross-compile from anything that isn't Windows
- build systems foreign relative to shell/makefile systems used by most
  open-source projects, few projects have time to maintain 2 separate build
  systems (cmake helps but takes a lot of initial effort to convert to)
- no free-as-in-beer Fortran compiler available

I have none of these problems when I use MinGW-w64. Hence the desire to
be able to curate an all-MinGW software stack. It's not a matter of open-
source ideology for me, it's brass tacks "can I do the work I need to do."
With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython
in an all-MinGW stack hurts, a lot.

Only cross-compilation and the build system in the above list are relevant
to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
real reasons for some groups of users and developers to prefer MinGW-w64
over MSVC.

> I'd rather see this effort thrown behind compiling extensions,
> including cross compilation.

There are patches awaiting review that improve this as well. Efforts to
improve CPython's build system and the handling of extensions are not
completely independent, in many cases the patches are written by the same
set of MinGW users. One of these sets of patches is not inherently evil,
you understandably have less interest in them but it's still disappointing
to see so little movement on either.

> Having different builds of CPython out there will only fragment the
> community and hurt extension authors far more than it may seem to help.

The community of people developing and using open-source projects, either
CPython or otherwise, is already highly fragmented. Ignoring it makes it
worse. python.org does not have to distribute or endorse MinGW-compiled
builds of CPython. If the build option never gets incorporated, then it
will continue to be reverse-engineered.

Guido van Rossum:
> Here's the crux of the matter. We want compiled extension modules
> distributed via PyPI to work with the binaries distributed from python.org.

Absolutely. I don't think additional options in the build system would
change this.

R. David Murray:
> And, at this point, we would NEED A BUILDBOT.  That is, a machine that
> has whatever tools are required installed such that tests added to the
> test suite to test MinGW support in distutils would run, so we can be
> sure we don't break anything when making other changes.

That's not too hard. I've done this for other projects. AppVeyor works if
your build is short enough, and I've done cross-compilation from Travis
CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's
infrastructure, but I can offer guidance if it would help.

Steve Dower:
> I'm afraid of users having numpy crash because they're using an MSVC
> CPython instead of a mingw CPython. I'm afraid of users not being able
> to use library A and library B at the same time because A requires MSVC
> CPython and B requires mingw CPython. (I can produce more examples if you
> like, but the general concern is having a fragmented community, as I said
> in my previous post.)

A valid fear. Mixing C runtimes can cause problems, I've seen this myself.
Correct me if I'm wrong, but this is nearly as much of an issue if someone
wants to use a different version of MSVC to compile CPython than the version
used to build the official binaries. It requires care, but you can't deny
that there are use cases where people will want and need to do such things.
Is possible fr

Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-26 Thread Stefan Richthofer

>You shouldn't have to emulate that. The exact behavior of GC is allowed to vary between systems.

Yes, of course. I am looking into this for JyNI, which in contrast should emulate CPython behavior as good as possible.

And for such details, -one by one- I am currently weighting up whether it's easier to support it already in Jython or in the layer.

And for aspects where it is feasible, I see nothing wrong to get as close as possible to the reference implementation (and the persistence of weakrefs on resurrection seems to be none of these indeed).

 

Gesendet: Sonntag, 26. Oktober 2014 um 06:53 Uhr
Von: "Guido van Rossum" 
An: "Stefan Richthofer" 
Cc: "[email protected]" 
Betreff: Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection



On Saturday, October 25, 2014, Stefan Richthofer  wrote:

Okay, sorry, I was thinking too Jython-like. I fixed runGC() just to
see now that it does not even trigger resurrection, since under
CPython there are no finalizers executed in ref cycles (i.e. I find my
objects in gc.garbage).
So I realize, my xy_cyclic tests are pointless anyway since in cyclic
gc no resurrection can happen.

> The second problem (with weakref) is different: weakrefs are cleared
> before __del__ is called, so resurrection doesn't affect the whole
> process.
It appears weakrefs are only cleared if this is done by gc (where no
resurrection can happen anyway). If a resurrection-performing-__del__ is
just called by ref-count-drop-to-0, weakrefs persist - a behavior that is
very difficult and inefficient to emulate in Jython, but I'll give it
some more thoughts...
 

You shouldn't have to emulate that. The exact behavior of GC is allowed to vary between systems.

 

However thanks for the help!

-Stefan


> Gesendet: Sonntag, 26. Oktober 2014 um 01:22 Uhr
> Von: "Antoine Pitrou" 
> An: [email protected]
> Betreff: Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection
>
>
> Hello Stefan,
>
> On Sun, 26 Oct 2014 00:20:47 +0200
> "Stefan Richthofer"  wrote:
> > Hello developers,
> >
> > I observed strange behaviour in CPython (tested in 2.7.5 and 3.3.3)
> > regarding object resurrection.
>
> Your runGC() function is buggy, it does not run the GC under CPython.
> Fix it and the first problem (with id()) disappears.
>
> The second problem (with weakref) is different: weakrefs are cleared
> before __del__ is called, so resurrection doesn't affect the whole
> process. Add a callback to the weakref and you'll see it is getting
> called.
>
> In other words, CPython behaves as expected. Your concern is
> appreciated, though.
>
> Regards
>
> Antoine.
>
>
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de
>
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: https://mail.python.org/mailman/options/python-dev/guido%40python.org


--
--Guido van Rossum (on iPad)



___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Ray Donnelly
On Sun, Oct 26, 2014 at 1:12 PM, Tony Kelman  wrote:
> Thanks all for the responses. Clearly this is a subject about which
> people feel strongly, so that's good at least. David Murray's guidance
> in particular points to the most likely path to get improvements to
> really happen.
>
> Steve Dower:
>> Building CPython for Windows is not something that needs solving.
>
> Not in your opinion, but numerous packagers of MinGW-based native or
> cross-compiled package sets would love to include Python. The fact
> that they currently can't, without many patches, is a problem.
>
>> The culture on Windows is to redistribute binaries, not source,
>
> There are many cultures using Windows. Including open-source ones.
>
>> and both the core team and a number of redistributors have this figured
>> out (and it will only become easier with VC14 and Python 3.5).
>
> With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang.
> MSVC is not the only compiler on Windows. There are many use cases for
> preferring other compilers. Have you read this wiki page for example?
> https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows
>
> In my personal experience, having recently gotten Julia to compile using
> MSVC for the first time, MSVC as a compiler is highly deficient for many
> needs especially in the scientific software community:
> - C99 (getting better recently, but still not done)
> - AT&T syntax assembly
> - C++11 features (also mostly okay now, but not if you're using an older
>   MSVC version with Python 2.7, which many people still have to do)
> - 128-bit integer intrinsics
> - cannot cross-compile from anything that isn't Windows
> - build systems foreign relative to shell/makefile systems used by most
>   open-source projects, few projects have time to maintain 2 separate build
>   systems (cmake helps but takes a lot of initial effort to convert to)
> - no free-as-in-beer Fortran compiler available
>
> I have none of these problems when I use MinGW-w64. Hence the desire to
> be able to curate an all-MinGW software stack. It's not a matter of open-
> source ideology for me, it's brass tacks "can I do the work I need to do."
> With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython
> in an all-MinGW stack hurts, a lot.
>
> Only cross-compilation and the build system in the above list are relevant
> to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
> real reasons for some groups of users and developers to prefer MinGW-w64
> over MSVC.
>
>> I'd rather see this effort thrown behind compiling extensions,
>> including cross compilation.
>
> There are patches awaiting review that improve this as well. Efforts to
> improve CPython's build system and the handling of extensions are not
> completely independent, in many cases the patches are written by the same
> set of MinGW users. One of these sets of patches is not inherently evil,
> you understandably have less interest in them but it's still disappointing
> to see so little movement on either.
>
>> Having different builds of CPython out there will only fragment the
>> community and hurt extension authors far more than it may seem to help.
>
> The community of people developing and using open-source projects, either
> CPython or otherwise, is already highly fragmented. Ignoring it makes it
> worse. python.org does not have to distribute or endorse MinGW-compiled
> builds of CPython. If the build option never gets incorporated, then it
> will continue to be reverse-engineered.
>
> Guido van Rossum:
>> Here's the crux of the matter. We want compiled extension modules
>> distributed via PyPI to work with the binaries distributed from
>> python.org.
>
> Absolutely. I don't think additional options in the build system would
> change this.
>
> R. David Murray:
>> And, at this point, we would NEED A BUILDBOT.  That is, a machine that
>> has whatever tools are required installed such that tests added to the
>> test suite to test MinGW support in distutils would run, so we can be
>> sure we don't break anything when making other changes.
>
> That's not too hard. I've done this for other projects. AppVeyor works if
> your build is short enough, and I've done cross-compilation from Travis
> CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's
> infrastructure, but I can offer guidance if it would help.
>
> Steve Dower:
>> I'm afraid of users having numpy crash because they're using an MSVC
>> CPython instead of a mingw CPython. I'm afraid of users not being able
>> to use library A and library B at the same time because A requires MSVC
>> CPython and B requires mingw CPython. (I can produce more examples if you
>> like, but the general concern is having a fragmented community, as I said
>> in my previous post.)
>
> A valid fear. Mixing C runtimes can cause problems, I've seen this myself.
> Correct me if I'm wrong, but this is nearly as much of an issue if someone
> wants to use a different versi

Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread R. David Murray
On Sun, 26 Oct 2014 06:12:45 -0700, "Tony Kelman"  wrote:
> Steve Dower:
> > Building CPython for Windows is not something that needs solving.
> 
> Not in your opinion, but numerous packagers of MinGW-based native or
> cross-compiled package sets would love to include Python. The fact
> that they currently can't, without many patches, is a problem.

If this includes (or would likely include) a significant portion of the
Scientific Computing community, I would think that would be a compelling
use case.  We'd need to be educated more about the reasons why this
approach works better than remaining compatible with MSVC CPython so we
could evaluate the risks and reward intelligently.  (I wonder..."things
are going to fragment anyway if you (cpython) don't do anything" might
be an argument, if true...but wouldn't make the consequences any
easier to deal with :(

But as has been discussed, it seems better to focus first on fixing the
issues on which we are all in agreement (building extensions with MinGW).

> R. David Murray:
> > And, at this point, we would NEED A BUILDBOT.  That is, a machine that
> > has whatever tools are required installed such that tests added to the
> > test suite to test MinGW support in distutils would run, so we can be
> > sure we don't break anything when making other changes.
> 
> That's not too hard. I've done this for other projects. AppVeyor works if
> your build is short enough, and I've done cross-compilation from Travis
> CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's
> infrastructure, but I can offer guidance if it would help.

When I say "we need a buildbot", what I mean is that we need someone
willing to donate the resources and the *time and expertise* to setting
up and maintaining something that integrates with our existing buildbot
setup.  You set up a buildbot slave, request an id and password from
Antoine, keep the thing running, and respond in a timely fashion to
requests for help resolving issues that arise on the buildbot (both
buildbot issues and help-me-diagnose-this-failure issues).  After the
initial setup the load isn't generally heavy (I haven't had to do
anything with the OSX buildbot running on the machine in my dinning room
for months and months now, for example).

So your guidance would have to go to someone who was volunteering to
take on this task...there isn't anyone on the existing core team who
would have time to do it (if I'm wrong, I'm sure someone will speak up).
On the other hand, you don't have to be a committer to run a buildbot,
and there *are* people on the core-mentorship list who have expressed
interest in helping out with our automated testing infrastructure,
including (if I understand correctly) adding some level of integration
to other CI systems (which might just be messages to the IRC
channel)[*].  So that could be a fruitful avenue to explore.

--David

[*] This is an area in which I have an interest, but it hasn't gotten
high enough on my todo list yet for me to figure out exactly what the
current state of things is so I can help it along.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Ray Donnelly
On Sun, Oct 26, 2014 at 2:28 PM, Ray Donnelly  wrote:
> On Sun, Oct 26, 2014 at 1:12 PM, Tony Kelman  wrote:
>> Thanks all for the responses. Clearly this is a subject about which
>> people feel strongly, so that's good at least. David Murray's guidance
>> in particular points to the most likely path to get improvements to
>> really happen.
>>
>> Steve Dower:
>>> Building CPython for Windows is not something that needs solving.
>>
>> Not in your opinion, but numerous packagers of MinGW-based native or
>> cross-compiled package sets would love to include Python. The fact
>> that they currently can't, without many patches, is a problem.
>>
>>> The culture on Windows is to redistribute binaries, not source,
>>
>> There are many cultures using Windows. Including open-source ones.
>>
>>> and both the core team and a number of redistributors have this figured
>>> out (and it will only become easier with VC14 and Python 3.5).
>>
>> With MSVC. It doesn't work with MinGW, it likely doesn't work with Clang.
>> MSVC is not the only compiler on Windows. There are many use cases for
>> preferring other compilers. Have you read this wiki page for example?
>> https://github.com/numpy/numpy/wiki/Numerical-software-on-Windows
>>
>> In my personal experience, having recently gotten Julia to compile using
>> MSVC for the first time, MSVC as a compiler is highly deficient for many
>> needs especially in the scientific software community:
>> - C99 (getting better recently, but still not done)
>> - AT&T syntax assembly
>> - C++11 features (also mostly okay now, but not if you're using an older
>>   MSVC version with Python 2.7, which many people still have to do)
>> - 128-bit integer intrinsics
>> - cannot cross-compile from anything that isn't Windows
>> - build systems foreign relative to shell/makefile systems used by most
>>   open-source projects, few projects have time to maintain 2 separate build
>>   systems (cmake helps but takes a lot of initial effort to convert to)
>> - no free-as-in-beer Fortran compiler available
>>
>> I have none of these problems when I use MinGW-w64. Hence the desire to
>> be able to curate an all-MinGW software stack. It's not a matter of open-
>> source ideology for me, it's brass tacks "can I do the work I need to do."
>> With MSVC I can't, with MinGW-w64 I can. Not being able to include CPython
>> in an all-MinGW stack hurts, a lot.
>>
>> Only cross-compilation and the build system in the above list are relevant
>> to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
>> real reasons for some groups of users and developers to prefer MinGW-w64
>> over MSVC.
>>
>>> I'd rather see this effort thrown behind compiling extensions,
>>> including cross compilation.
>>
>> There are patches awaiting review that improve this as well. Efforts to
>> improve CPython's build system and the handling of extensions are not
>> completely independent, in many cases the patches are written by the same
>> set of MinGW users. One of these sets of patches is not inherently evil,
>> you understandably have less interest in them but it's still disappointing
>> to see so little movement on either.
>>
>>> Having different builds of CPython out there will only fragment the
>>> community and hurt extension authors far more than it may seem to help.
>>
>> The community of people developing and using open-source projects, either
>> CPython or otherwise, is already highly fragmented. Ignoring it makes it
>> worse. python.org does not have to distribute or endorse MinGW-compiled
>> builds of CPython. If the build option never gets incorporated, then it
>> will continue to be reverse-engineered.
>>
>> Guido van Rossum:
>>> Here's the crux of the matter. We want compiled extension modules
>>> distributed via PyPI to work with the binaries distributed from
>>> python.org.
>>
>> Absolutely. I don't think additional options in the build system would
>> change this.
>>
>> R. David Murray:
>>> And, at this point, we would NEED A BUILDBOT.  That is, a machine that
>>> has whatever tools are required installed such that tests added to the
>>> test suite to test MinGW support in distutils would run, so we can be
>>> sure we don't break anything when making other changes.
>>
>> That's not too hard. I've done this for other projects. AppVeyor works if
>> your build is short enough, and I've done cross-compilation from Travis
>> CI for other projects. Or Jenkins, or a Vagrant VM. I don't know PSF's
>> infrastructure, but I can offer guidance if it would help.
>>
>> Steve Dower:
>>> I'm afraid of users having numpy crash because they're using an MSVC
>>> CPython instead of a mingw CPython. I'm afraid of users not being able
>>> to use library A and library B at the same time because A requires MSVC
>>> CPython and B requires mingw CPython. (I can produce more examples if you
>>> like, but the general concern is having a fragmented community, as I said
>>> in my previous post.)
>>
>> A valid fear. Mixing C runtimes 

Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-26 Thread Armin Rigo
Hi Stefan,

On 26 October 2014 02:50, Stefan Richthofer  wrote:
> It appears weakrefs are only cleared if this is done by gc (where no
> resurrection can happen anyway). If a resurrection-performing-__del__ is
> just called by ref-count-drop-to-0, weakrefs persist -

How do you reach this conclusion?  The following test program seems to
show the opposite, by printing None on Python 2.7.6:

import weakref
   class X(object):
def __del__(self):
print ref()
x = X()
ref = weakref.ref(x)
del x


A bientôt,

Armin.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Tony Kelman

If this includes (or would likely include) a significant portion of the
Scientific Computing community, I would think that would be a compelling
use case.


I can't speak for any of the scientific computing community besides myself,
but my thoughts: much of the development, as you know, happens on Linux
with GCC (or OSX with clang). But it's important for users across all
platforms to be able to install binaries with a minimum of fuss.
Limitations of MSVC have already led the numpy/scipy community to
investigate building with MinGW-w64. (See several related threads from
April on the numpy-discussion list.) Ensuring compatibility with CPython's
chosen msvcrt has made that work even more difficult for them.

And Julia is not yet a significant portion of anything, but our community
is growing rapidly. See https://github.com/JuliaLang/IJulia.jl/pull/211 -
with respect to IJulia, "Python is just an implementation detail." Even
such a silly thing as automating the execution of the Python installer, to
set up a private only-used-by-IJulia copy, is needlessly difficult to do.
The work on Jupyter will hopefully help this specific situation sooner or
later, but there are other cases where CPython needs to serve as part of
the infrastructure, and the status quo makes that harder to automate.


We'd need to be educated more about the reasons why this approach works
better than remaining compatible with MSVC CPython so we could evaluate
the risks and reward intelligently.


Ideally, we can pursue an approach that would be able to remain compatible
with MSVC CPython. Even if this needs involvement from MinGW-w64 to make
happen, I don't think it's intractable. But I know less about the inner
details of CPython than you do so I could be wrong.


But as has been discussed, it seems better to focus first on fixing the
issues on which we are all in agreement (building extensions with MinGW).


Yes. We'll look into how much of the work has already been done on this.


there *are* people on the core-mentorship list who have expressed
interest in helping out with our automated testing infrastructure,
including (if I understand correctly) adding some level of integration
to other CI systems (which might just be messages to the IRC
channel)[*].  So that could be a fruitful avenue to explore.


If we pursue a fork (which not everyone will like but might happen anyway)
then we likely would do this type of CI integration along the way as Ray
suggested. So even if it turns out to fail as an endeavor, some good may
come of it.

Sincerely,
Tony

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Paul Moore
On 26 October 2014 13:12, Tony Kelman  wrote:
> Only cross-compilation and the build system in the above list are relevant
> to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
> real reasons for some groups of users and developers to prefer MinGW-w64
> over MSVC.

Not really, to be honest. I still don't understand why anyone not
directly involved in CPython development would need to build their own
Python executable on Windows. Can you explain a single specific
situation where installing and using the python.org executable is not
possible (on the assumption that the mingw build is functionally
identical and ABI compatible with the CPython build, the claim being
made here)? Note that "not possible" is different from "I don't want
to" or "it doesn't fit my views about free software" or similar. Also
note that building extensions is different - you have to assume that
building extensions using mingw with a mingw-built CPython is just as
hard as building them with a MSVC-built CPython (otherwise you've made
changes to extension building and you should contribute them
independently so that everyone can benefit, not just those who build
their own Python with mingw!)

> Paul Moore:
>> If it were possible to cross-compile compatible extensions on Linux,
>> projects developed on Linux could supply Windows binaries much more
>> easily, which would be a huge benefit to the whole Windows Python
>> community.
>
> I want to do exactly this in an automated repeatable way, preferably on
> a build service. This seems harder to do when CPython cannot itself be
> built and handled as a dependency by that same automated, repeatable
> build service.

I cannot see why you would need to build Python in order to build
extensions. After all, if your build service is on Linux, it couldn't
run a mingw-built Python anyway. If your build service is a Windows
machine, just install the python.org binaries (which is a simple
download and install, that can be fully automated, but which is a
one-off process anyway).

> Unless it becomes possible to cross-compile extensions
> using the build machine's own version of Python, which might be the right
> approach.

This may be where we are getting confused. I see this as the only
practical way of cross-compiling Windows extensions on Linux, by using
the Linux Python. So being able to cross-compile Python is not
relevant.

On a tangential note, any work on supporting mingw builds and
cross-compilation should probably be done using setuptools, so that it
is external to the CPython code. That way (a) it isn't constrained by
the CPython release schedules and backward compatibility constraints,
and (b) it can be used in older versions of Python (which is pretty
much essential if it's to be useful, TBH).

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Paul Moore
On 26 October 2014 17:59, Tony Kelman  wrote:
> Ensuring compatibility with CPython's
> chosen msvcrt has made that work even more difficult for them.

Ensuring compatibility with CPython's msvcrt is mandatory unless you
want to create a split in the community over which extensions work
with which builds. That's precisely the scenario Steve Dower and
myself (among others) fear, and want to avoid at all cost.

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Paul Moore
On 26 October 2014 14:28, Ray Donnelly  wrote:
> I like this idea. To reduce the workload, we should probably pick
> Python3 (at least initially)?

Aren't the existing patches on the tracker already for Python 3.5+?
They should be, as that's the only version that's likely to be a
possible target (unless you get someone to agree to allow a change
like this as in scope for Pythhon 2.7, which I've seen no indication
of). Maybe I'm confused here.

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Ray Donnelly
On Sun, Oct 26, 2014 at 10:41 PM, Paul Moore  wrote:
> On 26 October 2014 13:12, Tony Kelman  wrote:
>> Only cross-compilation and the build system in the above list are relevant
>> to CPython, but I hope I have convinced you, Paul Moore, etc. that there are
>> real reasons for some groups of users and developers to prefer MinGW-w64
>> over MSVC.
>
> Not really, to be honest. I still don't understand why anyone not
> directly involved in CPython development would need to build their own
> Python executable on Windows. Can you explain a single specific
> situation where installing and using the python.org executable is not
> possible (on the assumption that the mingw build is functionally
> identical and ABI compatible with the CPython build, the claim being
> made here)?

I don't know where this "ABI compatible" thing came into being; I
think Steve Dower eluded to it by stating that we should focus on
enabling MinGW-w64 as an extension-building compiler, using a core
interpreter built with MSVC, and that by limiting the interfaces to
the Windows C calling conventions everything would be OK.
Unfortunately this is not possible. MinGW-w64-built extensions need to
link to msvcrt.dll to do anything useful and you cannot mix two
different msvcr??.dlls in one application. Please see
http://msdn.microsoft.com/en-us/library/abx4dbyh%28v=VS.100%29.aspx
and http://msdn.microsoft.com/en-us/library/ms235460%28v=VS.100%29.aspx
for the details. MinGW-w64 assumes the very old msvcrt.dll files from
Windows XP SP3 and XP64 specifically to avoid this mess. The rest of
your reply assumes that this ABI compatibility is a given so I'll stop
at this point.

> Note that "not possible" is different from "I don't want
> to" or "it doesn't fit my views about free software" or similar. Also
> note that building extensions is different - you have to assume that
> building extensions using mingw with a mingw-built CPython is just as
> hard as building them with a MSVC-built CPython (otherwise you've made
> changes to extension building and you should contribute them
> independently so that everyone can benefit, not just those who build
> their own Python with mingw!)
>
>> Paul Moore:
>>> If it were possible to cross-compile compatible extensions on Linux,
>>> projects developed on Linux could supply Windows binaries much more
>>> easily, which would be a huge benefit to the whole Windows Python
>>> community.
>>
>> I want to do exactly this in an automated repeatable way, preferably on
>> a build service. This seems harder to do when CPython cannot itself be
>> built and handled as a dependency by that same automated, repeatable
>> build service.
>
> I cannot see why you would need to build Python in order to build
> extensions. After all, if your build service is on Linux, it couldn't
> run a mingw-built Python anyway. If your build service is a Windows
> machine, just install the python.org binaries (which is a simple
> download and install, that can be fully automated, but which is a
> one-off process anyway).
>
>> Unless it becomes possible to cross-compile extensions
>> using the build machine's own version of Python, which might be the right
>> approach.
>
> This may be where we are getting confused. I see this as the only
> practical way of cross-compiling Windows extensions on Linux, by using
> the Linux Python. So being able to cross-compile Python is not
> relevant.
>
> On a tangential note, any work on supporting mingw builds and
> cross-compilation should probably be done using setuptools, so that it
> is external to the CPython code. That way (a) it isn't constrained by
> the CPython release schedules and backward compatibility constraints,
> and (b) it can be used in older versions of Python (which is pretty
> much essential if it's to be useful, TBH).
>
> Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Tony Kelman

Not really, to be honest. I still don't understand why anyone not
directly involved in CPython development would need to build their own
Python executable on Windows. Can you explain a single specific
situation where installing and using the python.org executable is not
possible


I want, and in many places *need*, an all-MinGW stack. For a great deal
of software that is not Python, I can do this today. I can use build
services, package management, and dependency resolution tools that work
very well together with this all-MinGW software stack. These are problems
that Python has notoriously struggled with on Windows for a long time.
It's not "my views on free software," it's the reality of MSVC being a
near-useless compiler for scientific software. (And I don't see that
changing much.) Do my requirements conflict with many non-scientific
Python users on Windows? Probably. So you're welcome to ignore my
requirements and I'll do my own thing, but I don't think I'm alone.
There's likely no desire from the scientific Python community to branch
off and separate in quite the way I'm willing to do from non-scientific
Python, but it would solve some of their problems (while introducing many
others). I suspect a MinGW-w64-oriented equivalent to Conda would be
attractive to many. That's approximately what I'm aiming for.

There are some ways in which I can use the python.org MSVC executable and
installer. But it is nearly impossible for me to integrate it into the rest
of the tools and stack that I am using; it sticks out like a sore thumb.
Likewise MinGW-compiled CPython may stick out like a sore thumb relative
to the existing way things work with Python on Windows. I'm okay with that,
you probably aren't.


changes to extension building and you should contribute them
independently so that everyone can benefit


Noted.


I cannot see why you would need to build Python in order to build
extensions.


No, of course they are separate. CPython is one of my dependencies.
Compiled extensions are other dependencies. Software completely unrelated
to Python is yet another set of dependencies. It's not a very coherent
stack if I can't handle all of these dependencies in a uniform way.


On a tangential note, any work on supporting mingw builds and
cross-compilation should probably be done using setuptools, so that it
is external to the CPython code.


Noted.

Sincerely,
Tony

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Paul Moore
On 26 October 2014 23:24, Tony Kelman  wrote:
> I want, and in many places *need*, an all-MinGW stack.

OK, I'm willing to accept that statement. But I don't understand it,
and I don't think you've explained why you *need* your CPython
interpreter to be compiled with mingw (as opposed to a number of other
things you might need around building extensions). You may well "need"
a mingw-compiled CPython because no-one has yet fixed the issues
around using mingw to build extensions for the python.org python
build. But that's my point - I'd rather "they" fixed that issue,
rather than perpetuating your need for a non-standard compiler that
uses extensions no-one else can use.

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread Paul Moore
On 26 October 2014 23:11, Ray Donnelly  wrote:
> I don't know where this "ABI compatible" thing came into being;

Simple. If a mingw-built CPython doesn't work with the same extensions
as a MSVC-built CPython, then the community gets fragmented (because
you can only use the extensions built for your stack). Assuming numpy
needs mingw and ultimately only gets built for a mingw-compiled Python
(because the issues building for MSVC-built Python are too hard) and
assuming that nobody wants to make the effort to build pywin32 under
mingw, then what does someone who needs both numpy and pywin32 do?

Avoiding that issue is what I mean by ABI-compatible. (And that's all
I mean by it, nothing more subtle or controversial).

I view it as critical (because availability of binaries is *already*
enough of a problem in the Windows world, without making it worse)
that we avoid this sort of fragmentation. I'm not seeing an
acknowledgement from the mingw side that they agree. That's my
concern. If we both agree, there's nothing to argue about.

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-26 Thread martin


Zitat von Tony Kelman :


A maintainer has volunteered. Others will help. Can any core developers
please begin reviewing some of his patches?


Unfortunately, every attempt to review these patches has failed for me,
every time. In the last iteration of an attempt to add mingw64 support,
I had asked contributors to also provide instructions on how to use these
patches, and haven't received any instructions that actually worked.

I'm hesitant to add code that I cannot verify as actually working.

I guess anybody else reviewing these patches ran into similar problems
(I know some other core developers have tried reviewing them as well,
others have stated here that they are unable to review the patches).

Regards,
Martin


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com