Re: Tool for finding external dependencies

2007-07-13 Thread syt
On Jul 9, 3:39 am, Rob Cakebread <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I need to find external dependencies for modules (not Python standard
> library imports).
>
> Currently I usepylintand manually scan the output, which is very
> nice, or usepylint's--ext-import-graph option to create a .dot file
> and extract the info from it, but either way can take a very long
> time.
>
> I'm aware of Python's modulefinder.py, but it doesn't find external
> dependencies (or at least I don't know how to make it do them).

notice that you can launch pylint in the following way to disabling
everything but dependencies analysis : ::

  pylint --enable-checker=imports yourproject

this will disable all others checkers and you may gain a signifiant
speedup.

-- Sylvain

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: patching pylint.el

2007-07-13 Thread syt
On Jul 9, 4:13 pm, lgfang <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I think this is a bug ofpylint.el.  But I failed finding a way to
> submit the bug neither in its official site nor in google.  So I post
> it here wishing it may be useful for some buddies.
>
> The bug is that it uses "compile-internal" from "compile" without
> require compile.  So "M-xpylint" will fail if compile hadn't been
> loaded in advance by any means.
>
> My fix is rather straightforward: add "(require 'compile)" in the code.
>
> -- begin diff output --
> 2a3>   (require 'compile)
>
> -- end diff output --

fyi, pylint related bug should be reported on the python-
[EMAIL PROTECTED] mailing list.
I've opened a ticket for your bug/patch: http://www.logilab.org/bug/eid/4026

cheers,
Sylvain

-- 
http://mail.python.org/mailman/listinfo/python-list


Re: PEP 382: Namespace Packages

2009-04-16 Thread syt
On Apr 14, 6:27 pm, "P.J. Eby"  wrote:
> I think you've misunderstood something about the use case.  System
> packaging tools don't like separate packages to contain the *same
> file*.  That means that they *can't* split a larger package up with
> your proposal, because every one of those packages would have to
> contain a __pkg__.py -- and thus be in conflict with each
> other.  Either that, or they would have to make a separate system
> package containing *only* the __pkg__.py, and then make all packages
> using the namespace depend on it -- which is more work and requires
> greater co-ordination among packagers.

I've maybe missed some point, but doesn't the PEP requires
coordination so that *.pkg files have different names in each portion,
and the same if one want to provide a non empty __init__.py.

Also providing a main package with a non empty __init__.py and others
with no __init__.py turns all this in the "base package" scenario
described later in this discussion, right?

BTW, It's unclear to me what difference do you make between this usage
and zope or peak's one.

> Allowing each system package to contain its own .pkg or .nsp or
> whatever files, on the other hand, allows each system package to be
> built independently, without conflict between contents (i.e., having
> the same file), and without requiring a special pseudo-package to
> contain the additional file.

As said above, provided some conventions are respected...

What's worrying me is that as the time goes, import mecanism becomes
more and more complicated, with more and more trick involved. Of
course I agree we should unify the way namespace packages are handled,
and this should live in the python std lib. What I like in MAL's
proposal is that it makes things simplier... Another point: I don't
like .pth, .pkg files. Isn't this pep an opportunity to at least unify
them?
--
Sylvain Thénault
--
http://mail.python.org/mailman/listinfo/python-list