yagyala <[EMAIL PROTECTED]> writes: > I recently started working for a company that has just implemented its > first set of software standards. So far, so good. Here's the problem: > one of those standards is that the comments for each routine must > indicate every other routine that it calls. As I try to keep my > routines small, and factor out methods alot, this can lead to an > enormous ammount of extra typing. I really, really, really don't want > to do this by hand. Does anyone know of a tool that could do this for > me, or at least a tool that can tell what other routines a given > routine calls that I could program against? (Preferably something that > works under pydev, but I'm not going to be choosy.)
import tokenize for token in tokenize.generate_tokens(open("foo.py")): print token Note module token, also. Module inspect may also be of use: http://docs.python.org/lib/inspect-source.html Also, Python 2.5 introduced a new AST, which I haven't had occasion to use yet (not even certain whether it ended up getting exposed it to Python, or just staying on the C level...) Possibly you could even persuade some existing source code analysis tool to do exactly the job you want for you -- but it's probably much easier just to write what you need yourself. John -- http://mail.python.org/mailman/listinfo/python-list