output buffering
Hello, When reading a large datafile, I want to print a '.' to show the progress. This fails, I get the series of '.'s after the data has been read. Is there a trick to fix this? Thanks -- http://mail.python.org/mailman/listinfo/python-list
Re: output buffering
On 2005-11-11, Fredrik Lundh <[EMAIL PROTECTED]> wrote: > "JD" <[EMAIL PROTECTED]> wrote: > >> When reading a large datafile, I want to print a '.' to show the >> progress. This fails, I get the series of '.'s after the data has been >> read. Is there a trick to fix this? > > assuming that you're printing to stdout, > > sys.stdout.flush() > > should do the trick. It does, Thanks! -- http://mail.python.org/mailman/listinfo/python-list
Re: output buffering
On 2005-11-11, Larry Bates <[EMAIL PROTECTED]> wrote: > This is something I wrote that might help. > > http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/299207 > The solutions become better and better. Thanks. > -Larry Bates > > JD wrote: >> Hello, >> >> When reading a large datafile, I want to print a '.' to show the >> progress. This fails, I get the series of '.'s after the data has been >> read. Is there a trick to fix this? >> >> Thanks -- http://mail.python.org/mailman/listinfo/python-list
removing dictionary key-pair
Hello, I try to remove a dictionary key-pair (remove an entry), but I'm unsuccessful. Does anyone know how to achieve this? Thanks -- http://mail.python.org/mailman/listinfo/python-list
changing python script on-the-fly for ActiveX/COM object (win32com)
I have implemented a COM object in Python and I would like to be able to change the script without stopping and restarting the application that's using the COM object. Is there a way to do this? (I can change the program that calls the COM object if needed.) Thanks... -- jeff -- http://mail.python.org/mailman/listinfo/python-list
File access
Hi, What I am trying to do is to run a subprocess on another machine using subprocess.Popen, this subprocess contuinue writing something into a file when it is runing. After submit this subprocess, I tried to open the file and readlines() in the loop (with a delay) in the loop) when the subprocess was runing. The problem is I could not get anything untill the subprocess finished. I also tried to run another python programm when the subprocess is runing, I could get what I want. Does anyone know why? Thanks! JD -- http://mail.python.org/mailman/listinfo/python-list
Re: File access
Thanks for answering, No, the data was writing into the file when the subprocess was runing. For example, every second it will write something into the file. I tried to run another python program aside and it sucessfully read the file when the subprocess was runing. JD On Aug 2, 11:00 am, Adrian Petrescu <[EMAIL PROTECTED]> wrote: > On Aug 2, 12:41 pm, JD <[EMAIL PROTECTED]> wrote: > > > > > Hi, > > > What I am trying to do is to run a subprocess on another machine using > > subprocess.Popen, this subprocess contuinue writing something into a > > file when it is runing. > > > After submit this subprocess, I tried to open the file and readlines() > > in the loop (with a delay) in the loop) when the subprocess was > > runing. > > > The problem is I could not get anything untill the subprocess > > finished. > > > I also tried to run another python programm when the subprocess is > > runing, I could get what I want. > > > Does anyone know why? Thanks! > > > JD > > Could the problem be that the subprocess only flushes the output > buffer when it terminates, and so until the subprocess "finishes", as > you say, the file is empty because the data is still in the buffer? > Trying throwing some flushes into the code and see if it helps. > > Or am I misunderstanding your question? -- http://mail.python.org/mailman/listinfo/python-list
Re: File access
Thanks for the suggestion, I am thinking implement a database system for that. JD On Aug 2, 12:11 pm, Larry Bates <[EMAIL PROTECTED]> wrote: > JD wrote: > > Hi, > > > What I am trying to do is to run a subprocess on another machine using > > subprocess.Popen, this subprocess contuinue writing something into a > > file when it is runing. > > > After submit this subprocess, I tried to open the file and readlines() > > in the loop (with a delay) in the loop) when the subprocess was > > runing. > > > The problem is I could not get anything untill the subprocess > > finished. > > > I also tried to run another python programm when the subprocess is > > runing, I could get what I want. > > > Does anyone know why? Thanks! > > > JD > > I believe you are approaching this incorrectly. You should probably be using > a > socket server/socket client to communicate between these two. Or perhaps you > could use a multi-user database table. Writing/reading to files from two > different workstations and expecting them to by synchronized most likely > won't work. > > -Larry -- http://mail.python.org/mailman/listinfo/python-list
hang in multithreaded program / python and gdb.
Hi I want to debug a locking situation in my program. http://wiki.python.org/moin/DebuggingWithGdb Where do I get binaries for ? - debug python 2.5 binary/rpm for FC7 Also, the description of the hang is as follows. 2 therads waiting in [EMAIL PROTECTED] ... from PyThread_acquire_lock() 1 thread in _poll from _socketmodule.so. Any ideas what might be going on. (Isnt PyThread_acquire_lock() trying to get the global interpreter lock ?) Any help.. pointers are welcome. Thanks /Jd -- http://mail.python.org/mailman/listinfo/python-list
creating "jsp-like" tool with python
I'd like to create a program that takes files with "jsp-like" markup and processes the embedded code (which would be python) to produce the output file. There would be two kinds of sections in the markup file: python code to be evaluated, and python code that returns a value that would be inserted into the output. This seems like it would be straightforward in python, and maybe there's even a library that I could use for this, but as a newbie to Python, I don't know the landscape very well. I am not looking for a big framework, just something small and simple that will do just this job. Suggestions or pointers would be greatly appreciated. Thanks... -- jeff -- http://mail.python.org/mailman/listinfo/python-list
looking for RSS generator
I'm looking for sample code to generate an RSS feed. I can annotate my html code with comments/tags to mark the articles, all I need is an "engine" to process the pages and generate the rss file. If I was a python wiz I'm sure I could do this quickly but as a newbie, an assist here would certainly make this task go faster. Thanks... -- jeff -- http://mail.python.org/mailman/listinfo/python-list
clean up html document created by Word
I am looking for python code (working or sample code) that can take an html document created by Microsoft Word and clean it up (if you've never had to look at a Word-generated html document, consider yourself lucky ;-) Alternatively, if you know of a non-python solution, I'd like to hear about it. Thanks... -- jeff -- http://mail.python.org/mailman/listinfo/python-list
Re: clean up html document created by Word
Wow, thanks for all the great responses! Here's my summary: - demoronizer (from John Walker) is designed to solve some very particular problems that could be considered bugs. However, it doesn't remove the unnecessary html generated by Word. http://www.fourmilab.ch/webtools/demoroniser/ - The tool from Microsoft can be used in two ways: you can copy html to the clipboard or export to "compact html". The former results in slightly cleaner html but doesn't include the style sheet and so the rendering isn't as nice; the latter does include the style sheet but it's got slightly more junk in it. Both approaches preserve the "blank" paragraphs (basically, ) for spacing, which is unnecessary and clutters up the html. This tool did properly preserve the footnotes in my test document. http://www.microsoft.com/downloads/details.aspx?FamilyID=209ADBEE-3FBD-482C-83B0-96FB79B74DED&displaylang=EN BTW, I didn't know this, but much of the extra html was added by Microsoft to allow round-tripping between html and Word. - Tidy with Win2000 configuration: It's already bundled in with my editor (PSPad) so this was a nice surprise (I guess I never explored that submenu -- that's the "problem" with modern editors and their zillions of features). The tidy output could use a more whitespace to improve html readability, but I assume I can change the config file to do this. No "blank paragraphs" (better than the Microsoft tool) but footnotes were messed up. http://www.w3.org/People/Raggett/tidy/ -- jeff -- http://mail.python.org/mailman/listinfo/python-list
xml-rpc timeout
Hi I have a multi-threaded application. For certain operations to the server, I would like to explicitly set timeout so that I get correct status from the call and not timed out exception. Does anyone know how to go about doing it ? /Jd -- http://mail.python.org/mailman/listinfo/python-list
threading.local _threading_local problems
Hi I have the following situation.. Have a worker thread, that does the "work" given to it. While doing work, some of the objects use thread local storage for storing that requires explicit close. e.g. connection handles. These objects are long living. The worker, does not have any direct access to the objects. I would like to clean up thread local area explicitly so that I do not run out of connection handles. Any ideas on how to access local storage from the thread ? (threading.local() gives new object everytime.. I am looking for something like singleton / global access from within the thread) Or is there anyway to notify the object when the thread for which it set the local storage is going away ? This looks to me a bit like design shortcoming. or I have missed something completely. Thanks /Jd -- http://mail.python.org/mailman/listinfo/python-list
Re: xml-rpc timeout
Steve Holden wrote: > Jd wrote: >> Hi >>I have a multi-threaded application. For certain operations to the >> server, I would like to explicitly set timeout so that I get correct >> status from the call and not timed out exception. >>Does anyone know how to go about doing it ? >> > The easiest way is to use socket.setdefaulttimeout() to establish a > longer timeout period for all sockets, I guess. It's difficult to > establish different timeouts for individual sockets when they aren't > opened directly by your own code (though each socket does also have a > method to set its timeout period). > > regards > Steve Ya.. the problem here is that I donot have acces to the socket. I have written my own transport etc.. but when the socket is getting created, there is no context for the method and where I know what method I am going to call, I do not have access to socket. I would have thought this to be an easy thing to achieve. In order to make xml-rpc easy to use.. it has become difficult to control. Anyone have any other ideas ? /Jd -- http://mail.python.org/mailman/listinfo/python-list
A question about subprocess
Hi, I want send my jobs over a whole bunch of machines (using ssh). The jobs will need to be run in the following pattern: (Machine A) (Machine B) (Machine C) Job A1 Job B1Job C1 Job A2 Job B2etc Job A3 etc etc Jobs runing on machine A, B, C should be in parallel, however, for each machine, jobs should run one after another. How can I do it with the subprocess? Thanks, JD -- http://mail.python.org/mailman/listinfo/python-list
Re: A question about subprocess
Thanks very much for all the answers. JD On Oct 3, 6:24 pm, Dan Stromberg <[EMAIL PROTECTED]> wrote: > You don't necessarily need thesubprocessmodule to do this, though you > could use it. > > I've done this sort of thing in the past with fork and exec. > > To serialize the jobs on the machines, the easiest thing is to just send > the commands all at once to a given machine, like "command1; command2; > command3". > > You can use waitpid or similar to check if a series of jobs has finished > on a particular machine. > > An example of something similar can be found > athttp://stromberg.dnsalias.org/~strombrg/loop.html > > (If you look at the code, be kind. I wrote it long ago :) > > There's a benefit to saving the output from each machine into a single > file for that machine. If you think some machines will produce the same > output, and you don't want to see it over and over, you can analyze the > files with something > likehttp://stromberg.dnsalias.org/~strombrg/equivalence-classes.html. > > On Wed, 03 Oct 2007 16:46:20 +, JD wrote: > > > Hi, > > > I want send my jobs over a whole bunch of machines (using ssh). The > > jobs will need to be run in the following pattern: > > > (Machine A) (Machine B) (Machine C) > > > Job A1 Job B1Job C1 > > > Job A2 Job B2etc > > > Job A3 etc > > > etc > > > Jobs runing on machine A, B, C should be in parallel, however, for > > each machine, jobs should run one after another. > > > How can I do it with thesubprocess? > > > Thanks, > > > JD -- http://mail.python.org/mailman/listinfo/python-list
pygresql
Hi there. I'm trying to use python with postgresql. I decided to use psycopg to interact with the postgresql server. When installing psycopg it appeared that I needed mxDateTime. So I decided to install the mxbase package. I received the following error message (the interesting bit seems to be at the end): [EMAIL PROTECTED]:/var/lib/postgresql/mxbase$ sudo python setup.py install running install running build running mx_autoconf gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict- prototypes -fPIC -D_GNU_SOURCE=1 -I/usr/local/include -I/usr/include - c _configtest.c -o _configtest.o _configtest.c: In function 'main': _configtest.c:4: warning: statement with no effect gcc -pthread _configtest.o -L/usr/local/lib -o _configtest success! removing: _configtest.c _configtest.o _configtest gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict- prototypes -fPIC -D_GNU_SOURCE=1 -I/usr/include/python2.5 -I/usr/local/ include -I/usr/include -c _configtest.c -o _configtest.o success! removing: _configtest.c _configtest.o macros to define: [('HAVE_STRPTIME', '1')] macros to undefine: [] running build_ext building extension "mx.DateTime.mxDateTime.mxDateTime" (required) building 'mx.DateTime.mxDateTime.mxDateTime' extension gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict- prototypes -fPIC -DUSE_FAST_GETCURRENTTIME -DHAVE_STRPTIME=1 -Imx/ DateTime/mxDateTime -I/usr/include/python2.5 -I/usr/local/include -I/ usr/include -c mx/DateTime/mxDateTime/mxDateTime.c -o build/temp.linux- i686-2.5_ucs4/mx-DateTime-mxDateTime-mxDateTime/mx/DateTime/mxDateTime/ mxDateTime.o gcc: mx/DateTime/mxDateTime/mxDateTime.c: No such file or directory gcc: no input files error: command 'gcc' failed with exit status 1 I googled "error: command 'gcc' failed with exit status 1" and interestingly a lot of the results seemed to be linked with python. I can confirm that I do have gcc installed. One post seemed to suggest that I may be using too new a version of gcc. Do you think this is the problem or am I going astray somewhere else? Thank you very much in advance for any assistance, James. -- http://mail.python.org/mailman/listinfo/python-list
Re: pygresql
Btw apologies for naming the post 'pygresql'! That was the module I was attempting to use before. -- http://mail.python.org/mailman/listinfo/python-list
Re: pygresql
Apologies for essentially talking to myself out loud! I've switched back to pygresql. I think a lot of my problems were caused by not having installed postgresql-server-dev-8.2 which contains a lot of header files etc. I'm sure this was part of the problem with the psycopg modules aswell. postgresql-server-dev can easily be installed of course by using: sudo apt-get install postgresql-server-dev I hope my ramblings have been of help to someone! -- http://mail.python.org/mailman/listinfo/python-list
How to pass out the result from iterated function
I got a iterated function like this: def iterSomething(list): has_something = False for cell in list: if something in cell: has_something = True output = something if has_something: iterSomething(output) else: final_out = outupt The problem is how can I read this final_out outside of the function. I tried the global statement, it seems not work. Any idea? JD -- http://mail.python.org/mailman/listinfo/python-list
Re: How to pass out the result from iterated function
On Dec 10, 2:25 pm, eric <[EMAIL PROTECTED]> wrote: > On Dec 10, 9:16 pm, JD <[EMAIL PROTECTED]> wrote: > > > > > I got a iterated function like this: > > > def iterSomething(list): > > has_something = False > > for cell in list: > > if something in cell: > > has_something = True > > output = something > >if has_something: > >iterSomething(output) > >else: > >final_out = outupt > > > The problem is how can I read this final_out outside of the function. > > I tried the global statement, it seems not work. Any idea? > > > JD Thanks, I tried to return the last result. It's not working. Your code works, thanks > why don't you just return it ? > > def iterSomething(list): > has_something = False > for cell in list: > if something in cell: > has_something = True > output = something >if has_something: >return iterSomething(output) >else: >return output > > ? -- http://mail.python.org/mailman/listinfo/python-list
algorizm to merge nodes
Hi, I need help for a task looks very simple: I got a python list like: [['a', 'b'], ['c', 'd'], ['e', 'f'], ['a', 'g'], ['e', 'k'], ['c', 'u'], ['b', 'p']] Each item in the list need to be merged. For example, 'a', 'b' will be merged, 'c', 'd' will be merged. Also if the node in the list share the same name, all these nodes need be merged. For example, ['a', 'b'], ['a', 'g'] ['b', 'p'] will be merged to ['a', 'b', 'g', 'p'] The answer should be: [['a', 'b', 'g', 'p'], ['c', 'd', 'u'], ['e', 'f', 'k']] Anyone has a solution? Thanks, JD -- http://mail.python.org/mailman/listinfo/python-list
Re: algorizm to merge nodes
Hi, Thanks for the help, but the result is not quite right: [['a', 'b', 'g'], ['c', 'd', 'u'], ['b', 'p'], ['e', 'f', 'k']] the ['b', 'p'] is not merged. JD On Oct 17, 2:35 pm, "Chris Rebert" <[EMAIL PROTECTED]> wrote: > (Disclaimer: completely untested) > > from collections import defaultdict > > merged = defaultdict(list) > for key, val in your_list_of_pairs: > merged[key].append(val) > > result = [[key]+vals for key, vals in merged.items()] > > Cheers, > Chris > -- > Follow the path of the Iguana...http://rebertia.com > > On Fri, Oct 17, 2008 at 1:20 PM, JD <[EMAIL PROTECTED]> wrote: > > Hi, > > > I need help for a task looks very simple: > > > I got a python list like: > > > [['a', 'b'], ['c', 'd'], ['e', 'f'], ['a', 'g'], ['e', 'k'], ['c', > > 'u'], ['b', 'p']] > > > Each item in the list need to be merged. > > > For example, 'a', 'b' will be merged, 'c', 'd' will be merged. > > > Also if the node in the list share the same name, all these nodes need > > be merged. > > > For example, ['a', 'b'], ['a', 'g'] ['b', 'p'] will be merged to ['a', > > 'b', 'g', 'p'] > > > The answer should be: > > > [['a', 'b', 'g', 'p'], ['c', 'd', 'u'], ['e', 'f', 'k']] > > > Anyone has a solution? > > > Thanks, > > > JD > > -- > >http://mail.python.org/mailman/listinfo/python-list -- http://mail.python.org/mailman/listinfo/python-list
Re: algorizm to merge nodes
Hi, Thanks, It works for this example, but if I add another item ['e', 'd']: [['a', 'b'], \ ['c', 'd'], \ ['e', 'f'], \ ['a', 'g'], \ ['e', 'k'], \ ['c', 'u'], \ ['b', 'p'],\ ['e', 'd']] The result is set(['a', 'p', 'b', 'g']), set(['e', 'c', 'u', 'd']), set(['k', 'e', 'd', 'f']) The right result should be: ['a', 'p', 'b', 'g'], ['c', 'u', 'e', 'd', 'k', 'f'] JD On Oct 17, 3:00 pm, Mensanator <[EMAIL PROTECTED]> wrote: > On Oct 17, 3:20 pm, JD <[EMAIL PROTECTED]> wrote: > > > > > Hi, > > > I need help for a task looks very simple: > > > I got a python list like: > > > [['a', 'b'], ['c', 'd'], ['e', 'f'], ['a', 'g'], ['e', 'k'], ['c', > > 'u'], ['b', 'p']] > > > Each item in the list need to be merged. > > > For example, 'a', 'b' will be merged, 'c', 'd' will be merged. > > > Also if the node in the list share the same name, all these nodes need > > be merged. > > > For example, ['a', 'b'], ['a', 'g'] ['b', 'p'] will be merged to ['a', > > 'b', 'g', 'p'] > > > The answer should be: > > > [['a', 'b', 'g', 'p'], ['c', 'd', 'u'], ['e', 'f', 'k']] > > > Anyone has a solution? > > A = [['a', 'b'], \ > ['c', 'd'], \ > ['e', 'f'], \ > ['a', 'g'], \ > ['e', 'k'], \ > ['c', 'u'], \ > ['b', 'p']] > > merged = [] > > for i in A: > if len(merged)==0: > merged.append(set(i)) > else: > gotit = False > for k,j in enumerate(merged): > u = j.intersection(set(i)) > if len(u): > merged[k] = j.union(set(i)) > gotit = True > if not gotit: > merged.append(set(i)) > > print merged > > ## > ##[set(['a', 'p', 'b', 'g']), set(['c', 'u', 'd']), set(['k', 'e', > 'f'])] > ## > > > > > Thanks, > > > JD -- http://mail.python.org/mailman/listinfo/python-list
Re: algorizm to merge nodes
Thanks, This one really works. JD On Oct 17, 3:17 pm, [EMAIL PROTECTED] wrote: > JD, you probably need the algorithm for connected components of an > undirected graph. > > For example you can do that with my graph > lib:http://sourceforge.net/projects/pynetwork/ > > from graph import Graph > g = Graph() > data = [['a', 'b'], ['c', 'd'], ['e', 'f'], ['a', 'g'], ['e', 'k'], > ['c', 'u'], ['b', 'p']] > g.addArcs(data, bi=True) > > print g.connectedComponents() > # Output: [['a', 'b', 'g', 'p'], ['c', 'u', 'd'], ['e', 'k', 'f']] > > Bye, > bearophile -- http://mail.python.org/mailman/listinfo/python-list
Re: algorizm to merge nodes
It could be a very good "homework assignmet". This is for a real application. each item in the list represent two terminals of a resistor. All the resistors in the list are shorted, I need to figure out how many independent nets are there. JD On Oct 17, 4:16 pm, Paul McGuire <[EMAIL PROTECTED]> wrote: > On Oct 17, 3:20 pm, JD <[EMAIL PROTECTED]> wrote:> Hi, > > > I need help for a task looks very simple: > > > > I smell "homework assignment". > > -- Paul -- http://mail.python.org/mailman/listinfo/python-list
issue with cookielib.LWPCookieJar
Greetings: My cookiejar contains the cookie that I need however when I do cj.save(file) it does not actually save out to the cookies.lwj Does anyone have any clue what would keep this from saving? It CREATED my cookies.lwj file so I know it's not permissions. cookies.lwp: #LWP-Cookies-2.0 test.py: def requestXML(frag, server='US', data=None): import urllib import urllib2 import os.path import cookielib base_urls = { "US":"http://www.wowarmory.com/";, "EU":"http://eu.wowarmory.com/";, "US_SECURE":"https://www.wowarmory.com/";, "EU_SECURE":"https://eu.wowarmory.com/"; } COOKIEFILE = 'cookies.lwp' cj = cookielib.LWPCookieJar() if os.path.isfile(COOKIEFILE): cj.load(COOKIEFILE) opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj)) try: if data is not None: data = urllib.urlencode(data) req = urllib2.Request(base_urls[server] + frag, data) req.add_header('User-agent', 'Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.1.10) Gecko/20071115 Firefox/2.0.0.10') handle = opener.open(req) except IOError, e: if hasattr(e, 'code'): return 'We failed to open "%s".' % base_urls[server] + frag elif hasattr(e, 'reason'): return "The error object hast he following 'reason' attribute: %s" % e.reason headers = handle.info() xml = handle.read() print xml print headers print data for index, cookie in enumerate(cj): print index, ' : ', cookie -- http://mail.python.org/mailman/listinfo/python-list