1) Review your design - You say you are processing a large data set, just make sure you are not trying to store 3 versions. If you are missing a design, create a flow chart or something that is true to the code you have produced. You could probably even post the design if you are brave enough.
2) Check your implementation - make sure you manage lists, arrays etc correctly. You need to sever links (references) to objects for them to get swept up. I know it is obvious but easy to do in a hasty implementation. 3) Verify and test problem characteristics, profilers, top etc. It is hard for us to help you much without more info. Test your assumptions. Problem solving and debugging is a process, not some mystic art. Though sometime the Gremlins disappear after a pint or two :-) p [EMAIL PROTECTED] wrote: > I have a python code which is running on a huge data set. After > starting the program the computer becomes unstable and gets very > diffucult to even open konsole to kill that process. What I am assuming > is that I am running out of memory. > > What should I do to make sure that my code runs fine without becoming > unstable. How should I address the memory leak problem if any ? I have > a gig of RAM. > > Every help is appreciated. > -- http://mail.python.org/mailman/listinfo/python-list