Greets, Sorry for my late answer, google groups lost my post... First, thanks you for your explanations about memory handling in the os and python. I've tried with python 2.5 under linux : For the parsing of a 66 Mb xml file with cElementTree : When starting python : 2.1 Mb private memory used import xml.etree.cElementTree as ElementTree #3.4 Mb used et=ElementTree.parse('otherdata.xml') #218.6 Mb used del et #43.3 Mb used et=ElementTree.parse('otherdata.xml') #218.6 Mb used del et #60.6 Mb used et=ElementTree.parse('otherdata.xml') #218.6 Mb used del et #54.1 Mb used et=ElementTree.parse('otherdata.xml') #218.6 Mb used del et #54.1 Mb used et=ElementTree.parse('otherdata.xml') #218.6 Mb used del et #54.1 Mb used
Why does I have a such erratic memory freeing ? I've tried the same test many time with a new interpreter and I've got 43.3 Mb after the first free and 54.1 Mb after the others. If there is a memory pool limit in list ans dict, why can't I goes back to 43.3 or 54.1 Mb all the times ? I've tried using readlines(): When starting python : 2.1 Mb private memory used f=open('otherdata.xml') #2.2 Mb used data=f.readlines() #113 Mb used del data #2.7 Mb used f.seek(0) #2.7 Mb used data=f.readlines() #113 Mb used del data #2.7 Mb used That time I have a good memory handling (for my definition of memory handling) So is there a problem with cElementTree ? I've done a last test with ElementTree : When starting python : 2.1 Mb private memory used import xml.etree.ElementTree as ElementTree #3.2 Mb used et=ElementTree.parse('otherdata.xml') #211.4 Mb used (but very slow :p) del et #21.4 Mb used et=ElementTree.parse('otherdata.xml') #211.4 Mb used del et #29.8 Mb used So why does I have such differences in memory freeing ? Only due to fragmentation ? Anyway, python 2.5 has a better memory handling than 2.4, but still not perfect for me. I think I've not really understood the problem with the use of malloc (fragmentation,...) Thanks for your help Regards, FP -- http://mail.python.org/mailman/listinfo/python-list