On Tue, Jan 20, 2009 at 7:27 PM, Tim Arnold wrote:
> I had the same problem you did, but then I changed the code to create a new
> soup object for each file.That drastically increased the speed. I don't
> know why, but it looks like the soup object just keeps getting bigger with
> each feed.
>
>
Hi all,
I have found the actual solution for this problem.
I tried using BeautifulSoup.SoupStrainer() and it improved memory usage
to the greatest extent.Now it uses max of 20 MB(earlier
it was >800 MB on 1GB RAM system).
thanks all.
--
Yours,
S.Selvam
--
http://mail.python.org/m
"Philip Semanchuk" wrote in message
news:mailman.7530.1232375454.3487.python-l...@python.org...
>
> On Jan 19, 2009, at 3:12 AM, S.Selvam Siva wrote:
>
>> Hi all,
>>
>> I am running a python script which parses nearly 22,000 html files
>> locally
>> stored using BeautifulSoup.
>> The problem is
S.Selvam Siva wrote:
Hi all,
I am running a python script which parses nearly 22,000 html files
locally stored using BeautifulSoup.
The problem is the memory usage linearly increases as the files are
being parsed.
When the script has crossed parsing 200 files or so, it consumes all the
availa
On Jan 19, 2009, at 3:12 AM, S.Selvam Siva wrote:
Hi all,
I am running a python script which parses nearly 22,000 html files
locally
stored using BeautifulSoup.
The problem is the memory usage linearly increases as the files are
being
parsed.
When the script has crossed parsing 200 files
Hi all,
I am running a python script which parses nearly 22,000 html files locally
stored using BeautifulSoup.
The problem is the memory usage linearly increases as the files are being
parsed.
When the script has crossed parsing 200 files or so, it consumes all the
available RAM and The CPU usage