I have a database of board games; its data is based on the XML feed
from:

http://boardgamegeek.com/xmlapi/boardgame/1,2,3&stats=1

Presently it has more than 50,000 game entries, along with associated
publisher, artists, designers, and so forth.  In other words--a
massive amount of information.

Everything is structured fairly well with proper model associations, I
think, so it's a fairly trivial task to pull the data out of the MySQL
database. The issue is that the resulting data array is huge...and I'm
running out of memory.

I can sometimes handle this by pulling 1000 records at a time, say,
but then I run into my next issue.  To actually do all the reading and
writing to XML (via the SimpleXML class), I'd say it takes around 4 or
5 minutes.  On my local environment, I can change the maximum
execution time to be limitless, so it's not a huge issue.  But on my
Web host, though, that's not such an easy task--I'm not even sure it
can be done.

So, it gets to the main question: What's the best way to handle
processing of tens of thousands of records, reading them from a MySQL
database then writing them out via XML?  (Incidentally, the XML file
that gets written is some 800MB in size!)

Any and all suggestions are most welcome.

--Carl

--

You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=.


Reply via email to