I belive PHP should be able to handle it but it's a bad idea. The reason being your app will not scale. Because if you script consumes 2mb of memory on average, 100 users accesing it at the same time will be 200Mb. Of course if you expect only a small number of users it does not matter.
The biggest XML job i have handled with PHP is parsing the ODP RDF dump which is around 700MB. Obviously arrays are out of the question in such a scenario, even though only one user will be accessing the script at a given moment. the ODP dump has a couple of million records
Kim Steinhaug wrote:
Something Ive wondered about as I started working with XML. Importing huge XML files, and converting theese to arrays works indeed very well. But I am wondering if there are any limits to how many arrays the PHP can handle when performance is accounted for.
Say I create an array from a XML with lots of childs, say we are talking of upto 10 childs, which would give 10 dimensional arrays. Say we then have 10.000 records, or even 100.000 records.
Will this be a problem for PHP to handle, or should I break such a prosess into lesser workloads (meaning lesser depth in the array)?
Anyone with some experience on this?
-- Raditha Dissanayake. ------------------------------------------------------------------------ http://www.radinks.com/sftp/ | http://www.raditha.com/megaupload Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader Graphical User Inteface. Just 150 KB | with progress bar.
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php