Thanks for your reply! Im going to use this for a backup system for our webstore system, where some of our customers have *alot* of products. Given the structure of the database with categories and images 5000 unique products quickly gives 3x = 15000 arrays. But again, how often would the client need to revert the backup? Ive never needed to do it for now, but I still want to do this backup system in XML (Looks kinda up to date, :), and at the same time know that it will eventually work -> even for the customers that infact has alooooot of products in their system.
The alternative, would be that customers with >x products would only have access to mySQL dump. Or to have a script that evaluates the XML file and prompts the user that this file is to large for automatic import / revert. a) >From what I understand of your answer you're telling me that PHP itself will not have any "problems" working with 1.000.000.000 arrays, aslong as there are enough ram and processing power available, and that the timeout is exteeeeended. Or would the PHP die a painfull death as soon as it gets "to much" data? Could we coclude the follwing, XMLfile > 200 MB = No PHP! b) I should do some benchmarking, to find the "magic" marker for how much XML data that is "affordable" to grab when done on a webserver with other clients to prevent angry customers or failures due to factory timeout setting to 30 secs. -- Kim Steinhaug --------------------------------------------------------------- There are 10 types of people when it comes to binary numbers: those who understand them, and those who don't. --------------------------------------------------------------- "Raditha Dissanayake" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED] > Hi, > > I belive PHP should be able to handle it but it's a bad idea. The reason > being your app will not scale. Because if you script consumes 2mb of > memory on average, 100 users accesing it at the same time will be 200Mb. > Of course if you expect only a small number of users it does not matter. > > The biggest XML job i have handled with PHP is parsing the ODP RDF dump > which is around 700MB. Obviously arrays are out of the question in such > a scenario, even though only one user will be accessing the script at a > given moment. the ODP dump has a couple of million records > > > > Kim Steinhaug wrote: > > >Something Ive wondered about as I started working with XML. > >Importing huge XML files, and converting theese to arrays works > >indeed very well. But I am wondering if there are any limits to > >how many arrays the PHP can handle when performance is accounted for. > > > >Say I create an array from a XML with lots of childs, say we are > >talking of upto 10 childs, which would give 10 dimensional arrays. > >Say we then have 10.000 records, or even 100.000 records. > > > >Will this be a problem for PHP to handle, or should I break such > >a prosess into lesser workloads (meaning lesser depth in the array)? > > > >Anyone with some experience on this? > > > > > > > > > -- > Raditha Dissanayake. > ------------------------------------------------------------------------ > http://www.radinks.com/sftp/ | http://www.raditha.com/megaupload > Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader > Graphical User Inteface. Just 150 KB | with progress bar. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php