It is not a database problem, it seems to me something about the reading of
the XML file. This is my code about this:

$data = implode(file("example.xml"), "");

xml_parse_into_struct($parser, $data, &$structure, &$index);

xml_parser_free($parser);

it could be the array $data creation fails because the example.xml is too
big, infact if I decrease the size of xml file the script works (even if it
lasts several minutes), unfortunately the xml_parse_into_struct function
needs the $data array as argument.
My PHP version is 4.0.6.



"Raditha Dissanayake" <[EMAIL PROTECTED]> ha scritto nel messaggio
news:[EMAIL PROTECTED]
> Ago wrote:
>
> >I have a PHP script who reads data from a XML file uploaded via http
form.
> >This file is about 15 MB. The script builds a query of thousands inserts
as
> >many as the products the XML file contains (more or less 40000 inserts).
> >After 4 minutes the httpd process aborts by itself. What can it depend
on? I
> >have also increased the memory_limit parameter in php.ini to 32 MB and
> >set_time_limit to 0. What can I do anymore?
> >
> >
> Use a single insert statement instead of 40,000  - or -
> Drop the primary key before the insert and recreate it after wards. - or -
> save data into a csv file and use  'load data in file'
>
>
> Having said that the page shouldn't stop loading after 4 minutes if you
> use set_time_limit please see if you have some other setting that may be
> killing it.
>
>
> >Thanks.
> >
> >
> >
>
>
> -- 
> Raditha Dissanayake.
> ---------------------------------------------
> http://www.raditha.com/megaupload/upload.php
> Sneak past the PHP file upload limits.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to