""foster dot graeme at gmail dot com"" <php-bugs@lists.php.net> wrote in 
message news:[EMAIL PROTECTED]
> ID:               40494
> User updated by:  foster dot graeme at gmail dot com
> Reported By:      foster dot graeme at gmail dot com
> Status:           Bogus
> Bug Type:         Zip Related
> Operating System: Linux
> PHP Version:      5.2.1
> Assigned To:      pajoye
> New Comment:
>
> Would it be possible to add a brief description of this situation to
> the documentation, for example the following could be added to the
> description of ZipArchive::addFile
>
> Description
>
> bool ZipArchive::addFile ( string filename [, string localname] )
>
> Adds a link to the ZIP archive from a given path. When the archive is
> closed the link is checked to ensure that the file still exists and
> will then be compressed and added to the archive. If a lot of files are
> being added then the number of file handles permitted by the OS may be
> exceeded, if that occurs then the status will be set to
> ZIPARCHIVE::ER_OPEN. This can be avoided by closing the archive before
> the limit is reached and then reopening the archive.
>
> for example:
>
> if ($zip->numfile % $limit == 0)
> {
>   $zip->close();
>   $zip->open($filename,ZIPARCHIVE::CREATE);
> }
>
>
> Previous Comments:
> ------------------------------------------------------------------------
>
> [2007-02-15 14:35:34] [EMAIL PROTECTED]
>
> "I still think that it would be nice if there was some way for the
> system to manage this."
>
> It is in the TODO list. As I said three times already in this
> discussion. The solution is to add different modes:
> - commit at the end when the archive is close
> - immediate addition (will be much slower)
>
> And again, it is in my TODOs already. I cannot tell when they will be
> available (I do it on my free time).
>
> In the meantime a simple:
>
> if (($zip->numFiles % $yourlimit) == 0) {close; reopen;}
>
> will do it.
>
>
>
> "the archive can be partially built prior to the ulimit being reached.
> This could be set as 250, with the ability to overload it. Maybe this
> would only be triggered if a flag was set when the archive was
> opened."
>
> This solution does not work.The limit is arbitrary. There is no way to
> get an exact value (and I doubt php is the only running process).
>
>
> ------------------------------------------------------------------------
>
> [2007-02-15 14:02:51] foster dot graeme at gmail dot com
>
> Okay thanks for the explanation, I understand the problem a little
> better. I still think that it would be nice if there was some way for
> the system to manage this.
>
> I was thinking along the lines of a function to flush the files so that
> the archive can be partially built prior to the ulimit being reached.
> This could be set as 250, with the ability to overload it. Maybe this
> would only be triggered if a flag was set when the archive was opened.
>
> ------------------------------------------------------------------------
>
> [2007-02-15 13:23:36] [EMAIL PROTECTED]
>
> See:
>
> http://pecl.php.net/bugs/bug.php?id=9443
>
> "it would be good if this wasn't necessary, in thatthe code could catch
> the problem and allocate extra file handles if that is the problem."
>
> This is not something I can control. The operating system defines it
> and there is no way for me to increase this value.
>
> I suggest you to close and reopen it every 1000 or so (or even 255 if
> you want to go on the safest way, ie old windows).
>
> Future releases will have a different mode, where the checks will done
> only when you close the archives.
>
> ------------------------------------------------------------------------
>
> [2007-02-15 13:14:57] foster dot graeme at gmail dot com
>
> Maybe I need to explain this problem a little more.
>
> I am trying to archive a folder on the server, at the moment it
> contains 5609 folders and 11,221 files. The script loops through the
> files adding them to the archive using the addFile() method. After the
> first 1002 files I get a ZIPARCHIVE::ER_OPEN. If I close the archive
> and the open it again I still have that error. However, if I close the
> archive and open it before I get that error then I can archive all
> 11,221 files.
>
> Since closing the file and re-opening fixes the problem (so long as I
> do that before I get the error) Then may I suggest that closing an
> archive will clear the status. Obviously, it would be good if this
> wasn't necessary, in thatthe code could catch the problem and allocate
> extra file handles if that is the problem.
>
> ------------------------------------------------------------------------
>
> [2007-02-15 11:41:24] [EMAIL PROTECTED]
>
> "When adding files to an archive, (using successive
> ZipArchive::addFile()
> commands) the compression doesn't happen until the file is closed. "
>
> Yes, we do it while finalizing the archive.
>
> " This can result in an out of memory error, "
>
> You will run out of file ID before running out of memory. It  does not
> really use many memory, only the file names and file handlers.
>
> I suppose you are talking about the file handlers?
>
> "It would certainly require a rewrite of the ugly function
> zip_close()"
>
> What is ugly in this function? Or do you have a portable way to lock a
> file until the archive creation is done?
>
> I think you refer to the file handlers limitation. There is already a
> bug about it and I plan to add a special (less safe) mode. This mode
> will allow one to add only the paths without checks, errors will occur
> only when the archive is closed. But that's a feature addition not a
> bug fix.
>
> I close this bug (not a bug > bogus).
>
> Thanks for your report!
>
> ------------------------------------------------------------------------
>
> The remainder of the comments for this report are too long. To view
> the rest of the comments, please view the bug report online at
>    http://bugs.php.net/40494
>
> -- 
> Edit this bug report at http://bugs.php.net/?id=40494&edit=1 

-- 
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to