And even "worse":
-8<-
#!/usr/bin/perl
use Benchmark;
$t0 = new Benchmark;
&bla();
$t1 = new Benchmark;
# Memory has grown on my machine to 110 MB
#sleep 20;
$t2 = new Benchmark;
&bla();
$t3 = new Benchmark;
# Memory has resides on my machine on 110 MB
print "Fir
Well this example does not demonstrate the problem it demonstrates the
solution to let perl free memory allocated once by setting the variable
to undef ;-)
---8<---
#!/usr/bin/perl
&bla();
print "Done";
my $bla = ;
sub bla {
my $var = "";
for( 1 .. 10_000_000 ) {
You can try memory management yourself and see that the memory allocated
is not wiped until the script is finished.
8<
#!/usr/bin/perl
&bla();
print "Done";
sub bla {
my $var = "";
for( 1 .. 10_000_000 ) {
$var .= "xxx";
}
my $bla = ;
Hello Carl,
Nope that's right, so you load up one image. The perl process
allocates itself 100MB of memory for it from the OS. Then doesn't
release it back to the OS once it's finished with.
The perl process will re-use this memory, so if you process another
image you don't grab another 100
In (eg) the worker MPM, each process contains its own perl interpreter,
so if each process handles one image once in its lifetime, there is a
lot of memory that has been grabbed by perl which is not available to
create more perl processes.
... is what makes sense to me but may be utterly meaningl
> This revelation of how Perl does not free up memory it allocates is
> worrying, especially as I do process large documents regularly.
>
> If I read you right, you are saying that $r->child_terminate will force
> the current thread to terminate, causing Apache to create a new thread.
> Is that
Hey Carl,
The only place where forking is useful is where you want something to
continue processing after sending the response back to the client.
You can achieve the same effective result by calling
$r->child_terminate()
(assuming your using pre-fork). The currently running child exits at
th
I use Image::Imlib2 for on the fly image creation and it works fine for me.
After the thumbnails are created, memory is restored to normal size.
Image::Imlib2 is also very vast and easy to code.
For filetypes that are not supported by imlib2, I use imagemagick which is
using some more memory but
i generally don't like to do that in modperl unless i have enough
webservers. i think its a bit of a waste of the mp resource.
i do this:
on upload, generate a thumbnail w/Image:Epeg (which is a super fast
implementation of epeg. previously i compiled my own epeg tools and
the perl mod i
Can you fork off a separate process to do this (that will die once it is
completed)?
The only place where forking is useful is where you want something to
continue processing after sending the response back to the client.
You can achieve the same effective result by calling $r->child_terminate(
On 3/27/06 6:21 AM, "Tom Schindl" <[EMAIL PROTECTED]> wrote:
> Please note that this is not only true for Image-Creation but also if
> you are restoring large string contents in a variable (e.g. after
> processing a file-upload).
>
> Tom
>
> Frank Maas wrote:
>> Tom,
>>
>>
>>> As a sidenote
Please note that this is not only true for Image-Creation but also if
you are restoring large string contents in a variable (e.g. after
processing a file-upload).
Tom
Frank Maas wrote:
> Tom,
>
>
>>As a sidenote often it is not really desired/dangerous to run image
>>creation as a mod_perl hand
Well image creation in mod_perl is not a bad idea if you ensure that the
process is killed after it succeeded a certain memory. The problem in
case of mod-perl/httpd is that that if every httpd-process is eating 20
MB of space only because you have created once an image. You must ensure
that only v
Tom,
> As a sidenote often it is not really desired/dangerous to run image
> creation as a mod_perl handler because of the nature of perl, memory
> allocated once is never freed until the process shutdowns or is killed
> (by your Apache::SizeLimit handler).
Ah, you got me worried there... How in
14 matches
Mail list logo