Re: Please help: perl run out of memory

2022-04-27 Thread David Emanuel da Costa Santiago
Às 11:33 de 17/04/22, wilson escreveu: hello the experts, can you help check my script for how to optimize it? currently it was going as "run out of memory". $ perl count.pl Out of memory! Killed My script: use strict; my %hash; my %stat; To be honest you don't need the

Re: Please help: perl run out of memory

2022-04-26 Thread hw
On Sun, 2022-04-17 at 17:33 +0800, wilson wrote: > hello the experts, > > can you help check my script for how to optimize it? > currently it was going as "run out of memory". > > $ perl count.pl > Out of memory! > Killed I would use a database like Mari

Re: Please help: perl run out of memory

2022-04-22 Thread David Precious
On Thu, 21 Apr 2022 07:12:07 -0700 al...@coakmail.com wrote: > OP maybe need the streaming IO for reading files. Which is what they were already doing - they used: while () { ... } Which, under the hood, uses readline, to read a line at a time. (where "HD" is their global fileh

Re: Please help: perl run out of memory

2022-04-21 Thread alice
OP maybe need the streaming IO for reading files. Thanks On 2022-04-21 21:56, David Precious wrote: > On Thu, 21 Apr 2022 17:26:15 +0530 > "M.N Thanishka sree Manikandan" wrote: > >> Hi wilson >> Try this module file::slurp > > Given that the OP is running into memory issues processing an 80

Re: Please help: perl run out of memory

2022-04-21 Thread David Precious
On Thu, 21 Apr 2022 17:26:15 +0530 "M.N Thanishka sree Manikandan" wrote: > Hi wilson > Try this module file::slurp Given that the OP is running into memory issues processing an 80+ million line file, I don't think suggesting a CPAN module designed to read the entire contents of a file into mem

Re: Please help: perl run out of memory

2022-04-21 Thread M.N Thanishka sree Manikandan
Hi wilson Try this module file::slurp Regards, Manikandan On Sun, 17 Apr, 2022, 15:03 wilson, wrote: > hello the experts, > > can you help check my script for how to optimize it? > currently it was going as "run out of memory". > > $ perl count.pl > Out of m

Re: Please help: perl run out of memory

2022-04-21 Thread Adriel Peng
I am not sure, but can Tie::Hash etc be used by tying hash to a local file to reduce the memory use? regards.

Re: Please help: perl run out of memory

2022-04-18 Thread Rob Coops
;> hello the experts, >> >> can you help check my script for how to optimize it? >> currently it was going as "run out of memory". >> >> $ perl count.pl >> Out of memory! >> Killed >> >> >> My script: >> use strict; >

Re: Please help: perl run out of memory

2022-04-17 Thread David Mertens
make sense? I could bang out some code illustrating what I mean if that would help. David On Sun, Apr 17, 2022, 5:33 AM wilson wrote: > hello the experts, > > can you help check my script for how to optimize it? > currently it was going as "run out of memory". > > $ p

Re: DBD::mysqlPP is giving out of memory exception and ODBC is working fine.

2013-05-19 Thread Ganesh Babu N
Dear Shlomi Fish, . Thank you for pointing to the right website I will follow the rules. The above code is only part of the large program. I have posted only the Database related portion only. Regards, Ganesh On Sun, May 19, 2013 at 1:38 PM, Shlomi Fish wrote: > Hi Ganesh, > > On Sat, 18 May 2

Re: DBD::mysqlPP is giving out of memory exception and ODBC is working fine.

2013-05-19 Thread Shlomi Fish
Hi Ganesh, On Sat, 18 May 2013 20:11:09 +0530 Ganesh Babu N wrote: > $dbh = DBI->connect("dbi:mysqlPP:$dsn;host=$host", $user, $pw, > {PrintError => 1, RaiseError => 1}); > if (!$dbh) { > print "error: connection: $DBI::err\n$DBI::errstr\n$DBI::state\n"; > } > $drh = DBI->install_driver("mys

DBD::mysqlPP is giving out of memory exception and ODBC is working fine.

2013-05-18 Thread Ganesh Babu N
$dbh = DBI->connect("dbi:mysqlPP:$dsn;host=$host", $user, $pw, {PrintError => 1, RaiseError => 1}); if (!$dbh) { print "error: connection: $DBI::err\n$DBI::errstr\n$DBI::state\n"; } $drh = DBI->install_driver("mysqlPP"); $ary_ref = $dbh->selectcol_arrayref("SELECT pui,spuid FROM inven WHERE dis

Re: Out of memory, HTML::TableExtract

2011-01-27 Thread C.DeRykus
On Jan 27, 3:29 am, jinstho...@gmail.com (Jins Thomas) wrote: > On Thu, Jan 27, 2011 at 4:44 PM, C.DeRykus wrote: > > On Jan 26, 11:28 pm, jinstho...@gmail.com (Jins Thomas) wrote: > > > > Hi DeRykus > > > > Sorry for replying late. > > > > I was able to  test DB_File with your example, thanks. Bu

Re: Out of memory, HTML::TableExtract

2011-01-27 Thread Jins Thomas
On Thu, Jan 27, 2011 at 4:44 PM, C.DeRykus wrote: > On Jan 26, 11:28 pm, jinstho...@gmail.com (Jins Thomas) wrote: >> > > Hi DeRykus > > > > Sorry for replying late. > > > > I was able to test DB_File with your example, thanks. But i'm facing > > a problem. I'm not able to access multi dimension

Re: Out of memory, HTML::TableExtract

2011-01-27 Thread C.DeRykus
On Jan 26, 11:28 pm, jinstho...@gmail.com (Jins Thomas) wrote: > Hi DeRykus > > Sorry for replying late. > > I was able to  test DB_File with your example, thanks. But i'm facing > a problem. I'm not able to access multi dimensional array with this > DB_File. Address is being stored just a string.

Re: Out of memory, HTML::TableExtract

2011-01-26 Thread Jins Thomas
nal arrays (like two dimensional array from html tables) Thanks Jins Thomas On Sat, Jan 8, 2011 at 10:45 AM, C.DeRykus wrote: > On Jan 5, 10:56 pm, jinstho...@gmail.com (Jins Thomas) wrote: >> Hi experts, >> >> Have you ever experienced Out of memory problem while using &g

Re: How to avoid Out of Memory Errors when dealing with a large XML file?

2011-01-15 Thread Saqib Ali
Thanks! This workaround worked for me. :) :) - Saqib On Tue, Jan 11, 2011 at 8:26 AM, Bob McConnell wrote: > From: Saqib Ali > > > I'm reading a large (57 MB) XML file Using XML::XPath::XMLParser() > > > > I keep getting this error: > > > > "Callback called exit at XML/XPath/Node/Element.pm

Re: How to avoid Out of Memory Errors when dealing with a large XML file?

2011-01-14 Thread Jenda Krynicky
From: Saqib Ali > I'm reading a large (57 MB) XML file Using XML::XPath::XMLParser() > > I keep getting this error: > > "Callback called exit at XML/XPath/Node/Element.pm at line 144 during > global destruction." > > I'm using Windows XP. So I watched the task-management memory meter > during t

RE: How to avoid Out of Memory Errors when dealing with a large XML file?

2011-01-11 Thread Bob McConnell
From: Saqib Ali > I'm reading a large (57 MB) XML file Using XML::XPath::XMLParser() > > I keep getting this error: > > "Callback called exit at XML/XPath/Node/Element.pm at line 144 during > global destruction." > > I'm using Windows XP. So I watched the task-management memory meter > during t

How to avoid Out of Memory Errors when dealing with a large XML file?

2011-01-11 Thread Saqib Ali
Hi. I'm reading a large (57 MB) XML file Using XML::XPath::XMLParser() I keep getting this error: "Callback called exit at XML/XPath/Node/Element.pm at line 144 during global destruction." I'm using Windows XP. So I watched the task-management memory meter during the execution of this process.

Re: Out of memory, HTML::TableExtract

2011-01-07 Thread C.DeRykus
On Jan 5, 10:56 pm, jinstho...@gmail.com (Jins Thomas) wrote: > Hi experts, > > Have you ever experienced Out of memory problem while using > HTML::TableExtract. I'm having little large html files, still i didn't > expect this to happen > > Would you be able to su

Re: Out of memory, HTML::TableExtract

2011-01-06 Thread C.DeRykus
On Jan 5, 10:56 pm, jinstho...@gmail.com (Jins Thomas) wrote: > Hi experts, > > Have you ever experienced Out of memory problem while using > HTML::TableExtract. I'm having little large html files, still i didn't > expect this to happen > If the html files are really b

Re: Out of memory, HTML::TableExtract

2011-01-06 Thread Parag Kalra
ne your thing > and it remains in memory? > > On 2011-01-06 02:26:13 -0500, Jins Thomas said: > >> --0016364270585953cf049927ffd4 >> >> Content-Type: text/plain; charset=ISO-8859-1 >> >> >> >> Hi experts, >> >> >> >>

Re: Out of memory, HTML::TableExtract

2011-01-06 Thread Robert
Maybe because you aren't closing each file after you have done your thing and it remains in memory? On 2011-01-06 02:26:13 -0500, Jins Thomas said: --0016364270585953cf049927ffd4 Content-Type: text/plain; charset=ISO-8859-1 Hi experts, Have you ever experienced Out of memory pr

Out of memory, HTML::TableExtract

2011-01-05 Thread Jins Thomas
Hi experts, Have you ever experienced Out of memory problem while using HTML::TableExtract. I'm having little large html files, still i didn't expect this to happen Would you be able to suggest some workarounds for this. I'm using this subroutine in another for loop. sub

Re: Out of Memory!

2010-10-13 Thread Chas. Owens
On Wed, Oct 13, 2010 at 04:41, Panda-X wrote: snip > From my observation, the Out of Memory error could happen anytime, where > the "MEM" size was never over 10K. > > Any clues ? snip Then the problem is unlikely to be that message. The two most common things that will c

Re: Out of Memory!

2010-10-13 Thread Rob Coops
On Wed, Oct 13, 2010 at 10:41 AM, Panda-X wrote: > > > 2010/10/13 Rob Coops > > >> >> On Wed, Oct 13, 2010 at 8:42 AM, Panda-X wrote: >> >>> Hi List, >>> >>> My script is running on WinXP, Dos shell, which is a server program. &g

Re: Out of Memory!

2010-10-13 Thread Panda-X
2010/10/13 Rob Coops > > > On Wed, Oct 13, 2010 at 8:42 AM, Panda-X wrote: > >> Hi List, >> >> My script is running on WinXP, Dos shell, which is a server program. >> >> It works fine, unless it will pops-up a "Out of Memory!" and stop runni

Re: Out of Memory!

2010-10-13 Thread Rob Coops
On Wed, Oct 13, 2010 at 8:42 AM, Panda-X wrote: > Hi List, > > My script is running on WinXP, Dos shell, which is a server program. > > It works fine, unless it will pops-up a "Out of Memory!" and stop running > per each few days! > > Could someone tell me this

Out of Memory!

2010-10-12 Thread Panda-X
Hi List, My script is running on WinXP, Dos shell, which is a server program. It works fine, unless it will pops-up a "Out of Memory!" and stop running per each few days! Could someone tell me this error message is coming from Perl ? or from the Dos shell ? Thanks! ex

Re: Out of Memory error with large Oracle DB extraction

2009-11-25 Thread Rene Schickbauer
Hi! I've done a lot of DBD::Oracle programming for over 10 years and never met a situation where I needed to do that. I can safely say that your program will be better off for using some sort of while ( ...->fetch ) loop. I also use DBD::Oracle in one of my projects (ActivePerl 5.10). I get

Re: Out of Memory error with large Oracle DB extraction

2009-11-19 Thread Peter Scott
On Thu, 19 Nov 2009 01:07:25 -0700, Dan Fish wrote: > I'm working on a project that requires some rather large extractions > from an Oracle DB (about 2 million rows) and while monitoring the task > manager, I get an "Out of memory error" at about 2GB of mem usage. [...] &

Re: Out of Memory error with large Oracle DB extraction

2009-11-19 Thread Rob Coops
On Thu, Nov 19, 2009 at 9:07 AM, Dan Fish wrote: > I'm working on a project that requires some rather large extractions from > an > Oracle DB (about 2 million rows) and while monitoring the task manager, I > get an "Out of memory error" at about 2GB of mem usage. >

Out of Memory error with large Oracle DB extraction

2009-11-19 Thread Dan Fish
I'm working on a project that requires some rather large extractions from an Oracle DB (about 2 million rows) and while monitoring the task manager, I get an "Out of memory error" at about 2GB of mem usage. Client System Specifics: Win 2K3 Server 32-Bit SP2 Intel Xeon 2.5

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Robert H
On 9/20/09 12:17 PM, Shawn H Corey wrote: Rodrick Brown wrote: These disclaimers are requirements for anyone working in the securities industry. There isn't much the poster can do about this and shouldn't be bashed for this. Many of these disclaimers are automatically appended to everyones out g

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Shawn H Corey
Shlomi Fish wrote: I'm not sure whether we should discuss it so exhaustively. Many workplaces add it to their employee's E-mails, and it is a common practice, and it gets posted to public mailing lists with publicly accessible archives, and it may be legally iffy - but that's life. And it's mor

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Shlomi Fish
On Sunday 20 Sep 2009 19:12:31 Shawn H Corey wrote: > Telemachus wrote: > > On Sun Sep 20 2009 @ 10:13, Shawn H Corey wrote: > >> Telemachus wrote: > >>> Ok, I'll bite: do you really mean to say that it's a crime somewhere to > >>> put this bullshit drivel into an email and then send that mail to a

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Telemachus
On Sun Sep 20 2009 @ 12:17, Shawn H Corey wrote: > Rodrick Brown wrote: > >These disclaimers are requirements for anyone working in the > >securities industry. There isn't much the poster can do about this and > >shouldn't be bashed for this. Many of these disclaimers are > >automatically appended

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread jm
On Sun, Sep 20, 2009 at 11:12 AM, Shawn H Corey wrote: > Telemachus wrote: > >> On Sun Sep 20 2009 @ 10:13, Shawn H Corey wrote: >> >>> Telemachus wrote: >>> Ok, I'll bite: do you really mean to say that it's a crime somewhere to put this bullshit drivel into an email and then send

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Shawn H Corey
Rodrick Brown wrote: These disclaimers are requirements for anyone working in the securities industry. There isn't much the poster can do about this and shouldn't be bashed for this. Many of these disclaimers are automatically appended to everyones out going emails so there isn't really anything

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Shawn H Corey
Telemachus wrote: On Sun Sep 20 2009 @ 10:13, Shawn H Corey wrote: Telemachus wrote: Ok, I'll bite: do you really mean to say that it's a crime somewhere to put this bullshit drivel into an email and then send that mail to a public list? Annoying, sure. Pointless, sure. Non-binding, sure. But a

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Rodrick Brown
These disclaimers are requirements for anyone working in the securities industry. There isn't much the poster can do about this and shouldn't be bashed for this. Many of these disclaimers are automatically appended to everyones out going emails so there isn't really anything he could have done. No

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Telemachus
On Sun Sep 20 2009 @ 10:13, Shawn H Corey wrote: > Telemachus wrote: > >Ok, I'll bite: do you really mean to say that it's a crime somewhere to put > >this bullshit drivel into an email and then send that mail to a public > >list? Annoying, sure. Pointless, sure. Non-binding, sure. But a crime? > >

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Shawn H Corey
Telemachus wrote: Ok, I'll bite: do you really mean to say that it's a crime somewhere to put this bullshit drivel into an email and then send that mail to a public list? Annoying, sure. Pointless, sure. Non-binding, sure. But a crime? Speeding is illegal but not a crime. Not all laws are par

Re: "Tell your lawyers" [Was - Re: "Out Of Memory"]

2009-09-20 Thread Telemachus
On Sun Sep 20 2009 @ 9:01, Shawn H Corey wrote: > Ajay Kumar wrote: > >__ > >This communication contains information which is confidential. It is for the > >exclusive use of the intended recipient(s). If you are not the intended >

Re: "Out Of Memory"

2009-09-20 Thread Shawn H Corey
Ajay Kumar wrote: can anybody tell me what is the issues Please How big are the files? How much RAM do you have? How big is your swap? __ This communication contains information which is confidential. It is for the exclusiv

"Out Of Memory"

2009-09-20 Thread Ajay Kumar
Hi Guyies I am working on a script that parse the excel sheets and write the data into another excel sheet. we have more than 40 excel file to parse , when i am parsing excel sheet i am getting out of memory error out of memory issues comes after some file has parsed if i parsed 5 or 6

Re: out of memory

2008-11-01 Thread Dr.Ruud
Alert: this is also posted (and being replied on) in news:comp.lang.perl.misc article: news:[EMAIL PROTECTED] -- Affijn, Ruud "Gewoon is een tijger." -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] http://learn.perl.org/

out of memory

2008-10-31 Thread [EMAIL PROTECTED]
Hi, I want to parse large log file (in GBs) and I am readin 2-3 such files in hash array. But since it will very big hash array it is going out of memory. what are the other approach I can take. Example code: open ($INFO, '<', $file) or die "Cannot open $file :$!

Out of memory! Callback called exit. END failed--call queue aborted

2008-06-22 Thread Ravi Malghan
Hi: I have a simple script which connects to a few databases. When I run the script I get the "Out of memory! Callback called exit. END failed--call queue aborted." error. It runs fine on other similar machines. Below is an example. As soon as I add the 3rd database connection it sta

Re: out of memory problem

2007-02-14 Thread John W. Krahn
Arantxa Otegi wrote: > I have memory problems programming with perl: "out of memory!" > > I have to process a lot of xml files which are in different directories > (more than 2 files in 110 directories). The files are quite small > (almost all of them are smaller th

Re: out of memory problem

2007-02-14 Thread Rob Dixon
Arantxa Otegi wrote: I have memory problems programming with perl: "out of memory!" I have to process a lot of xml files which are in different directories (more than 2 files in 110 directories). The files are quite small (almost all of them are smaller than 100KB). Here is

Re: out of memory problem

2007-02-14 Thread Rob Dixon
Ken Foskey wrote: On Wed, 2007-02-14 at 12:37 +0100, Arantxa Otegi wrote: I have memory problems programming with perl: "out of memory!" I have to process a lot of xml files which are in different directories (more than 2 files in 110 directories). The files are quite small (

Re: out of memory problem

2007-02-14 Thread D. Bolliger
Arantxa Otegi am Mittwoch, 14. Februar 2007 12:37: > I have memory problems programming with perl: "out of memory!" > > I have to process a lot of xml files which are in different directories > (more than 2 files in 110 directories). The files are quite small > (almos

Re: out of memory problem

2007-02-14 Thread Ken Foskey
On Wed, 2007-02-14 at 12:37 +0100, Arantxa Otegi wrote: > I have memory problems programming with perl: "out of memory!" > > I have to process a lot of xml files which are in different directories > (more than 2 files in 110 directories). The files are quite small >

out of memory problem

2007-02-14 Thread Arantxa Otegi
I have memory problems programming with perl: "out of memory!" I have to process a lot of xml files which are in different directories (more than 2 files in 110 directories). The files are quite small (almost all of them are smaller than 100KB). Here is some code:

Re: Out of memory!, while extending scalar with vec()

2006-12-04 Thread Jenda Krynicky
From: "kyle cronan" <[EMAIL PROTECTED]> > On 12/3/06, Bill Jones <[EMAIL PROTECTED]> wrote: > > On 12/3/06, kyle cronan <[EMAIL PROTECTED]> wrote: > > > (1<<$ARGV[0]) > > > > Just a thought - > > > > The argument you are passing is really the two's complement; so you > > are really passing 256M (no

Re: Out of memory!, while extending scalar with vec()

2006-12-04 Thread Paul Johnson
my $foo = ''; > vec($foo, (1<<$ARGV[0])-1, 8)=1; > > I can run this with the argument 27, and it works just fine. Perl > uses a bit more than 128 MB virtual memory before exiting, just as one > would expect. When I run the program with n=28 (perl 5.8.8, linux), I > ge

Re: Out of memory!, while extending scalar with vec()

2006-12-03 Thread Tom Phoenix
On 12/3/06, kyle cronan <[EMAIL PROTECTED]> wrote: I have plenty of virtual memory, so there's no reason a malloc would fail that I can think of. Should I submit this with perlbug? Yes. Cheers! --Tom Phoenix Stonehenge Perl Training -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additio

Re: Out of memory!, while extending scalar with vec()

2006-12-03 Thread kyle cronan
On 12/3/06, Bill Jones <[EMAIL PROTECTED]> wrote: On 12/3/06, kyle cronan <[EMAIL PROTECTED]> wrote: > (1<<$ARGV[0]) Just a thought - The argument you are passing is really the two's complement; so you are really passing 256M (not 28) to the vec statement. So what you're saying is I should be

Re: Out of memory!, while extending scalar with vec()

2006-12-03 Thread Bill Jones
On 12/3/06, kyle cronan <[EMAIL PROTECTED]> wrote: (1<<$ARGV[0]) Just a thought - The argument you are passing is really the two's complement; so you are really passing 256M (not 28) to the vec statement. -- WC (Bill) Jones -- http://youve-reached-the.endoftheinternet.org/ -- To unsubscribe,

Out of memory!, while extending scalar with vec()

2006-12-03 Thread kyle cronan
nd it works just fine. Perl uses a bit more than 128 MB virtual memory before exiting, just as one would expect. When I run the program with n=28 (perl 5.8.8, linux), I get: Out of memory! I have plenty of virtual memory, so there's no reason a malloc would fail that I can think of. From

SOLVED - Re: Out of memory! - from CPAN

2006-03-16 Thread Jerry K
n the Changes files. I would like to say thank you again to everyone who has replied to my questions and offered suggestions. Jerry Tom Phoenix wrote: On 3/15/06, Jerry Kemp <[EMAIL PROTECTED]> wrote: # perl -MCPAN -e shell cpan> reload index Out of memory! Was your perl compil

Re: Out of memory! - from CPAN

2006-03-16 Thread Jerry K
5/06, Jerry Kemp <[EMAIL PROTECTED]> wrote: # perl -MCPAN -e shell cpan> reload index Out of memory! Was your perl compiled to use your system's malloc(), or Perl's own? You can find out with a command like this one: perl -MConfig -lwe 'print $Config{usemymalloc}'

Re: Out of memory! - from CPAN

2006-03-16 Thread Jay Savage
On 3/15/06, Tom Phoenix <[EMAIL PROTECTED]> wrote: > On 3/15/06, Jerry Kemp <[EMAIL PROTECTED]> wrote: > > > # perl -MCPAN -e shell > > cpan> reload index > > Out of memory! > > Was your perl compiled to use your system's malloc(), or Perl'

Re: Out of memory! - from CPAN

2006-03-15 Thread Tom Phoenix
On 3/15/06, Jerry Kemp <[EMAIL PROTECTED]> wrote: > # perl -MCPAN -e shell > cpan> reload index > Out of memory! Was your perl compiled to use your system's malloc(), or Perl's own? You can find out with a command like this one: perl -MConfig -lwe 'print $Co

RE: Out of memory! - from CPAN

2006-03-15 Thread Keenan, Greg John (Greg)** CTR **
ctions are in place then all those resources may not be available. Check your syslogs and dmesg for entries when you get the "out of memory" errors. Good luck, Greg. -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.or

Re: Out of memory! - from CPAN

2006-03-15 Thread Jerry Kemp
etty standard practice. I have cleaned out my .cpan subdirectory in the hopes that it would be "just that simple", unfortunately, it wasn't. > > 5) Look into other changes you've made to your system lately. Are you > having trouble building anything else? Have you upgra

Re: Out of memory! - from CPAN

2006-03-15 Thread Jerry Kemp
Hello Shawn, Thank you for your reply. Mr. Shawn H. Corey wrote: I can do this also. An for modules with none to just a few dependencies, this is an acceptable work-around. For modules with deep dependencies, it takes a long time to get everything worked out. Hopefully, this explains why I

Re: Out of memory! - from CPAN

2006-03-15 Thread Jerry Kemp
Hello John, Thank you for the correction, Jerry John W. Krahn wrote: Owen Cook wrote: You could try comp.lang.misc.perl That should be comp.lang.perl.misc :-) John -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]

Re: Out of memory! - from CPAN

2006-03-15 Thread Jerry Kemp
grows, it starts with just the big ones, then I can't do the small ones, then at some point, I can't even do a 'reload index' with out the out of memory error. What does 'top' say before you start cpan, what does it say after starting cpan? Here is a 'CPAN

Re: Out of memory! - from CPAN

2006-03-15 Thread Jay Savage
On 3/14/06, Jerry K <[EMAIL PROTECTED]> wrote: > Hello, > > I asked this question, and other than one response (thanks Owen!), > things have been pretty silent on this issue. > > As I am a recent subscriber, maybe this is not the place to post this > type of a request. > > Is there a better place t

Re: Out of memory! - from CPAN

2006-03-15 Thread Mr. Shawn H. Corey
Jerry K wrote: Hello, I asked this question, and other than one response (thanks Owen!), things have been pretty silent on this issue. As I am a recent subscriber, maybe this is not the place to post this type of a request. Is there a better place that someone might aim me to resolve this

Re: Out of memory! - from CPAN

2006-03-14 Thread John W. Krahn
Owen Cook wrote: > > You could try comp.lang.misc.perl That should be comp.lang.perl.misc :-) John -- use Perl; program fulfillment -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]

Re: Out of memory! - from CPAN

2006-03-14 Thread Owen Cook
On Tue, 14 Mar 2006, Jerry K wrote: > Hello, > > I asked this question, and other than one response (thanks Owen!), > things have been pretty silent on this issue. > > As I am a recent subscriber, maybe this is not the place to post this > type of a request. > > Is there a better place that

Re: Out of memory! - from CPAN

2006-03-14 Thread Jerry K
for any replies, Jerry K Jerry K wrote: Hello Owen, thank you for your reply. When I started over again, yes, I did remove .cpan . I have set .cpan to 20 Mb. My .cpan/build directory was at 21 Mb. I cleaned that directory out. I still get the "Out of memory!" error. Jerry

Re: Out of memory! - from CPAN

2006-03-12 Thread Jerry K
Hello Owen, thank you for your reply. When I started over again, yes, I did remove .cpan . I have set .cpan to 20 Mb. My .cpan/build directory was at 21 Mb. I cleaned that directory out. I still get the "Out of memory!" error. Jerry K Owen Cook wrote: On Sun, 12 Mar 200

Re: Out of memory! - from CPAN

2006-03-12 Thread Owen Cook
ploration and modules installation (v1.86) > ReadLine support enabled > > cpan> reload index > CPAN: Storable loaded ok > Going to read /usr/local/.cpan/Metadata >Database was generated on Wed, 22 Feb 2006 09:18:20 GMT > CPAN: LWP::UserAgent loaded ok > Out of memor

Out of memory! - from CPAN

2006-03-12 Thread Jerry K
rence. In my attempt, I yahoo'ed and googled my error message, and also searched the archives at CPAN. I found several others who were experiencing similar issues, but no one had resolved this issue, or at least if they had, they didn't post their fix on line. Posted below is my

Re: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread David Van Ginneken
to generate results file > # > my $oBook = new Spreadsheet::ParseExcel::Workbook->Parse($localfile); > my $oWkS = ${$oBook->{Worksheet}}[0]; > > print "-- SHEET: ".$oWkS->{Name}. "\n"; > print "Row:

Re: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread Craig Moynes
> > > $oBook = new > > > Spreadsheet::ParseExcel::Workbook->Parse($localfile); my > > > ($iR, $iC, $oWkS, $oWkC); $oWkS = ${$oBook->{Worksheet}}[0]; > > > > > > print "-- SHEET: ".$oWkS->{Name

Re: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread Craig Moynes
print "-- SHEET: ".$oWkS->{Name}. "\n"; print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n"; $resultMessage.=sprintf("%s,%s\n",basename($localfile),$oWkS->{MaxRow}); } But I still get an out of memory er

RE: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread Bakken, Luke
> print "-- SHEET: ".$oWkS->{Name}. "\n"; > print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n"; > > $resultMessage.=basename($localfile).",".$oWkS->{MaxRow}."\n"; } >

Re: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread Jay Savage
t;{Name}. "\n"; > print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n"; > $resultMessage.=basename($localfile).",".$oWkS->{MaxRow}."\n"; > } > > The problem I am running into is after 10 files (i

Re: Spreadsheet::ParseExcel - Out of memory error

2005-04-27 Thread Craig Moynes
On 4/26/05, Wagner, David --- Senior Programmer Analyst --- WGO <[EMAIL PROTECTED]> wrote: > Craig Moynes wrote: > > Hi All, > > I am using the spreadsheet::parseexcel module to open up a series (31) > > spreadsheets and grab the row counts. > > > > Here is an excerpt with the ParseExcel Code. > >

RE: Spreadsheet::ParseExcel - Out of memory error

2005-04-26 Thread Wagner, David --- Senior Programmer Analyst --- WGO
print "Row: ".$oWkS->{MinRow}." v ".$oWkS->{MaxRow}."\n"; > > $resultMessage.=basename($localfile).",".$oWkS->{MaxRow}."\n"; } > > The problem I am running into is after 10 files (in testing all 31 > files are

Spreadsheet::ParseExcel - Out of memory error

2005-04-26 Thread Craig Moynes
axRow}."\n"; } The problem I am running into is after 10 files (in testing all 31 files are the same source file with different names), and then I get an out of memory error. Anyone have any idea how I can clean out the memory. I have a feeling it might be some autocaching or somethin

Spreadsheet::ParseExcel - Out of memory error

2005-04-26 Thread Craig Moynes
I have a piece of code that needs to open 31 excel spreadsheets, find out the row count for each. When it runs it gets about 10 files in and crashes with an Out of memory error. Here is an excerpt from the code: my $oBook; my $oWks; my ($iR, $iC, $oWkS, $oWkC); foreach $hashEntry ( @LOGS

Signal Handler for Out of Memory Problem

2004-11-29 Thread Joshua Berry
I have written a program that appears to have a memory leak or is using a module with a memory leak. Eventually the system runs out of memory and dies. I need to write a signal handler that intercepts the signal that stops the program and flushes everything in use to free up memory. Does anyone

Re: Need Help with 'Out of Memory!' Error message situation

2004-10-04 Thread Jenda Krynicky
From: Tony Frasketi <[EMAIL PROTECTED]> > Hello group > I'm getting the following error message when running my Perl program: > > Out of Memory! > > The program reads in an ASCII file into a string variable $lines. Then > parses $lines looking for cert

Need Help with 'Out of Memory!' Error message situation

2004-10-03 Thread Tony Frasketi
Hello group I'm getting the following error message when running my Perl program: Out of Memory! The program reads in an ASCII file into a string variable $lines. Then parses $lines looking for certain types of entities as follows... while ($lines =~ s/(INSERT INTO.+?)\#(.+)/

Re: Out of Memory! error on large tied hash

2004-09-27 Thread Rob Benton
Chris Devers wrote: This may be the sort of problem that would best be handled by a proper database server. The folks writing RDBMSes have been tackling problems like this for decades now, and have some useful techniques; rather than reinvent the wheel, you can just leverage their efforts by put

Re: Out of memory error problem

2003-12-16 Thread drieux
On Dec 16, 2003, at 11:15 AM, Perl wrote: [..] The script works fine but when it runs against a very large file (2GB+) I receive an out of memory error. was the perl that you are using built to work with large datafiles? There is a USE_LARGE_FILE that is normally set, and you would see it with

Re: Out of memory error problem

2003-12-16 Thread James Edward Gray II
very large file (2GB+) I receive an out of memory error. Is there a more efficient way of handling the hash portion that is less memory intense and preferably faster? Sure is. # Tracking log parser use strict; my $recips; my %event_id; my $counter; my $total_recips; my $count; # Get log file die

Out of memory error problem

2003-12-16 Thread Perl
I wrote a small script that uses message ID's as unique values and extracts recipient address info. The goal is to count 1019 events per message ID. It also gets the sum of recipients per message ID. The script works fine but when it runs against a very large file (2GB+) I receive an out of m

RE: Out of memory while finding duplicate rows

2003-02-24 Thread Madhu Reddy
hanks Peter - good point > > > -Original Message- > > From: Peter Scott [mailto:[EMAIL PROTECTED] > > Sent: Sunday, February 23, 2003 5:17 AM > > To: [EMAIL PROTECTED] > > Subject: RE: Out of memory while finding duplicate > rows > > > > > > In artic

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Beau E. Cox
Thanks Peter - good point > -Original Message- > From: Peter Scott [mailto:[EMAIL PROTECTED] > Sent: Sunday, February 23, 2003 5:17 AM > To: [EMAIL PROTECTED] > Subject: RE: Out of memory while finding duplicate rows > > > In article <[EMAIL PROTECTED]>

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Peter Scott
In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Beau E. Cox) writes: >Hi - > >Wait! If you are going to load the data into a database anyway, >why not use the existing database (or the one being created) to >remove duplicates. You don't even have to have an index on the >column you are making u

RE: Out of memory while finding duplicate rows

2003-02-23 Thread Beau E. Cox
PROTECTED] > Subject: RE: Out of memory while finding duplicate rows > > > Hi, >those data finally have to load into database.. > before loading into dabase,,we need to do some > validations like remove duplicate etc. > that is why i am doing... > > i have ano

RE: Out of memory while finding duplicate rows

2003-02-22 Thread Madhu Reddy
gt; > From: Madhu Reddy [mailto:[EMAIL PROTECTED] > > Sent: Saturday, February 22, 2003 11:12 AM > > To: [EMAIL PROTECTED] > > Subject: Out of memory while finding duplicate > rows > > > > > > Hi, > > I have a script that will find out duplicate &g

  1   2   >