Hi,
Of course today after a reboot to update to the newest kernel,
everything works without crashing...
I imagine that yesterday the problem was that I had forgotten that I had
a Windows virtual machine running on the server that was eating a good
piece of memory. Still, using a cursor to p
Hi Joe,
Thanks for responding as you would clearly be the expert on this sort of
problem. My current function does page through data using a cursor
precisely to avoid out of memory problems, which is why I am somewhat
surprised and stumped as to how this can be happening. It does return
all
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On 05/16/2013 08:40 AM, Tom Lane wrote:
> "David M. Kaplan" writes:
>> Thanks for the help. You have definitely identified the problem,
>> but I am still looking for a solution that works for me. I tried
>> setting vm.overcommit_memory=2, but this j
"David M. Kaplan" writes:
> Thanks for the help. You have definitely identified the problem, but I
> am still looking for a solution that works for me. I tried setting
> vm.overcommit_memory=2, but this just made the query crash quicker than
> before, though without killing the entire connect
Hi,
Thanks for the help. You have definitely identified the problem, but I
am still looking for a solution that works for me. I tried setting
vm.overcommit_memory=2, but this just made the query crash quicker than
before, though without killing the entire connection to the database. I
imag
"David M. Kaplan" writes:
> I have a query that uses a PL/R function to run a statistical model on
> data in a postgresql table. The query runs the function 4 times, each
> of which generates about 2 million lines of results, generating a final
> table that has about 8 million lines. Each tim
On Thu, May 05/16/13, 2013 at 02:47:28PM +0200, David M. Kaplan wrote:
> Hi,
>
> I have a query that uses a PL/R function to run a statistical model
> on data in a postgresql table. The query runs the function 4 times,
> each of which generates about 2 million lines of results, generating
> a fin
Hi,
I have a query that uses a PL/R function to run a statistical model on
data in a postgresql table. The query runs the function 4 times, each
of which generates about 2 million lines of results, generating a final
table that has about 8 million lines. Each time the function is called,
it