On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra
<tomas.von...@2ndquadrant.com> wrote:
> Hi,
>
> On 11/28/2017 06:17 PM, Ted Toth wrote:
>> I'm writing a migration utility to move data from non-rdbms data
>> source to a postgres db. Currently I'm generating SQL INSERT
>> statements involving 6 related tables for each 'thing'. With 100k or
>> more 'things' to migrate I'm generating a lot of statements and when I
>> try to import using psql postgres fails with 'out of memory' when
>> running on a Linux VM with 4G of memory. If I break into smaller
>> chunks say ~50K statements then thde import succeeds. I can change my
>> migration utility to generate multiple files each with a limited
>> number of INSERTs to get around this issue but maybe there's
>> another/better way?
>>
>
> The question is what exactly runs out of memory, and how did you modify
> the configuration (particularly related to memory).
>
> regards
>
> --
> Tomas Vondra                  http://www.2ndQuadrant.com
> PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services

I'm pretty new to postgres so I haven't changed any configuration
setting and the log is a bit hard for me to make sense of :(

Attachment: psql.outofmem.log
Description: Binary data

Reply via email to