On 11/05/12 9:27 AM, Robert Haas wrote:
That is, if we have a large datum that we're trying to
send back to the client, could we perhaps chop off the first 50MB or
so, do the encoding on that amount of data, send the data to the
client, lather, rinse, repeat?
I'd suggest work_mem sized chunks f
On Tue, Oct 30, 2012 at 6:08 AM, Tatsuo Ishii wrote:
>> i have sql file (it's size are 1GB )
>> when i execute it then the String is 987098801 bytr too long for encoding
>> conversion error occured .
>> pls give me solution about
>
> You hit the upper limit of internal memory allocation limit in
> i have sql file (it's size are 1GB )
> when i execute it then the String is 987098801 bytr too long for encoding
> conversion error occured .
> pls give me solution about
You hit the upper limit of internal memory allocation limit in
PostgreSQL. IMO, there's no way to avoid the error except yo