Hi team, I got to know the field size limit for the bytea datatype column is limited to 1 GB in postgreSQL. Then how can we increase this? Since we need to store high volume of data for each row in a table
https://www.postgresql.org/docs/current/limits.html Any suggestions would be appreciated. Thanks & Regards, Sai On Tue, 15 Aug, 2023, 8:10 am Sai Teja, <saitejasaichintalap...@gmail.com> wrote: > By default the bytea_output is in hex format. > > On Tue, 15 Aug, 2023, 12:44 am Ron, <ronljohnso...@gmail.com> wrote: > >> Did you *try* changing bytea_output to hex? >> >> On 8/14/23 12:31, Sai Teja wrote: >> >> I am just running select query to fetch the result >> Query : select id, content_data, name from table_name >> So here content_data is bytea content which is having more than 700 MB. >> Even if I run this query in any DB client such as Pgadmin, dbeaver etc.. >> I'm facing the same error. But this query is being called in java as well >> So, I don't think java could be the issue as I can able to successfully >> insert the data. But, only the problem is with fetching the data that too >> only specific rows which are having huge volume of data. >> >> Thanks, >> Sai >> >> On Mon, 14 Aug, 2023, 10:55 pm Rob Sargent, <robjsarg...@gmail.com> >> wrote: >> >>> On 8/14/23 09:29, Sai Teja wrote: >>> > Could anyone please suggest any ideas to resolve this issue. >>> > >>> > I have increased the below parameters but still I'm getting same error. >>> > >>> > work_mem, shared_buffers >>> > >>> > Out of 70k rows in the table only for the few rows which is of large >>> > size (700MB) getting the issue. Am unable to fetch the data for that >>> > particular row. >>> > >>> > Would be appreciated if anyone share the insights. >>> > >>> > Thanks, >>> > Sai >>> > >>> > >>> Are you using java? There's an upper limit on array size, hence also on >>> String length. You'll likely need to process the output in chunks. >>> >>> >>> >> -- >> Born in Arizona, moved to Babylonia. >> >