You know, the csv file was exported from other database of a machine, so I 
really dont want to break it for it is a hard work. Every csv file contains 
headers and values. If I redesign the table, then I have to cut all the csv 
files into pieces one by one. 





---- On 星期一, 02 一月 2017 08:21:29 -0800 Tom Lane <t...@sss.pgh.pa.us> 
wrote ----




vod vos <vod...@zoho.com> writes: 

> When I copy data from csv file, a very long values for many columns (about 
1100 columns). The errors appears: 

> ERROR: row is too big: size 11808, maximum size 8160 

 

You need to rethink your table schema so you have fewer columns. 

Perhaps you can combine some of them into arrays, for example. 

JSON might be a useful option, too. 

 

            regards, tom lane 

 

 

-- 

Sent via pgsql-general mailing list (pgsql-general@postgresql.org) 

To make changes to your subscription: 

http://www.postgresql.org/mailpref/pgsql-general 






Reply via email to