On 01/02/2017 09:03 AM, vod vos wrote:
> You know, the csv file was exported from other database of a machine, so
> I really dont want to break it for it is a hard work. Every csv file
> contains headers and values. If I redesign the table, then I have to cut
> all the csv files into pieces one by one.
If it helps:
http://csvkit.readthedocs.io/en/latest/tutorial/1_getting_started.html#csvcut-data-scalpel
>
>
> ---- On 星期一, 02 一月 2017 08:21:29 -0800 *Tom Lane
> <tgl@sss.pgh.pa.us>* wrote ----
>
> vod vos <vodvos@zoho.com <mailto:vodvos@zoho.com>> writes:
> > When I copy data from csv file, a very long values for many
> columns (about 1100 columns). The errors appears:
> > ERROR: row is too big: size 11808, maximum size 8160
>
> You need to rethink your table schema so you have fewer columns.
> Perhaps you can combine some of them into arrays, for example.
> JSON might be a useful option, too.
>
> regards, tom lane
>
>
> --
> Sent via pgsql-general mailing list (pgsql-general@postgresql.org
> <mailto:pgsql-general@postgresql.org>)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
>
>
--
Adrian Klaver
adrian.klaver@aklaver.com