Re: [GENERAL] COPY: row is too big

Поиск
Список
Период
Сортировка
От Adrian Klaver
Тема Re: [GENERAL] COPY: row is too big
Дата
Msg-id 6b0a0f3e-cd0d-1927-6c4e-ba32cc24a0a3@aklaver.com
обсуждение исходный текст
Ответ на Re: [GENERAL] COPY: row is too big  (Rob Sargent <robjsargent@gmail.com>)
Ответы Re: [GENERAL] COPY: row is too big  (Rob Sargent <robjsargent@gmail.com>)
Список pgsql-general
On 01/05/2017 08:31 AM, Rob Sargent wrote:
>
>
> On 01/05/2017 05:44 AM, vod vos wrote:
>> I finally figured it out as follows:
>>
>> 1. modified the corresponding data type of the columns to the csv file
>>
>> 2. if null values existed, defined the data type to varchar. The null
>> values cause problem too.
>>
>> so 1100 culumns work well now.
>>
>> This problem wasted me three days. I have lots of csv data to COPY.
>>
>>
> Yes, you cost yourself a lot of time by not showing the original table
> definition into which you were trying insert data.

Given that the table had 1100 columns I am not sure I wanted to see it:)

Still the OP did give it to us in description:

https://www.postgresql.org/message-id/15969913dd3.ea2ff58529997.7460368287916683127%40zoho.com
"I create a table with 1100 columns with data type of varchar, and hope
the COPY command will auto transfer the csv data that contains some
character and date, most of which are numeric."

In retrospect I should have pressed for was a more complete description
of the data. I underestimated this description:

"And some the values in the csv file contain nulls, do this null values
matter? "


--
Adrian Klaver
adrian.klaver@aklaver.com


В списке pgsql-general по дате отправления:

Предыдущее
От: Paul Ramsey
Дата:
Сообщение: Re: [GENERAL] Improve PostGIS performance with 62 million rows?
Следующее
От: BRUSSER Michael
Дата:
Сообщение: [GENERAL] psql error (encoding related?)