Re: Using COPY to import large xml file

Поиск
Список
Период
Сортировка
От Tim Cross
Тема Re: Using COPY to import large xml file
Дата
Msg-id 8736xaseju.fsf@gmail.com
обсуждение исходный текст
Ответ на Re: Using COPY to import large xml file  (Anto Aravinth <anto.aravinth.cse@gmail.com>)
Ответы Re: Using COPY to import large xml file
Список pgsql-general
Anto Aravinth <anto.aravinth.cse@gmail.com> writes:

> Thanks a lot. But I do got lot of challenges! Looks like SO data contains
> lot of tabs within itself.. So tabs delimiter didn't work for me. I thought
> I can give a special demiliter but looks like Postrgesql copy allow only
> one character as delimiter :(
>
> Sad, I guess only way is to insert or do a through serialization of my data
> into something that COPY can understand.
>

The COPY command has a number of options, including setting what is used
as the delimiter - it doesn't have to be tab. You need to also look at
the logs/output to see exactly why the copy fails.

I'd recommend first pre-processing your input data to make sure it is
'clean' and all the fields actually match with whatever DDL you have
used to define your db tables etc. I'd then select a small subset and
try different parameters to the copy command until you get the right
combination of data format and copy definition.

It may take some effort to get the right combination, but the result is
probably worth it given your data set size i.e. difference between hours
and days. 

--
Tim Cross


В списке pgsql-general по дате отправления:

Предыдущее
От: Jeff Janes
Дата:
Сообщение: Re: DB size growing exponentially when materialized view refreshedconcurrently (postgres 9.6)
Следующее
От: Data Ace
Дата:
Сообщение: Re: PostgreSQL Volume Question