Re: Using COPY to import large xml file

Поиск
Список
Период
Сортировка
От Anto Aravinth
Тема Re: Using COPY to import large xml file
Дата
Msg-id CANtp6R+VFj2=qD6+DR7Hpzg1c_cQRHG-W5kENsre3yQesJHHxw@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Using COPY to import large xml file  (Adrien Nayrat <adrien.nayrat@anayrat.info>)
Ответы Re: Using COPY to import large xml file
Список pgsql-general
Thanks for the response. I'm not sure, how long does this tool takes for the 70GB data. 

I used node to stream the xml files into inserts.. which was very slow.. Actually the xml contains 40 million records, out of which 10Million took around 2 hrs using nodejs. Hence, I thought will use COPY command, as suggested on the internet. 

Definitely, will try the code and let you know.. But looks like it uses the same INSERT, not copy.. interesting if it runs quick on my machine. 

On Sun, Jun 24, 2018 at 9:23 PM, Adrien Nayrat <adrien.nayrat@anayrat.info> wrote:
On 06/24/2018 05:25 PM, Anto Aravinth wrote:
> Hello Everyone,
>
> I have downloaded the Stackoverflow posts xml (contains all SO questions till
> date).. the file is around 70GB.. I wanna import the data in those xml to my
> table.. is there a way to do so in postgres?
>
>
> Thanks, 
> Anto.

Hello Anto,

I used this tool :
https://github.com/Networks-Learning/stackexchange-dump-to-postgres

Regards,

--
Adrien NAYRAT
https://blog.anayrat.info


В списке pgsql-general по дате отправления:

Предыдущее
От: Adrien Nayrat
Дата:
Сообщение: Re: Using COPY to import large xml file
Следующее
От: Adrien Nayrat
Дата:
Сообщение: Re: Using COPY to import large xml file