Importing a Large .ndjson file

Поиск
Список
Период
Сортировка
От Sankar P
Тема Importing a Large .ndjson file
Дата
Msg-id CAMSEaH5SfyfXN_rSah41dOOA_aAik4hZED0qp52=1wqzjz-pMA@mail.gmail.com
обсуждение исходный текст
Ответы Re: Importing a Large .ndjson file
Список pgsql-general
Hi

I have a .ndjson file. It is a new-line-delimited JSON file. It is
about 10GB and has about 100,000 records.

Some sample records:
```
{ "key11": "value11", "key12": [ "value12.1", "value12.2"], "key13": {
"k111": "v111" } } \n\r
{ "key21": "value21", "key22": [ "value22.1", "value22.2"] }
```
Now I want to INSERT these json records into my postgres table of the
following schema:

```
CREATE TABLE myTable (id BIGSERIAL, content JSONB);
```

Where I want the records to be inserted to the `content` field of my
postgres table.

What is the best way to do this on a postgresql database, deployed in
kubernetes, with a 1 GB RAM allocated ?

I can probably write a that would read this file line-by-line and
INSERT into the database, in a transaction. But that I believe would
take a lot of network traffic and I want to know if there is a better
way to do this.

Thanks.

-- 
Sankar P
http://psankar.blogspot.com



В списке pgsql-general по дате отправления:

Предыдущее
От: Laurenz Albe
Дата:
Сообщение: Re: Conflict with recovery on PG version 11.6
Следующее
От: Toomas Kristin
Дата:
Сообщение: Re: Conflict with recovery on PG version 11.6