[NOVICE] Bulk load billions of records into Postgres cluster

Поиск
Список
Период
Сортировка
От balasubramaniam
Тема [NOVICE] Bulk load billions of records into Postgres cluster
Дата
Msg-id CACFhHyuehAgUm6cQ4RbELZC5HSnc9Zsi9hpQjo+g2q+kVW1i-Q@mail.gmail.com
обсуждение исходный текст
Ответы Re: [NOVICE] Bulk load billions of records into Postgres cluster
Список pgsql-novice
Hi All,

We have a proven NoSQL production setup with a few billion rows. We are planning to move towards a more structured data model with few tables.

I am looking for a completely open-source and battle-tested database and Postgres seems to be the right start.

Due to our increasing scale demands, I am planning to start with Postgresql cluster. Ability to ingest data at scale, around a few TBs, in the fastest possible duration is highly critical for our use case. I have read through official documentation and also about COPY FROM command, but none of these talk specifically about cluster setup.

1) What is the standard and fastest way to ingest billions of records into Postgres at scale.
2) Is there a tool to generate the sql script for COPY FROM command for ready use? I want to avoid writing another custom tool and maintain it.

Thanks in advance,
bala

В списке pgsql-novice по дате отправления:

Предыдущее
От: Neha Khatri
Дата:
Сообщение: [NOVICE] Doc compilation on Solaris
Следующее
От: Rounak Jain
Дата:
Сообщение: [NOVICE] where and how to store calculated data?