Обсуждение: [NOVICE] Bulk load billions of records into Postgres cluster

Поиск
Список
Период
Сортировка

[NOVICE] Bulk load billions of records into Postgres cluster

От
balasubramaniam
Дата:
Hi All,

We have a proven NoSQL production setup with a few billion rows. We are planning to move towards a more structured data model with few tables.

I am looking for a completely open-source and battle-tested database and Postgres seems to be the right start.

Due to our increasing scale demands, I am planning to start with Postgresql cluster. Ability to ingest data at scale, around a few TBs, in the fastest possible duration is highly critical for our use case. I have read through official documentation and also about COPY FROM command, but none of these talk specifically about cluster setup.

1) What is the standard and fastest way to ingest billions of records into Postgres at scale.
2) Is there a tool to generate the sql script for COPY FROM command for ready use? I want to avoid writing another custom tool and maintain it.

Thanks in advance,
bala

Re: [NOVICE] Bulk load billions of records into Postgres cluster

От
Aleksey Tsalolikhin
Дата:
Hi Bala.  What are you going to export from?  If that NoSQL database can dump its data in CSV format, Postgres can read it in using COPY FROM or \copy (e.g., https://stackoverflow.com/questions/2987433/how-to-import-csv-file-data-into-a-postgresql-table).  In other words, it doesn't have to be an SQL script for this data transfer.

Aleksey

On Fri, Jun 30, 2017 at 9:29 PM, balasubramaniam <balasubramaniam.b@gmail.com> wrote:
Hi All,

We have a proven NoSQL production setup with a few billion rows. We are planning to move towards a more structured data model with few tables.

I am looking for a completely open-source and battle-tested database and Postgres seems to be the right start.

Due to our increasing scale demands, I am planning to start with Postgresql cluster. Ability to ingest data at scale, around a few TBs, in the fastest possible duration is highly critical for our use case. I have read through official documentation and also about COPY FROM command, but none of these talk specifically about cluster setup.

1) What is the standard and fastest way to ingest billions of records into Postgres at scale.
2) Is there a tool to generate the sql script for COPY FROM command for ready use? I want to avoid writing another custom tool and maintain it.

Thanks in advance,
bala