Re: Using Postgres to store high volume streams of sensor readings
| От | Ciprian Dorin Craciun |
|---|---|
| Тема | Re: Using Postgres to store high volume streams of sensor readings |
| Дата | |
| Msg-id | 8e04b5820811221354j4a19b6ddk9b9ba60e3a6bb2a4@mail.gmail.com обсуждение исходный текст |
| Ответ на | Re: Using Postgres to store high volume streams of sensor readings ("Scott Marlowe" <scott.marlowe@gmail.com>) |
| Ответы |
Re: Using Postgres to store high volume streams of sensor readings
Re: Using Postgres to store high volume streams of sensor readings |
| Список | pgsql-general |
On Sat, Nov 22, 2008 at 11:51 PM, Scott Marlowe <scott.marlowe@gmail.com> wrote:
> On Sat, Nov 22, 2008 at 2:37 PM, Ciprian Dorin Craciun
> <ciprian.craciun@gmail.com> wrote:
>>
>> Hello all!
> SNIP
>> So I would conclude that relational stores will not make it for
>> this use case...
>
> I was wondering you guys are having to do all individual inserts or if
> you can batch some number together into a transaction. Being able to
> put > 1 into a single transaction is a huge win for pgsql.
I'm aware of the performance issues between 1 insert vs x batched
inserts in one operation / transaction. That is why in the case of
Postgres I am using COPY <table> FROM STDIN, and using 5k batches...
(I've tried even 10k, 15k, 25k, 50k, 500k, 1m inserts / batch and no
improvement...)
Ciprian Craciun.
В списке pgsql-general по дате отправления: