Re: Need to tune for Heavy Write

Поиск
Список
Период
Сортировка
От Samuel Gendler
Тема Re: Need to tune for Heavy Write
Дата
Msg-id CAEV0TzD7==4mmNXA2a_pM3o2aw1BmD2pNg+OFkM356webohPQw@mail.gmail.com
обсуждение исходный текст
Ответ на Need to tune for Heavy Write  (Adarsh Sharma <adarsh.sharma@orkash.com>)
Ответы Re: Need to tune for Heavy Write  (Mark Kirkwood <mark.kirkwood@catalyst.net.nz>)
Список pgsql-performance


On Wed, Aug 3, 2011 at 9:56 PM, Adarsh Sharma <adarsh.sharma@orkash.com> wrote:
Dear all,

From the last few days, I researched a lot on Postgresql Performance Tuning due to slow speed of my server.
My application selects data from mysql database about 100000 rows , process it & insert into postgres 2 tables by making about 45 connections.

It's already been mentioned, but is worth reinforcing, that if you are inserting 100,000 rows in 100,000 transactions, you'll see a huge performance improvement by doing many more inserts per transaction.  Try doing at least 500 inserts in each transaction (though you can possibly go quite a bit higher than that without any issues, depending upon what other traffic the database is handling in parallel).  You almost certainly don't need 45 connections in order to insert only 100,000 rows.  I've got a crappy VM with 2GB of RAM in which inserting 100,000 relatively narrow rows requires less than 10 seconds if I do it in a single transaction on a single connection.  Probably much less than 10 seconds, but the code I just tested with does other work while doing the inserts, so I don't have a pure test at hand.

В списке pgsql-performance по дате отправления:

Предыдущее
От: "Kevin Grittner"
Дата:
Сообщение: Re: Performance die when COPYing to table with bigint PK
Следующее
От: Jian Shi
Дата:
Сообщение: table size is bigger than expected