Re: How import big amounts of data?
| От | Teemu Torma |
|---|---|
| Тема | Re: How import big amounts of data? |
| Дата | |
| Msg-id | 200512291541.05739.teemu@torma.org обсуждение исходный текст |
| Ответ на | How import big amounts of data? (Arnau <arnaulist@andromeiberica.com>) |
| Ответы |
Re: How import big amounts of data?
|
| Список | pgsql-performance |
On Thursday 29 December 2005 10:48, Arnau wrote: > Which is the best way to import data to tables? I have to import > 90000 rows into a column and doing it as inserts takes ages. Would be > faster with copy? is there any other alternative to insert/copy? I am doing twice as big imports daily, and found the follwing method most efficient (other than using copy): - Use plpgsql function to do the actual insert (or update/insert if needed). - Inside a transaction, execute SELECT statements with maximum possible number of insert function calls in one go. This minimizes the number of round trips between the client and the server. Teemu
В списке pgsql-performance по дате отправления: