execute many for each commit

Поиск
Список
Период
Сортировка
От Alessandro Gagliardi
Тема execute many for each commit
Дата
Msg-id CAAB3BBJRcHi-mCHwOTuQwDR32xwe-NkZQghhZ4mtYZxhHF0L-w@mail.gmail.com
обсуждение исходный текст
Ответы Re: execute many for each commit  (Tom Lane <tgl@sss.pgh.pa.us>)
Список pgsql-novice
This is really more of a psycopg2 than a PostgreSQL question per se, but hopefully there are a few Pythonistas on this list who can help me out. At a recent PUG meeting I was admonished on the folly of committing after every execute statement (especially when I'm executing hundreds of inserts per second). I was thinking of batching a bunch of execute statements (say, 1000) before running a commit but the problem is that if any one of those inserts fail (say, because of a unique_violation, which happens quite frequently) then I have to rollback the whole batch. Then I'd have to come up with some logic to retry each one individually or something similarly complicated.

I look at this problem and I think, "I must be doing something wrong." But I can't figure out what it is. The closest thing to an answer I could find using Google was http://stackoverflow.com/questions/396455/python-postgresql-psycopg2-interface-executemany which didn't really provide any good solution at all. Perhaps someone here knows better? Or perhaps that advice was wrong and I simply do have to do a commit after each insert?

В списке pgsql-novice по дате отправления:

Предыдущее
От: Bartosz Dmytrak
Дата:
Сообщение: Re: insert fail gracefully if primary key already exists
Следующее
От: Alessandro Gagliardi
Дата:
Сообщение: Foreign Key to an (abstract?) Parent Table