Re: Duplicate Unique Key constraint error
| От | Harpreet Dhaliwal |
|---|---|
| Тема | Re: Duplicate Unique Key constraint error |
| Дата | |
| Msg-id | d86a77ef0707101233w4e5e68cbx2e6840410b935d43@mail.gmail.com обсуждение исходный текст |
| Ответ на | Re: Duplicate Unique Key constraint error (Tom Lane <tgl@sss.pgh.pa.us>) |
| Список | pgsql-general |
Thanks alot for all your suggestions gentlemen.
I changed it to a SERIAL column and all the pain has been automatically alleviated :)
Thanks a ton.
~Harpreet
I changed it to a SERIAL column and all the pain has been automatically alleviated :)
Thanks a ton.
~Harpreet
On 7/10/07, Tom Lane <tgl@sss.pgh.pa.us> wrote:
"Harpreet Dhaliwal" < harpreet.dhaliwal01@gmail.com> writes:
> Transaction 1 started, saw max(dig_id) = 30 and inserted new dig_id=31.
> Now the time when Transaction 2 started and read max(dig_id) it was still 30
> and by the time it tried to insert 31, 31 was already inserted by
> Transaction 1 and hence the unique key constraint error.
This is exactly why you're recommended to use sequences (ie serial
columns) for generating IDs. Taking max()+1 does not work, unless
you're willing to lock the whole table and throw away vast amounts of
concurrency.
regards, tom lane
В списке pgsql-general по дате отправления: