Re: how to make duplicate finding query faster?

Поиск
Список
Период
Сортировка
От Holger Jakobs
Тема Re: how to make duplicate finding query faster?
Дата
Msg-id 5bf642e4-eb22-8eba-e96d-5b33b5010b03@jakobs.com
обсуждение исходный текст
Ответ на how to make duplicate finding query faster?  (Sachin Kumar <sachinkumaras@gmail.com>)
Ответы Re: how to make duplicate finding query faster?
Список pgsql-admin
Am 30.12.20 um 08:36 schrieb Sachin Kumar:
Hi All,

I am uploading data into PostgreSQL using the CSV file and checking if there is any duplicates value in DB it should return a duplicate error.  I am using below mention query.

if Card_Bank.objects.filter( Q(ACCOUNT_NUMBER=card_number) ).exists():
        flag=2
      else:
        flag=1
it is taking too much time i am using 600k cards in CSV.

Kindly help me in making the query faster.

I am using Python, Django & PostgreSQL.
--

Best Regards,
Sachin Kumar

I think it would be easier to not check the duplicates before, but let the DB complain about duplicates.

That would about slash the roundtrips to the DB in half. Instead of check + insert there would be only an insert, which might fail every now and then.

Regards,

Holger

-- 
Holger Jakobs, Bergisch Gladbach, Tel. +49-178-9759012
Вложения

В списке pgsql-admin по дате отправления:

Предыдущее
От: Sachin Kumar
Дата:
Сообщение: how to make duplicate finding query faster?
Следующее
От: Sachin Kumar
Дата:
Сообщение: Re: how to make duplicate finding query faster?