Обсуждение: bulk insert unique contraint

Поиск
Список
Период
Сортировка

bulk insert unique contraint

От
"Janek Sendrowski"
Дата:
Hi,
 
I want to insert data in my table and I want to insert the rows, which don't violates the unique contraint of my id.
I'musing a 64bit hash for my it. 
If I do one insert statement, which inserts many rows it doesn't do anything if one row violates the unique contraint.
Is there a faster way than using multiple insert statements?
 
Janek


Re: bulk insert unique contraint

От
Andy Colson
Дата:
On 12/30/2013 8:14 AM, Janek Sendrowski wrote:
> Hi,
>
> I want to insert data in my table and I want to insert the rows, which don't violates the unique contraint of my id.
I'musing a 64bit hash for my it. 
> If I do one insert statement, which inserts many rows it doesn't do anything if one row violates the unique
contraint.
> Is there a faster way than using multiple insert statements?
>
> Janek
>
>

You could:

create table junk (like main);
copy to junk ...;
create index junkpk on junk(uid);
-- _not_ a unique index, optional
analyze junk;
select uid, count(*) from junk group by uid;
-- do something about the dups
insert into main select * from junk;
drop table junk;

-Andy


Re: bulk insert unique contraint

От
"Janek Sendrowski"
Дата:
Hi Salah Jubeh,
 
My files don't have this format. Could it make sense to create such a file with php and import the data from it?
Or ist also possible the get certain rows from the file with regexp?
 
Janek


Re: bulk insert unique contraint

От
salah jubeh
Дата:
>>Hello Janek,

>>>which inserts many rows it doesn't do anything if one row violates the unique contraint.
>>You insert your data in a transactional way, this is the default behaviour .

>>To work around it,  do not insert data like

>>INSERT INTO <TABLE> VALUES (),()...;

>>but
>>INSERT INTO <TABLE> VALUES ();
>>INSERT INTO <TABLE> VALUES ();
>>Also make sure each statment runs in a separate transaction.

>>> Is there a faster way than using multiple insert statements?
>>have alook on
>>COPY

>>Regards


 
>My files don't have this format. Could it make sense to create such a file with php and import the data from it?
>Or ist also possible the get certain rows from the file with regexp?

Sorry for not forwarding my reply earlier to the mailing list.

An easier option would be as suggested by another mail thread - andy colson -. Also here another option:

1. Disable the unique constraint 'ALTER ABLE .. DROP  CONSTRAINT'
2. Remove duplicates , have a look here http://wiki.postgresql.org/wiki/Deleting_duplicates
3. Enable the unique constraint.

Regards


On Monday, December 30, 2013 4:47 PM, Janek Sendrowski <janek12@web.de> wrote:
Hi Salah Jubeh,
 
My files don't have this format. Could it make sense to create such a file with php and import the data from it?
Or ist also possible the get certain rows from the file with regexp?

 
Janek


--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general


Re: bulk insert unique contraint

От
"Janek Sendrowski"
Дата:
Hi Salah Jubeh,
 
I'm not in hurry ;)
Thanks for your answer. Thats what I've been searching for.
 
Janek Sendrwoski