On Mon, Oct 06, 2003 at 18:30:29 +0200,
papapep <papapep@gmx.net> wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> I've got plenty of data files (prepared to be inserted with the \copy
> statement) but I have to filter them to be sure that there are no
> duplicated rows inserted.
> I know I should do it with a trigger that executes a function before
> inserting the row and if it's duplicated do something with it (insert it
> in another table, simply forget it, etc...). The theory is clear :-)
> But the practice is not so clear (for me, of course).
> Anyone can give me some guide to how the function should do the control
> of duplicated rows?
You might want to consider reading the data into a temp table and then using
a query to do something with duplicates.