Re: Is there any limit on the number of rows to import using copy command

Поиск
Список
Период
Сортировка
От Merlin Moncure
Тема Re: Is there any limit on the number of rows to import using copy command
Дата
Msg-id CAHyXU0x1uW_349ONE+35KT87Ua-dX8-QaZ5Sj6eENTMwsX=okw@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Is there any limit on the number of rows to import using copy command  ("sivapostgres@yahoo.com" <sivapostgres@yahoo.com>)
Ответы Re: Is there any limit on the number of rows to import using copy command
Список pgsql-general
On Wed, Jul 23, 2025 at 2:51 AM sivapostgres@yahoo.com
<sivapostgres@yahoo.com> wrote:
>
> Tried in PostgreSQL 11.11 , PostgreSQL 15.2 in Windows 10
>
> Here we try to transfer data from one database to another (remote) database.
>
> Tables do have records ranging from 85000 to 3600000 along with smaller sized tables.
> No issues while transferring smaller sized tables.
>
> I here take one particular table [table1] which has 85000 records.
> The table got Primary Key, Foreign Key(s), Triggers.  Trigger updates another table [table2]
> Table2 have 2 triggers, one to arrive a closing value and other to delete, if the closing value is zero.
>
> 1.  Transfer the data from source database to a csv file.  85000 records transferred. No issues.
> 2.  Transfer the file to the remote location.  No issues.
> 3.  Transfer the contents of the file to the table using Copy From command. - Fails when try to transfer all the
85000records at once. 
>
> Copy from command is
>
> Copy public.table1 From 'E:\temp\file1.csv' (FORMAT CSV, DELIMITER ',', HEADER TRUE)
>
> The above command succeeds, when
> 1.  The trigger in Table1 is disabled with all other constraints on.
> 2.  The no. of rows is within 16000 or less, with Trigger enabled.  We haven't tried with higher no of rows.
>
> The above command goes on infinite loop, when
> 1.  We try to transfer all 85000 rows at once, with Trigger and other constraints in table1 enabled.  We waited for
1.5hrs first time and 2.5 hrs second time before cancelling the operation. 
>
> I read in the documentation that the fastest way to transfer data is to use Copy command.  And I couldn't find any
limitin transferring data using that command.  One could easily transfer millions of rows using this command. 

Most likely, you are getting yourself into trouble with the trigger
dependencies.  Triggers are powerful, but also can be dangerous, and
this could be 'wrong tool for the job' situation.

 Here are some general tips:

* pg_trigger_depth(): can tell you if trigger A calls trigger B and
back to trigger A, etc.   you can use it with raise notify, and also
use it to guard execution on CREATE TRIGGER

* reconfiguring your logic to statement level triggers can be a good
idea. this can take some thinking, but can be much more efficient when
bulk processing since trigger execution can be deferred until the load
completes. (one trick is to use now() to check for records inserted
since it is stable though the transaction)

* reconfiguring your logic to a procedure can be a better idea; COPY
your data into some staging tables (perhaps temp, and indexed), then
write to various tables with joins, upserts, etc.

merlin



В списке pgsql-general по дате отправления: