Loading the database, question about my method
| От | John McKown |
|---|---|
| Тема | Loading the database, question about my method |
| Дата | |
| Msg-id | Pine.LNX.4.21.0008181840590.6484-100000@linux2.johnmckown.net обсуждение исходный текст |
| Список | pgsql-general |
I have a sequential file on my work system (OS/390). This file contains multiple record types which I want to load into multiple tables in a PostgreSQL database. I have written a program which runs on the OS/390 system. This program reads the sequential file and reformats it. What I do is reformat it so that a "copy <tablename> from stdin;" is generated followed by the reformatted records for that table. This works and is quite fast. However, I was wondering if anybody would know how much slower it would be to use standard SQL INSERT statements would be. Basically, my input file would look something like: BEGIN; INSERT ..... INSERT ..... COMMIT; Again, I kinda like this approach because it is "standard" SQL and not specific to PostgreSQL (not that I plan to use anything else, personally). The reason I'd like a standard method is because there might be some interest by other OS/390 users in this code for other databases which they might have. Just curious, John McKown
В списке pgsql-general по дате отправления: