Обсуждение: Insert multiple Rows
Hi, I have a program that need to send a Postgresql database a lot of rows, i'm doing it using FOR an INSERT, something like this for i = 0 to 1000 {insert into table(item1) VALUES (i); } And so on, as you can imagine these consume a lot of resources and move so slowly, that's why I'm looking for a command to send more than one row at the time, something like insert into table (item1) values ('1' / '2' / '3' / '4' ... / '1000'); The data I want to send come from variables on the program, coping local variables is fast so i can make my query on a variable with all the data and then send just one query to the database. I was looking on PostgreSQL documentation and find nothing,, I think that what I'm looking for doesn't exist,, you are my last resource, it possible to do what I want? BTW: I found a command on Postgre to pass a file with all the info but the database is not on local machine and its a little hard to send it a file.
* Saint X <saintxplo@yahoo.com.ar> [03.03.2003 19:21]: > I was looking on PostgreSQL documentation and find nothing,, I think > that what I'm looking for doesn't exist,, you are my last resource, it > possible to do what I want? Search documentation for COPY command. Note 'stdin' and 'stdout': ... stdin Specifies that input comes from the client application. stdout Specifies that output goes to the client application. ... -- Victor Yegorov
On Mon, 2003-03-03 at 16:38, Saint X wrote: > Hi, I have a program that need to send a Postgresql database a lot of > rows, i'm doing it using FOR an INSERT, something like this > for i = 0 to 1000 > { > insert into table(item1) VALUES (i); > } > > And so on, as you can imagine these consume a lot of resources and > move so slowly, that's why I'm looking for a command to send more than > one row at the time, something like > > insert into table (item1) values ('1' / '2' / '3' / '4' ... / '1000'); Have you started a transaction first? If not, every INSERT will be a separate transaction, which will take much longer. Do: BEGIN INSERT ... INSERT ... ... many times ... COMMIT > The data I want to send come from variables on the program, coping > local variables is fast so i can make my query on a variable with all > the data and then send just one query to the database. > > I was looking on PostgreSQL documentation and find nothing,, I think > that what I'm looking for doesn't exist,, you are my last resource, it > possible to do what I want? > > BTW: I found a command on Postgre to pass a file with all the info but > the database is not on local machine and its a little hard to send it > a file. Using COPY will be even faster. To send a file from the remote system, run psql on the remote system and use \copy rather than COPY. -- Oliver Elphick Oliver.Elphick@lfix.co.uk Isle of Wight, UK http://www.lfix.co.uk/oliver GPG: 1024D/3E1D0C1C: CA12 09E0 E8D5 8870 5839 932A 614D 4C34 3E1D 0C1C ======================================== "A new commandment I give to you, that you love one another, even as I haveloved you." John 13:34
Hi, Saint X wrote: > Hi, I have a program that need to send a Postgresql database a lot of > rows, i'm doing it using FOR an INSERT, something like this > for i = 0 to 1000 > { > insert into table(item1) VALUES (i); > } > > And so on, as you can imagine these consume a lot of resources and > move so slowly, that's why I'm looking for a command to send more than > one row at the time, something like > > insert into table (item1) values ('1' / '2' / '3' / '4' ... / '1000'); You can do instead: insert into table (item1) values ('1'); insert into table (item1) values ('2'); .... insert into table (item1) values ('1000'); As single query. But better use COPY command or prepared statment. Inserts slow commonly due to indexis and foreing keys. -- Olleg Samoylov
On Tuesday 04 Mar 2003 10:54 am, Oleg Samoylov wrote: > Hi, > > Saint X wrote: > > Hi, I have a program that need to send a Postgresql database a lot of > > rows, i'm doing it using FOR an INSERT, something like this > > for i = 0 to 1000 > > { > > insert into table(item1) VALUES (i); > > } > > > > And so on, as you can imagine these consume a lot of resources and > > move so slowly, that's why I'm looking for a command to send more than > > one row at the time, something like > > > > insert into table (item1) values ('1' / '2' / '3' / '4' ... / '1000'); > > You can do instead: > > insert into table (item1) values ('1'); > insert into table (item1) values ('2'); > .... > insert into table (item1) values ('1000'); > > As single query. > > But better use COPY command or prepared statment. > > Inserts slow commonly due to indexis and foreing keys. If you use a 'prepared' insert within a transaction, would that speed things up - maybe by defering index updates? -- Gary Stainburn This email does not contain private or confidential material as it may be snooped on by interested government parties for unknown and undisclosed purposes - Regulation of Investigatory Powers Act, 2000
> > If you use a 'prepared' insert within a transaction, would that speed things > up - maybe by defering index updates? > I doubt it. From: Neil Conway <neilc@samurai.com> ... I'd suspect that for most INSERT queries, the parsing/rewriting/planning stages are relatively inexpensive, and the bulk of the runtime is spent in the executor (and writing WAL records, etc). So my guess would be that you won't see that much of a performance improvement from this... I've noticed a real performance boost using COPY something like ... PQexec("COPY xxx FROM stdin"); for (...) { sprintf(buf, "%d\t%d\t...\n", values[0], values[1], ... values[n]); PQputline(conn, buf); } PQputline(conn, "\\.\n"); PQendcopy(conn); runs like hell. Regards, Christoph
Hi HTH COPY [ BINARY ] table [ WITH OIDS] FROM { 'filename' | stdin} [ [ USING ] DELIMITERS 'delimiter' ] [ WITH NULL AS'null_string' ] COPY [ BINARY ] table [ WITH OIDS] TO { 'filename' | stdin} [ [ USING ] DELIMITERS 'delimiter' ] [ WITH NULL AS 'null_string'] Regards Mark Carew Brisbane Australia