Обсуждение: imorting csv files into postgres
Does anyone know if you can import csv text files into postgres I have tried the COPY command and it needs to have an existing table, while what I want to do is create a new table from a csv file. Also is there any way to define the text seperater ala MSSQL eg field seperator = , text seperator = " Thanks
mike wrote:
> Does anyone know if you can import csv text files into postgres
A perl script to do that was just posted to the GENERAL List....
Hmmmm....
Tim Uckun just posted the following:
-------
Here is your steps. 1) Create your destination table in postgres
2) Export your data to a comma delimeted format
3) save this perl script to a file
------
#! /usr/bin/perl
$inquotes = 0;
while (<>){ # Chop the crlf chop ($_); chop ($_);
# this first bit goes through and replaces # all the commas that re not in quotes with tildes for ($i=0 ; $i
<length($_) ; $i++){ $char=substr($_,$i,1); if ($char eq '"' ){ $inquotes = not($inquotes);
}else{ if ( (!$inquotes) && ($char eq ",") ){ substr($_,$i,1)="~"; } }
} # this replaces any quotes s/"//g; print "$_\n";
}
-----
4) type this cat myfile.txt | perl myprog.pl > outfile.dat
5) go to psql
6) COPY "tablename" FROM '/path/to/outfile.dat' USING DELIMITERS '~';
-----
See if that helps any.
--
Lamar Owen
WGCR Internet Radio
1 Peter 4:11
Yeah, but he doesn't want to have to create the table before hand, he wants postgresql to do all the work. ;) The only way that I know of to do that is write a program and feed it the CSV file... but in the program it is going to need to create the table. Sometimes you just have to do the work yourself. :) Adam Lang Systems Engineer Rutgers Casualty Insurance Company http://www.rutgersinsurance.com ----- Original Message ----- From: "Lamar Owen" <lamar.owen@wgcr.org> To: "mike" <mike@cyborg-group.com> Cc: <pgsql-interfaces@postgresql.org> Sent: Thursday, January 18, 2001 3:48 PM Subject: Re: [INTERFACES] imorting csv files into postgres > mike wrote: > > > Does anyone know if you can import csv text files into postgres > > A perl script to do that was just posted to the GENERAL List.... > Hmmmm.... > > Tim Uckun just posted the following: > ------- > Here is your steps. > 1) Create your destination table in postgres > 2) Export your data to a comma delimeted format > 3) save this perl script to a file > > ------ > #! /usr/bin/perl > $inquotes = 0; > while (<>){ > # Chop the crlf > chop ($_); > chop ($_); > > # this first bit goes through and replaces > # all the commas that re not in quotes with tildes > for ($i=0 ; $i < length($_) ; $i++){ > $char=substr($_,$i,1); > if ($char eq '"' ){ > $inquotes = not($inquotes); > }else{ > if ( (!$inquotes) && ($char eq ",") ){ > substr($_,$i,1)="~"; > } > } > } > # this replaces any quotes > s/"//g; > print "$_\n"; > } > ----- > > 4) type this cat myfile.txt | perl myprog.pl > outfile.dat > 5) go to psql > 6) COPY "tablename" FROM '/path/to/outfile.dat' USING DELIMITERS '~'; > ----- > > See if that helps any. > -- > Lamar Owen > WGCR Internet Radio > 1 Peter 4:11
Adam Lang wrote: > Yeah, but he doesn't want to have to create the table before hand, he wants > postgresql to do all the work. ;) > The only way that I know of to do that is write a program and feed it the > CSV file... but in the program it is going to need to create the table. I have such a script for AOLserver Tcl that does all the work. But it requires a working AOLserver/PostgreSQL installation, which will take longer to set up than a simple perl script would take to be written. The hard part is the CSV parsing -- creating the table and filling in the data is easy in comparison, especially in the Pg perl interface. -- Lamar Owen WGCR Internet Radio 1 Peter 4:11