Обсуждение: copy/dump database to text/csv files

Поиск
Список
Период
Сортировка

copy/dump database to text/csv files

От
William Nolf
Дата:

This is probably an easy one for most sql users but I don't use it very often.

 

We have a postgres database that was used for an application we no longer use.  However, we would

like to copy/dump the tables to files, text or csv so we can post them to sharepoint.

 

Copy seems to be what I want.  But it only does one table at a time.  There is 204 tables with a schemaname=public.  I need to be copy the tables with data to files.   I need something like a for

loop which checks if the table is populated if so, copy it to tablename.csv file

 

Any ideas?

 

thanks

Bill Nolf | Sr. Linux Engineer (RHCE)


https://xceleratesolution.sharepoint.com/Shared%20Documents/Xcelerate%20Logos/xCelerate-Logo-(W-Solutions).jpg
8401 Greensboro Dr. #930
McLean, VA 22102

O: 703-275-8461


Re: copy/dump database to text/csv files

От
John R Pierce
Дата:
On 7/24/2014 12:04 PM, William Nolf wrote:
>
> This is probably an easy one for most sql users but I don't use it
> very often.
>
> We have a postgres database that was used for an application we no
> longer use.  However, we would
>
> like to copy/dump the tables to files, text or csv so we can post them
> to sharepoint.
>
> Copy seems to be what I want.  But it only does one table at a time.
> There is 204 tables with a schemaname=public.  I need to be copy the
> tables with data to files.   I need something like a for
>
> loop which checks if the table is populated if so, copy it to
> tablename.csv file
>
> Any ideas?
>

what scripting/programming language are you best in?   I'd probably whip
up a perl script to do that.

start with something like,

     select table_schema||'.'||table_name from information_schema.tables
where table_schema not in ('pg_catalog','information_schema')

which will output a list of all table names.

I suppose even bash scripting...

     for $table in $(psql -tc "select table_schema||'.'||table_name from
information_schema.tables where table_schema not in
('pg_catalog','information_schema')" dbname; do
         psql -c "\copy $table to '$table.csv' with (format csv,
header,...)" dbname
     done



--
john r pierce                                      37N 122W
somewhere on the middle of the left coast



Re: copy/dump database to text/csv files

От
Thomas Kellerer
Дата:
William Nolf wrote on 24.07.2014 21:04:
> This is probably an easy one for most sql users but I don't use it
> very often.
>
> We have a postgres database that was used for an application we no
> longer use.  However, we would
>
> like to copy/dump the tables to files, text or csv so we can post
> them to sharepoint.
>
> Copy seems to be what I want.  But it only does one table at a time.
> There is 204 tables with a schemaname=public.  I need to be copy the
> tables with data to files.   I need something like a for
>
> loop which checks if the table is populated if so, copy it to
> tablename.csv file
>
> Any ideas?

If you are not constrained to psql you might want to have a look at my tool SQL Workbench/J
(http://www.sql-workbench.net/)which contains an export command to do just that: export all tables of a schema (or
multipleschemas) in a defined format,  see here: http://www.sql-workbench.net/manual/command-export.html 

Using that, this would be as easy as:

WbExport -outputDir=/some/export/directory
          -type=text
          -header=true
          -delimiter=','
          -sourceTable=public.*;

You also have more control over the format of the exported data than you would have with psql I dare say.

Disclosure: I am the author of that tool.

Regards
Thomas






Re: copy/dump database to text/csv files

От
Francisco Olarte
Дата:
Hi William:


On Thu, Jul 24, 2014 at 9:04 PM, William Nolf <bnolf@xceleratesolutions.com> wrote:
We have a postgres database that was used for an application we no longer use.  However, we wouldlike to copy/dump the tables to files, text or csv so we can post them to sharepoint.

....

How BIG is your public schema? As this is a one-shot I would recommend first doing a custom format backup, and then working from it ( you can make an sql format backup from a custom one with pg_restore ).

THEN I would suggest making a mini-script ( I'm partial to perl for that, but this is due to 20+ years  hacking with it ) through which I'll pipe a data-only sql backup. You just wait for the line with 'copy table(comma,separated,field,names) from stdin' line, open the file table.whatever, write a header line if needed, read, transform and write copy lines until the '\.' end of data marker and close the file, repeat until input exhausted. Something like the following perl seudo code:

BEWARE, UNTESTED CODE FOR EXAMPLE PURPOSES:
OUTER: # Loop around all the backup
    while(<>) {
    if (my ($table, $fields) = /^copy\s*(.*?)\s*\((.*?)\)\s*from stdin;$/) {
    # Header line found, do headers etc....
    my $fname = get_filename($table); # Dequote, add extensions.
    open (my $fh, '>', $fname) or die "open problem: $!";
    print $fh generate_header($fields); # If needed.
      INNER: # Loop around a single table data.
    while(<>) {
        /^\\\.$/ and last;
        print $fh transform($_); # Chomp, split, quote, etc...
    }
    close($fh) or die "Close problem"; # Disk full, .....
    }
}
# Yeah, I know print should be tested too.

One of this things should give you a file for each table as fast as your database can send a backup, or your machine can run pg_restore.

Regards.
   Francisco Olarte.

Re: copy/dump database to text/csv files

От
Marc Mamin
Дата:
This is probably an easy one for most sql users but I don't use it very often.
>
>
>
>We have a postgres database that was used for an application we no longer use.  However, we would
>
>like to copy/dump the tables to files, text or csv so we can post them to sharepoint.
>
>
>
>Copy seems to be what I want.  But it only does one table at a time.  There is 204 tables with a schemaname=public.  I need to be copy the tables with data to files.   I need something like a for
>
>loop which checks if the table is populated if so, copy it to tablename.csv file
>
>
>
>Any ideas?

You could do it with pgpsql, or if your schema is too large generate a sript with it.

something like:

DO $$
DECLARE
test boolean;
rec record;
BEGIN
  for rec in  select tablename from pg_tables where schemaname = 'public'
  LOOP
    EXECUTE 'select exists (select * from public.'||rec.tablename||')' into test;
    IF test THEN raise notice 'COPY public.% TO %.dump',rec.tablename,rec.tablename;
    END IF;
  END LOOP;
END;
$$ language plpgsql

regards,

Marc Mamin