Oh heck yeah. So here's a script for you (I wrote it mostly in the mail
client so you have to do your own testing). Now schedule that in cron via
a line like:
@daily /....command
#!/bin/sh
rm -rf /tmp/pg_data.*
TMPDIR=`mktemp -d /tmp/pg_data.XXX`
SENDTO=somone@somewhere
# Create the temporary directory
mkdir $TMPDIR/postgres
# dump each database schema/data separately
su -l postgresql -c "psql -At -F ' ' -U postgresql -d template1 <<__END__
SELECT datname FROM pg_database WHERE datallowconn;
__END__
" | while read DB; do
echo "PostgreSQL db $DB"
mkdir -p $TMPDIR/postgres/$DB
# schema
su -l postgresql -c "pg_dump -Cs -F c -Z 9 -S postgresql $DB" \
> $TMPDIR/postgres/$DB/schema.pg
# data
su -l postgresql -c "pg_dump -bd -F c -Z 9 -S postgresql $DB" \
> $TMPDIR/postgres/$DB/data.pg
done
# dump all globals (users/groups)
su -l postgresql -c "pg_dumpall -g" \
> $TMPDIR/postgres/globals.sql
# Create a archive, bzip it an mail it
tar cf - $TMPDIR/postgres | \
bzip -9 | \
perl -Mstrict -Mwarnings -MMIME::Base64 -e \
'my $buf;binmode STDIN;while(read(STDIN, $buf, 60*57)) {print
encode_base64($buf)}' | \
mail -s "Pg backup $SENDTO
Joshua b. Jore ; http://www.greentechnologist.org
On 7 Jul 2002, tony wrote:
> I have a small database: dump file of about 1.6 Mb
>
> I want to do an offsite backup every weekday. I thought of rsync but the
> client runs only Mac desktops.
>
> I would like to bzip the dump file and send it by mail using cron. Is
> that easy to do?
>
> Cheers
>
> Tony Grant
>
> --
> RedHat Linux on Sony Vaio C1XD/S
> http://www.animaproductions.com/linux2.html
> Macromedia UltraDev with PostgreSQL
> http://www.animaproductions.com/ultra.html
>
>
>
>
> ---------------------------(end of broadcast)---------------------------
> TIP 2: you can get off all lists at once with the unregister command
> (send "unregister YourEmailAddressHere" to majordomo@postgresql.org)
>
>
>