Re: Importing *huge* mysql database into pgsql
| От | Harald Fuchs |
|---|---|
| Тема | Re: Importing *huge* mysql database into pgsql |
| Дата | |
| Msg-id | pur6s2mg2w.fsf@srv.protecting.net обсуждение исходный текст |
| Ответ на | Importing *huge* mysql database into pgsql (".ep" <erick.papa@gmail.com>) |
| Список | pgsql-general |
In article <1173191066.416664.320470@n33g2000cwc.googlegroups.com>, ".ep" <erick.papa@gmail.com> writes: > Hello, > I would like to convert a mysql database with 5 million records and > growing, to a pgsql database. > All the stuff I have come across on the net has things like > "mysqldump" and "psql -f", which sounds like I will be sitting forever > getting this to work. > Is there anything else? If you really want to convert a *huge* MySQL database (and not your tiny 5M record thingie), I'd suggest "mysqldump -T". This creates for each table an .sql file containing just the DDL, and a .txt file containing the data. Then edit all .sql files: * Fix type and index definitions etc. * Append a "COPY thistbl FROM 'thispath/thistbl.txt';" Then run all .sql files with psql, in an order dictated by foreign keys.
В списке pgsql-general по дате отправления: