Обсуждение: Access to postgres conversion
Dear all -
I would like to know if any one has migrated database from MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any open source tools that you have used to do this task. Can you please share your experiences ?
Regards
On 05/25/11 12:42 PM, akp geek wrote: > Dear all - > > I would like to know if any one has migrated database from > MS access to Postgres . We use postgres 9.0.2 on solaris . Are there > any open source tools that you have used to do this task. Can you > please share your experiences ? how many tables is this database? is there more to it than just tables? (Access isn't really a database, its a data-centric rapid application development system). postgres won't do the reports, forms, macros, etc (ok, it has user definable procedures/functions, but they aren't in vbasic, and they don't work anything like access programs do). if its just a few tables, it might just be easiest to dump those tables as CSV files, then import them one by one into equivalent tables in postgres via COPY FROM
It's 10 tables. that's all. No reports. I will follow your suggestion. Thanks for the help
On Wed, May 25, 2011 at 4:25 PM, John R Pierce <pierce@hogranch.com> wrote:
On 05/25/11 12:42 PM, akp geek wrote:how many tables is this database? is there more to it than just tables? (Access isn't really a database, its a data-centric rapid application development system). postgres won't do the reports, forms, macros, etc (ok, it has user definable procedures/functions, but they aren't in vbasic, and they don't work anything like access programs do).Dear all -
I would like to know if any one has migrated database from MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any open source tools that you have used to do this task. Can you please share your experiences ?
if its just a few tables, it might just be easiest to dump those tables as CSV files, then import them one by one into equivalent tables in postgres via COPY FROM
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
Hi.
You could try Access to PostgreSQL from Bullzip.com. I liked the dump file that this creates. Small neat and it is free!
Tommy.
You could try Access to PostgreSQL from Bullzip.com. I liked the dump file that this creates. Small neat and it is free!
Tommy.
It's 10 tables. that's all. No reports. I will follow your suggestion. Thanks for the helpOn Wed, May 25, 2011 at 4:25 PM, John R Pierce <pierce@hogranch.com> wrote:On 05/25/11 12:42 PM, akp geek wrote:how many tables is this database? is there more to it than just tables? (Access isn't really a database, its a data-centric rapid application development system). postgres won't do the reports, forms, macros, etc (ok, it has user definable procedures/functions, but they aren't in vbasic, and they don't work anything like access programs do).Dear all -
I would like to know if any one has migrated database from MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any open source tools that you have used to do this task. Can you please share your experiences ?
if its just a few tables, it might just be easiest to dump those tables as CSV files, then import them one by one into equivalent tables in postgres via COPY FROM
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
On 2011-05-25, akp geek <akpgeek@gmail.com> wrote: > --000e0cd22f6cd3829104a41ee636 > Content-Type: text/plain; charset=ISO-8859-1 > > Dear all - > > I would like to know if any one has migrated database from MS > access to Postgres . We use postgres 9.0.2 on solaris . Are there any open > source tools that you have used to do this task. Can you please share your > experiences ? mdbtools can convert access to SQL, some scripting may be needed to translate names and typenames in the SQL ⚂⚃ 100% natural
On 5/25/2011 3:42 PM, akp geek wrote: > Dear all - > > I would like to know if any one has migrated database from > MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any > open source tools that you have used to do this task. Can you please > share your experiences ? > I rolled my own. If the number of rows in the MDB table is not that many (under 100k), then I'll create a new table up on pgsql, link to it with the ODBC driver, and append from the source table to the pgsql table. You can get away with larger appends if both systems are on the same network. If it was a table with a few million rows, then I wrote a little VBA snippet that created a pgdump compatible SQL text file from the source data. To figure out the format, I just pgdump'd an existing table from PostgreSQL, then patterned my SQL file after it. While it was extremely fast at doing the conversion (both generating the SQL file and the time it took for pgdump to process the SQL file), I only recommend that method for cases where you have millions and millions of rows. Or a lot of identical tables. (The VBA module was about 100-150 lines of code in total.)
Thanks so much . I was using bullzip What I felt with Bullzip was it is good for less amount of data. I have 2 tables each of which has 2.5 million records. For me it is taking for ever, The job that I set up has been running since 12 hours.
I appreciate if you could share the VBA code that you were mentioning. I would love to use that to make the data transfer faster
Regards
On Thu, Jun 2, 2011 at 9:32 AM, Thomas Harold <thomas-lists@nybeta.com> wrote:
I rolled my own.On 5/25/2011 3:42 PM, akp geek wrote:Dear all -
I would like to know if any one has migrated database from
MS access to Postgres . We use postgres 9.0.2 on solaris . Are there any
open source tools that you have used to do this task. Can you please
share your experiences ?
If the number of rows in the MDB table is not that many (under 100k), then I'll create a new table up on pgsql, link to it with the ODBC driver, and append from the source table to the pgsql table. You can get away with larger appends if both systems are on the same network.
If it was a table with a few million rows, then I wrote a little VBA snippet that created a pgdump compatible SQL text file from the source data. To figure out the format, I just pgdump'd an existing table from PostgreSQL, then patterned my SQL file after it. While it was extremely fast at doing the conversion (both generating the SQL file and the time it took for pgdump to process the SQL file), I only recommend that method for cases where you have millions and millions of rows. Or a lot of identical tables.
(The VBA module was about 100-150 lines of code in total.)
On Thu, Jun 2, 2011 at 10:01 AM, akp geek <akpgeek@gmail.com> wrote: > Thanks so much . I was using bullzip What I felt with Bullzip was it is good > for less amount of data. I have 2 tables each of which has 2.5 million > records. For me it is taking for ever, The job that I set up has been > running since 12 hours. Export to CSV or tab delimited file, then suck it in with a COPY statement in postgres. Just make sure that there is no "invalid" data like fake dates. 2.5 million rows should take a couple of minutes tops to insert into a modern hardware server.
On Thu, Jun 2, 2011 at 12:01 PM, akp geek <akpgeek@gmail.com> wrote: > The only problem I am seeing with dates as you mentioned. when I export the > data to csv the date is getting the format of 8/1/1955 0:00:00 , but > postgres not accepting that. Any clues? Should work: test=> select '8/1/1955 0:00:00'::date; date ------------ 1955-08-01 (1 row) Time: 0.325 ms test=> select '8/1/1955 0:00:00'::timestamp; timestamp --------------------- 1955-08-01 00:00:00 (1 row) at worst I guess you run the export file thru an editing filter that removes the '0:00:00' off the date column.
The only problem I am seeing with dates as you mentioned. when I export the data to csv the date is getting the format of 8/1/1955 0:00:00 , but postgres not accepting that. Any clues?
Regards
On Thu, Jun 2, 2011 at 11:23 AM, Vick Khera <vivek@khera.org> wrote:
On Thu, Jun 2, 2011 at 10:01 AM, akp geek <akpgeek@gmail.com> wrote:Export to CSV or tab delimited file, then suck it in with a COPY
> Thanks so much . I was using bullzip What I felt with Bullzip was it is good
> for less amount of data. I have 2 tables each of which has 2.5 million
> records. For me it is taking for ever, The job that I set up has been
> running since 12 hours.
statement in postgres. Just make sure that there is no "invalid" data
like fake dates. 2.5 million rows should take a couple of minutes
tops to insert into a modern hardware server.
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general