Re: large database

Поиск
Список
Период
Сортировка
От Nathan Clayton
Тема Re: large database
Дата
Msg-id CAKVk3xyRcb8bJJ0vE0_sxGZQQXse_64yGxVfM1N_Tt9KX2vtJQ@mail.gmail.com
обсуждение исходный текст
Ответ на Re: large database  (Adrian Klaver <adrian.klaver@gmail.com>)
Список pgsql-general


On Dec 11, 2012 2:25 PM, "Adrian Klaver" <adrian.klaver@gmail.com> wrote:
>
> On 12/11/2012 01:58 PM, Mihai Popa wrote:
>>
>> On Tue, 2012-12-11 at 10:00 -0800, Jeff Janes wrote:
>>>
>>> On Mon, Dec 10, 2012 at 12:26 PM, Mihai Popa <mihai@lattica.com> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I've recently inherited a project that involves importing a large set of
>>>> Access mdb files into a Postgres or MySQL database.
>>>> The process is to export the mdb's to comma separated files than import
>>>> those into the final database.
>>>> We are now at the point where the csv files are all created and amount
>>>> to some 300 GB of data.
>>>
>>>
>>> Compressed or uncompressed?
>>
>>
>> uncompressed, but that's not much relief...
>> and it's 800GB not 300 anymore. I still can't believe the size of this
>> thing.
>
>
> Are you sure the conversion process is working properly?
>
Another question is whether there's a particular reason that you're converting to CSV prior to importing the data?

All major ETL tools that I know of, including the major open source ones (Pentaho / Talend) can move data directly from Access databases to Postgresql. In addition, provided the table names are all the same in the Access files, you can iterate over all of the access files in a directory at once.

В списке pgsql-general по дате отправления:

Предыдущее
От: David Noel
Дата:
Сообщение: initdb error
Следующее
От: Lutz Fischer
Дата:
Сообщение: problem with large inserts