Re: Import Database

Поиск
Список
Период
Сортировка
От Brett W. McCoy
Тема Re: Import Database
Дата
Msg-id Pine.LNX.4.30.0102051516040.30791-100000@chapelperilous.net
обсуждение исходный текст
Ответ на Import Database  ("Matt" <matthewf9@aol.com (nospam)>)
Список pgsql-general
On Mon, 29 Jan 2001, Matt wrote:

> I am trying to find if importing a very large delimited text file is faster
> with postgresql or mysql (with mysqlimport). Each night the transaction
> system we use completes a text file of the days activities, which must be
> loaded into a database, the speed is very important, mysqlimport takes less
> than an hour, however sometimes crashes. Is postgresql likely to be faster
> or slower at importing such vast amounts of data?

How much data are you talking about?  Megabytes?  Gigabytes?

PostgreSQL will load fairly fast if you turn off fsync and delete your
indexes and rebuild them after the import.  I haven't played with large
imports on the newer Postgres, but a couple of years ago I was importing
millions of rows into 6.5 on a lowly Pentium 200, with no indexes and with
fsync turned off.  I had to load each table separately (each one was
several million rows, plain old delimited delimited text), and they loaded
fairly quickly -- maybe 10 or 15 minutes, just using the COPY command
inside of psql.  With fsync on and indexes in place, it took *hours* to
load and basically slowed the server to a crawl because of the I/O
overhead.

-- Brett



В списке pgsql-general по дате отправления:

Предыдущее
От: "Martin A. Marques"
Дата:
Сообщение: Re: Solaris 8 compilation errors
Следующее
От: Alex Pilosov
Дата:
Сообщение: Re: Compiling Perl code