Re: populate table with large csv file

Поиск
Список
Период
Сортировка
От Dave [Hawk-Systems]
Тема Re: populate table with large csv file
Дата
Msg-id DBEIKNMKGOBGNDHAAKGNIENDFBAC.dave@hawk-systems.com
обсуждение исходный текст
Ответ на Re: populate table with large csv file  ("P.J. \"Josh\" Rovero" <rovero@sonalysts.com>)
Ответы Re: populate table with large csv file  (Ron Johnson <ron.l.johnson@cox.net>)
Список pgsql-general
>> aside from parsing the csv file through a PHP interface, what isthe
>easiest way
>> to get that csv data importted into the postgres database. thoughts?
>
>Assuming the CSV file data is well formed, use psql and
>the COPY command.
>
>In psql, create the table.  Then issue command:
>
>copy <tablename> from 'filename' using delimiters ',';

perfect solution that was overlooked.

Unfortunately processing the 143mb file which would result in a database size of
approx 500mb takes an eternity.  As luck would have it we can get away with just
dropping to an exec and doing a cat/grep for any data we need...  takes 2-3
seconds.

the copy command is definately a keeper as I am not looking at replacing code
elsewhere with a simpler model using that.

Thanks

Dave



В списке pgsql-general по дате отправления:

Предыдущее
От: "Nigel J. Andrews"
Дата:
Сообщение: Re:
Следующее
От: Curtis Stanford
Дата:
Сообщение: Re: Good way to insert/update when you're not sure of duplicates?