Re: Understanding memory usage

Поиск
Список
Период
Сортировка
От Damiano Albani
Тема Re: Understanding memory usage
Дата
Msg-id CAKys9514yvi9EcSccpLGtGzETfNE--jCwqBiJBy_6iFJ9-pwsA@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Understanding memory usage  (Daniele Varrazzo <daniele.varrazzo@gmail.com>)
Список psycopg
On Thu, Oct 31, 2013 at 12:01 PM, Daniele Varrazzo <daniele.varrazzo@gmail.com> wrote:

I easily expect a much bigger overhead in building millions of Python
object compared to building 20. Not only for the 37 bytes of overhead
each string has (sys.getsizeof()), but also for the consequences for
the GC to manage objects in the millions.

For the record, I've eventually settled for a solution using pgnumpy.
It's capable of handling results made of millions of rows with very little overhead as far as I could see.
As my original goal was to feed the data to Pandas down the line, pgnumpy seems spot on.

--
Damiano Albani

В списке psycopg по дате отправления:

Предыдущее
От: Daniele Varrazzo
Дата:
Сообщение: Re: Understanding memory usage
Следующее
От: Denis Papathanasiou
Дата:
Сообщение: Best strategy for bulk inserts where some violate unique constraint?