Re: large resultset

Поиск
Список
Период
Сортировка
От AI Rumman
Тема Re: large resultset
Дата
Msg-id AANLkTikId6xwBgFhSxQIOpF0rhKps9yCfrUCcdcZeDT4@mail.gmail.com
обсуждение исходный текст
Ответ на Re: large resultset  (Andrew McMillan <andrew@morphoss.com>)
Ответы Re: large resultset  (vinny <vinny@xs4all.nl>)
Re: large resultset  (Andrew McMillan <andrew@morphoss.com>)
Список pgsql-php
Thanks a lot.
Actualy I am new with Postgresql.
I am using POstgresql 8.1.
The application is for giving the administrator all their email activities. Thats why it is 2 million of records. It is an CRM application.

On Tue, Jun 15, 2010 at 4:37 PM, Andrew McMillan <andrew@morphoss.com> wrote:
On Tue, 2010-06-15 at 16:01 +0600, AI Rumman wrote:
> No. I need to send 2 million records. I want to know what is the best
> possible way to send these records?
> HOw should I write the plpgsql procedure to send record ony by one to
> improve the response time to the users?

I don't think you're providing enough information for us to help you.

Your problem with two million users might be:

* But it takes so long to loop through them...
* I run out of memory receiving the resultset from the far end.
* How do I optimise this SQL query that fetches 2 million records.
* Or (likely) something I haven't considered.

Your 'How' question might be:

* Should I be using a cursor to access these efficiently, by sending
data in several chunks?

* How can I write this so I don't waste my time if the person on the far
end gave up waiting?

Etc.


Fundamentally sending 2million of anything can get problematic pretty
darn quickly, unless the 'thing' is less than 100 bytes.


My personal favourite would be to write a record somewhere saying 'so
and so wants these 2 million records', and give the user a URL where
they can fetch them from.  Or e-mail them to the user, or... just about
anything, except try and generate them in-line with the page, in a
reasonable time for their browser to not give up, or their proxy to not
give up, or their ISP's transparent proxy to not give up.

Why do they want 2 million record anyway?  2 million of what?  Will
another user drop by 10 seconds later and also want 2 million records?
The same 2 million?  Why does the user want 2 million records?  Is there
something that can be done to the 2 million records to make them a
smaller but more useful set of information?


Hopefully this stream of consciousness has some help buried in it
somewhere :-)


Cheers,
                                       Andrew McMillan.


--
------------------------------------------------------------------------
http://andrew.mcmillan.net.nz/                     Porirua, New Zealand
Twitter: _karora                                  Phone: +64(272)DEBIAN
           Water, taken in moderation cannot hurt anybody.
                            -- Mark Twain

------------------------------------------------------------------------


В списке pgsql-php по дате отправления:

Предыдущее
От: Jasen Betts
Дата:
Сообщение: Re: large resultset
Следующее
От: Jasen Betts
Дата:
Сообщение: Re: large resultset