Re: Query takes around 15 to 20 min over 20Lakh rows

Поиск
Список
Период
Сортировка
От Tom Lane
Тема Re: Query takes around 15 to 20 min over 20Lakh rows
Дата
Msg-id 2441707.1630624966@sss.pgh.pa.us
обсуждение исходный текст
Ответ на Re: Query takes around 15 to 20 min over 20Lakh rows  ("David G. Johnston" <david.g.johnston@gmail.com>)
Ответы Re: Query takes around 15 to 20 min over 20Lakh rows  (Shubham Mittal <mittalshubham30@gmail.com>)
Список pgsql-general
"David G. Johnston" <david.g.johnston@gmail.com> writes:
> On Thu, Sep 2, 2021 at 3:16 PM Shubham Mittal <mittalshubham30@gmail.com>
> wrote:
>> *Please help in optimizing this query. I need to actually generate reports
>> daily using this query.. It takes almost 15 to 20 min to execute this query
>> due to joins.. *

> Use jsonb_populate_recordset (or one of its siblings) to get rid of as many
> of these key-based value extraction operations as possible and build a
> table from the contents of the jsonb.

While those duplicative extractions sure look inefficient, it's not
clear from the (lack of) given facts whether that's the main cost,
or whether the complicated FROM clause is producing a bad plan.
I'd suggest first looking at EXPLAIN ANALYZE output to verify which
plan step(s) are slow.  If it's the final output step that's expensive,
then yes the next step is to optimize the extractions.

Otherwise, see

https://wiki.postgresql.org/wiki/Slow_Query_Questions

            regards, tom lane



В списке pgsql-general по дате отправления:

Предыдущее
От: "David G. Johnston"
Дата:
Сообщение: Re: Query takes around 15 to 20 min over 20Lakh rows
Следующее
От: jesusthefrog
Дата:
Сообщение: gen_random_uuid key collision