Re: Searching in varchar column having 100M records

Поиск
Список
Период
Сортировка
От Tomas Vondra
Тема Re: Searching in varchar column having 100M records
Дата
Msg-id 20190717124846.xbvcvfjg5hdcnaed@development
обсуждение исходный текст
Ответ на Re: Searching in varchar column having 100M records  (Sergei Kornilov <sk@zsrv.org>)
Ответы Re: Searching in varchar column having 100M records  (Andreas Kretschmer <andreas@a-kretschmer.de>)
Список pgsql-performance
On Wed, Jul 17, 2019 at 02:53:20PM +0300, Sergei Kornilov wrote:
>Hello
>
>Please recheck with track_io_timing = on in configuration. explain
>(analyze,buffers) with this option will report how many time we spend
>during i/o
>
>>   Buffers: shared hit=2 read=31492
>
>31492 blocks / 65 sec ~ 480 IOPS, not bad if you are using HDD
>
>Your query reads table data from disks (well, or from OS cache). You need
>more RAM for shared_buffers or disks with better performance.
>

Either that, or try creating a covering index, so that the query can do an
index-only scan. That might reduce the amount of IO against the table, and
in the index the data should be located close to each other (same page or
pages close to each other).

So try something like

    CREATE INDEX ios_idx ON table (field, user_id);

and make sure the table is vacuumed often enough (so that the visibility
map is up to date).


regards

-- 
Tomas Vondra                  http://www.2ndQuadrant.com
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services




В списке pgsql-performance по дате отправления:

Предыдущее
От: Sergei Kornilov
Дата:
Сообщение: Re: Searching in varchar column having 100M records
Следующее
От: Andreas Kretschmer
Дата:
Сообщение: Re: Searching in varchar column having 100M records