Re: Searching in varchar column having 100M records

Поиск
Список
Период
Сортировка
От Andreas Kretschmer
Тема Re: Searching in varchar column having 100M records
Дата
Msg-id 5322aa5e-9913-5471-7254-c5fff6c09146@a-kretschmer.de
обсуждение исходный текст
Ответ на Re: Searching in varchar column having 100M records  (Tomas Vondra <tomas.vondra@2ndquadrant.com>)
Список pgsql-performance

Am 17.07.19 um 14:48 schrieb Tomas Vondra:
> Either that, or try creating a covering index, so that the query can 
> do an
> index-only scan. That might reduce the amount of IO against the table, 
> and
> in the index the data should be located close to each other (same page or
> pages close to each other).
>
> So try something like
>
>    CREATE INDEX ios_idx ON table (field, user_id);
>
> and make sure the table is vacuumed often enough (so that the visibility
> map is up to date). 

yeah, and please don't use varchar(64), but instead UUID for the user_id 
- field to save space on disk and for faster comparison.


Regards, Andreas

-- 
2ndQuadrant - The PostgreSQL Support Company.
www.2ndQuadrant.com




В списке pgsql-performance по дате отправления:

Предыдущее
От: Tomas Vondra
Дата:
Сообщение: Re: Searching in varchar column having 100M records
Следующее
От: "David G. Johnston"
Дата:
Сообщение: Re: Searching in varchar column having 100M records