how does full text searching tokenize words ? can it be altered?

Поиск
Список
Период
Сортировка
От Jonathan Vanasco
Тема how does full text searching tokenize words ? can it be altered?
Дата
Msg-id 93848FE9-AECD-4044-AB1E-AD612FCDB09A@2xlp.com
обсуждение исходный текст
Список pgsql-general
I'm getting a handful of 'can not index words longer than 2047 characters' on my `gin` indexes.

1. does this 2047 character count correspond to tokens / indexed words?
2. if so, is there a way to lower this number ?
3. is there a way to profile the index for the frequency of tokens ?


( apologies in advance if this looks familiar, i posted this as part of a larger question last month; everything but
thiswas answered by the list and I can't find answers to this online ) 




В списке pgsql-general по дате отправления:

Предыдущее
От: Don Brown
Дата:
Сообщение: Windows Installation User account - Correct database for us
Следующее
От: Steve Atkins
Дата:
Сообщение: Re: Windows Installation User account - Correct database for us