Обсуждение: Postgresql 8.2.4 crash with tsearch2
hi, I have compiled postgresql 8.2.4 on a debian etch witch french snowball stemmer . I applied the lastest patch send by Teodor Sigaev (http://www.sai.msu.su/~megera/postgres/gist/tsearch/V2/tsearch_snowball_82-20070504.gz) and my backend still crash. I tested on 2 differents server both with etch, one with i386 the other with amd64. The first crash on ts_vector when the parameter string is longer than 200 characters. The second crash on lexize If you have other patch to try I will do it :) If for you there is no bug, I don't understand what's appen to my servers. any idea ? thank Regards
hi, Ok it's my mistake, I forgot to add "-i" at for gendict config.sh regards Le lundi 21 mai 2007 à 19:32 +0200, Philippe Amelant a écrit : > hi, > I have compiled postgresql 8.2.4 on a debian etch witch french snowball > stemmer . > I applied the lastest patch send by Teodor Sigaev > (http://www.sai.msu.su/~megera/postgres/gist/tsearch/V2/tsearch_snowball_82-20070504.gz) > > and my backend still crash. > I tested on 2 differents server both with etch, one with i386 the other > with amd64. > The first crash on ts_vector when the parameter string is longer than > 200 characters. > The second crash on lexize > If you have other patch to try I will do it :) > > If for you there is no bug, I don't understand what's appen to my > servers. any idea ? > > thank > > Regards > > > > ---------------------------(end of broadcast)--------------------------- > TIP 9: In versions below 8.0, the planner will ignore your desire to > choose an index scan if your joining column's datatypes do not > match
hum not enough tests before sending this mail, the fisrt request
select lexize('fr','chose');
work but the server crash on the second resquest (the same one)
Le mardi 22 mai 2007 à 12:16 +0200, Philippe Amelant a écrit :
> hi,
> Ok it's my mistake, I forgot to add "-i" at for gendict config.sh
>
>
> regards
>
> Le lundi 21 mai 2007 à 19:32 +0200, Philippe Amelant a écrit :
> > hi,
> > I have compiled postgresql 8.2.4 on a debian etch witch french snowball
> > stemmer .
> > I applied the lastest patch send by Teodor Sigaev
> > (http://www.sai.msu.su/~megera/postgres/gist/tsearch/V2/tsearch_snowball_82-20070504.gz)
> >
> > and my backend still crash.
> > I tested on 2 differents server both with etch, one with i386 the other
> > with amd64.
> > The first crash on ts_vector when the parameter string is longer than
> > 200 characters.
> > The second crash on lexize
> > If you have other patch to try I will do it :)
> >
> > If for you there is no bug, I don't understand what's appen to my
> > servers. any idea ?
> >
> > thank
> >
> > Regards
> >
> >
> >
> > ---------------------------(end of broadcast)---------------------------
> > TIP 9: In versions below 8.0, the planner will ignore your desire to
> > choose an index scan if your joining column's datatypes do not
> > match
>
> ---------------------------(end of broadcast)---------------------------
> TIP 6: explain analyze is your friend
Pls, check your steps or say me where I'm wrong :)
If you still have a problems, I can solve it if I'll have access to your
developer server...
% cd PGSQL_SRC
% zcat ~/tmp/tsearch_snowball_82-20070504.gz| patch -p0
% cd contrib/tsearch2
% gmake && su -c 'gmake install' && gmake installcheck
% cd gendict
% cp ~/tmp/libstemmer_c/src_c/stem_UTF_8_french.c stem.c
% cp ~/tmp/libstemmer_c/src_c/stem_UTF_8_french.h stem.h
% ./config.sh -n fr -s -p french_UTF_8 -v -C'Snowball stemmer for
French - UTF8'
% cd ../../dict_fr
% gmake && su -c 'gmake install'
% psql contrib_regression < dict_fr.sql
contrib_regression=# select lexize('fr', 'sortir'), lexize('fr',
'service'), lexize('fr', 'chose');
lexize | lexize | lexize
--------+----------+--------
{sort} | {servic} | {chos}
(1 row)
contrib_regression=# select lexize('fr',
'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaas');
lexize
----
{aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa}