Re: disfavoring unparameterized nested loops

Поиск
Список
Период
Сортировка
От Kenneth Marshall
Тема Re: disfavoring unparameterized nested loops
Дата
Msg-id 20210621141541.GZ32646@aart.rice.edu
обсуждение исходный текст
Ответ на Re: disfavoring unparameterized nested loops  (John Naylor <john.naylor@enterprisedb.com>)
Список pgsql-hackers
> >
> > Most of the time when I see that happen it's down to either the
> > selectivity of some correlated base-quals being multiplied down to a
> > number low enough that we clamp the estimate to be 1 row.   The other
> > case is similar, but with join quals.
> 
> If an estimate is lower than 1, that should be a red flag that Something Is
> Wrong. This is kind of a crazy idea, but what if we threw it back the other
> way by computing 1 / est , and clamping that result to 2 <= res < 10 (or
> 100 or something)? The theory is, the more impossibly low it is, the more
> wrong it is. I'm attracted to the idea of dealing with it as an estimation
> problem and not needing to know about join types. Might have unintended
> consequences, though.
> 
> Long term, it would be great to calculate something about the distribution
> of cardinality estimates, so we can model risk in the estimates.
> 

Hi,

Laurenz suggested clamping to 2 in this thread in 2017:

https://www.postgresql.org/message-id/1509611428.3268.5.camel%40cybertec.at

Having been the victim of this problem in the past, I like the risk
based approach to this. If the cost of being wrong in the estimate is
high, use a merge join instead. In every case that I have encountered,
that heuristic would have given the correct performant plan.

Regards,
Ken



В списке pgsql-hackers по дате отправления:

Предыдущее
От: David Rowley
Дата:
Сообщение: Re: Use simplehash.h instead of dynahash in SMgr
Следующее
От: Dean Rasheed
Дата:
Сообщение: Re: Use extended statistics to estimate (Var op Var) clauses