How does Numeric division determine precision?

Поиск
Список
Период
Сортировка
От Will Pugh
Тема How does Numeric division determine precision?
Дата
Msg-id CAM39vH_0snUv_YRCU11L2iB4zj=Dz7F9B_RhUPtYDqdSKymaDw@mail.gmail.com
обсуждение исходный текст
Ответы Re: How does Numeric division determine precision?  (Tom Lane <tgl@sss.pgh.pa.us>)
Список pgsql-sql
Hi,

I'm using Postgres 9.1, and wanted to understand how some of the
numeric operations work.

It seems that is 9.1, numerics that don't have a specified precision
and scale are arbitrary scale/precision.

For many operations this is straightforward.  However, when doing a
division operation that does not terminate, I'm curious about how the
number of digits is determined.

It seems like there is some minimum precision, e.g.   >select 1/3::numeric   0.33333333333333333333

However, when operating on numbers with larger precision:   >select .5353535353355353535353/74::numeric
0.0072345072342639912640

.5353535353355353535353 has 22 digits
.0072345072342639912640 also has 22 digits, but should the first two
0's after the decimal point count as "precision"?


If I then, do the same operation, but move the decimal point on the
divisor, I get a different amount of precision:   >select .5353535353355353535353/.0074::numeric
72.3450723426399126399054

.5353535353355353535353 still has 22 digits
72.3450723426399126399054 now has 24 digits


For the most part, this seems correct, but I'm interested in knowing
how you determine precision and scale for the result of a divide.  Is
there a well known algorithm?
   Thanks,   --Will


В списке pgsql-sql по дате отправления:

Предыдущее
От: David Johnston
Дата:
Сообщение: Re: Prevent double entries ... no simple unique index
Следующее
От: Tom Lane
Дата:
Сообщение: Re: How does Numeric division determine precision?