Обсуждение: About limit on cube dimensions
Hi
Why is there limit on the number of cube dimensions?
It is bit strange because ARRAY has no such limit which is similar to cube.
Does it relate to Rtree?
Can I use 10000 dimensional cube without R-tree?
---
sato
Shida Sato wrote > Hi > > Why is there limit on the number of cube dimensions? > It is bit strange because ARRAY has no such limit which is similar to > cube. > Does it relate to Rtree? > Can I use 10000 dimensional cube without R-tree? From the docs: http://www.postgresql.org/docs/9.4/static/cube.html "To make it harder for people to break things, there is a limit of 100 on the number of dimensions of cubes. This is set in cubedata.h if you need something bigger." Thus the limit is indeed arbitrary - though if you decide to recompile to increase that limit your expectations should be sufficient tempered since likely few (if any) people are using cubes with 100 times the default limit number of dimensions. Given that R-tree is 2-dimensional I'm not sure how it is relevant. The docs also indicate that GIST effectively supersedes R-Tree as an index method... David J. -- View this message in context: http://postgresql.1045698.n5.nabble.com/About-limit-on-cube-dimensions-tp5817087p5817090.html Sent from the PostgreSQL - general mailing list archive at Nabble.com.
David G Johnston <david.g.johnston@gmail.com> writes: > Shida Sato wrote >> Why is there limit on the number of cube dimensions? >> From the docs: http://www.postgresql.org/docs/9.4/static/cube.html > "To make it harder for people to break things, there is a limit of 100 on > the number of dimensions of cubes. This is set in cubedata.h if you need > something bigger." > Thus the limit is indeed arbitrary - though if you decide to recompile to > increase that limit your expectations should be sufficient tempered since > likely few (if any) people are using cubes with 100 times the default limit > number of dimensions. Just offhand, it seems like that limit is doing a couple of things: * Protecting against overflow in memory allocation requests. In theory we could raise the limit to something near MaxAllocSize/(sizeof(double)*2) without breaking this. * Protecting against locking up the server if there are slow (O(N^2) or worse) algorithms in any of the cube functions. Before considering a proposal to raise the default value I'd want to see some investigation of the second point. regards, tom lane
On 01/09/14 09:05, Shida Sato wrote: > Hi > > Why is there limit on the number of cube dimensions? > It is bit strange because ARRAY has no such limit which is similar to > cube. > Does it relate to Rtree? > Can I use 10000 dimensional cube without R-tree? > > --- > sato Have you calculated how much disc space you would need to store a cube with 10000 dimensions??? Hint, an 8 TB disc would be woefully inadequate, unless it was very sparsely populated. Cheers, Gavin