Re: [GENERAL] dynamic schema modeling and performance

Поиск
Список
Период
Сортировка
От Merlin Moncure
Тема Re: [GENERAL] dynamic schema modeling and performance
Дата
Msg-id CAHyXU0yewUUPQS98HhSMiwmoWkf0fQPA=C98JvJAc9SBdNBH9Q@mail.gmail.com
обсуждение исходный текст
Ответ на [GENERAL] dynamic schema modeling and performance  (Rj Ewing <ewing.rj@gmail.com>)
Ответы Re: [GENERAL] dynamic schema modeling and performance  (Rj Ewing <ewing.rj@gmail.com>)
Список pgsql-general
On Tue, Apr 11, 2017 at 12:46 PM, Rj Ewing <ewing.rj@gmail.com> wrote:
> I'm looking for thoughts on the best way to handle dynamic schemas.
>
> The application I am developing revolves around user defined entities. Each
> entity is a tabular dataset with user defined columns and data types.
> Entities can also be related to each other through Parent-Child
> relationships. Some entities will be 100% user driven, while others (such as
> an entity representing a photo) will be partially user driven (all photo
> entities will have common fields + custom user additions).
>
> I was hoping to get opinions on whether postgresql would be a suitable
> backend. A couple of options I have thought of are:
>
> 1. Each entity is represented as a table in psql. The schema would be
> dynamically updated (with limits) when an entity mapping is updated. I
> believe that this would provide the best data constraints and allow the best
> data normalization. A concern I have is that there could be an enormous
> amount of tables generated and the performance impacts this might have in
> the future. I could then run elasticsearch as a denormalized cache for
> efficient querying and full-text-search.
>
> 2. Use a nosql database. This provides the "dynamic" schema aspect. A
> concern here is the lack of relation support, thus leading to a more
> denormalized data structure and the potential for the data to become
> corrupted.
>
> Any opinions on the use of psql for this case, or other options would be
> greatly appreciated!

Postgres can function as a nosql database -- you can use jsonb for
example to archive data in such a way that the data model can be
changed without making schema adjustments.   Another way to do it is
EAV pattern as noted.  These might be good strategies if you're not
sure where you want to end up.

It really comes down to this: how formal do you want your data model
to be?   Adding formality leads to performance optimizations, exposes
your data to the fantastic SQL language, and allows rigorous
assumptions to made made from external dependencies and trusted.
Formality also brings a degree of inflexibility since your data has to
be forced into predetermined structures.

merlin


В списке pgsql-general по дате отправления:

Предыдущее
От: Igor Neyman
Дата:
Сообщение: Re: [GENERAL] advice for efresh of materialized view
Следующее
От: "Ivan Fabris, Gruppo ColliniConsulting.it"
Дата:
Сообщение: [GENERAL] R: advice for efresh of materialized view